Top 5 Reasons Test Automation Fails
Over the last decade, test automation has gained significant traction in the software testing sphere. The adoption of Agile methodologies and DevOps principles has lead enterprises and test engineers to plunge headlong into automated testing. Unfortunately, not all automation projects have been successful in getting a return out of the time, money and resources they’ve invested in the process. In fact, stories of failed test automation projects have become a cliché; odd when you consider the substantial promise that test automation offers.
Rather than blaming the automation process itself, taking a look at the organizational context surrounding an automation strategy gives us an interesting insight. Failed automation strategies often result from the input of the people involved, rather than from the automation itself. Below are five common problems encountered in developing and maintaining an enterprise test automation strategy.
1. Lack of Management Buy-in
Management buy-in is absolutely necessary for a successful test automation project. And that buy-in has to be non-trivial and durable. Management must understand the goals and ROI for the project exactly. Due to the level of script maintenance required to achieve project goals, automated testing necessitates a long-lasting commitment from all business areas.
2. Lack of Commitment to Test Automation Suite Maintenance
That durable commitment is often difficult to maintain in the face of the ongoing maintenance effort necessary to support test automation suites. High level automated tests like sanity checks tend to endure multiple version releases because they address the top end functionality of the system which is usually fairly stable. Automated tests that work directly with feature details and parameters will have to be rewritten (and verified) with the same effort that goes into the source code changes.
3. Bleeding Resources
One of the most common automation failure scenarios begins well, everyone holds a good understanding of the project goal and the necessary scripting work. Then, some weeks or months into the project, resources are bled away from the automation scripting process to more ‘important’ (read system development) work. The scripts begin to drift away from the code they are supposed to test as the code changes and they don’t.
4. Short Attention Span
A variant of the resource loss problem is the loss of attention to script maintenance that occurs when development engineering is also charged with script creation/maintenance. Many automation tests are triggered by automated build systems with the idea being that a build has to pass the automated tests or it goes back to development with an appropriate bug list. Faced with working on their code or working on scripts, the near irresistible temptation is for the engineers to fix the code and simply disable the ‘offending’ test scripts.
5. Test Automation Platform Failures
An issue that is not quite as prevalent but just as severe is the failure of the selected test platform. With the plethora of test automation platforms available and their sometimes slightly exaggerated capabilities, it can be a challenge to sort through all the offerings and find the system that will really do what the project needs. Few single issues can dissipate management’s enthusiasm for a test automation project faster than finding out that there is a test requirement that the selected system simply cannot fulfill, especially after significant time and effort is already invested in the current platform.
A Successful Test Automation Project
Even with all these pitfalls to avoid, automation projects can be quite successful. One example from QualityLogic’s work involved testing an array of nearly a thousand web service login pages. The pages were quite similar, and the back end code used the data provided for each of them in the same way. Our engineer created a set of test scripts that could be used as push-button setups to perform the login process. This allowed a great deal of manual testing to be performed very quickly. This example of automation worked because its requirements were very specific in scope and goals, and the scripting was precisely defined. Its ROI was much greater than its cost and its maintenance was simple and clear.
The key to avoiding automation project killers is to recognize that test automation is, in many ways, just like product design. It requires the same kinds of foresight, planning and preparatory research. Then it has to be managed for the long haul because the automated test suite is going to go through exactly the same life cycle as the product it tests.