A botched implementation of an automated testing tool can cause an entire company to sour on the effectiveness of the quality assurance (QA) process. Before you start touting how much time and money your automated testing solution will save the company, think carefully about the impact of automated testing on the system development life cycle. Answer these essential questions to prevent a QA bottleneck:
- What tests should be automated?
- When should automated load testing be conducted?
- Do you have adequate time and resources dedicated to automation?
Before you sign off on the purchase order for that automated testing tool you've had your eye on, make sure that you have senior management's complete support. Start by creating a test proposal document that clearly outlines the time frame and manages expectations.
Small victories go a long way
Unless you have the in-house expertise to write the test scripts, plan to begin with easier pieces of functionality. Many of the large vendors or their partners offer mentoring that can help you tailor scripts to your needs. Regardless, get some victories under your belt as soon as possible. Showing positive results early can prevent your testing budget from being axed after the next round of bean counting.
How do you get those early victories? According to Tom Doyle, QA analyst for Ceridian, which produces time and attendance software, "Start with the easier items and work your way up to the more difficult-to-test items. An automation testing team must show successes as early in the process as possible to the higher-ups, or the project could be doomed from the very beginning." Doyle’s team began with some less-database-intensive portions of the application. “The scripts were relatively simple, but when we completed the tests successfully, our confidence level increased. It was a great catalyst for more complex tests,” he said.
What to automate
Determining what functional tests should be automated is the next step. Automation of repetitive tasks and portability tests leverage speed and accuracy of a testing tool. If you have manual test plans, use them as guides to create your test scripts. "Our automated tests were based off of our existing test plans, but then they were adapted to utilize various features of the automated tool," said Shari Heath, quality assurance manager at TechRepublic. "For example, we were able to create data-driven tests to go through the same series of tests using multiple data sets. We never had the time to manually test with a lot of different data sets." Using an automated testing tool, Heath’s team wrote a script that parameterized input values for cookie-cutter online order forms, allowing the required fields to be tested with several sets of data.
When asked for examples of where automated testing proved most valuable, Heath recommended tasks such as cross-browsing testing and portions of the application that use multiple data values for the same fields. Automation really shines with regression and performance testing because you can rapidly test application changes. Tom Mochal, director of internal development at a software company in Atlanta and a CNET contributor, suggests that these test types are conducive to automation:
- High-volume stress and system availability
- Production environment simulation
Mochal added that usability, user acceptance testing, and initial unit testing require heavy human intervention and are not good candidates for automation.
Automated testing also doesn’t eliminate the need for manual functional testing for essential pieces of the application. “A computer just knows if it passed, failed, or how fast it could go—sometimes that’s not enough. A person can look at the feature and apply product or service knowledge that catches difficult bugs or illustrates an unusual user case,” said Kent Langley, operations engineering manager at TechRepublic.
When to conduct automated load testing
Clearly, any major code alterations or system adjustments to the production environment demand serious load testing. However, it’s not just code changes that require a stress test. In the case of Web sites, traffic from large-scale marketing promotions can bring even a carefully planned system architecture to its knees. Automated load testing provides the hard numbers needed for capacity planning.
Sources contacted for this story agreed that an application should not leave beta until it has been load tested.
The time investment in writing load test scripts is minimal compared with the frustration and impracticality of conducting a “human-driven” load test, Heath said.“Automated load tests allow you to introduce a load on your servers you can't realistically create by having all of your developers and QA engineers sitting at their desks waiting for someone to say ‘Go!’”
The quality assurance and development teams must commit resources to automated testing, especially training and maintenance of the script library. "Your tests are only as good as your automated scripts, which are only as good as your person who wrote the scripts. So training is a huge issue,” Langley said. "The person who writes the scripts has to understand not only the testing process but the software they are tasked with using and how it behaves in various types of networked environments."
A fast-paced release cycle makes this kind of knowledge transfer even more crucial because script libraries, test data, and documentation must constantly be updated. Langley warns organizations that neglecting to nurture automated scripting expertise will reinvent the wheel, as they will inevitably have to write new scripts for every test cycle.
However, even in a rapid-build environment, automated testing can still be indispensable, Mochal added. “The biggest challenge around using testing tools is setting up the test cases to begin with. Once the initial tests are in place, the tools support a fast release cycle very well,” Mochal said. “Again, you can test much faster using the tools than you could on your own.”