eMail David Southgate
david southgate writes for living.
david southgate: writing for living.
home published services musings about feedback contact

(Search on keyword)

(Or expand the menus below to browse by topic and headline)


Mailing List

Join my mailing list and I promise neither to spam you repeatedly nor sell your email address to anyone on the planet. Instead, I'll send you an email once in a while with a link to hot career and management advice— and maybe even a groovy travel destination. Why not join? [ list ]

Automated testing saves money if done right

Published on June 10, 2002
on TechRepublic.Com

The words automated testing often cause many tech executives to smack their lips in anticipation of reduced overhead, shorter quality assurance (QA) cycles, and lower personnel expenses. You can almost hear their excitement: "Whittle down the project budget and deliver a project early? Give me two or three helpings, please."

But while 20 percent of software development budgets should be dedicated to testing, today's beleaguered QA departments rarely get the resources they need to run testing, so the expected return typically never materializes. Though automated testing can save millions of dollars, the efficiency won't occur in a matter of weeks or even months-and it won't shorten your QA cycle or reduce manpower needs, say experts.

The key to success is understanding how automated testing plays into a plethora of project management areas and the many unexpected benefits it can bring if done properly.

The ROI of testing

Automated testing tools require a significant investment in product selection, staff training, implementation, and maintenance. Only after repeated use (as few as three and as many as 10 test runs) will CIOs begin to see an ROI for testing tools. The savings will likely occur in unexpected business areas, such as project releases-since a company will probably spend less on project maintenance, for instance.

Close to 80 percent of technology budgets are typically spent on maintaining existing systems, a euphemism for cleaning up post-deployment problems, according to Linda Hayes. Hayes is CTO at WorkSoft, a provider of next-generation testing software, and author of Automated Testing Handbook, (Software Testing Institute, 1995). Automated testing, explained Hayes, can improve the testing coverage of a QA department, resulting in cleaner, better-performing software.

Better-performing software helps deter costly system failures-another place where enterprises can anticipate savings. Take for instance the lone beta tester of a car rental's new Web site. The tester somehow misses a functional defect in the checkout process during testing. When the site goes live, the defect foils the ability of corporate customers to complete transactions. The company then receives a deluge of e-mail flames from customers who couldn't reserve cars online-a testing snafu that cost $25 million in lost sales.

How to implement testing the right way

To make automated testing a viable and strategic effort, tech leaders need to spend time examining current tools and determining how to implement a testing process well before pulling a product off a shelf. The following five steps should help you get your automated testing effort started on the right foot:

Step 1: Is test automation the best choice?

Determine if the project is a candidate for test automation. If a company envisions multiple application releases, or if an application requires a handful of tests that must be duplicated thousands of times, automated testing tools should be considered, said A.J. Alhait, president of Betasoft Inc., a Campbell, CA-based provider of automated and manual performance-based testing. "But if you see that your test cases are generally easy, and it will take longer to automate them than manually test them, then manual testing is the way," he explained.

Step 2: Budget training time

Allow sufficient time to implement an automated testing solution. This includes budgeting enough time to train the QA staff. A common misstep is jumping into testing and not having the skills or resources to analyze the results, which can inevitably thwart the efficiencies that automated testing offers.

Step 3: Choose a product champion

Make sure you have a product "champion" on the QA team, or the QA department may simply set the tool on the shelf and forget about it, advises Elfriede Dustin, QA director at, a software organization that specializes in publishing professional tax software applications. (For more insight on Dustin's advice, check out her Web site or her book, Effective Software Testing: 50 Specific Ways to Improve Software Testing [Addison Wesley, expected to publish in winter 2002].)

Step 4: Bring in an expert

Don't attempt to transform the QA staff into test scriptwriters. Leave that up to an expert-who may be someone on your staff with scriptwriting experience or a consultant you bring in for the project, advised Hayes. You should also consider purchasing a test framework (i.e., a library of commonly used testing scripts).

Step 5: Maintain the script library

Once the automated testing tool is up and running, be sure to budget time and money to maintain the script library. Every time a new version of software needs to roll through QA, the script library should be at the ready.

Automated tools just part of the solution

Because of the amount of revenue now tied to the core transactional infrastructure at many companies, reducing system failure is no longer an option-it's a necessity. If implemented correctly, automated testing, coupled with an appropriate number of testers and a sufficient timeline, can broaden QA's coverage, thus reducing product maintenance and avoiding costly software failures.

"Automated testing tools are merely a part of the solution; they aren't a magic answer to the testing problem," said Dustin. "It has to be seen as an enhancement to the manual testing effort, not as a replacement."

© 2002 - 2012 d.southgate   |   design: notlimited   |  

xhtml + css
david southgate
writing for living.