Home > Articles > Software Development & Management

Lessons in Test Automation: A Manager's Guide to Avoiding Pitfalls When Automating Testing

Using examples from many years of helping businesses automate their testing processes, automated testing expert Elfriede Dustin provides some useful lessons to help keep you from repeating other people's mistakes.
This article is excerpted from an article by Elfriede Dustin, "Lessons in Test Automation," published in Software Testing & Quality Engineering, September/October 1999. Elfriede is also the author of the book, Quality Web Systems, publishing in August of 2001.
Like this article? We recommend

Lessons Learned

I have worked on many projects at various companies where automated testing tools were introduced to a test program lifecycle for the first time. In reviewing these projects, I have accumulated a list of "automated testing lessons learned," taken from actual experiences and test engineer feedback. In this article, I'll share examples of this feedback compiled from real projects, hoping that this information will help you to avoid some typical false starts and roadblocks.

The various tools used throughout the development lifecycle did not easily integrate.

A different tool was used for each phase of the lifecycle: a business modeling tool during the business analysis phase, a requirements management tool during the requirements phase, a design tool during the design phase, a test management tool during the testing phase, and so on. For metrics purposes—and to enforce consistency among the elements and traceability between phases—the goal was to have the output of one tool feed into the next tool used in the lifecycle. But because each of the tools used for the various phases was from a different vendor, they didn't easily integrate. Trying to overcome those challenges and integrate the tools on this project was a complex effort. Much time was spent trying to move information from one tool set to another, using elaborate programming techniques, and resulting in extra work. The code generated to make those tools integrate was not reusable later, because of new upgrades to the various tools.

One possible corrective action would be for each project team to conduct a feasibility study to measure the need to purchase tools that are already integrated into a unified suite. A cost/benefit analysis should be conducted to determine whether the potential benefits of buying an integrated suite of tools would outweigh the costs.

Duplicate information was kept in multiple repositories.

One of our project teams purchased a test management tool in addition to the existing requirements management and automated testing tools. Duplicate information was kept in multiple repositories and was very difficult to maintain. In several instances, the implementation of more tools actually resulted in less productivity. I've found that a requirements management tool can be used as a test management tool. There's no need to maintain test information in both tool databases. Maintenance can be improved by simply keeping most of the test progress and test status information in one tool.

The automated testing tool drove the testing effort.

Often, when a new tool is used for the first time on a testing program, more time is spent on automating test scripts than on actual testing. Test engineers may be eager to automate elaborate scripts, but may lose sight of the real goal, which is to test the application.

Keep in mind that automating test scripts is part of the testing effort, but doesn't replace the testing effort. Not everything can or should be automated. It's important to evaluate which tests lend themselves to automation. For example, only automate tests that are run many times, such as "smoke" (build verification) tests, regression tests, and mundane tests (tests that include many simple and repetitive steps). Also, automate tests that would be impossible (or prohibitively expensive) to perform manually, such as simulating 1,000 multiple-user accesses.

Everyone on the testing staff was busy trying to automate scripts.

On some projects we found that the division of labor—breaking up responsibilities so that all required testing activities are accomplished—had not been adequately defined. As a result, the entire team focused on the development of automated testing scripts.

It's important to clearly define this division of duties. It's not necessary for the entire testing team to spend its time automating scripts; only a portion of the test engineers who have a development background should spend their time automating scripts. Manual test engineer expertise is still necessary to focus on all other test aspects. Again, let me stress that it's not feasible to automate everything.

Elaborate test scripts were developed, duplicating the development effort.

I have witnessed test script development that resulted in an almost complete duplication of the development effort, through overuse of the testing tool's programming language. In one of our projects, the application itself used a complex algorithm to calculate various interest rates and mortgage rates. The tester re-created these algorithms using the testing tool. Too much time was spent on automating scripts, without much additional value gained. One cumbersome script was developed using the tool's programming language—but the same script could have been developed using the capture/playback feature of the tool and simply modifying the generated script in a fraction of the time. The test team must be careful not to duplicate the development effort; this is a risk when developing elaborate test scripts. For each automated testing program, it's important to conduct an automation analysis, and to determine the best approach to automation by estimating the highest return.

Automated test script creation was cumbersome.

All teams involved need to understand that test script automation doesn't happen automatically, no matter what the vendor claims. On one project, test engineers with manual test backgrounds were involved in creating the automated scripts. Basing their assumptions on the vendor claims of the tool's ease of use, the test engineers complained that the creation of automated scripts took longer than expected, and that too many workaround solutions had to be found.

It's important to understand that the tools are never as easy to use as the tool vendor claims. It's also beneficial to include one person on the testing staff who has programming knowledge and appropriate tool training, so that he or she can mentor the rest of the testing staff responsible for automation.

Training was too late in the process, so test engineers lacked tool knowledge.

Sometimes tool training is initiated too late in the project for it to be useful for the test engineers using the tool. On one project, this resulted in tools not being used correctly. Often, for example, only the capture/playback portion of the testing tool was used, and scripts had to be repeatedly re-created, causing much frustration.

When introducing an automated tool to a new project, it's important that tool training be incorporated early in the schedule as one of the important milestones. Since testing needs to be involved throughout the system development lifecycle, tool training should happen early enough in the cycle for it to be useful and to ensure that tool issues can be brought up and resolved early. This involvement allows for testability and automation capabilities to be built into the system under test.

Mentors are also very important when first introducing tools to the testing program. Mentors must be very knowledgeable and should advise, but shouldn't do the actual work.

The test tool was introduced to the testing program with two weeks left for system testing.

I recall one project in which system testing was behind schedule, and management introduced a new testing tool in the hopes of speeding up the testing effort. Since we had a test automation expert on the team, we were able to leverage the use of the testing tool for such efforts as creating a smoke test script. The smoke test script automated the major functionality of the system, and before a new system test build was accepted, the smoke test script was played back to verify that previously working functionality had not been affected by new fixes.

We had taken a risk by introducing the tool so late in the process, but in this case we came out ahead: The script saved some time. If no test automation expert had been on the test team, I would have suggested that the test team not accept the use of an automated testing tool this late in the lifecycle. The tool's learning curve would not have allowed us to gain any benefits from incorporating it this late in the testing lifecycle.

Testers resisted the tool.

The best automation tool in the world won't help your test efforts if your team resists using it. In one case, the tool remained in the box—hardly any effort was invested in incorporating it into the process. The test engineers felt that their manual process worked fine, and they didn't want to bother with the additional setup work involved in introducing this tool.

When first introducing a new tool to the testing program, mentors are very important, but you also need tool champions—advocates of the tool. These are team members who have experience with the tool, and who have first-hand experience in its successful implementation.

There were expectations of early payback.

Often when a new tool is introduced to a project, the expectations for the return on investment are very high. Project members anticipate that the tool will immediately narrow the testing scope, meaning reducing cost and schedule. In reality, chances are that initially the tool will actually increase the testing scope.

It's very important to manage expectations. An automated testing tool doesn't replace manual testing, nor does it replace the test engineer. Initially, the test effort will increase, but when automation is done correctly it will decrease on subsequent releases.

The tool had problems recognizing third-party controls (widgets).

Another aspect of managing expectations is understanding the tool's capabilities. Is it compatible with the system under test? On some projects, the test engineers were surprised to find out that a specific tool could not be used for some parts of the application. During the tool evaluation period, it's important to verify that third-party controls (widgets) used in the system under test are compatible with the automated testing tool's capabilities.

If a testing tool is already in-house but the system architecture hasn't been developed yet, the test engineer can give the developers a list of compatible third-party controls that are supported by the test tool vendor. If an incompatible third-party control is proposed, the test engineer should require a justification and explain the consequences.

A lack of test development guidelines was noted.

One program had several test engineers, each using a different style for creating test scripts. Maintaining the scripts was a nightmare. Script readability and maintainability is greatly increased when test engineers can rely on development guidelines.

The tool was intrusive, but the development staff wasn't informed of this problem until late in the testing lifecycle.

Some testing tools are intrusive—for the automated tool to work correctly, actual code has to be inserted into the code developed for the system under test. In this case, the development staff wasn't informed that the automated tool was intrusive; when they finally found out, they were very reluctant to incorporate the necessary changes into their code. Because of uncertainty about the tool's intrusiveness, the first time that the system under test didn't function as expected, the intrusive tool was immediately blamed (even though there was no evidence that it had been the culprit).

To prevent this from happening, the test engineers need to involve the development staff when selecting an automated tool. Developers need to know well in advance that the tool requires code additions (if applicable—not all tools are intrusive). Developers can be assured that the tool will not cause any problems by offering them feedback from other companies that have experience using the tool, and by showing documented vendor claims to that effect.

Reports produced by the tool were useless.

The test engineering staff on one project spent much time setting up elaborate customized reports using Crystal Report Writer, which was part of the automated testing tool. The reports were never used, since the data required for the report was never accumulated in the tool.

Before creating any elaborate reports, verify that the specific type of data is actually collected. Set up only reports specific to the data that will be generated. Produce reports as requested by management or customers, and those reports required internally by the test program for measuring test progress and test status.

Tools were selected and purchased before a system engineering environment was defined.

Some teams are eager to bring in automated tools. But there's such a thing as too much eagerness: Tools that are evaluated and purchased without having a system architecture in place can cause problems. When the decision for the architecture is being made, many compatibility issues can surface between the tools already purchased and the suggested architecture. I remember projects in which workaround solutions had to be found while trying to match the system-engineering environment to the requirements of the tools already purchased. A lot of vendor inquiries had to be made to determine whether the next release of the tools might be compatible with the system middle-layer that the project team wanted to choose.

To avoid these problems, it's important that a system architecture be defined—and test tools selected—with all requirements (tool and architecture) in mind.

Various tool versions were in use.

It's possible for everyone to be using the same tool, and yet still not be able to talk to each other. On one project that had numerous tool licenses, we had various tool versions in use. That meant that scripts created in one version of the tool were not compatible in another version, causing significant compatibility problems and requiring many workaround solutions.

One way to prevent this from happening is to ensure that tool upgrades are centralized and managed by a configuration management department.

The new tool upgrade wasn't compatible with the existing system engineering environment.

Keep on top of product "improvements." On one project, a new tool version involved an upgrade that used the Microsoft Mail system for automatic defect notification to the Microsoft Exchange system. The vendor had omitted mentioning this upgrade detail. The project had invested a lot of money in a variety of test tool licenses and was caught by surprise by this new tool upgrade. The project's company actually had plans to move to the Lotus Notes mailing system, but was still using Microsoft Mail. And using the Microsoft Exchange system, unfortunately, was not part of their plans at all. An elaborate workaround solution had to be found to allow for the tool's compatibility with Lotus Notes.

Whenever a new tool upgrade is received, verify any major changes with the vendor. It's often to your benefit to become a beta testing site for any new tool upgrades; this is a good way to incorporate your project's issues or needs regarding any new upgrade.

It's also advisable to test a new tool upgrade in an isolated environment to verify its compatibility with the currently existing system engineering environment, before rolling out any new tool upgrade on a project.

The tool's database didn't allow for scalability.

One project implemented a tool that used Access as its database. Once the test bed grew (test scripts, test requirements, and so on) and multiple test engineers tried to access the database, multi-user access problems surfaced. The database became corrupted several times and backups had to be restored.

To avoid this problem, pick a tool that allows for scalability using a robust database. Additionally, be sure to back up your testing tool database daily, if not twice a day.

Incorrect use of a test tool's management functionality results in wasted time.

In one specific test tool that allows for test management, test requirements can be entered into the tool in a hierarchical fashion. On one project, a person spent two weeks to enter test requirements in this hierarchy—ending up with 1,600 test requirements. When the test team was then ready to automate these test requirements, they realized that there was not enough time or resources to automate each of the 1,600 requirements. Much of the time spent entering these requirements was wasted, since only a fraction of the test requirements would eventually be automated. Planning and laying out milestones (what, when, and how something will be accomplished) still applies to automated testing.

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020