Experiences of Test Automation: Test Automation Anecdotes
- 29.1 Three Grains of Rice
- 29.2 Understanding Has to Grow
- 29.3 First Day Automated Testing
- 29.4 Attempting to Get Automation Started
- 29.5 Struggling with (against) Management
- 29.6 Exploratory Test Automation: Database Record Locking
- 29.7 Lessons Learned from Test Automation in an Embedded Hardware-Software Computer Environment
- 29.8 The Contagious Clock
- 29.9 Flexibility of the Automation System
- 29.10 A Tale of Too Many Tools (and Not Enough Cross-Department Support)
- 29.11 A Success with a Surprising End
- 29.12 Cooperation Can Overcome Resource Limitations
- 29.13 An Automation Process for Large-Scale Success
- 29.14 Test Automation Isn't Always What It Seems
An anecdote is a short account of an incident (especially a biographical one). Numerous people told us short stories (anecdotes) of their experiences, and because they merit retelling but don’t constitute full chapters, we collected them in this chapter. The stories vary in length from half a page to five pages. They are all independent, so they can be read in any order.
29.1 Three Grains of Ric
Randy Rice, United States
Consultant, speaker, and author
As a consultant, I see a number of different situations. The following describes three short experiences I have had with a couple of clients.
29.1.1 Testware Reviews
I was a consultant once on a project where we were trying to bring best practices in test automation into a large organization that had only tinkered with test automation. The company’s environment spanned web-based, client/server, and mainframe applications. About 15 test designers and 15 test automators were brought in to work on this effort. The test tools in use when we first arrived were old versions not even being supported by the vendor because of their age. Only a small portion of the applications were automated to any degree. The automation that was in place consisted of large test scripts that were very difficult to maintain.
The project was initiated as one of several aggressive projects to technically reengineer the entire IT operation. The chief information officer (CIO) who was the original champion of these projects was a believer in test automation. Her successor inherited the projects but did not share the same commitment and enthusiasm for many of them. There was also a 6-month vacancy while the new CIO was being recruited, so things had just coasted along. When the new sheriff came to town, people started trying to figure out who would survive.
Supervising this effort were three senior test automation consultants who really knew their stuff and had a very specific approach to be followed. We had six test automation gurus on the managing consultant side, and we had regular communication based on metrics and goals. In fact, we developed a very nice dashboard that integrated directly with the tools. At any time on any project, people could see the progress being made. We gave demonstrations of how the automation was being created (this went over management’s heads) and also the results of automation, so we had plenty of knowledge and communication.
To their credit, the contracting company trained all the incoming test design and automation consultants out of their own pocket. Although these were experienced consultants, the contractor wanted to set a level baseline of knowledge for how the work would be done on this project.
After about 3 weeks, it became apparent that some of the test automators were going their own way and deviating from the defined approach. This was a big problem because a keyword approach was being used, and certain keywords had to work consistently among applications. There were too many people who wanted to do things their way instead of the way that had been designed.
To correct the issue, the senior consultants required all test designers and consultants to attend daily technical reviews of testware. Technical reviews are not just for application software code or requirements. To get 30 people (more or less) from diverse backgrounds on the same approach is not a trivial achievement! Before long, this became a peer review type of effort, with critiques coming from peers instead of the senior consultants. It had turned into a forum for collaboration and learning.
Some of the test consultants resisted the technical reviews and didn’t last on the project. They were the same test automators who refused to follow the designed approach.
After a few weeks, it was no longer necessary to maintain the frequent reviews, and the test automation effort went a lot more smoothly.
Unfortunately, test management and senior technical management (at the CIO level) in this organization never saw the value of test automation. Therefore, much of the fine work done by this team was scrapped when senior management pulled all support for this effort. They terminated the contracts of everyone who knew anything about the automation and ended up “achieving” a negative return on investment (ROI)—millions of dollars were spent with very little to show for it. I see little future for automation at this company now, in spite of the great work that was done.
This was a huge and very visible project. But the test manager was like many test managers and had been thrust into the role with no training in testing. The client staff were thin in numbers, skills, and motivation.
My bottom line assessment is that the organization simply was not ready for such an aggressive project. Then, when the sponsoring CIO left, there was no one to champion the project. Also, the software wasn’t engineered in a way that was easily automated; it was old and very fragile. The expectations for ROI were very high and it would have been better to take smaller steps first.
29.1.2 Missing Maintenance
There was a move in the late 1990s to go from fractional stock prices to decimal prices. For decades, stock prices had been shown as “$10 1/2” instead of “$10.50.” There were many benefits to the decimal representation, such as ease of computation, standardization worldwide, and so forth. This was a major conversion effort that was almost as significant for the company as the Y2K maintenance effort.
Because the conversion effort was so massive and time was so short, management decided not to update the test automation during the project. This decision later proved to be significant.
By the time the decimalization project was complete, work was well underway for the Y2K conversion effort. We wanted to update the test automation for both efforts—decimalization and Y2K—at the same time. However, the schedule won again, and by the time the Y2K effort was complete, the test automation was deemed to be so out of date, it would be easier to start all over in a new, more modern tool. This was indeed the case. One of the problems was the platform, the DEC VAX. There was only one tool on the market for that platform. An emulator-based PC tool could have been used, but then there would be issues of character-based testing.
At the time, keyword-driven or even data-driven approaches were not widely known, and the automators and test managers encountered for themselves the difficulties of maintaining the automation code with hardcoded values. The first decision not to keep up with maintenance of the automated testware proved to be the death of the entire test automation effort for that application. This was a highly complex financial application, taking about 3 years to create the original test automation. There were new projects being developed on client/server platforms. Starting again from square one might have been a good idea, but the company hadn’t yet realized the ROI from the first effort. Basically, the manual test approach just seemed too compelling.
29.1.3 A Wildly Successful Proof-of-Concept
I was hired by a large Wall Street company to assess the quality of its software testing process and make recommendations regarding a workable test automation approach. This company was not new to the idea of test automation. In fact, it already had three major test automation tools in place and was looking for another test automation solution. There was no integration between the various test automation tools, and they were applied in functional silos.
One particular system at this company was being manually regression tested every day! This one very unfortunate lady performed the same tests every day for about 8 hours a day.
As we were considering which tools might be the best fit for this system, I suggested that we contact the various candidate vendors and see if any were willing to send a technical consultant and perform a proof-of-concept using the the vendor’s tool and my client’s system.
My client thought this was an excellent idea, so we contacted the vendors and found one that was willing to send in a consultant at a reduced daily rate. We felt it was worth the risk to pay for the proof-of-concept. It would have taken us weeks to try to get an unfamiliar tool working, and we didn’t want to pay for a tool without knowing it would work.
It seemed to me a good test project for the proof-of-concept was the 8-hour daily manual regression test, so we asked the vendor’s test automation consultant to tackle that application.
After 3 days, the regression tests were completely automated! We were hoping just to get an idea that the tool would work in the environment. What we got instead was our first success! We probably broke even on ROI after 1 month.
My client was thrilled, I looked good for suggesting the idea, and the vendor made a big sale. However, the person happiest with the outcome was the lady who had previously performed the manual regression tests for 8 hours a day. Now, she started an automated test and 15 minutes later, the test was done. Her time was now free for designing better tests and performing more creative tests.