- 29.1 Three Grains of Rice
- 29.2 Understanding Has to Grow
- 29.3 First Day Automated Testing
- 29.4 Attempting to Get Automation Started
- 29.5 Struggling with (against) Management
- 29.6 Exploratory Test Automation: Database Record Locking
- 29.7 Lessons Learned from Test Automation in an Embedded Hardware-Software Computer Environment
- 29.8 The Contagious Clock
- 29.9 Flexibility of the Automation System
- 29.10 A Tale of Too Many Tools (and Not Enough Cross-Department Support)
- 29.11 A Success with a Surprising End
- 29.12 Cooperation Can Overcome Resource Limitations
- 29.13 An Automation Process for Large-Scale Success
- 29.14 Test Automation Isn't Always What It Seems
29.5 Struggling with (against) Management
Kai Sann, Austria
Engineer and test manager
I have had some “interesting” managers over the years who had some challenging effects on the way we did test automation.
29.5.1 The “It Must Be Good, I’ve Already Advertised It” Manager
We started software test automation in 2002. The management’s intention was to reduce time for the system test. Furthermore, the management used automation as a marketing strategy before the automation was developed.
At this time, there was no calculation for return on investment (ROI). The approach was this: Software test automation must pay because manual testing is no longer needed. The goal was to automate 100 percent of all test cases.
I had a hard time explaining that software test automation is just one of many methods to achieve better software and that it is not free—or even cheap.
29.5.2 The “Testers Aren’t Programmers” Manager
We started very classically and believed the promises of the vendors. They told us, “You only need to capture and replay,” but we found this was not true. In our experience, this leads to shelfware, not success—it does not pay off.
After some trial and mostly error, we started to write automation code. At this point, we were far away from developing automated test cases. We needed some lessons in writing code. We were lucky to have very good mentors on our development team who taught us to write libraries and functions so we didn’t have to code the same tasks repeatedly.
I had a discussion with my boss about what programming is. He explained to me that he consciously hadn’t hired testers with an informatics degree because he didn’t want to have more developers in his department. You can imagine his surprise when I told him that our automation code included libraries and functions.
He told his superiors that the testers do “advanced scripting” rather than coding because he was afraid that the testers would otherwise be forced to write production code!
29.5.3 The “Automate Bugs” Manager
An idea provided by one manager was to automate bugs we received from our customer care center. We suffer the consequences to this day. How did this work? Our developers had to fix our customers’ bugs. We were told to read this bug and automate this exact user action. This is where the consequences come in: Not knowing any better, we hardcoded the user data into our automation. After 2 years, we were one major release behind the development. We didn’t know anything about data-driven tests at that time.
We were automating bugs for versions that were not in the field anymore. Most of these test cases still exist because we haven’t had time to replace them.
29.5.4 The “Impress the Customers (the Wrong Way)” Manager
My boss had the habit of installing the untested beta versions for presentations of the software in front of customers. He would install unstable versions and then call our developers from the airport at 5:30 a.m. and order immediate fixes to be sent to him by email.
Our programmers hated this so much that we introduced an automated smoke test. This test checks if we have a new build; then it installs the beta, and finally it checks the basic functions of our product. Our boss was told to only install smoke-tested beta versions.
Today we don’t have this boss issue anymore, but we continue the automated smoke tests for our nightly builds because they provide us with a good guess about the state of our software. Here we really save money because smoke tests must be done anyway and we can provide our development with a lot of issues concerning the integration of new modules at an early stage. We expand this test every few months. The coolest thing is that we are informed about the latest test results by email.
So in spite of some challenging managers, we are now doing well with our automation!