Perform Runtime Analysis Together
Runtime analysis is a thankless job. Whether you're a developer or just a really geeky tester (that's me), odds are that no one else on the project team appreciates or even understands your efforts in performing runtime analysis. Having used several of the runtime analysis tools on the market, I feel that anyone doing this type of testing should be sainted. It's very difficult to find and interpret the meaningful data that these tools generate. Even if you can find the data and figure out what it's telling you, most often no one knows what actually needs to be fixed.
As Goran Begic states in his article "An introduction to runtime analysis with Rational PurifyPlus," runtime analysis provides information on the following aspects of an application's execution:
Memory errors and memory leaks in native applications
Memory leaks in .NET managed code and Java applications
Let's consider an example. One of my projects had a problem with pages taking a long time to load (more than 60 seconds). We ran numerous performance tests and couldn't find the problem. The developers looked briefly at the problem, but they had deadlines, and after a couple of days they decided that the problem could be solved later... when the developers had more time to deal with it. Our traditional performance tests couldn't isolate the problem at a sufficient level of detail. What to do? Using a simple code-coverage tool in conjunction with one of our simple regression scripts, the testing team was able to isolate the problem to a specific method. It seemed that a call was being executed 4,000,000 times when the page loaded. Armed with that information, they fixed the problem the next day, decreasing page load time to three seconds. The team now executes runtime analysis on a regular basis.
I've never worked with a developer who was actually tasked with performing runtime analysis. The developer was always doing it to solve a problem discovered by some other method of testing, or in response to something I found while performing my second-rate runtime analysis. I've found that the most effective way to make sure that runtime analysis gets done is to start performing the analysis yourself. As a tester, you don't need to become a runtime analysis expert; all you need to do is learn the basics about some runtime-analysis tools; learn a little about the technologies you're testing (common problems, bottlenecks, and performance problems); and find some time to actually do some testing.
Of all the techniques described in this article, runtime analysis seems to be the most effective in increasing developer/tester communication (your mileage may vary). In my experience as a tester, once you find something (or even if you only think you may have found something), you should bring over a developer and show him or her what you have. Suddenly, to the developer, you're no longer a technology-blind tester who doesn't know anything about development, and the developer will likely be interested in helping you to understand what you're seeing. Once a developer knows that a tester has the desire and the aptitude to learn, the developer typically is willing to spend as much time as available helping the tester to understand the applicable technologies. From the developer's point of view, explaining the technologies once, early in the project, saves him or her from having to answer many small questions later on, when under greater time pressures. At the very least, the tester will have a basic understanding from which to ask smarter and more meaningful questions.
At the same time that the developer is helping the tester, the developer may in turn look to the tester for help in learning the testing tools; this is an opportunity for the tester to share information on the possible risks and long-term effects of the problems found, if they're not fixed immediately. Together, tester and developer uncover and refine performance requirements and simultaneously learn new skills.
By working with developers on runtime analysis, testers can learn more about the technologies, code, and technical issues that developers face, and developers can learn more about what risks concern the testers. This technique leverages tools that both teams can shareand some of the tools specific to each teamto get everyone working together.