Examples of Test Reporting in Action
Here I’ll share two examples of test reporting in action. Both examples are trivial in nature (what you might get after an hour of testing), but both illustrate the aspects of test reporting discussed earlier. As you read through the examples, you’ll see some aspects called out specifically; others will be implied or included as part of the findings. Both of these reports are informal—that is, they don’t follow a rigorous, predefined template—which is my preference. I’m not a big fan of templates, but that’s not to imply that something like this couldn’t be formalized for consistency across the team.
The first example is a test report I put together after about an hour of testing a sample application called ProSum, developed by Earl Everett. Notice that I call out mission, coverage, and risk explicitly. I talk about techniques and environment only as related to status. I also talk about what testing I didn’t do.
Test Report on ProSum Version 1.4 by Earl Everett
Mission: ----------------------------------- 1) Provide information about the application in terms of potential defects. 2) Identify coverage, risk, and test strategy for this application. Coverage: ----------------------------------- Functionality: - Application - Generate a random number - Spinner controls - Add numbers - Clear fields - Error checking - Calculation - Testability Data: - Random numbers - Bounds - Types - Rounding Usability and Platform: - User Interface - Consistency - Windows compatible - Look and feel Operations: - Stress Risks: ----------------------------------- - Incorrect addition - Incorrect random number generation - Incorrect error handling - Other features implemented incorrectly - Inconsistency with Windows features and user expectations - Company image Process: ----------------------------------- 1) Variability Tour - I started with getting to know the application. Read About, clicked some buttons, did the blink test. 2) Functional Exploration - I looked at functionality and used heuristic oracles based on consistency, image, claims, expectation, and purpose. - I looked at usability in terms of me as a user and my expectations in terms of a Windows application. - I did what limited stress testing I could think of that I thought was valuable. - The only data analysis I did was to identify boundaries for addition, determine rounding for decimals, and to try some simple equivalence classes for input values. Testing Not Done: ------------------------------------ - There don’t seem to be any files written to the hard drive, but I did not do an extensive check using any tools. - Other than when pasting large character sets, there don’t seem to be any noticeable performance problems on my laptop. However, I don’t know that there are no performance problems that might not manifest themselves over time or with a high volume of usage. - I looked for a command line interface, but there does not seem to be one (or at least I can’t get it to work). - I did not test internationalization. - I did not get to see the source code. - I did not test on multiple platforms (only an HP Pavilion ze4600 running Windows XP Pro SP2). Results of Testing: ------------------------------ 1) There are spelling, grammar, and stylistic problems on the About dialog. This is inconsistent with purpose and is bad for the company image. 2) There are four buttons on the About dialog that allow you to exit the dialog. It would appear they all do the same thing (close the dialog). This is inconsistent with purpose because it may confuse the user (it confused me initially) and it is inconsistent with the product because the error dialog has only one button (the OK button) to exit the dialog. 3) On the About screen, the versions displayed (title bar and text) do not match. This is inconsistent with the product. 4) The way Version is displayed in title bars is inconsistent. On the About dialog the word Version is spelled out and it is 1.0 ("Version 1.0"). On the error dialog it is just a V with a 1 ("V1"). And on the main screen it is V with a 1.0 ("V 1.0"). This is inconsistent with the product and bad for image. 5) The error dialog has some stylistic problems with the text (caps in the middle of the sentence on "Integer", "Between", and "Only"). This is bad for image. 6) On the About dialog, the OK button has a hotkey and on the error dialog there seems to be no hotkey for OK (or at least if there is, it is not the same as the one on the About dialog). This is inconsistent with product. 7) Similarly for the main screen, each pushbutton has a hotkey with the exception of the About dialog (as far as I can tell - if there is one it is not indicated in the same way as the other buttons). This is inconsistent with product. 8) If I push the "Pick Two Random Numbers" button many times it does not seem to generate negative numbers in the top field. This seems to be inconsistent with the purpose of the random number generator. 9) If I push the "Pick Two Random Numbers" button many times it will sometimes generate a 100 (or what appears to be a value in the 100 range) in both the bottom and top fields. This is inconsistent with the product since valid values are from -99 to 99. 10) The spinner control for each field seems to work independent of values entered by hand into the text fields. This could be inconsistent with user expectations. When I enter a 33 and click up, I expect a value of 34, not 1. Etc....
You get the point. The full test report is available in this file.
The second example, shown in Figure 3, comes from James Bach’s Rapid Software Testing course appendices and is an excellent illustration of how simple a test report can be. (To see the images in full size, go to page 97 of the appendices.) Notice that this example includes a recommendations section. This is in line with the stated missions. Also worth noting is that only 45 minutes of testing seems to have been done (15 minutes on the .NET application and 30 minutes on the J2EE application). In just 45 minutes, you can already have this much to report!
Figure 3 Hand-written test report by James Bach comparing J2EE and .NET applications.