Most complaints about (ineffective) measurement programs stem from the failure to properly report collected measurement data. There are three key ingredients to successful communication of measurement results:
Establishing a metrics repository
Consistently reporting results
Analyzing (not just reporting) the data
Establishing a metrics repository serves several purposes. It organizes the data in an orderly fashion for greater flexibility in sorting and accessing the data. It also stores historical data, which is useful for trend analysis and monitoring rates of change. Repositories can be created through internally developed spreadsheets or database applications. They should be centrally maintained with secured access. Several commercially available tools have frameworks for building a measurement repository. The repository is usually a secondary capability of the tool. Unless the tool is used in a different capacity, the purchase of the tool for the primary use of a repository framework is ill-advised.
Consistently reporting measurement results seems like an obvious deliverable from the measurement program, but time and time again we have observed organizations in which proper identification and collection of measurement data were used with only limited effectiveness because results were not reported properly.
We observed firsthand an example of improper reporting several years ago when we were engaged to audit a well-established measurement program at one of the larger U.S. commercial banks. We were familiar with the measurement activities being conducted at this particular bank from relationships that we had formed with members of its metrics team. A request to audit the bank's measurement program came to us via a senior vice president who had recently been appointed to oversee the metrics initiative. Our audit revealed some very advanced and sophisticated measurement activity, which was supported by formalized and well-documented data collection processes. However, we uncovered a major flaw in the reporting of the measurement data.
The reporting process had not taken into account the requirements of the various business units that were receiving the reports. The reports contained a wide variety of statistically correct measurement tables and graphs, but the recipients of the report had little or no use for the data and the formats that were contained within the report. This was not a case of not reporting, but rather a case of reporting the wrong information, perhaps to the wrong audience.
Analysis of the data is another major weakness that we have observed in the industry. Actually, the problem may be more correctly stated as simply reporting the data and not analyzing it at all. If a manager received a periodic report indicating the number of function points produced, together with the effective rates of productivity and growth, that information could be useful. However, further analysis could significantly enhance that information. For example, further analysis of the reported function points could show a trend toward increased development within certain technical environments, with commensurate increases in productivity. This enhanced information would certainly be useful for future strategic direction setting. The analysis of the data could also result in projections of anticipated future growth rates and identify the need to shift budget dollars to support anticipated growth trends.
Depending on the organization, analysis of measurement data could range from basic statistical process, control-based analysis to a much more advanced statistical analysis of the data, depending on the ability of the intended audience to understand the reported data. The key ingredient is to take a proactive view of the data being analyzed. It is interesting and useful to review the past, but value is added when current business can be managed from the ability to forecast outcomes on the basis of accurate reporting.