Why Do We Still Have Software Development Problems?
- “When I use a word,” Humpty Dumpty said in a rather scornful tone,
- “it means just what I choose it to mean—neither more nor less.”
- LEWIS CARROLL Through the Looking Glass
This introductory chapter highlights some of the software issues that have been around since the 1960s and are still prevalent today. Whether they have been ignored because they are not publicized, or because we do not want them to cloud our thinking, history is history.
The first question that I hope entered your mind when you opened this book is about what you can do to improve software development productivity in your organization. You aren’t the first person in the past 40 years or so to ask this question. In spite of all the attempts that have been made using similar approaches, they have resulted in limited results in real productivity gains.
I have been trying to wrap my arms around the software productivity problem since the 1970s, when I was formally tasked by my manager to find a way to improve software development productivity in his organization. Because I was new to the organization, I was not bound to consider only potential solutions that were consistent with the standards and culture of the organization. I was able to observe as an outsider the normal development approaches and tools used in that development environment.
I had some early successes in the productivity improvement task that have guided my productivity research since that time, even though most of the task successes were abandoned by the organization because they were not consistent with the organization’s culture.
I also made some of the same errors you have probably made on your way to reading this book. I tried most of the new technology solutions with the hope that each one would have a marked, positive impact on development productivity. One of the first technologies was Programmer’s Workbench (PWB), which made all the current and previous versions of the software code accessible, reviewable, and testable with the idea that it was going to reduce errors and greatly improve the product. PWB worked and removed some of the product problems, but it didn’t really improve productivity. Our organization adopted PWB as a development standard, hoping that the quality and productivity improvements would be a good financial investment. Quality and the development process did improve due to the more efficient handling of the source code and the ability to quickly repair errors. Quick corrections also made it possible to make changes without much thought. Development productivity only improved marginally.
Much of the development work I was involved with during my career included real-time software system development. The techniques I learned using the Hatley/Pirbhai real-time systems specification methods contributed to the quality of that work in a very positive way. However, it didn’t significantly improve productivity even though the resulting designs were better.
I began collecting data in the mid-1970s to investigate the impact of the environment on software development cost and schedule. The environment initially included the organization facilities, tools, and processes, but as time passed, it also began to seriously include organization management and culture. As the quantity and quality of the data improved, the data became the foundation of a cost and schedule estimating model that highlighted the productivity improvement areas of concern. Note: In spite of the technology focus of most publications and the belief of many managers, technology offered the smallest productivity payoff of the development environment elements.
I ultimately constructed a model that will be used throughout this book to help explain the productivity impacts of the elements in the development environment and the decisions you make as a manager relative to your environment.
1.1 Software Crisis
The term “software crisis” refers to a set of problems that highlight the need for changes in our existing approaches to software development. The term originated in the late 1960s about the time of the 1968 NATO Conference on Software Engineering.1 At this conference, a list of software problems was presented as the major development concerns. The problem list included software that was
- Delivered late
- Prohibitive in terms of modification costs
- Impossible to maintain
- Performing at an inadequate level
- Exceeding budget costs
By the way, this list of problems still persists in much of the software development industry today, some 40 years later. The list has been reduced in many organizations, but when we look at the individual problems in the list, we can observe a common thread—a lack of a realistic schedule (delivered late).
A notion pervading the conference was that we can engineer ourselves out of any problem. Hence, the term “software engineering” was coined. Software engineering had to be a potential solution to the problems, but we have to look at what happened after the term was coined. One of the significant conference outputs was a software engineering college curriculum. However, the curriculum that was produced just happened to be identical to the computer science curriculum of that day. Changing the subject name from “computer science” to “software engineering” accomplished very little.
Crisis is a strong word. It suggests a situation that demands resolution. The conditions that represent the crisis will be altered, either toward favorable relief or toward a potential disaster. According to Webster’s definition, a crisis is “a crucial or decisive point or situation.” A heart attack is a crisis where we either live or die. By now, the crisis should have been resolved one way or another. In retrospect, the term exigence2 fits the situation better than crisis because there is no discernible point of change for better or worse. A skin rash is an exigency.
Looking a little deeper into the list of problems, we find that the perceived solution to the software development problems was technology. According to the results in Figure 1.1 from the 2013 Standish Chaos Manifesto,3 technology has not been the total solution to project success.
The Chaos report divides projects into three classes: successful, challenged, and failed. Only about 29 percent of the 2004 projects evaluated in the 2004 study were classified as successful. Fifty-three percent were delivered but with significant overruns in cost and schedule while delivering an average of only 64 percent of the features of the original requirements (challenged). The overruns averaged about 84 percent in schedule and 56 percent in cost. The remaining 18 percent were cancelled before delivery (failed).
About 39 percent of the 2012 projects evaluated were successful. Forty-three percent were delivered, but with significant average overruns of nearly 59 percent of cost and 74 percent of schedule while delivering only 69 percent of the original requirements (challenged). Still, about 18 percent were cancelled before delivery (failed).
We can observe from this report a gain of successful project completions of about 10 percent during the past decade due to shifts in the development environment, including better schedule and cost estimates, processes, technology, and team performances. Most projects have problems, but more often they are people problems related to culture rather than technological problems.
A second study shown in Figure 1.2, which encompasses a large number of major aerospace (ground and space) software projects,4 illustrates a relationship between system size and development time. Three things are evident from this historic data. First, there is an apparent maximum size of 200,000 source lines for projects that are completed and delivered.
Second, there is also an apparent maximum schedule for the development of software system components. There is little data in this set that required more than four years to complete.
Third, the report indirectly contained two important pieces of information related to development productivity. If a development project lasted more than five years, it was outdated and no longer useful. With projects including more than 200,000 source lines, the number of people required on the development team overwhelmed the team’s ability to produce the product.
There is a close relationship between productivity and software estimating tools. Productivity achieved on the last development project is close to the productivity that will be used to determine the next project’s cost and schedule. Also, the parameters used by the tools are indicators that can, or should, be used to determine management actions to improve software development productivity. The standard metric for software development productivity has been the number of delivered source lines of code, independent of the development language per person month of effort since the beginning of project history recording.
Figure 1.1 CHAOS 2012 Software project survey results
One consistency that has aided the developers of software estimating tools is the use of a delivered source line of code as a measure of the software product size. That isn’t a perfect measure, because it doesn’t accurately account for rework, but it has been used since the 1960s and illustrates the general productivity trends that we observe.
The software estimating tools in widespread use today evolved from models developed in the late 1970s to early 1980s using historical project data available at the time. The widely used tools today include COCOMO II,5 Price-S,6 Sage,7 and SEER-SEM. It is important to note that these mature tools are as useful today as they were 30 years ago when they were first formulated. Input data parameter sets (analyst and programmer capability, application experience, use of modern practices and tools, etc.) developed for Seer8 and COCOMO9 to describe organizations in the early 1980s are, oddly enough, still accurate and applicable today. The organization parameter definitions have changed very little, and the values of those parameters haven’t changed much, either. Fortunately for the estimating model developer, the traditional software development culture has changed very slowly. Agile software development has introduced a major cultural shift that has already led to a new way of thinking about efficient development.
Figure 1.2 1996 Aerospace Software project completion study
There have been several development technology breakthroughs during the past 40 years that have significantly decreased the cost of software products. For example, the introduction of FORTRAN and COBOL decreased the cost of a given product functionality to one-third of the cost achievable when implemented in Assembler due to the decrease in the source lines of code required to achieve the product functionality. The transitions from C11 to the newer visual languages and the advent of object-oriented structures created significant software cost savings, because the required number of source lines have continually decreased. However, when we look at the effort required to produce a single line of source code in any given programming language (old or new), we see that traditional software development productivity (measured from start of development through delivery or software-system integration) has increased, with little blips and dips, almost linearly at the rate of less than two source lines of code per person month (SLOC/PM) per year, as shown in Figure 1.3.
We have learned new things about software development during this period. The development environment focus was almost entirely on the product during the 1960s and early 1970s. The principle activity once the requirements were established was programming, or should I say coding. Programmers were simply programmers. Software development technology, namely programming languages, improved as the system requirements grew to manage the size and complexity of the tasks increased. Development platforms improved to support the ever-increasing size of software systems.
Figure 1.3 Traditional software development productivity gains—1960 to 1990
The first major software system I encountered in my career was a real-time airborne weapons system with approximately 100,000 delivered source lines of assembler code. The development was started in the early 1960s and delivered almost three years later. The system included both radar and weapons software. Today that system size would not be memorable, except for several constraints that we do not have today. First, the software amounted to 50 boxes of punched cards implementing a single component. System development processes and standards did not exist. Modern software methods beginning with structured programming were not created until years later. There were no tools to manage source code or other development and test products. Documentation was created with a typewriter. The development team was approximately 35 engineers, of whom 20 were referred to as programmers. Much of the work was performed in a lab environment with the hardware engineers. This project achieved a productivity of a little over 70 SLOC/PM.
Ten years later, software systems took on a different character than the real-time software systems of the 1960s. Development and target computers were much larger and faster. Programmers had become software engineers. The major third-generation programming languages were FORTRAN, COBOL, PL/I, PASCAL, and C. Keypunches had become digital files, even though the DEC Programmer’s Workbench included just experimental development environments. Development standards were still in their infancy, but their necessity was obvious. Structured programming was an important new development strategy. Project management was becoming a major development factor.
Had third-generation languages been available and capable of implementing the airborne weapons system just described in 1960, the size could have been reduced to 33,000 source lines, and the product could have been delivered a year earlier using a staff of about 25 people. The improved schedule and the decreased cost were the results of the reduced size.
Programming languages are included in one branch of technology that has continued to change over time to keep the effective product size at a manageable level. From the first assembler programming language to the third- generation languages, there was a 3:1 reduction is program size. The object-based languages such as Visual C and Visual Basic created the ability to build larger-scale objects with a further reduction in size. The current use of Universal Modeling Language (UML) and state charts to automatically generate C11 code (auto-code generators) projects a 40:1 size improvement over C11. We still measure productivity based on the written source code for the project.
1960 to 1990 represented a historical period in which software development technology improved dramatically. Alvin Toffler described the phenomenon in Future Shock10 as a form of time compression that is present as a rapid acceleration of requirements changes. Technology changes fed on themselves to produce an ever-increasing frequency of change. Agile software development represents a major developmental cultural change.
The culture shift is not only happening in the development culture, but also in the people (programmers, designers, etc.) who make up the development culture. At the time Future Shock was published, college engineering graduates had a useful life expectancy of about five years before their skills would be outdated; that is, unless these engineers continued their schooling. That useful life is now less than two years in the software world. New college graduates live in an environment where all problems should be solvable in an hour or two. New tools and languages make it possible to create web pages and phone apps almost overnight. Software development is drifting toward artistic design rather than software engineering. This shift is creating a cultural problem for the large-scale system developers.
The work required to produce a line of source code remains almost independent of the computing power packed in one source line. Producing the instructions with the keyboard is not where the work happens. The work is in interpreting the requirements, producing and testing the design, correcting faulty reasoning, coordinating the work with others working on the task, and producing the documentation. An important related comment that has been attributed to Maurice Wilkes, the 1967 Turing Award winner, is:11
- As soon as we started programming, we found to our surprise that it wasn’t as easy to get programs right as we had thought. Debugging had to be discovered. I can remember the exact instant when I realized that a large part of my life from then on was going to be spent in finding mistakes in my own programs.
Size and complexity brought about the need for processes to manage the software development. The early waterfall process of requirements analysis, software design, code, test, and integration became the development standard.
Technology improved along the line shown in Figure 1.3. Each new technology change offered new and often better approaches to the complexity of the software development process. New approaches to the development process such as the software spiral method proposed by Boehm12 and the Gilb13 evolutionary delivery method provided ways to manage requirements complexity and volatility. Traditional software development productivity increased with the new technologies, but at only about 1.5 delivered SLOC/PM per year improvement.