Home > Articles > Process Improvement

  • Print
  • + Share This
This chapter is from the book

One Reason: The Civil Engineering Analogy

Back in the day when microcomputers4 first hit businesses, they arrived on the scene (in many cases) without a lot of planning. It was hard to predict what impact small, interactive systems would have on business processes, productivity, the nature of the workday, and so on.

At the same time, a lot of software developers were ahead of the curve on this; very early micros (like the Apple II, the Heathkit, and the TRS-80) were sold more to hobbyists and experimenters than businesspeople. Naturally, these were often the people who were later called on to write the software for these new PCs that started showing up in the workplace.

That was how I started, in broadcast television and radio; making software was not my "job," but people had heard that I had built my own computer and knew how to "program," and would ask me to write things for them since, at that time, little software existed for the PC, especially the vertical-market software that would apply specifically to broadcasting. Many of my friends and colleagues were experiencing the same things in banking, insurance, education, and other industries.

In those days, software seemed to sort of "happen," but it was extremely difficult to predict when it would be ready, or what it would do, or what it would be like. This was probably annoying to stakeholders, but acceptable since software was only a fringe issue for most businesses.

All that changed in the 80s with the introduction of inexpensive clone computers.

In a remarkably short few years, PCs were everywhere, and the vast majority of business processes, including very critical ones, had moved to the PC and therefore to software. It was rather shocking how fast this happened, and most people did not realize it had until it was already a done deal.

Business people suddenly realized how very vulnerable they were to the software used to run their business. The software's quality, when new software would become available, what it did and did not do, and so forth, had a profound impact on a company's well-being.

In the mid 90s, the CBS Network hosted a new media conference in Chicago, where the primary topics included the impact of the Internet on broadcasting, and just exactly how the affiliate stations could get control over their exploding automation. Most of the people in attendance were station managers; I was one of the very few techies there, and as such was a popular person to have at your dinner table. These guys were pretty worried.

They knew, and I knew, that the development of software had to be brought under some form of management, that developers had to be accountable to and supported by some kind of process. The chaotic style of developing software was producing chaos in business, and this had to stop.

Business people also knew that they had no idea how software development worked, or how one could control it. They also did not consider the making of software to be a professional activity, so they didn't ask us (not that we would have known what to say if they had).

They looked around at the other industries they did understand to try and find an analog or metaphor that could guide them. A lot of different things were tried, but what was settled on was what came to be called the waterfall.

The idea was that software is essentially like building a large, one-off structure, like a bridge or a skyscraper. The project is complex, expensive, and will be used by many people. We never build the same thing twice, but there is a lot of repetition from one edifice to another.

We all know this process, or something like it:

  1. Analyze the problem until you "know it," and then document this.
  2. Give this analysis to designers who will figure out how to solve this problem, with the given constraints, using software, and then document this design.
  3. Give the design to the development team, who will write the code.
  4. Hand it off to testers to do the quality assurance.
  5. Release the code.

This has been remarkably unsuccessful, but you can understand the thinking.

Before we build a bridge, we analyze the soils, find the bedrock, measure the high and low point of the river, figure out how many cars it will have to handle per hour at peak times, research local laws, see how the bridge will interact with existing roads and structures, measure if ships can clear it beneath, and so on. Then we get an architect to design a bridge that meets these requirements.

If he does the job correctly, the construction team's job is simply to follow this design as accurately and efficiently as possible. Many of the engineering disasters that have become famous in recent history can be traced back to changes, seemingly insignificant, that were made by the construction team to the plan prepared by the engineers. Because of this, the virtues of construction are considered to be compliance and accuracy, not invention.

The tester (the building inspector) comes last, to make sure the bridge is safe, work up a punch list for changes, and then finally approve it for use. Finally, the cars roll onto it.

The problem is that analysis for software development is not like analysis in engineering.

When we analyze, we are analyzing the requirements of a human business process, in which very few things are fixed. To communicate what we find, we do not have the highly precise languages of geology, law, hydrodynamics, and so forth to describe the situation; we have the relativistic language of business processes, which does not map well to technology. Thus, communication is lossy.

And even when we get it right, business processes and needs change, as do our perceptions as to what was needed in the first place. Bedrock is where it is, and will still be there tomorrow, but people change their minds about how they should run their businesses all the time.

They have to. The market changes around them. In fact, the realities, priorities, and constraints of business are changing at a faster rate every year. Also, the variations in software systems are generally much greater than the variations encountered when building the 247th overpass on Interstate 5.

Also, the analog to the construction step in civil engineering would seem to be the coding step in software development. I think that is a misunderstanding of what we really do when we code. If the virtue of construction is compliance and accuracy, I would equate that to the compile process, not the coding process. If so, where does the creation of code fit? Analysis? Design? Testing? This would seem to be a pretty important distinction to make, and yet we have traditionally left it in the construction position.

So, given our failure rate, why have we continued to develop software like this?

  • + Share This
  • 🔖 Save To Your Account