Software Engineering with Microsoft Visual Studio Team System: A Value-Up Paradigm
- May 19, 2006
- "A theory should be as simple as possible, but no simpler."
Figure 1.1 Einstein’s Theory of Special Relativity was the focal point of a paradigm shift in our understanding of physics. It capped forty years of debate on the most vexing technical challenges of his day—how to synchronize clocks and how to accurately draw maps over long distances.
A Paradigm Shift
Paradigm shifts come in fits and starts, as old theories can no longer explain the world as observed.1 A poster child for the scientific paradigm shift is Albert Einstein’s Theory of Special Relativity, published in 1905. Einstein’s work reduced Newtonian mechanics to a special case, settled forty years of debate on the nature of time and synchronicity, and set the agenda for much of science, technology, and world affairs of the twentieth century.
According to a posthumous legend many of us learned in school, Einstein was a solitary theoretician whose day job reviewing patent applications was a mere distraction from his passionate pursuit of physics. Yet this popular view of Einstein is misguided. In fact, the majority of patent applications that Einstein reviewed concerned the very physics problem that fascinated him—how to synchronize time over distance for multiple practical purposes, such as creating railroad schedules, maritime charts, and accurate territorial maps in an age of colonial expansion. Indeed, the synchronization of time was a great technological problem of the age, for which special relativity became a mathematical solution, capping decades of debate.
Einstein was not the only person to solve the mathematical problem in 1905—the far more prominent Henri Poincaré produced an alternative that has long since been forgotten.2 Why is Einstein’s solution the one taught in every physics class today? Poincaré’s calculations relied on the "ether," a supposed medium of space that had pervaded nineteenth-century physics. Einstein’s Special Relativity, on the other hand, used much simpler calculations that required no ether. This was the first notable example of the principle attributed to Einstein, also posthumously, that "a theory should be as simple as possible, but no simpler."
Three Forces to Reconcile
A shift similar to the contrasting views of physics 100 years ago has been occurring today in software development. On a weekend in 2001, seventeen software luminaries convened to discuss "lightweight methods." At the end of the weekend, they launched the Agile Alliance, initially charged around the Agile Manifesto.3 Initially, it was a rallying cry for those who saw contemporary software processes as similar to the "ether" of nineteenth-century physics—an unnecessary complexity and an impediment to productivity. Five years later, "agility" is mainstream. Every industry analyst advocates it, every business executive espouses it, and everyone tries to get more of it.
At the same time, two external economic factors came into play. One is global competition. The convergence of economic liberalization, increased communications bandwidth, and a highly skilled labor force in emerging markets made the outsourcing of software development to lower-wage countries (especially India) profitable.4 The Indian consultancies, in turn, needed to guarantee their quality to American and European customers. Many latched onto Capability Maturity Model Integration (CMMI) from the Software Engineering Institute at Carnegie Mellon University.5 CMMI epitomized the heavyweight processes against which the agilists rebelled, and it was considered too expensive to be practical outside of the defense industry. The offshorers, with their cost advantage, did not mind the expense and could turn the credential of a CMMI appraisal into a competitive advantage.
The second economic factor is increased attention to regulatory compliance after the lax business practices of the 1990s. In the United States, the Sarbanes-Oxley Act of 2002 (SOX) epitomizes this emphasis by holding business executives criminally liable for financial misrepresentations. This means that software and systems that process financial information are subject to a level of scrutiny and audit much greater than previously known.
These forces—agility, outsourcing/offshoring, and compliance—cannot be resolved without a paradigm shift in the way we approach the software lifecycle. The modern economics require agility with accountability. Closing the gap requires a new approach, both to process itself and to its tooling.
What Software Is Worth Building?
To overcome the gap, you must recognize that software engineering is not like other engineering. When you build a bridge, road, or house, for example, you can safely study hundreds of very similar examples. Indeed, most of the time, economics dictate that you build the current one almost exactly like the last to take the risk out of the project.
With software, if someone has built a system just like you need, or close to what you need, then chances are you can license it commercially (or even find it as freeware). No sane business is going to spend money on building software that it can buy more economically. With thousands of software products available for commercial license, it is almost always cheaper to buy. Because the decision to build software must be based on sound return on investment and risk analysis, the software projects that get built will almost invariably be those that are not available commercially.
This business context has a profound effect on the nature of software projects. It means that software projects that are easy and low risk, because they’ve been done before, don’t get funded. The only new software development projects undertaken are those that haven’t been done before or those whose predecessors are not publicly available. This business reality, more than any other factor, is what makes software development so hard and risky, which makes attention to process so important.6