Why Design Patterns Still Matter
- From Physical Architecture to Code
- Taming the Maintenance Beast
- The Big Split: Those Who Know Patterns and Those Who Don't
- Where Development Is Headed
If you do a book search on "software patterns," you'll find a slew, from such wide-ranging walks of life as Agile coding patterns, quality code patterns, design patterns in Ruby, patterns of enterprise architecture, project behavior patterns, and hundreds more.
The grandparent of them all, the big book, was Design Patterns: Elements of Reusable Object-Oriented Software.  Today I'd like to introduce the very real possibility that the best code you've ever worked on was influenced by Design Patterns; and that any terrible code, code that made your eyes hurt, could have been fixed if its developers had been influenced by the literature. Yes, it's that powerful.
Let's explore how design patterns have changed the world, starting with where they began and moving forward.
From Physical Architecture to Code
When architect Christopher Alexander proposed a pattern language in 1977,  he was thinking of the gateway to a house, a half-hidden garden, common areas, and other ways to lay out space. The idea was that how to make use of space could be communicated from builder to customer and back in a more meaningful way and with less effort if we had some shared terms. With patterns in hand, we can discuss powerful ideas more easily. It took the "Gang of Four"—Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides, to apply those ideas to computer programs.
The timing for patterns couldn't have been better. The year was 1994. C++ was reaching early mainstream as a programming language, and people were actually developing programs with it, but the programming advice was very thin. I speak from experience! At the time, I was taking computer science course CS220 at Salisbury University, where we learned that object-oriented programming meant inheritance, encapsulation, and polymorphism, but we didn't really learn what good object-oriented code looks like. The samples in our textbook typically modeled the real world: We could create a jar class to hold objects (I think it was an array of pointers to objects); or a car class, which contained an engine—it didn't inherit from it. None of those examples were deep enough to write a complex computer program.
At the same time, the graphical user interface was taking off. Suddenly programs moved from keystroke input to event-driven input, where multiple events could fire at any time. Programming-language designers tried to keep up with message-passing and events, but the end result was typically either very complex programs (Visual C/Qt) or too high-level and abstract (Visual Basic/TCL).
Design patterns provided a language for thinking about code, along with a set of "recipes" to follow. Best of all, they were timely; the subtext behind the patterns was how to implement an event-driven, GUI word processor, talking about it in terms of patterns. Windows 95 was just around the corner, the gold rush was on for windowed applications, and design patterns promised to tame the complexity beast, along with that other beast of software—maintenance.
Why maintenance? Remember, this was the 1990s. Programmers wrote design documents, which they decomposed into objects, and then wrote functions for those objects, something called top-down design. By not writing code until the function was well-defined and had a single responsibility, programmers ensured that functions were "small," at a few dozen lines of code, and classes "reasonable," perhaps a few hundred.
After the project, of course, something would happen. The customer would want a new subtotal on a report, so the programmer would add a new parameter to the function to split on, and an if statement, a for loop, or something like that on the outside. The next week, it would be another variable; the next month, a new if statement, and one more set of indentation. Eventually the code became too complex for anyone to understand a function easily; changes made here had unintended consequences there—unless programmers were incredibly careful, new bug fixes were likely to introduce even more bugs.