Writing a book is far easier than writing software. If the text in a book should have “bugs” such as ambiguities, inconsistencies, or worse, contradictions, you the reader might be annoyed, even angry, but you will still have your wits about you. Simply shrug your shoulders, turn the page and read on. This is because, as a human, you are a perceptive creature and can deal to a greater or lesser extent with the paradoxical and ambiguous nature of reality. Computers are not nearly so lucky. As Peter Drucker, a legendary management consultant, pointed out in The Effective Executive more than 40 years ago, computers are logical morons. In other words, computers are stupid. This is the first important realization toward protecting modern infrastructure. Computers are stupid because logic is essentially stupid: logic only does what logic permits.12 Computers do exactly as they are instructed by software, no more and no less. If the software is “wrong,” so too will be the computer. Unless the software developer anticipates problems ahead of time, the computer will not be able to simply shrug, turn the page, and move on.
Computers cannot intrinsically deal with ambiguity or uncertainly with as much deft and acumen as humans. Software must be correct, or it is nothing at all. So whereas humans live and even thrive in a universe full of logical contradictions and inconsistencies, computers live in a neat, tidy little world defined by logic. Yet that logic is written primarily by perceptive creatures known as software developers, who at times perceive better than they reason. This makes the radical malleability of software both blessing and bane. We can do with software as we will, but what we will can sometimes be far different than what we mean.
The radical malleability of software also poses additional explanatory complications. Software is like cement because it is being injected into the foundation of civilization. Software is also like a swimming pool because people opt to use it even though statistics tend to show the high private and social costs of its use. In fact, in this book, software is described to be like automobiles, DNA, broken windows, freeways, aeronautical charts, books, products, manuscripts, factories, and so on. Software might even be like a box of chocolates. You never know what you’re going to get. With software, all analogies are fragile and incomplete.
Cement is an imperfect analogy for software, but so too is just about everything else, which means analogies used to understand software tend to break quite easily if over-extended. The radical malleability of software means any single analogy used to understand software will be somewhat unsatisfying, as will, unfortunately, any single solution employed to solve the problem of insecure software. As a universal tool, software can take far too many potential forms for any one analogy to allow us to sufficiently grasp software and wrestle it to the ground. This challenge is nowhere more obvious than in the judicial courts of the United States, which reason by analogy from known concepts.13 This does not mean that software cannot be understood, simply that significantly more mental effort must be applied to think about software in a certain way, in the right context, and under the relevant assumptions. As such, this book may liberally switch between analogies to make certain points. This is more the nature of software and less the idiosyncrasies of the author (or at least, I would hope).
Finally, there are many different kinds of software: enterprise software, consumer software, embedded software, open source software, and the list goes on. Experts in the field prefer to distinguish between these types of software because each has a different function and different relevancies to the tasks they are designed for. Such is the radical malleability of software.
A fatal flaw of any book on software, therefore, is the lack of deference to the wild array of software in the world. The software in your car is different than in your home computer, is different than the software in space shuttles, is different than software in airplanes, is different than software in medical devices, and so on. As such, one can argue that the quality of software will differ by its intended use. Software in websites will have different and probably lower quality than software in airplanes. And this is true. There is only one problem with this reasoning: Hackers could care less about these distinctions.
At the point when software is injected into a product and that product is made available to the consumer (or in any other way allows the attacker to touch or interact with the software), it is fair game for exploitation. This includes automobiles, mobile phones, video game consoles, and even nuclear reactors. Once the software is connected to a network, particularly the Internet, the software is nothing more than a target. As a case in point, two men were charged with hacking into the Los Angeles city traffic center to turn off traffic lights at four intersections in August 2006. It took four days to return the city’s traffic control system to normal operation as the hackers locked out others from the system.14 Given that more and more products are becoming “network aware;” that is, they are connected to and can communicate across a digital network, software of any kind regardless of its intended use is fair game in the eyes of an attacker. As William Cheswick and Steven Bellovin noted in Firewalls and Internet Security, “Any program no matter how innocuous it seems can harbor security holes…We have a firm belief that everything is guilty until proven innocent.”
This is not paranoia on the part of the authors; this is the reality.
Therefore, I have chosen to distinguish primarily between two types of software: software that is networked, such as the software on your home computer or mobile phone, and software that is not. The software controlling a car’s transmission is not networked; that is, it is not connected to the Internet, at least not yet. Though not connected to the Internet, weaknesses in this software can still potentially harm the occupants as I illuminate in Chapter 2. But it is only a matter of time before the software in your transmission, as with most all other devices, will be connected to a global network. Once connection occurs the nature of the game changes and so too does the impact of even the tiniest mistake in software production. That software has different intended uses by the manufacturers is no excuse for failing to prepare it for an actively and proven hostile environment, as Chapter 3, “The Power of Weaknesses,” highlights.
Finally, the radical malleability of software has moved me to group multiple aspects of insufficient software manufacturing practices such as software defects, errors, faults, and vulnerabilities under the rubric of “software weaknesses.” This might appear at first as overly simplistic, but for this type of discussion, it is arguably sufficient for the task at hand.