Why Do Good Companies Go Bad?
Why do good companies go bad? Honestly, I hadn't thought too much about this question. Then a CEO friend of mine brought up the 62 "excellent" companies praised by Tom Peters and Robert Waterman in their early 1980s bestseller In Search of Excellence. A great many of them—including such stalwarts as Sears, Xerox, IBM, and Kodak—had faced serious hardships in the 20-odd years since. Some of them recovered. Some, as I write, are struggling mightily to recover. Some are dead or, in all likelihood, soon will be.
So why do good companies go bad? This heartfelt and insightful question launched me on a journey of discovery. I started by conducting archival research on companies that had failed during the past several decades, interviewed people from some of the failed companies, and eventually came to the conclusions presented here.
Although it is commonly believed that institutions are (at least potentially) immortal and humans are mortal, I found that the average life span of corporations is declining, even as that of humans is rising. Others have come to similar conclusions. In the best-known work in this area, The Living Company, Arie de Geus found that one-third of the companies listed in the 1970 Fortune 500 had vanished by 1983, either through acquisition, merger, or being broken up. De Geus quoted a Dutch survey showing that the average corporate life expectancy in Japan and Europe was 12.5 years. Another study found declining corporate life expectancy across the major European economies: from 45 to 18 years in Germany, from 13 to nine years in France, and from ten to four years in Great Britain.
Much of the decline in corporate life expectancy is the result of a heightened level of merger and acquisition activity in recent decades. However, most of this activity is due to distress selling rather than strategic buying because so many companies are in trouble.
Let me say at once that I have no intention of discounting the need to learn the underlying causes of success—the "good habits" of good companies. Nor will I second-guess de Geus or Peters and Waterman or others, like Jim Collins. For very good reasons, they singled out certain companies as models of success—companies that, for very different reasons, have since fallen on hard times. My purpose is not to reexamine why these companies were considered "excellent" or "visionary" in the first place. I am interested in what happened to them afterward—why they fell, why they failed, why they lost the magic touch. What happened?
In my view, when companies rise to excellence, they often unwittingly develop self-destructive habits that eventually undermine their success. As with people, these self-destructive habits are learned, not innate, and we can watch as companies adopt patterns of behavior that are self-destructive. Sometimes these habits get worse over time and become, in effect, addictions. But self-destructive habits can also be broken and overcome, and companies can be put back on the road to improved health.
Often the turnaround is precipitated by a crisis. Our self-destructive habits creep up on us, if you will. We overeat, fail to exercise, maybe even smoke, but we think we're still doing okay—until we have that minor heart attack, that potent reminder of mortality. Suddenly our self-destructive habits are gone, and we're eating salads and walking five miles a day. In the case of corporations, the crisis might take the form of an emerging competitor, a sudden erosion of market share, or a technological advance that threatens to leave the company behind. Such developments can spell doom, or they can serve to shake companies out of their destructive behavior patterns.
We'll see plenty of examples of companies that are actively working to curb their self-destructive habits, to change their behavior, as well as companies that have already done so and are "in recovery." Our message is positive: if you're willing to examine yourself honestly enough to discover your weaknesses, you can ultimately transform yourself.
So what are these self-destructive habits? We'll enumerate them one by one in the following chapters (and they're summarized in Figure 1-1). But first, let's see them in action by examining three companies in the technology sector.
Figure 1-1 Self-destructive habits of good companies
It's one of the great success stories in the annals of American business. In 1957 Kenneth Olsen, a 31-year-old engineer at MIT's Lincoln Laboratory, asked for $70,000 from American Research & Development to start a new firm he wanted to call Digital Computer Corp. He got the money, but the venture capitalists made him change the name. They pointed out that too many big companies, like RCA and General Electric, were losing money in the computer business.
So Digital Equipment Corp. set up shop in an old wool mill in Maynard, Massachusetts, and Ken Olsen set about to pursue his dream: to revolutionize the computer industry with the introduction of the "minicomputer"—a smaller, simpler, more useful, and far cheaper device than the bulky mainframes that were the industry standard.
In its first year, Digital had sales of $94,000. Five years later that number reached $6.5 million. In 1977, the company hit the $1 billion mark. Digital found itself leading an industry boom rippling from the Boston area that created so many high-paying jobs it came to be called the Massachusetts Miracle. At the same time, the reputation of its founder grew. He was brilliant and eccentric. He protected his innovative engineers. He instituted a no-layoff policy. Digital was known as "a fun place to work."
No wonder that when Tom Peters and Bob Waterman went "in search of excellence" for what became their 1982 bestseller, Digital not only made the list of excellent companies but was also considered one of the 15 "exemplars" that basically did everything right. It was one of the companies that represented "especially well both sound performance and the eight traits [of excellence]" the authors identified. Such high accolades appeared to be borne out when Fortune magazine, in 1986, declared Olsen "arguably the most successful entrepreneur in the history of American business."
Let's jump ahead to the end of that decade. In January 1989, Digital announced it would introduce a range of personal computers, along with their more powerful cousins, workstations. The question was, had Olsen already waited too long? One thing was certain: the stock was trading at $98, down from $199 just a year and a half earlier. Another certainty was that the minicomputer, the radical innovation on which Olsen had staked his company, was rapidly becoming a high-tech dinosaur. Today it's clear that the writing was on the wall. But Olsen had erased it and scrawled his own message: "The personal computer will fall flat on its face in business." Now his company appeared to be acknowledging its failure to see the future.1
Despite the eleventh-hour about-face, the hemorrhaging at Digital continued through 1991. Top executives were fleeing, and the company that abhorred layoffs was in the process of cutting 10,000 employees from the payroll. By then, Olsen had been in charge for 34 years and still entertained no thoughts of retirement. Instead, he used the annual shareholders' meeting that year to introduce the company's next-generation "Alpha" computer chip, which Olsen claimed was four times faster than the top-of-the-line chip from Intel. But the shareholders probably weren't heartened because the stock was now trading at $59 a share.
In the spring of 1992, the company flabbergasted Wall Street with the news that it had lost $294 million in the quarter that had just ended, only the second time in its history that Digital had reported a loss. Olsen responded with a massive restructuring of top-level management. It didn't help. By the end of April, the stock had fallen to $46, its lowest price since 1985, and takeover rumors were circulating.
That same spring, the Wall Street Journal seemed to be working on its first draft of Olsen's obituary. The Journal noted that a secret meeting between Olsen and Apple's John Sculley—a meeting that might have produced an alliance with much potential for Digital—had come to nothing. Instead, Apple had shocked the industry by inking a broad technology-sharing agreement with archenemy IBM.
The Journal described this as another opportunity apparently lost to Digital and Olsen. His persistent doubts about the PC—"he used to call it a 'toy'"—had crippled the nation's second-largest computer maker when the market turned to PCs. The Journal also noted that Olsen's resistance to another major trend of the last decade—so-called "open" systems that use standard operating software—had similarly impeded the company's performance.
Digital was now faced with the danger of being left behind by the industry it was instrumental in creating. As it struggled with huge losses on declining sales, repeated restructurings, and the exodus of key executives who questioned Olsen's decisions, the company watched its value plummet, with shares trading at one-fourth of their 1987 high.
At the same time, Olsen's autocratic style was drawing widespread criticism. John Rose, who a month earlier had resigned as manager of Digital's PC unit, told the Journal that the company "has everything it needs to turn around—good people, good products and great service—but it won't happen while he's still in charge." And one of Digital's former computer designers described Olsen as the Fidel Castro of the computer industry, adding that he's "out of touch, and anyone who disagrees with him is sent into exile."
One who had fallen into disfavor amid the recent turmoil was Digital's chief engineer William Strecker, who had opposed a mainframe project that Olsen backed, despite the fact that it was proving a costly failure. The disbanding of Strecker's group was viewed as an especially strong signal of disarray in the executive suite. A former Digital manager told the Journal that it was a "criminal shame," because Strecker was the only member of the inner circle who could develop a coherent product strategy.
The Journal suggested that Olsen's support of the ill-fated VAX 9000 mainframe, which cost $1 billion to bring to market but attracted few buyers, was partly responsible for Olsen's failure to work out a deal with Apple. Roger Heinen, an Apple senior vice president who was privy to the meeting, blamed the stalemate on Olsen's disinterest and lack of understanding of the importance of the personal computer industry. The Journal concluded that Olsen's vision of the computer industry was lacking and that his choices were leaving the company at a disadvantage in a market that was rapidly transforming.2
Just two months later, in July 1992, Digital announced that Olsen would retire as president and CEO, effective October 1. Olsen quickly followed with his own announcement that he would also vacate his seat on the board at that time, thus severing all formal ties to the company he had led since its inception. His resignation would also give a free hand to his successor, Robert Palmer, who faced the unenviable task of rescuing a company that had reported a loss of $2.79 billion in fiscal 1992.
Would the seven-year Digital veteran prove up to the challenge? He certainly seemed to be giving it his best shot. After six months on the job, Palmer had reorganized, slashed costs as well as jobs, recruited a new management team from outside, changed the color of the Digital logo, and, most radically, sold the old mill, the company's first and only home base. Palmer also announced a fundamental change in philosophy: a 19 percent spending cut on product development and engineering. No longer would Digital put competing teams to work on the same or similar problems (a practice highly praised in In Search of Excellence). "We have to rationalize our spending, have less redundancy in hardware and software design," Palmer told the business press.3
Early results were promising. In July 1993, the company announced quarterly earnings of $113 million. The stock price was rising back into the mid-40s. Even more important in many analysts' minds, wrote the Washington Post, was that "under Palmer the company is no longer in denial."4
Too little, too late. Ultimately, Palmer couldn't stop the bleeding. In January 1998, the crippled giant was acquired by Compaq—ironically, the world's largest maker of PCs—for $9.15 billion. The great Digital was dead.
All the postmortems agreed that, in the last analysis, the visionary's vision had failed: the company blinked and missed the PC revolution; blinked again and missed the change to open, rather than proprietary, systems; and, in classic denial, continued through the early '90s to pour money into developing a new mainframe.
As C. Gordon Bell, one of the chief engineers in Digital's early days, told the Boston Globe, the company's success bred its failure. "The VAX [minicomputer] took over the company, and what it allowed them to do was not think. No one had to think from 1981 until 1987 or '88 because the VAX was so dominant."5