Get real, quantifiable value from your IT investments!
This year, more than $1,000,000,000,000—one trillion dollars—will be spent on information technology. Yet few organizations are confident of the value of these enormous investments. It doesn't have to be that way. Best Practices in Information Technology offers a comprehensive approach to finding and applying IT solutions that work—not just now, but for years to come.
In this concise, actionable book, IBM management expert James W. Cortada shows how to implement IT best practices as a long-term strategy, not a one-time fix. You'll learn:
Cortada reviews every major activity that IT organizations perform, showing how to continuously find new ways to maximize cost-effectiveness and customer value. He focuses on areas where best practices can deliver the most powerful payoff, including: reducing labor costs, shortening cycle times, and enhancing service. Discover how to:
The competitive frenzy unleashed a decade ago shows no signs of abating: companies must constantly improve, or they will die. Best Practices in Information Technology shows you the secrets of building an IT organization that will keep improving—today, tomorrow, and well into the 21st century.
There are two things we can all agree on when it comes to computers: We spend a lot of money on them and they are too hard to use. Many people would add a third: “I am not sure I am getting real value for my information technology investment.” The third point is keeping today's Chief Information Officers (CIOs) and Chief Executive Officers (CEOs) awake at night. But let's begin with several facts.
First, worldwide, over $400 billion has been spent on hardware and software and another $600P700 billion using them during the mid-1990s. Most medium to large organizations in the industrialized world today spend between 1.5 and 5 percent of their total budgets on Information Technology (I/T). PCs alone sell by the tens of millions each year; in fact, in some countries like the United States, sales are approaching or exceeding the number of television sets purchased. This is big business.
Second, the toughest barrier to break through for the maximum exploitation of computers to their ultimate advantage is their relative poor usability. They are still, after 50 years, difficult to install, maintain, and, most important, to use. I doubt you can find a single executive or PC user who doesn't have a complaint about the lack of ease of use of these technologies. And yet they have become so important to our businesses—that we cannot afford to let cost or usability get in the way of our exploiting their power. So we go to extraordinary lengths to overcome problems. Companies support expensive Help Desk functions, 800 hot lines, and assign armies of computer experts to departments to help people connect and use their PCs. Many PC users buy their own books in attempts to use business assets more effectively. Every major PC manual—and there are thousands of them—sell in the millions and in dozens of languages. The bottom line is we really want to use computers effectively.
Third, I, along with thousands of other “computer experts,” spend a great deal of time explaining to senior business management how best to use this technology. Indeed, enormous sums are spent each year by corporations such as IBM and Microsoft, major universities, such as Harvard, MIT, and Cal Tech, and by effective computer users like GE, Westinghouse, and Ford Motors to figure out how to use this expensive technology properly. Whole bodies of knowledge have emerged to help in the search for easier and more effective ways to use computers. The strategists have taught us the importance of aligning technologies with business plans; the technologists have figured out how to give us computers for centralized, decentralized, distributed, and network-centric computing. The quality experts have taught us the value of process management and the value of sensible measures applied in practical ways. In other words, we can align our strategies, technologies, and processes to just about any size, flavor, cost, or shape that we want. Whole countries are betting big chunks of their economic futures on getting it right: India with programmers and East Asia with its hardware manufacturing, the United States with software and basic R&D.
This is all well and good, but for those of us who have to run businesses, solve problems, get basic information to make decisions, and exploit technology before our competitors do, there is no time to become an expert in the one area we spend so much money on and yet feel so insecure about. For both this community and the business-focused managers who are increasingly coming to dominate information technology operations, a new strategy is emerging, born out of the quality management movement and the popular application of benchmarking techniques. Simply put, the new strategy is the application of “best practices.” It has become quite fashionable to copy what someone else has figured out, modifying it to fit one's own operations. It turns out that across many organizations, functions, topics, and issues, finding out what others are doing well and attempting to replicate their features—best practices—are emerging as a practical strategy for improving overall organizational performance in the immediate future.
This book is about best practices in I/T. It will not give you the answer, but I will show you how to arrive at it because the answer will keep changing and I would like to have you not lose sleep at night because of that fact. Constantly applying best practices makes it possible for you to discover and then achieve the ways to get the most value out of your investment in computing. That is the long and the short of why I wrote this book.
This book follows in a long string of volumes I have written over the past two decades on how to improve productivity and effectiveness of computing. But more specifically, it is a companion to a book I published in 1995: TQM for Information Systems Management: Quality Practices for Continuous Improvement. In that book I documented many wonderful management practices that were seeping into the world of computers and specifically into the information technology organizations of large corporations. My original intent was to say, “look, the quality management folks have clearly demonstrated important ways to improve the value delivered by I/T organizations.” I accomplished that objective to the extent that the reader could walk away from the book knowing what to change and with a sense of what other people were doing. I did not want to write another treatise on how to do process reengineering or change corporate culture through seminars, posters, and empowered teams. It was enough to say that those things were going on, why, and what benefits people were enjoying.
But what became obvious to me after completing that book was the fact that the quality management practitioners were leading with a powerful strategy—best practices. They would benchmark what others were doing, conduct literature surveys, interview people, bring home their findings, and look for ways to duplicate them. By the way, this was also going in sales, marketing, manufacturing, finance, personnel management, and in accounting. The various activities have been merging into a best practices strategy and it is now rearing its head in computing. The TQM book had the usual collection of war stories about who was cleverly doing what in computing. What was needed next, however, was a more explicit statement of what are the best practice strategies in I/T, a better sense of what are best practices in key areas of concern to management, and then finally, how to apply those strategies on an ongoing basis since the examples in the chapters that follow will, in time, be yesterday's news. So what is needed is an explanation of best practices, and also insight into patterns of best practices to look for in I/T. Providing some easy-to-use nontechnical methodologies and tools to apply best practices in I/T also appeared of imminent importance.
Let's discuss what a best practices strategy is not. It is not a one-time event in which you go out and find how someone is doing something very well, drop it into your organization, and then declare victory. It is not a substitute for doing the hard thinking about where your business is going and why (visioning). It is not a substitute for continuous improvement. What it is, however, is a strategy for finding and applying the best thinking and experience independent of your own and that of your company. Well executed, best practices in I/T and in every other functional area cause you to continuously keep up with what others are doing well, learning from their mistakes, and giving you confidence and a path to your own practices which are better than anyone else's. Those who implement best practices the worst way are those who use it merely to look through the rear view mirror on what to do. Those who use it best look over the bow of their ship to what is ahead. I cannot emphasize this point enough because too many people fail to realize that the effective use of best practices lies in getting you to a novel future.
The foundation for this book is built on three sources of information. First, there is my own personal work as manager, consultant, and user of computing for a quarter of a century. During that period of time, I have studied, implemented, used, and written about the management of computing, learning a few tricks along the way. Second, colleagues at IBM have constantly studied problems associated with the management of information technology for over a half century; their reservoir of studies and insights on what works well is profound and still a highly underutilized body of knowledge by the industrialized world. I have tapped into those pools of best practices to buttress the book, particularly in support of comments I make about general trends. Be assured that those comments are based solidly on surveys of many I/T organizations and practices. Third, secondary research on such topics as strategic alignment, operational practices, and measurements has been made possible by the fact that there has been an enormous growth in the volume of solid research and experimentation in the management of computing over the past decade.
Like all my other books, this one will be light on war stories and heavily focused on management practices. It succeeds or fails to the extent that it makes sense and is actionable. I have written it for management in general, not targeting it at a technical audience. It is a companion to my TQM for Information Systems Management but also can stand on its own. I have purposefully kept the book short—a difficult thing to do since it is far easier to write a 400- to 600-page volume that explains everything in full detail. I kept it short so you would focus only on the most important issues—a key element of a best practices strategy—and not be caught up in the vast quantity of narrowly focused discussions about specific machines, software, application development strategies, and service delivery approaches. The literature on each is vast and has been growing steadily over the past five decades. Yes, the computer has been around for over a half century; it is about time to find out what really works well!
Different audiences can read the book in several ways. While I wrote the book so any business person could read it cover to cover, you have options. The first four chapters are intended for all audiences, particularly management in functional areas trying to figure out what information technology can do, and for I/T management trying to learn what they should do. The same applies to Chapter 10 and marginally for Chapter 9. Chapters 4 through 8 are clearly targeted at the tactical issues related to the management of information technology. At a minimum they should be read by I/T professionals; however, I have material in each chapter for all managers and professionals. For example, Chapter 5, on legacy systems, teaches senior management what value older systems have but teaches the I/T professional what to do with them. The same chapter discusses I/T architectures but points out for senior management why the discussion is not arcane but rather crucial to any strategy they might have to sign up for and spend millions of dollars on. Throughout the book, there are various discussions about the value of I/T, which should always serve as the underlying basis for all important conversations about the role and management of computers, regardless of where one sits in the organization. Each chapter has boxed inserts that tell stories of specific company experiences. I also end each chapter with a box entitled “Implementing Best Practices Now” in which I list several steps you can take right now to get started implementing the ideas found in the chapter. The steps are those normally taken by well-managed companies.
I could not have written this book without the help of so many experts on the wise use of computer technology. Colleagues within IBM and also across the information processing industry have taught me a great deal and so many contributed advice and information for this book. However, I would like to call out the help of several individuals in particular. Mary T. Curnane of the IBM Consulting Group is one of those individuals who I am convinced can create successful I/T strategies in her sleep. She went through this manuscript with a fine-tooth comb making many suggestions for improvement. Tom Jenks, also of the IBM Consulting Group, taught me a great deal about measurements and management systems, critical elements in this book. Mike Albrecht, general manager of consulting services for IBM in North America, went through the manuscript line by line, making many suggestions for improvements based both on his experience in consulting and as an I/T executive. I also want to thank him publicly for contributing the foreword to this book. John W. Dunn, group vice president and chief technology officer at Northern Indiana Public Service Company (NIPSCO), demonstrated how to build massive systems that contributed directly to the bottom line while the I/T organization at Delmarva Power could write a book on how to make computing end-user focused!
This project renews a business relationship with the first editor to publish one of my management books—Paul Becker—at Prentice Hall. I thought he was excellent 15 years ago, but now he is even better. It was wonderful working with him again. I also want to thank Eileen Clark for shepherding the project from manuscript to book and Martha Williams for copyediting my original manuscript. Gene Barone, an I/T support specialist at IBM, prepared all the art work, taking my hand-drawn squiggles and turning them into important messages. Finally, I want to thank my family for making it possible for me to find the time to write this book.
This book has benefited from the help of my many friends across the world of I/T, and from colleagues within IBM. However, it is the product of my thinking and is not a statement of how IBM or my colleagues and friends necessarily feel about computing. If you like the book, thank you; if you find errors of facts, judgment, or effectiveness, that is my fault, but let me know so I can improve my practices. Write to me c/o of my publisher, Prentice-Hall, Inc., 1 Lake Street, Upper Saddle River, NJ 07458, to share your best practices experiences because we are only just starting to understand the power of this strategy. I will find ways to share your good stories with other readers.
—James W. Cortada