Home > Articles > Software Development & Management

How Legacy Architectures Can Foil the Modern Enterprise

  • Print
  • + Share This
Unfortunately, many institutions still rely upon computers that carry significant baggage from the past, such as aging data architectures, to chart a competitive course through a complex and unpredictable future. Learn how outdated computer components and technology can stymie critical future business initiatives.
This chapter is from the book


As businesses and governments greet the new century, they will be relying on technological advancements to deliver a vast array of initiatives across a variety of industries. Institutions charting a course through an increasingly complex and unpredictable future can count on one thing—computers will be their partners in this journey. Unfortunately, many computers carry significant baggage from the past.

This baggage comes in the form of aging hardware, software, and data architectures that prevent organizations from fully exploiting computers and the value they bring an organization, its customers, and its partners. Aging legacy architectures can stymie critical business initiatives while preventing an enterprise from responding to competitive pressures in a timely fashion.

Legacy architectures represent the collective set of application software, data structures, and operational platforms currently running in enterprise computing environments. How organizations deal with these aging legacy architectures will largely determine the depth and breadth of value computers will offer institutions and society in the coming century.

1.1 The Computer of the Future Meets Reality

Computers, for all intents and purposes, have been with us for more than half a century. We are living in a future that was both foreseen and unforeseen by past futurists. It is therefore worth examining where computers fit in relation to today's reality versus the hindsight of history.

The vision of modern computing was originally foretold in science fiction movies of the 1950s, 1960s, and 1970s. Computers had to be small enough to fit on a rocket ship and smart enough to communicate with people. HAL, the computer in 2001: A Space Odyssey, is a good example. In that movie, Arthur C. Clarke envisioned a very powerful computer that would recognize voices and faces, communicate fluently, and control a wide variety of functions.

Many of these predictions came true to varying degrees. Voice recognition works fairly well, and facial recognition is just now being widely deployed. Computers are smaller, can talk to us and to other computers, and automate transportation and environmental systems.

Of course, the ability for a computer to perform deductive reasoning is still primitive when compared to the human mind. Today's computers can, however, help diagnose diseases, aid research, control airplanes, guide missiles, and process vast amounts of data at increasingly phenomenal speeds, to name just a few of their capabilities.

In spite of these advances, there is another side to this story. With all of these modern wonders being supported by computers, why is it that the following situations are still commonplace?

  • Bank balances cannot be updated immediately after a deposit is entered, but must rather be updated during a nightly processing cycle.

  • A long-distance provider's computer instructed different service representatives to contact a customer repeatedly because different databases contained the same, inaccurate information.

  • A healthcare provider's computer sent out cancellation notices when it was supposed to have sent out payment notices. This continued even after the customer reported the problem.

  • Tax payment and tracking systems take well over a year to reconcile statements between what one corporation claimed to pay another corporation and what the receiving corporation claimed.

These may seem like isolated incidents, but I personally encountered these circumstances within the course of doing business over the past couple of years. None of the above scenarios would make good fodder for a science fiction movie, and they are certainly not what Arthur C. Clarke envisioned in his movies about the future. They do bring home the point, however, that many of today's computers are not very smart: They have some major underlying problems.

To a highly valued customer, these scenarios may be a minor annoyance or a major problem. The healthcare scenario in particular was the antithesis of good customer service, and a less than supportive response from the customer service department made the situation even more problematic. This situation also exemplified how humans can magnify computer errors and make a bad situation worse.

When contrasting the challenges above with the promises of modern technology, there is clearly a chasm between the ideal vision of the future and today's reality. The above situations are symptomatic of shortcomings found within "legacy" systems: shortcomings that could ultimately undermine major Information Technology (IT) initiatives across a variety of industries.

If, for example, a bank wanted to deploy real-time banking for customers through the Web, it seems reasonable that it would need to be able to update bank balances in real time. If a long-distance provider wanted to break into local markets or launch new business initiatives that leverage customer data on a global scale, it must be able to store and update customer information in a reliable, nonredundant environment.

Similarly, if a healthcare provider wanted to stay competitive, it would seem apparent that it would stop trying to cancel a customer policy when it really wanted to collect a payment from that customer. With companies in a continuous battle to redefine themselves, which includes shifting toward new markets and offering global services, legacy systems can stifle efforts to undergo strategic transformations to a business.

We are seeing numerous advances, most accompanied by a good deal of hype, in the ability of computers to redefine our lives as we head into the 21st century. The reality of the situation, however, is that there are significant problems within legacy computing infrastructures that must be addressed before many of these visions can become reality.

  • + Share This
  • 🔖 Save To Your Account