Are Private-Sector Organizations Responsible for Failing to Plan for Natural Disasters? (Part 2 of 3)
- The Contingency Planner's Job: Get Ready for Anything
- Understanding the Issues and Bridging the Vulnerability Gap
- What Does This Mean Today?
Understanding the Issues and Bridging the Vulnerability Gap
Even elite and leading-edge commercial organizations may not have taken the next logical step: performing a formal vulnerability analysis of threats by natural disasters to critical infrastructure. We submit that the technology and resources are mature enough now for performing essential vulnerability analysis on the effects of natural disasters on an enterprise. Much of this technology is affordable, and the new emphasis on natural disasters and terrorism mandates its consideration. The time is now for the serious contingency planner to evaluate what can be done to reduce the organization's exposure to natural disasters.
As history shows, every time a new technology or business process is introduced, that paradigm shift inevitably creates a "vulnerability gap." This vulnerability gap is the period of time from the introduction of the new technology until an affordable solution is devised to back it up. Remember Table 1 in part 1 of this series? Let's put the concepts described in that table in terms that most commercial organization should remember and understand:
- 1980. Most commercial organizations are using large mainframe computers and are just beginning to realize that they've outgrown the ability to fall back to paper in the event that the mainframe fails. Organizations realize that a fire in the computer room would be very bad, and they adopt standards for fire-suppression systems and management of combustibles in a computer room. Companies realize that a catastrophic failure of recording media would be disastrous; therefore, standards are adopted and written for backup of data, etc.
Resulting technologies: Mainframe computer recovery centers, offsite data storage companies.
- 1986. Most commercial organizations have begun to connect the mainframe to branch offices. Service bureaus such as Electronic Data Systems spring up. Suddenly, dependence on the public telephone network has become acute. In fact, the reaction of the corporate contingency planner of the early 1980s to recovery planning for the phone company was very much like the reaction often received today to the notion of planning for natural disasters. Many planners considered such an effort to be beyond their scope of responsibility. After all, they paid big phone bills, so why shouldn't the telecom carrier handle its own recovery plan? But the phone company might be more concerned with restoring payphones (access by the public to 911 service) than in restoring the T3 line or frame relay network that supported your entire online network of multiple branches. Therefore, much of the 1980s and well into the 1990s was spent writing operating and security standards for telecommunications. Case in point, Leo's 1990 book, Disaster Recovery Planning for Telecommunications.
Resulting technologies: Telecom companies introduced technologies such as command routing, remote call forwarding, diverse cable routing, synchronous optical networking (SONET), on-demand T1 services, and so on.
- 1995. Mission-critical applications moved out of the relative safety of the mainframe and computer center to distributed local area networks. The industry responded with operating and security standards for LANs. Leo played a role in the transition by publishing Writing Disaster Recovery Plans for Telecommunications Networks and LANs.
Resulting technologies: Computer recovery companies, rooted in mainframes, added client/server systems and technology with more space for people.
- September 11, 2001. No need to elaborate on this date.
Resulting technologies: Creation of the U.S. Department of Homeland Security and other federal emergency-management agencies.
- 2001 to date. Hurricanes Katrina, Rita, Gustav, and Ike. Indian Ocean tsunami, other megastorms.
Resulting technologies: Proactive planning efforts geared at natural disasters, including data validation, storm modeling, and advanced predictive analysis.