Home > Articles > Information Technology

Unplanned Security: Risk Assessment

  • Print
  • + Share This
Security expert Linda McCarthy shares a case study of security for a hospital computer network and provides a short risk assessment checklist to help you determine if your organization is at risk. Whenever new systems are added, system platforms are changed, or any major organizational modifications are undertaken, you need to redo your risk assessment.
This chapter is from the book

Imagine for just a moment that it's 6:30 a.m. and you're a patient in a hospital waiting for surgery. It's a routine operation to remove your gall bladder (one of those throwaway parts), and no big deal. What you don't know, however, is that the hospital's computer network was recently redesigned. The support staff moved all of the critical applications from the mainframe to a distributed network environment (right-sizing it). In the rush to move from one platform to another, management never developed security policies and procedures for the new systems. So the hospital support staff never configured security. On the surface, the right-sized network is running smoothly. Underneath, however, anyone on the hospital network can steal, modify, or destroy patient information on the servers.

Yesterday, when you were admitted to the hospital, you had some pre-op testing done to make sure that you don't have an infection. They did blood work and a chest X-ray—the standard pre-op stuff. You wake up early the next day, 4:00 a.m., and your surgery isn't for several hours. You wake because you're a little nervous about getting that gall bladder removed. After considering the problems it was giving you, you decide you'll be better off without it. Feeling calm, you fall back to sleep and have a few pleasant dreams.

Six a.m. rolls around. The doctor calls down from the operating room. He tells the nurse that he wants the results of your pre-op tests sent with you to the operating room. Since the results haven't come back to the floor yet, the nurse logs into the computer to get your results. They're normal. Or, at least they are now.

What your nurse doesn't know is that a hacker broke into the server and changed your test results from abnormal to normal. Before the information was modified, the results of your lung X-ray review noted a questionable shadow—maybe just congestion, or maybe pneumonia. Results that would tell your doctor to postpone the surgery to avoid possible complications that could lead to respiratory failure.

Since your doctor doesn't get those results, he operates anyway. Your gall bladder takes the route your tonsils fell to many years ago. It appears to have been a successful operation. That is, until the anesthesiologist notifies your surgeon that he can't seem to get you off the respirator. He orders a repeat chest X-ray which shows a dense pneumonia. He then requests your pre-op X-ray that shows a smaller shadow in the same area. He calls your surgeon wanting to know why he did an elective surgery on a patient with preexisting pneumonia. Your doctor can't be reached because he is busy filling out your death certificate. Guess what? Your lungs gave out—you're dead.

This is one case when the safety of the data means more than protecting information—it means protecting lives. Pretty scary when you consider just how much real hospitals rely on their computers. Just consider?

Transition Plan

Like many other institutions in its league, Rockland General decided to advance its computer operations by right-sizing its network. The plan was fairly straightforward. Roll the legacy systems (mainframes) out the door; roll in advanced architecture to move Rockland into the 21st Century. In preparation, Rockland's staff designed and installed a high-performance distributed network. Then, as planned, they rolled out the mainframes.

Rockland's executive staff saw the right-sizing effort as a huge success. The MIS managers who spearheaded the effort, Joe Davis and Marlene Schmidt, were promoted and glorified throughout the medical profession. Other hospitals sought their advice for similar projects. Joe and Marlene were so successful that they founded their own company, assisting hospitals worldwide with right-sizing needs.

When Matt Borland took over the new systems, he soon found that all was not as great as it appeared to be. Since Joe and Marlene left the hospital as heroes, it was only a matter of time before Matt took the heat for the mess they left behind.

In a previous life, Matt had been a system administrator. So he understood what it took to keep systems up and running. He felt lucky to take over the computer operations for Rockland General. He was advancing in his career, now a third-level manager, and this was an opportunity for him to advance another rung on the corporate ladder. Joe and Marlene's right-sizing efforts had enshrined them as heroes with prior management, and Matt knew that hard work would pay off for him, too.

What Matt soon discovered was that Joe and Marlene painted a pretty picture for the world. What they left behind, however, was a high-risk computer room with tons of proprietary information available to anyone on the hospital network to copy, modify, or steal. Since Matt had thought that he was taking over a top-notch system, he wasn't happy at this discovery.

If you've recently taken over the support responsibilities for new systems, maybe you should take a closer look at Matt's predicament. Do you know where the high-risk systems are on your network? Did your predecessors leave you a security nightmare that you don't know how to deal with?

In the introduction to this chapter, I got a bit melodramatic with the death by poor system installation. Unfortunately, that really could happen.

Every day, managers, CIOs, and system administrators take over systems that were installed by other people. Unless they actually audit those systems (something done only rarely), they simply assume that the systems are safe. That's a risky assumption. It's difficult to place the blame on previous management and support personnel after you have owned systems for six months or more. When you take over new systems, immediately test those systems to find out where you stand. Regardless of what Mo, Larry, or Curly might have done in the past, at the end of the day, it's now your system and your job on the line. If you're a manager, you are responsible for the reliability and integrity of the data. If you're a system administrator, accusing fingers will always point at you. After all, don't you support the systems?

Matt took over management of operations, but he was so busy making sure that the systems were available (because that was his #1 goal), that he never considered security to be a problem. Lucky for him, an unscheduled audit of the computer room uncovered the risks. If that hadn't happened, he might have lost his job, or at least his good reputation. And who knows, someone could have lost their life.

It was strictly by chance that Rockland General's management got a view of the real picture. Unless you have a good view of the security state of your network, you might not be so lucky. So let's take a closer look at the details of this audit.

Day 1: Testing Security

When Matt took over management of the hospital's computer operations, he started to look at ways to improve network performance and support. It was very important for him to keep the system up and running. In fact, that was one of Matt's goals—system availability.

System availability is an important goal. If you can't access patient information because the systems are down all the time, you have a problem (and a highly visible one at that). Matt made sure that he had the right tools to report network availability. The operations crew ran daily and monthly reports on the availability of the systems. One system administrator was even responsible for sending those daily and monthly reports on to Matt. Matt took a keen interest in the performance and availability of the network.

At about the same time, the hospital's auditors decided to conduct an unscheduled security audit. As part of their approach, no one from computer operations was notified of the audit—not even Matt.

It's funny how you can work at a company for ten years and not even know who the internal auditors are. They really do exist. And, they can show up at the drop of a hat. That is, if they smell risk in your area. Auditors are a different breed altogether. They look for risk, report the bad stuff, and try to reduce the risk.

Maria Plank, Rockland General's audit manager, hired me to conduct an audit on their computer room. Like most hospital auditors, Maria didn't understand the systems side of the house. That wasn't a problem for her. She could smell risk a mile away. It was in her blood. She heard some rumblings at a high level about risk in the computer room and decided to hire someone to conduct the audit for her. She didn't need to figure out what (if anything) was at risk: all she needed were the results from the expert. That's where I stepped into the picture.

Understanding Risk

When I spoke with Maria, she didn't give me much information. She just said she suspected risk. I asked her for a network map of the computer room and a list of the suspected high-risk systems.

Maria set up a visitor's office for me with a phone and system. She also gave me an account on the system—just in case I wanted to write my report there. That was nice. I took a look at the network map. Wow, they had a ton of database servers. That wasn't surprising, since that's what happens after a major right-sizing effort. What was surprising was that none of those servers were assigned risk classifications. That is, none were marked critical, mission critical, or noncritical.

Since Maria had no idea which servers were high risk, I needed to discover that myself. There are two ways to get that information. You can log into the servers and look around to see what they are storing. (This method takes a lot of time when you have a lot of servers.) Or, you can ask the system administrator. I couldn't do that because he wasn't supposed to be aware of the audit. Back to approach one. At this point, I was playing a game. The goal of that game was to uncover as much data and risk as possible before being detected.

Phase One: Physical Security

To start the game, I needed to put on a suit and dress the part. After all, my first goal was to get into the computer room without authorization. When I put on a suit, Bingo! I look like I belong.

Maria offered to sign me into the computer room, but I turned her down. An important part of my audit would be seeing whether I could let myself in without attracting suspicion. That's why I wore the suit.

Phase Two: Getting Past Physical Controls

I asked Maria to wait for me in my office and told her I'd call if I couldn't get in. It was easy to tell that she liked my approach. (I actually think that she wanted to go with me just to see me get away with it. But she knew that it wouldn't work if she were there.) If it did work, that would say a lot about security right off the bat. Surely a setup that gave virtually anyone access to the main computer room without authorization would indicate very high risk. If I succeeded, Maria would already have her money's worth. Any other risks I found would be icing on the cake.

With that in mind, I made my way down to the basement computer room. On paper, you had to have a badge with access enabled to get in. I picked up the phone by the entrance and waited for a guy standing in the computer room to answer.

When he answered, I informed him that I'd been sent by internal auditing to check on some systems. He immediately opened the first door and welcomed me. There were actually two sets of doors to get into the computer room, meaning, two levels of security. As I passed through the second set of doors, it occurred to me that neither level was being very effective. There were several main consoles to the left where my greeter seemed to be working. After introducing himself, he walked back to his phone to continue the conversation he'd been having when I knocked. Mission accomplished. He was distracted. I looked official. I was in.

The servers were lined up in tidy little rows and the place looked very clean. Not one piece of paper was left out. Nothing was left on the printers, either. Even the floors were spotless. There weren't even any cables hanging from the ceiling; every cable must have been hidden under the floors. You could tell that these guys put a lot of work into making this computer room look good.

I wandered up and down the rows looking for a monitor that someone might have forgotten to log off. No luck. I'd have to get onto the network another way. I thanked the guy who'd let me in, flashed a smile, and walked out.

Even though I couldn't easily access information once I was in the computer room, I could have left a bomb and destroyed their entire operation. You never know. That's why good physical security is necessary.

Although their physical security left a lot to be desired, Rockland General obviously kept a very clean computer room. Or, at least it was clean on the day I showed up. A week later, I might have found it littered with patient files. But for today, they were running 50/50 on my tests.

Phase Three: Unauthorized Access

Walking back to my office, I wondered how I could access those systems in the computer room. When I got back to my office, I logged in. I glanced at the network map to see if I could identify a system that might contain some juicy data.

Sometimes, people give their systems obvious names (like payroll) so that you know what data's on them before you even log in. Not here. The systems were all named with a letter/number combination (PR1, PR2, etc.). No clues.

I started probing a random system for information. Wouldn't you know it? I was able to access the system. I pulled out a travel floppy from my brief case and loaded some of my favorite tools onto the system. Tools make life easy. A few good tools can make all the difference in the world. My plan was to try to get into a system as a regular user, break root, and take over the system.

I began by testing to see if any of the systems in the computer room trusted me. In this case, trusted means that those systems were set up to trust my system. A trusted system allows you easy access without a password. (Trust relationships on networks can be dangerous, because if a hacker breaks into one system and 50 other systems trust that system, the hacker can then log into those 50 systems without a password.) When the script was done running, the results showed that I was trusted by ten systems. Not too bad for me. But definitely bad for the hospital, bad for the data, and bad for the patients.

I was in. Once I have a login to a machine, I have a pretty good chance of obtaining data. Just as I logged into the first machine, PR1, Maria strolled into my office. I told her that I was able to get into the computer room without any problems, but hadn't been able to access any data from there. She was incredulous that just anyone could walk in. That's when I explained to her my reasons for wearing the suit.

Moving on, I updated Maria on my successful approach to entering a system in the computer room. I let her know that I'd spend the rest of the day gathering data. I asked her to set up meetings with the lead system administrator and operations manager for the next day. She agreed, looked pleased, and walked away.

It was only a matter of time, about a minute, before I had full control of the system. If you're familiar with break-in techniques, that probably seems pretty long. In any case, the ten systems I had easy access to were running an old version of the operating system. (Older versions of operating systems can leave a system vulnerable, because old security bugs that hackers can easily exploit most likely exist.) It looked like those systems were running applications that hadn't been ported to the new version of the operating system. At least, that was my guess.

I filed that thought and began looking for access to the rest of the systems. You know, it's amazing how quickly time flies when you're having fun. Before I knew it, it was approaching 5:00 p.m. At any time, I expected Maria to stop by and walk me out. I was ready to leave. Tallying up my successes over the day, I'd been able to gain access and obtain full control of 60 servers. It seemed that Rockland's system administrators had installed their systems right out-of-the-box without configuring security or adding patches. They also made a lot of my work easy by configuring a good number of the servers to trust each other.

As a potential patient, I was beginning to find the situation scary. All of the critical systems were accessible and so far as I could tell, there weren't any audit trails. (An audit trail identifies activity of users or actions involved.) An experienced hacker could run rampant and leave without ever being detected. After all, I'd personally cracked 60 servers today and no one seemed the wiser.

Maria showed up at 5:15 p.m. I didn't give her the full story yet. I let her know that I was able to get into some systems but was still gathering data. Sometimes it's best to get to the end of an audit before you share the information. I also hate to pass on bits and pieces before I have all the facts.

Maria let me know that she'd set up interviews for the next morning. I would be meeting with the operations manager, Matt Borland, and the system administrator, Jill Rosenberg. Since Maria was so prompt in her scheduling, I would need to finish my testing after the interviews.

Day 2: Personal Information at Risk

I met with Matt first. He seemed like a nice guy, but clearly interested in upward mobility. Sometimes you meet people in business and just know that's their agenda. Of course, that was his agenda, not mine. My agenda was understanding the risks and gathering data. Matt didn't have much information for me. He had other managers reporting to him, but I didn't want to waste my time talking to them. I finished the small talk with Matt and decided to move on.

At this point, Matt passed me to Jill, the system administrator supporting the systems. Jill was calm, but hardly thrilled at being interviewed and audited. (I can't say that I blame her, but someone has to do it!)

I began the interview by requesting the policies and procedures. She had them ready for me. For the most part, the documentation looked good. However, the security section was very short (almost nonexistent). Jill explained that they were currently working on that part.

I was curious how they'd configured the systems for security during right-sizing without writing the procedures first. They hadn't. Management knew that there was no security, but the schedule was tight, so they decided to address security issues later. As a result, Rockland had been operating the new network for over a year without security.

In addition to not securing the systems, Rockland's staff had never classified them. My next job was to grill Jill on system contents. Getting those facts from her would allow me more time for testing, information gathering, and report writing. I needed to know which systems contained the most critical information, what kind of information that was, and why she considered it critical. That information would let me target the most important systems for my audit.

Jill knew where some of the critical data was, but it hadn't occurred to her to add a higher level of security to those systems. From Jill's data, however, I now understood where the juicy stuff was.

In my career, I have often seen auditors ask the staff which systems they want the audit tools run on. Sometimes the staff answers honestly. Sometimes they don't. Even without direct subversion, however, support staff don't always understand the high-risk areas on their own networks. So, even if someone tells you that DS19 is the highest-risk system, you still need to verify that information.

Jill pointed out that the patient records were on PR1 through PR10. Aha! Now I knew what the PR was for—Patient Records. I guess I just hadn't given it enough thought before. Anyway, those systems would be considered mission critical and should have security controls in place. As I mentioned earlier, though, those had been the first systems I'd broken into.

Jill's information was right on the mark. I verified that by checking the operating systems and system types, and by examining the data that the systems were holding. Having verified that step, I decided that I had enough data to write a report. Of course, there were a lot of security problems. But the top problems on my list were:

  • No one had ever completed a risks assessment.

  • The policies and procedures were incomplete.

  • Systems containing highly sensitive information had been installed right out-of-the-box.

  • Data could be easily modified, stolen, or destroyed without a trace.

Obviously, no one had paid enough (or any?) attention to the risks of change, destruction, or theft of data when the data was moved from the mainframe to a server environment. As a result, all of the patient records were at risk.

Summary: Plan Outsourcing Carefully

Moving systems from one platform to another is not an easy task. Before right-sizing a computing environment or moving systems to a new platform, a risk assessment needs to be done. In addition, new policies and procedures must be developed to reflect the new environment.

At the same time, system administrators need to be trained on how to provide security within the new system.

The old management in this scenario really played their cards right. With great pizazz, Joe and Marlene built the system, collected the applause, and moved on. Unfortunately, the new network they designed contained some pretty major security problems.

In real life, the cards that Joe and Marlene played are not that unusual. In network design, security often drags down the schedule and blows up the budget. Even worse, it doesn't win the type of attention that precedes big promotions. After all, management doesn't really want to be reminded of what can happen when security fails.

  • + Share This
  • 🔖 Save To Your Account