Home > Articles > Data > SQL Server

SQL Server Reference Guide

Hosted by

Toggle Open Guide Table of ContentsGuide Contents

Close Table of ContentsGuide Contents

Close Table of Contents

Building a SQL Server Lab

Last updated Mar 28, 2003.

We're all busy. With all a data professional has to do it seems that the first thing to suffer is learning and testing features on products, or learning new products. But I agree with Abraham Lincoln, who said "If I had six hours to chop down a tree, I'd spend the first four sharpening my ax." What he meant by that is the more you prepare, the better the end product. While that sounds obvious, deadlines and time pressures often get in the way, and we just seem to put things into production before they are fully tested. But many times this lack of practice and familiarity comes back to bite us.

Perhaps you do take time to prepare, and you thoroughly test things before they go into production. If so, good for you. But if you take the time to fully test and prepare things, you may not have enough time (or a place) to "play" with new technologies.

In both cases, you need a lab. And in this tutorial I'll explain what I've done in the past to create an environment that I can use to evaluate software, and even hardware. You might wonder how this subject could command an entire article. I mean, can't you just toss some older equipment in the corner and use that to test and play? You can, and in the smaller shops where I've worked in the past, that's exactly what I've done. But I've come to believe that even in very small shops it's almost as important to have a formal testing and evaluation system as it is to have a well-designed production environment.

Before we start, I want to make sure I differentiate a lab system from a testing system.

The testing function is perhaps the most important components in a development environment. In many development shops, the most common arrangement of servers is a development system, a testing system, a staging system, and a production system. Once again I'm using the term "system" here to mean that there may be more than one server, or perhaps a group of servers and even other kinds of hardware that are used to develop, test and distribute software.

The developers normally own the development system, and in some cases they are even administrators on the servers. This server is where the developers have test databases, code, and services that they control and change. Hopefully there is a central system for code check-in and check-out, along with strict version control. If the developers are depending on the development environment to provide that level of control, they will probably have a bad day at some point when it becomes corrupt. I'm beginning to see a new trend for development systems. In some shops each developer gets either a physical system or a virtual machine (VM). There's a central system where the images or backups are stored, and the machines are built from that after each major change.

The testing environment is a system where the developers ship the code they've written that is intended for release from the development system. The code is tested by users, by a "test harness" software suite, or perhaps a little of both. Sometimes they test for performance, sometimes for features, and sometimes they test for both.

Some shops include a staging system where the tested code lives before it goes into production. I've seen many shops use this server to test service packs for the operating system and so on.

The production system is normally the final link in the chain, and if it is the system that holds the master for the product, then it's off-limits to developers and testers (or at least it should be). There's often a process that runs to synchronize the changes between production, test and development, so that the developers have a clean environment for the next software build.

A lab system is often a place used by a smaller number of people, sometimes even just you. One of the main functions of a lab system is to try out new things. While your boss will probably disagree, this is a vital part of your job. As technology advances, you should be aware of those advances, because they might make your job faster, and they might save the company money. To convince your boss that this is true, you'll have to show how the new technology does or does not add value. For this environment, performance isn't often a prime consideration. You're often just evaluating functionality, not how quickly something will run.

But the main reason for a lab, perhaps even more important than testing new things, is testing in another sense. Whenever a new patch comes out for a system that you are responsible for, you had better check the patch before you install it on a production server, no matter what a vendor tells you. You should not only test the service to make sure the base function of the server still works, but you'll need to check the functionality of every application you support. If you don't, you'll quickly get yourself into trouble.

At one of the companies I worked at, we had several hundred SQL Servers. We also had in excess of 1,500 in-house developed applications. Whenever the developers wrote the code for those systems, they would sometimes encounter an issue with the operating system, database connection method, virus scan engine or another component that they would "code around." When a patch came out for the affected component, even if it solved a pressing issue I had, I couldn't just implement it on production. I had to test each and every application to ensure that the in-house code still worked. It took a long time to do that, and involved a lot of planning, but it was worth it. Almost every time the developers had to issue a patch of their own to correspond to the patch from the operating system. If I had implemented the vendor's patch without testing, I might have brought down production, and possibly lost data for the company. That's unprofessional, since the company entrusted me to ensure that didn't happen.

It also had an interesting side effect. Because I had a lab where the patches were tested, the developers had a chance to evaluate it themselves. They needed that, because for them, the development system was production — without it, they couldn't do their job. So they appreciated another environment to test in.

With all this in mind, how should you design a lab system?

The first thing you need to decide is the lab's purpose. If the lab needs to test the base platform, such as operating system and database server, then you'll have fewer requirements than if you need to check the applications. In this case you'll need to develop a simple "test harness" that checks functionality for the patch. I've done this in shops where various departments have their own applications, and a team of people that are responsible for them. My job is to verify that the base operating system and SQL Server still function.

In this case I set up a series of tests using scripts that add and remove users from the operating system, add then run SQL Server commands using osql that connect to the server, add the Windows users to a database, run some queries, back up the database, and then drop the users and the backup. I do all this with logging and checks along the way, looping the tests a few thousand times. Whenever a patch comes out, I log on to the lab system from a client and perform these tests, check the logs and report to the various teams on the results. This takes less than a couple of hours, and has saved my bacon more than once.

In the case where our group controlled the applications as well, things get a little more complicated. In essence I'm performing the same kinds of tests as in the base system, but because the applications are involved I have to perform more of them. If you have the same requirements, then you need to use one of the three methods I mentioned earlier. The least impact on the developers is to install the patch on the lab systems, and then have a set of users key in a number of actual application processes. While this is easier on the developers, it's terribly disruptive on the business and takes far too long. Not only that, if the application is complicated it's really not a very good test — the users simply can't test enough functionality in a reasonable period of time.

The better approach is to write a series of code or stored procedures that mimics the activity for the users. These automated scripts or programs should use a known set of variables for inputs so that the outputs can be checked. That way the entire testing process can be automated. Ideally, you would write these "test harnesses" when you write the original code. If you're like me, you've inherited the applications, so it isn't an easy sell to write the harnesses after the fact, but believe me, it's worth it. And even automated testing can't catch everything. You may still need a few "keystroke tests" to make sure everything continues to work.

So that's how you'll use the lab. The next part of the discussion is the nuts and bolts of setting one up. I'll show you how to use either physical hardware or an emulated environment.

Before I do that, however, you'll need software. You're going to need an operating system (whether it's a physical server or a virtual one), all the drivers (for physical systems) and then SQL Server software. You want to stay legal – to put this to bed, there is no “free” full-up version of SQL Server other than Express. So other than installing your production copies illegally or purchasing production licenses for test machines, what options do you have?

You could use “Evaluation” software. This is software you download for Microsoft products that expires with a month or two – sometimes as long as half a year away. It's actually designed specifically for the purpose of test-driving a product before you buy it, so it's usually the newest version of the product, which might not be what you want. Also, you have to periodically re-download, register and install the product again, which takes extra steps that you might not be interested in doing. On the plus-side, you can install everything you need for a lab this way, and the install process itself is something that is also good to practice from time to time. In fact, this is how I learned to do “scripted installations” or as Microsoft refers to them, unattended installations. You can download evaluation editions of the Operating Systems and SQL Server on the product pages at http://microsoft.com.

A far better choice is to purchase an MSDN (Microsoft Developer Network) subscription. This subscription allows you to legally download and install production-code software for use in a testing or evaluation environment. The advantages here are the amazing amounts of choices you have for the software, for a smaller price than buying each package commercially. At the time of this writing and depending on the package, you get things like 5 licenses per each product – not to be used in production, but for you to use on your testing machines. Not only that, you get access to samples, learning resources, support and more. It's really the way to go if you are developing applications on Microsoft platforms. I do know folks that pay for their own subscriptions each year, writing those off on taxes as a business expense. Often companies and organizations will purchase an MSDN subscription for their developers. Re-using a single subscription for multiple developers is illegal – you need to get one for each developer. That being said, your local Microsoft office will work all kinds of deals to get that in place. I really can't over-emphasize how useful the subscription has been for me, not just for the downloads, but for all of the community, samples and training around it. You can find out more here: http://msdn.com.

General Considerations for Your Lab System

You have two major choices for the systems you can use for a lab: Physical and Virtual. I'll come to both of those in a moment. But the general idea for the system is that you want to get it configured to a certain state, “freeze” or stop using it, and then “clone” or copy the contents of the system. You can then run whatever tests you like – even including corrupting the whole thing, without worry, since you can return it to a known state. When the next version or patch or edition of the software comes out, you can repeat the process to “re-freeze” the system to the next desired known state. In fact, I keep several images around, and then swap them in and out of my lab environment based on what I want to do.

The general process is:

  • Set up the hardware (Virtual or Physical)
  • Install and configure the operating system
  • Install and configure all drivers
  • Install and configure utilities (backups, maintenance, firewalls, viri scan, etc)
  • Optional: Install and configure SQL Server – you may delay this if you are planning to re-install manually or have different versions/editions etc. that you want to install
  • Freeze and clone hardware image.

Another consideration, whether the machine is physical or virtual, is that you'll need to start the test system periodically and allow it to pull down the latest updates for the operating system, viri scan, firewall and drivers, and then re-freeze and image the system again so that it can be re-used.

Let's Get Physical

Your first choice is to set up a physical system. If you want to test for performance and have the results be relevant to what you'll see in production, you'll have to mirror the production environment all the way down to the user input patterns. I've heard the arguments that you don't really have to do that, I've tested under those arguments, and found that the patterns matter a lot. I've tried to mimic the patterns; the testing still doesn't show what will really happen in production. It just doesn't work; there are simply too many variables to replicate. This is normally what the “Test” environment is for, not the lab system I'm explaining here.

So should you ever just use hardware? Sure — if you already have the hardware, it meets at least the minimum requirements and it isn't needed for anything else, then use it. The performance number differences you get before the change and after may in fact be representative — just don't depend on it. Again, this isn't your full “testing” platform, just testing for yourself, so in some cases even an old laptop will do. I've built systems and then taken the monitor, keyboard and mouse away and just used Remote Desktop software to communicate with it after the initial operating system installation.

The key is being able to "reset" the system. What I mean is that after a change you need to be able to put the system back to the state it was in when you started.

There are "snapshot" solutions that will take an image of your system and save it as a file so that you can re-apply it later. There are commercial and open-source versions of these tools, and a quick search gives this list: http://en.wikipedia.org/wiki/Comparison_of_disk_cloning_software

You could also just use the Windows operating system's built-in backup utilities to take a full backup, but the “bare metal” restore is more involved, which makes re-creating the system a longer process. Longer and more difficult processes means you're apt not to use them.

Living In a (non) Material World

A better solution for this problem is to use Virtual Machines. I've spoken at length about those in other articles, but using an emulated hardware environment has many advantages.

With a virtual machine you have less hardware, since you can run multiple images on the same system. You also save power, since you're only running on one actual device. You have the ability to reset the system multiple times in a very short time span, and you can even distribute the images to run multiple tests at once. And you can keep multiple images around, since they are just files.

You can also run one on almost any operating system, from Microsoft to Apple to Linux. That means you can use an Apple laptop to run an emulated Windows system to install SQL Server on.

But by far one of the greatest advantages of a VM environment is the reset capability. All of the offerings I'll describe have the ability to “shut down”, and then copy the file created to a new location. You can run the first copy, trash the system if you like, and then simply copy over the original configuration in a very short period of time. Start that back up, and you're right where you started. Even easier, many of these systems have the ability to take a “snapshot” of a running system at a known state, even while the system is running. You can then continue working, and when you're done you can revert to the snapshot at any point.

There are many contenders for running a “Hypervisor”, or a software-emulated computer. I've written an article here on using Virtual PC from Microsoft, a free offering that runs on everything from Windows XP and later. Hyper-V is another solution from Microsoft, which runs on Windows Server operating systems from 2008 and later. You can read more about that here. VMWare has several offerings that run on Windows XP and higher, which you can read more about here. Another popular choice (as of this writing) is Xen, an open-source package – read more about it here, and also VirtualBox, which you can learn more about here. Some of these run on the Windows operating system only, others run on multiple operating systems. Check the links for more information.

In all these cases, there are various drivers, configurations and settings that you should consider for SQL Server. In most cases, these involve performance tuning, so unless you're considering testing performance on a Virtual Machine, you don't have to follow many of these – but they certainly can't hurt. Not only that, many production systems are moving to Virtual Machines, so learning how to tune SQL Server on one is a good idea.

The primary considerations for running SQL Server (or any database system for that matter) on a Virtual Machine are the CPU and the Hard Disk access. For the CPU, you should ensure that your physical CPU on the hardware (often called the “host”) has extensions for running VM software. The Hypervisor you use and the hardware you have make a difference, so check your documentation for those.

The hard drive access (called I/O) is the place where you'll see the most performance loss, unless you take special steps to ensure it performs well. I'll cover performance tuning in a Hypervisor environment in another series of articles.

My choice is to use VM's for just about everything, and most of all in a testing environment. It's ease of snapshot capabilities along with the ability to keep so many images around (even on a DVD in some cases) makes it a compelling choice.