The Evolution of Systems Management
The landscape in systems and configuration management has evolved significantly since the first release of Microsoft Systems Management Server, and is experiencing great advancements still today. The proliferation of compliance-driven controls and virtualization (server, desktop, and application) has added significant complexity and exciting new functionality to the management picture.
Configuration Manager 2007 is a software solution that delivers end-to-end management functionality for systems administrators, providing configuration management, patch management, software and operating system distribution, remote control, asset management, hardware and software inventory, and a robust reporting framework to make sense of the various available data for internal systems tracking and regulatory reporting requirements.
These capabilities are significant because today's IT systems are prone to a number of problems from the perspective of systems management, including the following:
- Configuration "shift and drift"
- Security and control
- Timeliness of asset data
- Automation and enforcement
- Proliferation of virtualization
- Process consistency
This list should not be surprising—these types of problems manifest themselves to varying degrees in IT shops of all sizes. In fact, Forrester Research estimates that 82% of larger IT organizations are pursuing service management, and 67% are planning to increase Windows management. The next sections look at these issues from a systems management perspective.
Hurdles in the Distributed Enterprise
You may encounter a number of challenges when implementing systems management in a distributed enterprise. These include the following:
- Increasing threats—According to the SANS Institute, the threat landscape is increasingly dynamic, making efficient and proactive update management more important than ever (see http://www.sans.org/top20/).
- Regulatory compliance—Sarbanes-Oxley, HIPAA and many other regulations have forced organizations to adopt and implement fairly sophisticated controls to demonstrate compliance.
- OS and software provisioning—Rolling out the operating system (OS) and software on new workstations and servers, especially in branch offices, can be both time consuming and a logistical challenge.
- Methodology—With the bar for effective IT operations higher than ever, organizations are forced to adapt a more mature implementation of IT operational processes to deliver the necessary services to the organization's business units more efficiently.
With increasing operational requirements unaccompanied by linear growth in IT staffing levels, organizations must find ways to streamline administration through tools and automation.
The Automation Challenge
As functionality in client and server systems has increased, so too has complexity. Both desktop and server deployment can be very time consuming when performed manually. With the number and variety of security threats increasing every year, timely application of security updates is of paramount importance. Regulatory compliance issues add a new burden, requiring IT to demonstrate that system configurations meet regulatory requirements.
These problems have a common element—all beg for some measure of automation to ensure IT can meet expectations in these areas at the expected level of accuracy and efficiency. To get IT operational requirements in hand, organizations need to implement tools and processes that make OS and software deployment, update management, and configuration monitoring more efficient and effective.
Configuration "Shift and Drift"
Even in those IT organizations with well-defined and documented change management, procedures fall short of perfection. Unplanned and unwanted changes frequently find their way into the environment, sometimes as an unintended side effect of an approved, scheduled change.
You may be familiar with an old philosophical saying: If a tree falls in a forest and no one is around to hear it, does it make a sound?
Here's the configuration management equivalent: If a change is made on a system and no one knows, does identifying it make a difference?
The answer to this question is absolutely "yes." Every change to a system has some potential to affect the functionality or security of the system, or that system's adherence to corporate or regulatory standards.
For example, adding a feature to a web application component may affect the application binaries, potentially overwriting files or settings replaced by a critical security patch. Or, perhaps the engineer implementing the change sees a setting he or she thinks is misconfigured and decides to just "fix" it while working on the system. In an e-commerce scenario with sensitive customer data involved, this could have potentially devastating consequences.
At the end of the day, your selected systems management platform must bring a strong element of baseline configuration monitoring to ensure configuration standards are implemented and maintained with the required consistency.
Lack of Security and Control
Managing systems becomes much more challenging when moving outside the realm of the traditional LAN (local area network)-connected desktop or server computer. Traveling users who rarely connect to the trusted network (other than to periodically change their password) can really make this seem an impossible task.
Just keeping these systems up to date on security patches can easily become a full-time job. Maintaining patch levels and system configurations to corporate standards when your roaming users only connect via the Internet can make this activity exceedingly painful. In reality, remote sales and support staff make this an everyday problem. To add to the quandary, these users are frequently among those installing unapproved applications from unknown sources, subsequently putting the organization at greater risk when they finally do connect to the network.
Point-of-sale (POS) devices running embedded operating systems pose challenges of their own, with specialized operating systems that can be difficult to administer—and for many systems management solutions, they are completely unmanageable. Frequently these systems perform critical functions within the business (such as cash register, automated teller machine, and so on), making the need for visibility and control from configuration and security perspectives an absolute necessity.
Mobile devices have moved from a role of high-dollar phone to a mini-computer used for everything: Internet access, Global Positioning System (GPS) navigation, and storage for all manner of potentially sensitive business data. From the Chief Information Officer's perspective, ensuring that these devices are securely maintained (and appropriately password protected) is somewhat like gravity. It's more than a good idea—it's the law!
But seriously, as computing continues to evolve, and more devices release users from the strictures of office life, the problem only gets larger.
Timeliness of Asset Data
Maintaining a current picture of what is deployed and in use in your environment is a constant challenge due to the ever-increasing pace of change. However, failing to maintain an accurate snapshot of current conditions comes at a cost. In many organizations, this is a manual process involving Excel spreadsheets and custom scripting, and asset data is often obsolete by the time a single pass at the infrastructure is complete.
Without this data, organizations can over-purchase (or worse yet, under-purchase) software licensing. Having accurate asset information can help you get a better handle on your licensing costs. Likewise, without current configuration data, areas including Incident and Problem Management may suffer because troubleshooting incidents will be more error prone and time consuming.
Lack of Automation and Enforcement
With the perpetually increasing and evolving technology needs of the business, the need to automate resource provisioning, standardize, and enforce standard configurations becomes increasingly important.
Resource provisioning of new workstations or servers can be a very labor-intensive exercise. Installing a client OS and required applications may take a day or longer if performed manually. Ad-hoc scripting to automate these tasks can be a complex endeavor. Once deployed, ensuring the client and server configuration is consistent can seem an insurmountable task. With customer privacy and regulatory compliance at stake, consequences can be severe if this challenge is not met head on.
Proliferation of Virtualization
There's an old saying: If you fail to plan, you plan to fail. In no area of IT operations is this truer than when considering virtualization technologies.
When dealing with systems management, you have to consider many different functions, such as software and patch deployment, resource provisioning, and configuration management. Managing server and application configuration in an increasingly "virtual" world, where boundaries between systems and applications are not always clear, will require considering new elements of management not present in a purely physical environment.
Virtualization as a concept is very exciting to IT operations. Whether talking about virtualization of servers or applications, the potential for dramatic increases in process automation and efficiency and reduction in deployment costs is very real. New servers and applications can be provisioned in a matter of minutes. With this newfound agility comes a potential downside, which is the reality that virtualization can increase the velocity of change in your environment. The tools used to manage and track changes to a server often fail to address new dynamics that come when virtualization is introduced into a computing environment.
Many organizations make the mistake of taking on new tools and technologies in an ad-hoc fashion, without first reviewing them in the context of the process controls used to manage the introduction of change into the environment. These big gains in efficiency can lead to a completely new problem—inconsistencies in processes not designed to address the new dynamics that come with the virtual territory.
Lack of Process Consistency
Many IT organizations still "fly by the seat of their pants" when it comes to identifying and resolving problems. Using standard procedures and a methodology can help minimize risk and solve issues faster.
A methodology is a framework of processes and procedures used by those who work in a particular discipline. You can look at a methodology as a structured process defining the who, what, where, when, and why of one's operations, and the procedures to use when defining problems, solutions, and courses of action.
When employing a standard set of processes, it is important to ensure the framework you adopt adheres to accepted industry standards or best practices as well as takes into account the requirements of the business—ensuring continuity between expectations and the services delivered by the IT organization. Consistently using a repeatable and measurable set of practices allows an organization to quantify more accurately its progress to facilitate the adjustment of processes as necessary for improving future results. The most effective IT organizations build an element of self-examination into their service management strategy to ensure processes can be incrementally improved or modified to meet the changing needs of the business.
With IT's continually increased role in running successful business operations, having a structured and standard way to define IT operations aligned to the needs of the business is critical when meeting the expectations of business stakeholders. This alignment results in improved business relationships in which business units engage IT as a partner in developing and delivering innovations to drive business results.
The Bottom Line
Systems management can be intimidating when you consider the fact that the problems described to this point could happen even in an ostensibly "managed" environment. However, these examples just serve to illustrate that the very processes used to manage change in our environments must themselves be reviewed periodically and updated to accommodate changes in tools and technologies employed from the desktop to the datacenter.
Likewise, meeting the expectations of both the business and compliance regulation can seem an impossible task. At the end of the day, as technology evolves, so must IT's thinking, management tools, and processes. This makes it necessary to embrace continual improvement in those methodologies used to reduce risk while increasing agility in managing systems, keeping pace with the increasing velocity of change.