Home > Articles > Programming > Windows Programming

  • Print
  • + Share This
This chapter is from the book

Certification Processes

Now that you have reduced the diversity in your network, you must still arm yourself with processes that ensure stability. No matter how stringent your guidelines, you will always be faced with a certain amount of diversity. This is simply due to the breadth of functional requirements within any IT network. This is where the certification process comes into play.

Remember DOS?

DOS was a very limited OS. People who produced software for DOS had to worry about every single component of the computer—keyboard drivers, screen drivers, mouse drivers, and so on. This was because DOS provided very limited services to manage the PC. Software manufacturers had to provide every function if they wanted their product to work.

Windows changed all of this (though it wasn't the first program to do this on the PC—remember GEM?) by providing a single, unified environment to manage all of a PC's processes. With Windows, software manufacturers only had to ensure their programs worked with Windows interfaces. This enabled them to concentrate on their product's features.

Unfortunately, this caused another problem: DLL hell.

The Evolution of Software

Software has changed over the years. In the days of DOS and Windows 3.x, software often consisted of a single, unified executable that simply required a copy from the installation disks to the hard drive.

Today, software is much more complex. Software manufacturers have changed their product design from single executables to the creation of master programs that orchestrate the operation of all of a product's components.

This evolution to master programs is an important development. In order to manage memory more effectively, software manufacturers have divided their products into a series of subcomponents loaded into memory only when users require it, which frees up memory for important tasks.

For example, when you're working with a word processing program, printing is not required all the time. So the printing module (often called a dynamic link library, or DLL) is only loaded when you want to print. Then, when printing is complete, the DLL is removed from memory to clear it up for more important functions.

In addition, because the OS provides most functions, software applications must often call upon system functions to work. Printing, for example, is provided by the OS, whereas the product only needs to know how to call upon the function (see Figure 4.4).

FIGURE 4.4 The Heart of an Executable. Software programs today include a master program component that calls upon either private or shared subcomponents as required during operation. Figure 4.5 illustrates how this interacts with the PC memory management process.

FIGURE 4.5 The PC Memory Management Process. Most users have difficulty understanding how rapid access memory (RAM) is related to permanent storage on the hard disk. Note that software developers use the DLL approach to reduce RAM usage.

Facing DLL Hell

The problem with the DLL3 approach is that it is now very difficult for software manufacturers to ensure that all the components that ship with a product are up-to-date. When software is installed on a system, the components it includes often replace existing components that may be more recent. This causes instability within the system.

For example, say Company A decides in 1998 to develop a new product. The product development cycle will take two years to complete before the product reaches the market. They ensure that in 1998 they have the latest and greatest systems and that every OS component is up-to-date.

The problem arises when, while Company A is in development, operating systems continue to evolve. In 1998, Company A may have been using Windows NT with Service Pack 4. Meanwhile, Microsoft released Service Packs 5, 6, and 6a along with numerous hot fixes.4 In February 2000, Microsoft released Windows 2000. In June 2000, Microsoft released Service Pack 1 for Windows 2000.

If Company A does not ensure that it keeps all of its systems up-to-date and does not perform compatibility testing of its product when it does upgrade, the product that hits the marketplace in July 2000 will definitely include components that are older than those that may exist on your system. This process is illustrated in Figure 4.6.

FIGURE 4.6 Development and OS Timelines. Software Company A begins its development cycle with updated systems. As time passes, the company updates its systems, but it is difficult to keep up. One of the ways Software Company A can keep up-to-date is to use beta versions of upcoming systems to test the compatibility of its solution.

The software industry has taken steps to reduce the impact of this problem. For one thing, many manufacturers ensure that all new versions of a DLL support all of the functions of all of the older versions. So even if a product thinks it requires version 1, it will still work with version 2 because the new version supports all of version 1's functions—it is backward compatible.

Unfortunately, this does not work with every product. Some manufacturers do not provide backward compatibility for their components. If you want to run two conflicting products on the same operating system, you must figure out a way to make two versions of the same component work on the same system, otherwise known as side-by-side operation.

Table 4.1 describes the differences between backward compatibility and side-by-side operation.

Table 4.1 Backward Compatibility versus Side-by-Side Operation

Backward Compatibility

Side-by-Side Operation

Requires consistency of behavior and interface over time.

Interface and behavior change over time.

All versions can and must share state.

Requires isolation of state by version.

All applications use the "latest version" of the component.

Applications all use their own version of the component.

Applications support the activation of the component by its absolute name.

Applications call specific component names.


Service is straightforward.



Service is complex because care must be taken when changing system components.

As you can see, the DLL problem can become a major headache. It is often this very problem that causes instability in Windows systems.

Windows 2000 and DLL Hell

Microsoft is conscious of the DLL problem because they have received numerous support calls relating directly to it. This is why Windows 2000 includes for the very first time several technologies designed to avoid DLL hell, and Windows XP expands on these technologies.

In Windows 2000/XP, the user profile includes every element that is modifiable by the user.

System Security

Like Windows NT, Windows 2000 uses a 32-bit file system called NTFS. The advantage of this system over its predecessors is that every object stored in the system includes attributes. These attributes can contain security features—security features that are different for users, power users, and administrators. The greatest limitations are applied to users. Because users only operate the system, they only need read and execution permissions for every system component. In this way, NTFS "protects" system and application files by restricting access to those files.

Blocking Software Installations by Users

The security features of NTFS enable administrators to block users from installing software. Many users see this as a limitation of their "rights" on a PC. But this is not the case. Because a PC is a corporate asset that provides support for business processes, it must be stable. If a corporation puts in place a strategy for managing software stability within its network, it must restrict users from installing uncontrolled components on their PCs. This is simply because these uncontrolled components will most likely damage critical system components and destabilize PCs.

Conversely, users must have the right to choose the products they need for their job functions. By providing a self-serve center for approved and controlled products, the corporation meets both user needs and the corporate software management strategy.

In Windows NT, users were given too much leeway because software integration was not controlled effectively. Many software products would install into and require constant read and write use of the system directories. Giving users these rights would open the system to potential damage.

Realizing this, Microsoft released the Zero Administration Kit for Windows NT. This kit provided corporations with the tools to increase system lockdown to further limit user access. But this system was complex to use, and organizations often had to invest in its management.

With Windows 2000, Microsoft changed the nature of the NTFS system lockdown. They added further restrictions to users and changed the way applications worked with the OS. As a comparison, users in Windows NT have the same rights that power users do in Windows 2000. Today, users have significant restrictions within the operating system directories and within application directories.

Windows XP and DLL Hell

With Windows XP, Microsoft enhances the DLL hell management process with a new folder to store side-by-side DLLs and virtual DLL redirection. DLLs no longer need to register within the Windows Registry. Applications designed for Windows XP now include descriptive files that are read by XP's DLL Loader before loading the application. This means that in a pure XP environment, DLL hell may become a ghost of the past, but it will be some time before organizations can move to pure XP environments.

In addition, because it is no longer based on DOS, XP includes new virtual Windows and DOS compatibility environments, supporting even more applications than Windows 2000.

Software that is designed for Windows 2000/XP does not install any component in the system directories. All of its components go into its own application directory. In addition, every component that is modifiable by a user (configuration settings, user preferences, and so on) is stored within the directories containing the user profile. Here users rule and can read and write to their heart's content. Moving modifiable components to the user profile is a good strategy because critical system and application files are protected for all users. If users damage something within their own profile, it simply needs to be erased and re-created.5

Windows XP goes beyond the capabilities of Windows 2000 to manage the DLL process as is demonstrated in this flowchart.

Side-by-Side DLL Management

In addition, Windows 2000 and Windows XP include side-by-side DLL management. This means that if two applications require the use at the same time of two different versions of the same DLL, Windows 2000/XP can load both into memory but into separate memory spaces. Side-by-side DLL management avoids conflict and relieves organizations from having to manage this process.

Windows System File Protection

Windows 2000 also includes System File Protection (SFP). This feature stores a backup copy of critical system files (within the DLL Cache subdirectory). A special agent is constantly watching the system directories. If, during the installation of a new application, a critical system DLL is replaced and the original system DLL is crushed, this agent automatically replaces the new file with the original and proper file.

In addition, SFP enables only the OS to update files within these directories.

Windows Installer

For software applications, Microsoft introduced the Windows Installer service.6 This system service is designed to assist in the control and management of the software lifecycle on Windows 2000/XP systems. For stability purposes, Windows Installer has the capability to provide the same type of protection to application files that the SFP provides for system directories.

Viewing System File Protection at Work

To view the System File Protection (SFP) feature at work, log on to Windows 2000 as an administrator. Navigate to the WINNT\ SYSTEM32 directory. Locate CALC.EXE (the calculator) and delete it. Empty the Recycle Bin.

SFP should replace it within seconds.

Applications whose installation is integrated with Windows Installer include an installation database. This database is stored on the system during installation. Then, every time an application is launched, Windows Installer verifies its consistency based on the database settings. If everything is okay, the application starts normally. But if critical components are missing or damaged, Windows Installer searches the database for the location of the original installation file and repairs or replaces the component, and then launches the application normally. This process is called self-healing.

Windows Installer

Windows Installer does much more than just provide self-healing capabilities. It is a complete system designed to manage software on a SPA object. It can provide deployment through Active Directory, Windows 2000's central system and user management service. It can personalize installation settings through transformation files applied during setup. It can provide clean uninstallation of a software product. It can support the application of corrective maintenance (service packs) during the product's lifecycle.

Windows Installer is not perfect, but it does provide some measure of protection for applications. But beware: For self-healing to work, installation files must always be available to the Windows Installer service. This means that installation sources must permanently remain available in the network.

Certified for Windows 2000/XP

With all of these changes, Microsoft has implemented a new approach for software manufacturers to integrate their products with Windows 2000/XP: the Certified for Windows 2000/XP initiative. A product that is certified for Windows 2000/XP supports the following features:

  • Its DLLs are designed to be backward compatible.

  • All of its operation files are stored within an application directory in the Program Files folder.

  • Its installation process is integrated with Windows Installer (the extension of the installation file is then .MSI).

  • It can be delivered to SPA objects (PCs or servers) through Windows 2000/XP's Active Directory service.

  • All user-modifiable files are stored in the user's profile directories (within the Documents and Settings folder).

In fact, when a product is certified for Windows 2000/XP, the product will provide as much stability as is available from the OS.7

Stabilizing Windows Technologies

The problem is that not all applications on the market today comply with Windows 2000/XP standards. Until they do, you will need to have pre-Windows 2000 applications coexisting with Windows 2000/XP–certified applications.

Hardware Certification

Windows 2000/XP includes a hardware certification program. You can find most of the products that have been certified for Windows 2000 in the HCL available at http://www.microsoft.com/windows2000/server/howtobuy/upgrading/compat/search/devices.asp. If the product you want is not on this list, ensure that the manufacturer provides a Windows 2000 driver for the product; otherwise, you may destabilize your systems.

This may be cause for concern, but not if you apply a software management strategy within your network. Because the problem has been defined, you now know where to look for application and system instability issues.

Certifying Software: Managing DLL Conflicts

To face the DLL problem, you can implement a software certification process within your corporation. This process is based on the identification of all of the components you install on your systems and the repair of damaging applications. If an application includes a DLL that is older than the one stored in the system, you can take the following actions.

  • Upgrade the DLL during the installation of the product. This works if the DLL is backward compatible.

  • Move the DLL to another directory. This will take advantage of Windows 2000's side-by-side operational capabilities.

Running Pre-Windows 2000 Software

Pre-Windows 2000 software works on Windows 2000. But, because of the changes in NTFS security and because these applications have a tendency to store components within system directories, you will need to "loosen" system security. Otherwise, users will not have the ability to execute these applications, because in Windows 2000, they do not have write access to these directories.

To run most pre-Windows 2000 applications with user accounts, you must apply the Compatibility Security Kit to your systems. To do so, run the following command:

    secedit /configure /cfg
    compatws.inf /db 

This command will modify the security level of Windows 2000 to make it operate like Windows NT.8

Other conflicts can arise, but their impact is not critical. They can be repaired on an as-needed basis. The entire DLL management process is outlined in Figure 4.7. Wise Solutions, Inc.,9 a software manufacturer specializing in software delivery technologies, was the first to formalize this process.

FIGURE 4.7 The DLL Conflict Management Process. Managing DLLs involves identifying conflicts and taking proper action.

Because the SLMP requires all products that have a user base larger than ten to be managed centrally, corporations must use automated techniques for installation. Installation automation is required because the corporation does not want users to be faced with configuration decisions during installation. Most products aimed at the corporate marketplace include technologies that enable preconfiguration of installation parameters. Windows 2000/XP applications automatically include such a system because it is one of the functions of the Windows Installer.

Wise Solutions' Conflict Manager

Wise Solutions, Inc., offers a product called InstallManager. This product is designed to perform installation automation. Within InstallManager is a module called Conflict Manager. Conflict Manager uses a database to store all component information and also provides a Conflict Resolution Wizard to repair damaging components.

In some cases, corporations will need to deploy and manage software that does not include automated setup technology. This is generally referred to as software repackaging. Software repackaging consists of taking a "snapshot" of the system before installation, performing and customizing the installation, and taking a snapshot after installation. A software-repackaging program records all of the changes and inserts them into an executable that automates setup.

These tools are useful, especially when the corporation needs to deploy internally developed software.

It is during repackaging that conflict detection can occur. Here the packaging tool inventories all of the packaged components. This inventory is stored in a database containing all of a system's components. The database is then checked for conflicts. If conflicts are detected, action can be taken to repair damaging components.

Lanovation Prism Pack Enterprise Edition

Lanovation was the first to introduce a single, integrated product that managed both DLL conflicts and created Windows Installer (MSI) installation files. Their product is very simple to use. There is very little training required, so long as technicians understand the concepts of DLL management and Windows Installer integration.

This product is definitely a boon to the software lifecycle and certification processes. More information is available at http://www.lanovation.com.

Proactive and Reactive Conflict Management Strategies

It is clear that with its new features, Windows 2000/XP improves stability. But for applications, it does so only for those that are certified for Windows 2000/XP. Older applications are not integrated with Windows Installer because their setup is not integrated into this service.

It is possible, though, to integrate older applications into the Windows Installer service at the repackaging phase. Every application that is repackaged using a Windows Installer–compatible packager will gain some, but not all, of the features of a Windows 2000–ready product. It will, for example, have self-healing, but it will not operate with simple user rights. It will also take advantage of Windows Installer's reactive DLL management techniques (if the DLL is the same as another already loaded, load this one into a different memory space).

Managing Older Applications

Most organizations moving to Windows 2000/XP today will need to live with three types of applications.

  • 8- and 16-bit applications from DOS and Windows 3.x

  • 32-bit applications that are not ready for Windows 2000/XP

  • Windows 2000/XP–certified applications

The latter are by far the most stable, but you need to wait for manufacturers to update their products.

In order to properly manage these applications in the network, use the following three installation directories:

  • APPS16 (for DOS and Windows 3.x)

  • PreWin2K (for older, 32-bit software)

  • Program Files (for certified products)

Then migrate applications from the two first directories as you update them. You will be able to destroy these directories when they are empty.

Several software manufacturers offer repackaging tools—for example, Wise for Windows Installer from Wise Solutions, Inc.; WinInstall LE or WinInstall 2000 from Veritas, Inc.; InstallShield for Windows Installer from InstallShield Corporation; and Prism Pack from Lanovation, Inc.

Using both repackaging for Windows Installer and repackaging conflict detection techniques enables corporations to be both proactive and reactive when designing Windows 2000/XP networks. These techniques proactively repair applications before they are deployed in the network. And, if some conflicts are missed and deployed anyway, they reactively support conflicts after deployment through the Windows Installer service.

A Certification Approach: Complete DLL Management

Introducing a certification process is not much more costly than managing software in traditional ways. Most medium- to large-sized organizations already have software repackaging facilities because they want to take advantage of central deployment strategies.

Veritas WinInstall LE

WinInstall LE is located on the Windows 2000 Professional installation CD. It comes with a 60-day trial license.

Identifying and correcting DLL conflicts while packaging means adding two additional processes to the traditional software repackaging approaches.

  • Inventory components to identify conflicts

  • Repair damaging components

Component Inventories

Eventually, all applications that want to make it in the Windows market will be designed for Windows 2000/XP. As a result, repackaging may no longer be necessary. But it will always be vital for organizations to inventory components. Nothing is better than knowing exactly what is out there. Then, if problems arise, you will at least know what you have.

These two processes fit very well into most packaging methodologies because testing is already covered (if a software is repackaged, its installation must always be tested before deployment). With conflict detection, testing changes slightly to verify if either the backward compatible or the side-by-side strategy is required. Figure 4.8 illustrates this process.

FIGURE 4.8 The Software Certification Process. The application certification process involves six major steps.

Using a Certification Center

In this process, the software repackaging center becomes a certification center. Its nature is permanent and must be staffed as such. Certification activities will include kernel management and software lifecycle management. Every software addition into the network must pass through this center. No deviation is allowed. The return on investment for the certification center is provided by the increased stability of software in the network. Payback is also evident in a significant decrease of support calls related to nonfunctioning software.

Deploying MSI Software in the Enterprise

Software deployment in a large enterprise is difficult because of all the different source locations you need to manage. It becomes even more difficult when you include self-healing capabilities. Self-healing means that the source of the installation must always be available. Each different site must have a different source address. This may be simple for SPA objects that are always linked to the network, but it is more complex for portables because they must carry the source files with them.

One of the best methods is to use a variable on the SPA object. Here's how it works:

  1. Prepare all of your installation packages using a variable such as %Source%.

  2. On the preparation systems, define this variable as the source of the installation (Set SOURCE=\\Server\Folder).

  3. Define a variable on the SPA object. This variable can map to different locations on different SPA objects, but your installation packages all point to the same destination: %Source%.

For example, on a portable this variable can be set to a local, hidden partition on the hard disk; in a region, it can be a local server; in the certification lab, it can be the staging server; and at headquarters, it can be a central server.

However, if you are using Windows XP, all you need to do is redirect the software source URL.

Certification Center Processes

Several processes support the certification center. The first is the software certification process. It includes the following steps:

  1. System testing: Prepare the application for automated installation.

    1. Perform software configuration and installation process validation to determine if the installation fits within standards.

    2. Perform repackaging or package analysis. If the application is packaged to Windows 2000/XP standards, proceed with component inventory and conflict detection and resolution. If the application is not ready for Windows 2000/XP, repackage it, inventory components, and perform conflict detection and resolution.

    3. Finalize the package.

    4. Move on to system testing.

    5. Perform system tests.

    6. Document the installation.

  2. Integration/deployment preparation: Prepare and test the remote installation of the application.

    1. Update the deployment mechanisms.

    2. Deploy on test systems along with profile contents.

    3. Update the documentation.

  3. Integration testing: Test application cohabitation.

    1. Test the operation of the application within a complete user profile.

    2. Update the documentation and have it approved by the software owner.

  4. The application is ready for deployment.10

The next process is the kernel update process. This process singles out the kernel from any other software that is in use within the network. This is because the kernel is the only globally used component in the network. Because every user requires the kernel, its level of stability must go beyond that of other applications.

This means that the kernel must be updated on a constant basis. Because there are a number of different products covered, a change request and evaluation process must be put in place. This change request process uses the rationalization guidelines outlined previously to evaluate the validity of the request. If approved, the change is delivered in the next kernel update.

Given the rate of change in the software industry and the number of different software components that often reside in the kernel, it is essential to manage this process on a four-month basis at the very most. Waiting longer for update collection creates packages that are very unruly. Deploying more frequently is too expensive.

The kernel includes all of the components required for the operation of a basic SPA object. It is discussed in more detail in Chapter 5.

Figure 4.9 illustrates the kernel update process.

FIGURE 4.9 The Kernel Update Process. The SPA kernel update process should occur every four months. One month before release, a freeze is applied to change requests in order to allow update preparation.

Rate of Updates

Three updates a year has proven to be optimal for the kernel in all of the organizations using this process to date.

In addition to kernel updates, the certification center is responsible for software update validation and preparation. This means applying the same processes used with the kernel to every managed software product.

Though 90 percent of applications are managed because they have a population greater than ten users, it is also necessary to put in place standard procedures for the other 10 percent. Applications with less than ten users do not warrant certification because if they have conflicts, they do not impact many people. In addition, so long as they are designed to work with Windows 2000/XP, corporations can use the reactive approach to DLL management. Their installation can be manual, but it should be structured. Specific procedures for manual software installations should be put in place and monitored by the certification center. This is called the "less than ten" management process.

Certification by Profile

Even with rationalization practices, organizations often have several hundred software products to support and manage in the network. Therefore, it is important to regroup applications by user type. These IT profiles11 will include every application that is common to a group of users performing the same type of task. This regrouping by IT role or function simplifies certification testing (only those applications most likely to cohabit on a system will be tested) and deployment (applications are automatically deployed to users that fit the IT profile).

Finally, the certification process must support two additional situations: emergency updates and project requests. Both situations require a shorter timeline for deployment. If a fatal bug has been missed during testing and has been deployed by mistake, it is vital to perform an emergency update deployment. This deployment must have a very short timeframe because vital operations are nonfunctional.

In addition, when projects are launched, they often require newer versions of components or components that are not already in use in the network. This situation also requires a shorter timeframe. Figure 4.10 displays these processes.

FIGURE 4.10 Emergency or Project Certification. The emergency or project deployment leads to incremental version updates of the kernel.

All of the processes in the certification center help maintain stability within your network. These and the software rationalization processes must become part of your everyday operations if you want to profit from their benefits.

Certification and Consultant PCs

One of the questions you will be faced with when you implement your certification program is "What do you do with systems that operate in your network but don't belong to you?"

It all depends. If the system belongs to a consultant and is used only as a productivity tool, the issue has less importance. If consultants want to manage their own PCs without certification, that is their issue and not yours.

But if the computer is connected to the network, you have some degree of control. The least you can ask for is that the computer use a secure OS that conforms to your corporate security guidelines.

The certification problem is compounded when consultants use their own systems to perform development for your network. If this is the case, these consultants are in a situation where they introduce components into your network. As such, their systems should conform completely to your certification standards.

  • + Share This
  • 🔖 Save To Your Account