Home > Articles > Software Development & Management

This chapter is from the book

This chapter is from the book

Case Study: Assessing the Production Acceptance Process at Seven Diverse Companies

All the theory in the world about designing world-class infrastructures is of little use if it cannot be applied to real-life environments. In this section, I present real-life applications of infrastructure processes in general and applications of the production acceptance process in particular. All of the material in this part of the book is taken from work involving the production acceptance process that I performed in recent years at seven separate companies. The companies vary significantly in size, age, industry, orientation, and IT maturity. As a result, they offer a wide diversity of real-life experiences in how companies recognize, support, and improve the quality of their production acceptance environments.

In addition to the general company attributes previously mentioned, this initial part of the case study describes several key IT characteristics of each firm. This is to show both the amount and range of diversity among these organizations. I then discuss each company in more detail with emphasis on its particular strengths and weaknesses in its approach to infrastructure processes. Included in this section is a unique feature of this book: a completed assessment worksheet measuring the relative quality and robustness of each company's production acceptance process. The last part of this section summarizes and compares the attributes, relative strengths, weaknesses, and lessons learned from each of the seven companies studied.

The Seven Companies Selected

These seven companies were selected based on my familiarity with each one either as a client of my professional services or as a client whose infrastructure I personally managed. It is fortunate that these companies provided such a wide variety of IT environments. To gain further insight from studying the relative strengths and weaknesses of numerous production acceptance processes, it is helpful to draw from a variety of IT environments.

The seven companies studied here could not have been more diverse. They each consisted primarily of one of the four major platform environments: mainframe, midrange, client/server, or web-enabled. No two were in the same industry. They covered a wide spectrum of businesses that included aerospace, broadcast content, motion pictures, defense contracting, dotcom e-tailor, broadcast delivery, and financial services.

The age of the oldest company, 50 years, was more than 10 times the age of the youngest one. Even more striking was the variation by a factor of 1,000 from the largest number of total employees (and the number of IT employees specifically) to the smallest. Despite the diversity of these companies, they all had production applications to deploy, operate, maintain, and manage. They all shared a common production goal to run these systems as reliably and as efficiently as possible. The degree to which they accomplished that goal varied almost as widely as the environments that described them. Studying what each company did well or not so well when managing its applications provides important lessons as to how to implement a truly world-class production services department.

Types of Attributes

In setting out to study and analyze the production services function of these companies, I first identified attributes of each company that fell into one of three categories: business-oriented, IT-oriented and production services-oriented. The following characteristics were associated with each category.

Business-oriented attributes:

  • Type of industry of the company

    Manufacturing

    High technology

    Entertainment

    Services

  • Total number of its employees at the time of the study

    Largest had 80,000 workers

    Smallest had 75

    Average number was 17,300

  • Number of years it had been in business

    Oldest was 70 years

    Youngest was 4 years

    Average was 31 years

IT-oriented attributes:

  • Number of IT workers

    Largest had 2000 employees

    Smallest had 25 employees

    Average was 457 employees

  • Number of processors by platform
  • Number of desktops

Production services-oriented attributes:

  • Total number of applications in production
  • Number of production applications deployed per month
  • Existence of a production services department
  • To which group the production services department reported

The largest IT department in our sample skews the data slightly since the average is a more reasonable 200 with it removed.

Table 9-4 lists all of these attributes for each of the seven companies. We identify these seven firms simply as Company A, Company B and on through Company G. A few observations are worth noting aside from the obvious diversity of the companies. One is that the size of the company does not necessarily dictate the size of the IT department. For example, Company A has 80,000 employees, with 400 of them in IT; Company D has 30,000 workers, with 2,000 of them in IT. This is because Company A has many manufacturing workers not directly tied to IT, whereas Company D has major defense programs requiring huge investments in IT.

Table 9-4. Summary Comparison of Case Study Companies

Attribute

Company A

Company B

Company C

Company D

Company E

Company F

Company G

Industry:

Aerospace

Broadcast content

Motion pictures

Defense contractor

Dot-com e-tailor

Broadcast delivery

Financial services

Number of Employees:

80,000

1,500

3,000

30,000

75

4000

2,500

Age of Company:

50

15

70

60

4

10

8

Employees Within IT:

400

125

200

2,000

25

300

150

Mainframes:

4

0

0

8

0

2

0

  • (Midranges):

4

0

2

10

0

2

0

  • (Servers)

4

40

50

20

10

30

200

Desktops

1,200

600

2,000

5,000

80

1,800

1,500

# of Prod. Applications

350

125

150

500

25

700

250

Applications Deployed/Month:

2

2

3

4

1

3

5

Prod. Services Dept.:

Yes

No

No

Yes

No

Yes

No

Dept. to Which PS Reported:

Ops

N/A

N/A

Ops

N/A

Application Support

N/A

Quality Assurance Dept.:

No

No

Yes

Yes

No

Yes

Yes

Dept. to Which QA Reported:

N/A

N/A

Enterprise Planning

Apps Dev.

N/A

Apps Dev.

Apps Dev.

Change Mgmt. Formality

Medium

Low

Medium

High

None

Low

None

Prod. Acceptance Formality

Medium

None

Low

High

None

None

None

We will next look at each of the seven companies in more detail, focusing on their use, or non-use, of a production services function. We will also discuss each IT organization's relative strengths and weaknesses and what they learned from their experiences with attempting to implement robust infrastructure processes.

Company A

Company A is a large, well-established aerospace firm. The company is more than 50 years old and enjoys a reputation for researching, developing, and applying cutting-edge technology for both the military and commercial sectors. At the time of our assignment, it employed 80,000 workers—of whom 400 resided in IT. The IT platform environment of its main corporate computer center consisted primarily of four huge mainframes, with the same number of midrange computers and servers and approximately 1,200 desktops.

The IT operations department of Company A had a well-established production services function that ran 350 production applications daily (slightly more during month-end processing) and deployed on average two new production applications per month. There was no quality-assurance group at this company, although they did have the beginnings of a formal change management and production acceptance process.

The production services department was staffed by two very competent individuals who thoroughly knew the ins and outs of running virtually every production application in the company, though little of it was documented. They were very technically knowledgeable, as was most all of the staff in IT. This reflected part of the company's mission to develop highly technical expertise throughout the enterprise. Another part of the company's mission was to dedicate every department to continuous process improvement. The production services function was still very manually oriented and consequently somewhat inefficient. No automated scheduling systems were in place here at this time, but the company was willing to try new techniques and try new technologies to improve their processes.

Production services was also very segregated from other processes, such as change and problem management. There was only the start of a production acceptance process, which was not tied to production services at all. This segregation occasionally strained communications between operations and applications development. The fact that they were 25 miles apart sometimes added to the lack of face-to-face meetings.

Operations did a good job of collecting meaningful metrics such as outages, abnormal terminations, reruns, reprints, and reports delivered on time. There was an inconsistent emphasis on how often or how deeply their metrics should be analyzed, which sometimes undermined their usefulness.

To summarize Company A's strengths, they were willing to try new techniques and new technologies, they committed to continuous process improvement, they hired and developed a technically competent staff, and they were willing to collect meaningful metrics. To summarize their weaknesses, they tended to not interact with members of other IT staffs, they provided little documented training, they did not always have effective communications with the development group (due, in part, to a 25-mile separation), and they did not always analyze the metrics they collected.

Eventually, the operations department implemented a more formal production acceptance process. One of the most important lessons we learned was to ensure the operations department was involved very early with a new application project. This helps ensure that the appropriate operation's group provides or receives the proper resources, capacity, documentation, and training required for a successful deployment. The other important lesson we learned was that the other infrastructure support groups (such as network services, the help desk, storage management, and desktop applications) need to provide their full support to the production services function. Because this function had worked in an isolated manner in the past, other infrastructure support groups were initially reluctant to support it. They eventually did as improved processes, automation, and increased communication became more prevalent.

The nonweighted worksheet shown in Figure 9-2 presents a quick-and-simple method for assessing the overall quality, efficiency, and effectiveness of the production acceptance process at Company A. As mentioned previously, one of the most valuable characteristics of a worksheet of this kind is that it can be customized to evaluate each of the 12 processes individually. The worksheet in the following sections of this chapter applies only to the production acceptance process for each of the seven companies studied. However, the fundamental concepts applied in using these evaluation worksheets are the same for all 12 disciplines. As a result, the detailed explanation on the general use of these worksheets presented near the end of Chapter 7 also applies to the other worksheets in the book. Please refer to that discussion if you need more information on how weights are computed.

Figure 9-2

Figure 9-2 Assessment Worksheet for Company A

Process owners and their managers collaborate with other appropriate individuals to fill out this form. Along the left-hand column are 10 categories of characteristics about a process. The degree to which each of these characteristics is put to use in designing and managing a process is a good measure of its relative robustness.

The categories that assess the overall quality of a process are executive support, process owner, and process documentation. Categories assessing the overall efficiency of a process consist of supplier involvement, process metrics, process integration, and streamlining/automation. The categories used to assess effectiveness include customer involvement, service metrics, and the training of staff.

The evaluation of each category is a very simple procedure. The relative degree to which the characteristics within each category are present and being used is rated on a scale of 1 to 4, with 1 indicating no or barely any presence and 4 representing a large presence of the characteristic. For example, at this particular company, the executive sponsor for the production acceptance process demonstrated some initial support for the process by carefully selecting and coaching the process owner. However, over time, this same executive showed only mild interest in engaging all of the necessary development managers and staffs in the process. We consequently rated the overall degree to which this executive showed support for this process as small, giving it a 3 on the scale of 1 to 4. On the other hand, the process owner was extremely knowledgeable on all of the critical applications and their deployments, so we rated this category a 4.

We similarly rated each of the categories as shown in Figure 9-2. Obviously, a single column could be used record the ratings of each category; however, if we format separate columns for each of the four possible scores, categories scoring the lowest and highest ratings stand out visually. The next step is to sum the numerical scores within each column. For Company A, this totals to 4 + 6 + 6 + 4 = 20. This total is then divided by the maximum possible rating of 40, for an assessment score of 50 percent.

Company B

Company B is a satellite broadcast venture featuring informational programming. It is a relatively young firm at 15 years old. When it began, the technology of digital informational broadcasting was in its early refinement stages. This, among other reasons, resulted in them being very willing to employ cutting-edge technology. They did this almost to a fault, using very advanced but questionably tested technology at the outset for their satellites. They learned from their experiences, improved their technology, and eventually applied to their IT department by implementing cutting-edge but proved infrastructure processes.

Company B employs 1,500 workers, of whom 125 resided in IT. Their IT platform environment consists of 40 servers and approximately 600 desktops. There was no production services function at Company B nor was there a quality assurance group. They ran 125 production applications daily and deployed on average two new production applications per month. There was only a start of a change management process and no production acceptance process.

With the company poised to implement major enterprise applications, senior IT management realized they needed a formal production acceptance process. While preferring to do the work with their own staffs, they acknowledged limited in-house process expertise and hired professional consultants to run a pilot program. The IT executives were also very helpful in supplying qualified staff members from both applications development and operations to support the pilot program.

Since this was the first formal implementation of any infrastructure process, there was no integration to other processes and no immediate plans to do so. While applications development was extremely helpful in designing the production acceptance process and testing it with a perfect pilot application, they did not provide adequate training and documentation to the operations help desk. This was partly due to a re-shuffling of applications priorities, which also delayed the implementation of the process with a fully deployed application.

In summary of Company B's strengths, they saw the need for professional support for designing a Production Acceptance processes, they started out with pilot programs, and they staffed the pilot programs with qualified staff. For their weaknesses, the company did not provide adequate training and documentation to the help-desk group for their pilot program; they allowed support for the production acceptance process to weaken.

In a manner similar to that described for Company A, we performed an initial assessment of the production acceptance environment for Company B (see Figure 9-3). Their points totaled 18, for a final assessment score of 45 percent.

Figure 9-3

Figure 9-3 Assessment Worksheet for Company B

Company C

Our third company is one of the seven major motion picture studios in southern California. Studios in Hollywood tend to be an interesting paradox. On the one hand, they are some of the most creative companies for which one could ever hope to work. This applies to the writing, directing, acting, special effects, and other artistic pursuits that go into the production of a major motion picture. But when it comes to the traditional, administrative support of the company, they are as conservative as can be. This was especially true in their IT departments, and Company C was certainly no different in this regard. By the late 1990s, its IT department needed to be significantly upgraded to meet aggressive new business expansions.

Company C employs 3,000 workers, of whom 200 resided in IT. Their IT platform environment consists of two key midrange computers, 50 servers, and approximately 2,000 desktops. The company outsourced its mainframe processing, which still runs many of its core financial systems. There was no production services function at Company C, but there was a quality-assurance department that reported to an enterprise-planning group. Operations ran 150 production applications daily and deployed on average three new production applications per month. There was a formal, though not robust, change management process and an informal production acceptance process.

The IT executives at Company C conducted a studio-wide business assessment and determined that its current IT architecture would not support the future growth of the company. Many of the IT business systems would have to be upgraded or replaced and there would have to be a major overhaul of the IT infrastructure and its processes to support the new application environment. Among the processes needing improving was production acceptance. IT managers recognized the need and the opportunity to re-engineer their systems development life cycle (SDLC) methodology at the same time, and they committed the resources to do so. Software suppliers played key roles in these upgrades and re-engineering efforts. Managers also ensured that users, both internal and external to IT, received sufficient training on these new processes.

The IT quality assurance group at Company C worked closely with operations and developers in chartering a productions services function and in designing a production acceptance process. Since QA reported to the applications development department, IT executives elected to have the production services function report to them as well. This proved to be problematic in that the infrastructure group was often excluded from key deployment decisions. Another result of this arrangement was that it provided little documentation or training to the service desk and computer operations teams.

Summing up Company C's strengths, they recognized the need to upgrade their antiquated processes, they committed resources to re-engineer the SDLC process, and they provided considerable training to users on new processes. As to their weaknesses, they did not involve the infrastructure when designing the production acceptance process, they moved the control of production acceptance into applications development and out of operations, and they provided little or no training and documentation for the help desk and operations.

Eventually, the production services function became little more than an extension of the QA department, which still reported to applications development. As a result, although the company did now have a production acceptance process in place, the lack of infrastructure ownership of it made it less robust and less effective. The key lesson learned here was that IT executives must ensure that operations control the production acceptance process and that development be involved in the process design from the start.

Similar to the previous companies, we performed an initial assessment of the production acceptance environment for Company C (see Figure 9-4). Their points totaled 19, for a final assessment score of 48 percent.

Figure 9-4

Figure 9-4 Assessment Worksheet for Company C

Company D

This company is a major defense contractor which has supplied major weapons systems to the United States and foreign governments for more than 60 years. Its customers are primarily the five branches of the U.S. armed forces and secondarily the militaries of foreign governments. The company manages both classified and non-classified programs, putting an additional premium on fail-safe security systems. It also supplies limited commercial aviation products.

At the time of our involvement, Company D employed 30,000 workers, of whom 2,000 resided in IT. Their IT platform environment consists of eight mainframes, 10 midrange computers, 20 servers, and 5,000 desktops. There was a relatively formal production services function at Company D that reported to operations and a quality-assurance group that reported to applications development. They ran 500 production applications daily (dozens more on weekends) and deployed on average four new production applications per month. The company had very formal change management and production acceptance processes and was very committed to the practices of total quality and continuous process improvement.

The company also emphasized the use and analysis of meaningful metrics. By meaningful, we mean metrics that our customers and our suppliers can both use to improve the level of our services. One of the most refreshing aspects of this company was their support of our prescribed process improvement sequence of integrating first, standardizing second, streamlining third, and automating last.

As with many government defense contractors, Company D found itself rushing to meet program milestones and this sometimes undermined infrastructure processes such as production acceptance. High-priority projects were allowed to bypass the process to meet critical deadlines. Plans to streamline and automate the production acceptance process became a victim of unfortunate timing. Just as they were about to be put into place, cutbacks in personnel prevented the plans from being implemented. Subsequent mergers and acquisitions brought about some temporary turf wars that further delayed the standardization of processes across all divisions.

To summarize Company D's strengths, they were committed to total quality and continuous process improvement criteria, they were committed to doing excellent analysis of metrics, and they were striving sequentially to integrate, standardize, streamline, and then automate processes. To summarize their weaknesses, they were undermining the production acceptance process by rushing to meet deadlines, they were allowing high-priority projects to bypass the process, they were not allowing the process to be streamlined due to cutbacks, and they were experiencing occasional turf wars between IT departments.

Eventually, the standardization, streamlining, and automating of processes did occur among departments and across divisions and remote sites, and it brought with it significant operation and financial benefits. The standardization also helped facilitate future company acquisitions and the merging of remote sites.

As we did with our prior companies, we performed an initial assessment of the production acceptance environment for Company D (see Figure 9-5). They scored one of the highest initial assessments we had ever seen. Their points totaled 33, for a final assessment score of 83 percent.

Figure 9-5

Figure 9-5 Assessment Worksheet for Company D

Company E

Our next company is a dot-com victim, but fortunately not a casualty. Like many dot-com start-ups before it, this company began with a simple idea. The idea was to offer pop culture merchandise from television, motion pictures, sports, and other forms of entertainment. It had been in existence barely four years and was poised for significant growth. A shrinking national economy coupled with fierce competition on the Internet forced dramatic cutbacks in the company. It did survive, but on a much smaller scale.

Company E employs 75 workers, of whom 25 resided in IT. Their IT platform environment consists of 10 servers and 80 desktops. There was no production services function at Company E nor was there a quality-assurance group. They ran 25 production applications daily and deployed on average one new production application per month. The initial priorities of the company were to get their website up and operational and to start producing revenue. As a result, there was no change management or production acceptance processes in place. As the company started to grow, the need for these processes became more apparent.

Since the company was starting with a clean slate, there were no previous processes to undo, replace, or re-engineer. There were many young, energetic individuals who were eager to learn new skills and methods. The relatively small profile of applications meant that we had a large number from which to select for a pilot program. A willing staff and a select group of pilot applications could not overcome the problems and changing priorities of the company's rapid growth. Just as a process was about to be implemented, a new crisis would arise, putting the new procedure on hold.

A larger challenge common to many dot-com companies was the culture clashes that arose between the entrepreneurial spirit of those behind the company's initial success and the more disciplined approach of those charged with implementing structured processes into the environment. The clash was especially evident between the technical gurus who were used to having free reign when deploying new applications, installing upgrades, or making routine maintenance changes. Those of us tasked with implementing infrastructure processes spent a fair amount of time negotiating, compromising, and marketing before achieving some positive results.

In summarizing Company E's strengths, they were a high-energy start-up with no prior processes needing to be re-engineered, they had only a small profile of existing applications with many new ones planned (allowing for a number of pilot programs), and they had a young staff eager to learn new methods. For their weaknesses, their rapid growth hindered the use of processes, their entrepreneurial culture clashed with disciplined processes, and their influential technical gurus were at times unwilling to support new processes.

Despite these drawbacks, we were able to design and pilot an initial production acceptance process. The process was much more streamlined than normal due to the accelerated nature of web-enabled applications. This streamlining actually helped to integrate it with a pilot change management process also being developed. The frequency of new applications builds in this Internet environment at times made change management and production acceptance almost indistinguishable. This integration also facilitated much cross-training between infrastructure groups and applications development to ensure each area understood the other as changes and deployments were being planned.

As we did with our prior companies, we performed an initial assessment of the production acceptance environment for Company E (see Figure 9-6). As you might expect with a start-up, the assessment was relatively low (although they did score well for cross-training). Their points totaled 16, for a final assessment score of 40 percent.

Figure 9-6

Figure 9-6 Assessment Worksheet for Company E

Company F

This company did everything right—almost. It broke off from a relatively rigid, conservative parent company and vowed to be more flexible, progressive, and streamlined. The IT executives understood the importance of robust infrastructure processes and committed the resources to make them a reality. Their only flaw was in diving headfirst into production acceptance before any semblance of a change management process was put in place.

Company F employs 4,000 workers, of whom 300 resided in IT. Their IT platform environment consists of two mainframe processors, two midrange computers, 30 servers, and approximately 1,800 desktops. There was a production services department at Company F that reported to an applications-support group, and there was a quality-assurance group that reported to applications development. They ran 700 production applications daily, a dozen or so more on weekends and during month-end closings, and deployed on average three new production applications per month. There was only a start of change management process and no production acceptance process.

When the company first asked us to upgrade their IT environment by implementing robust infrastructure processes, they suggested we begin with production acceptance. They reasoned that this would be a natural place to start because they were planning to deploy several new critical applications during the upcoming year and already had an application-support group in place. We conducted an initial assessment of their infrastructure and concluded that a change management process was more urgently needed than production acceptance. We based this conclusion on the number and variety of changes being made to their production environment locally and remotely and that both were increasing at an accelerated rate.

The IT executives were very receptive to our recommendation about change management and were very supportive of our efforts to involve various departments within IT. They suggested that we include the remote sites as part of our strategy and committed time and resources to the process. Including the remote sites was a key addition since it allowed us to standardize and integrate the process across all locations. Even though a partial change management process was already in place, the IT managers realized its disjointed nature and its lack of metrics and were willing to design a new process from scratch. They had not realized much need in the past to collect or analyze metrics, but they were won over after seeing how effective they could be in managing changes and new deployments.

One downside during our involvement at Company F was the frequent reorganizations, especially concerning operations, applications support, and our new production services function. This delayed some of the process approvals and made some of the managers unwilling to select a pilot project for production acceptance because responsibilities for certain applications were likely to change.

As to Company F's strengths then, they recognized that change management needed to be implemented prior to any other infrastructure processes, their IT executives provided strong support for these processes, they included their remote sites as part of the strategy, and they were willing to start with a clean slate. As to its weaknesses, Company F saw little need for the use of metrics, they had no recognition of the need to analyze metrics, they reorganized frequently, which undermined attempts at process improvements, and they were unwilling to nominate a pilot production acceptance project.

Despite these hurdles, a very effective change management process was implanted at Company F. There was total standardization among three sites despite the fact each site was separated from the other by more than 1,000 miles. There were service and process metrics in place that were regularly collected, analyzed, and distributed. And it laid the foundation for a production acceptance process that would shortly follow. The most significant lesson learned was how important it was to implement key processes in the proper sequence. We would not have been as successful with either change management or production acceptance if we had not implemented them in the order we did.

As we did with our prior companies, we performed an initial assessment of the production acceptance environment for Company F (see Figure 9-7). Their prior establishment of an application-support group resulted in them having good services metrics, which were collected and analyzed on a regular basis. Their points totaled 27, for a final assessment score of 68 percent.

Figure 9-7

Figure 9-7 Assessment Worksheet for Company F

Company G

Company G is a relatively young financial services establishment that began eight years ago. It is successfully transitioning from that of a small start-up to a medium-sized enterprise. We have seen many a company at a similar time in their development struggle to transform from a novice firm into a mature organization. Company G does not seem to be struggling in this transformation. They have effectively promoted a culture of empowerment, honesty, and change; it is very much in evidence in their everyday manner of doing business.

Company G employs 2,500 workers, of whom 150 reside in IT. Their IT platform environment consists of 200 servers and approximately 1,500 desktops. The reason they have such a large number of servers in relation to desktops is that for several years, each new application was given its own server. This was one of several reasons for instituting a production acceptance process. There was no production services function at Company G, although there was a quality-assurance group that reported to applications development. They run 250 production applications daily and deploy an average of five new production applications per month. There was only the start of a change management process and no production acceptance process at the time we initiated our involvement.

Because the company was so young, it had few infrastructures processes in place. The upside to this was that there were few poor processes that needed to be re-worked. IT executives recognized the need to implement robust infrastructure processes and were willing to hire full-time staff to help implement and maintain them, particularly change management, production acceptance, and business continuity. They also saw the huge benefits from integrating these processes and stressed the need to design and implement these processes in a coordinated fashion.

The company did have a few hurdles to overcome. Audits are a fact of life in the banking and financial services industry and Company G had their share of them. This sometimes caused them to focus more on the results of audits than on the quality of their processes and services. Another hurdle was the lack of experience of critical team leads. This was no fault of the leads. The company believed strongly in promoting from within, and with such a young organization, this meant the leads needed some time to grow into their jobs. The company did invest well in training and mentoring to address this.

The rapid growth of the company also caused many shifts in priorities. This caused some pilot applications for production acceptance to change, causing the pilot to be re-started more than once. The production acceptance process did integrate well into their system development life cycle (SDLC) methodology, although an exorbitant amount of detail went into the analyses of these processes.

In review of Company G's strengths, they provided a highly empowering environment, they were a relatively young firm with few poor processes, they integrated their processes well, and they were willing to hire full-time staff to implement a production acceptance process. As to their weaknesses, they sometimes placed more emphasis on audits than on results, they lacked experienced team leads, their rapid growth caused frequent priority changes, and their production acceptance analysis was overly detailed.

This company used three excellent strategies in its process-improvement efforts

  1. They used a simple design in their processes.
  2. They used widely accepted tools.
  3. They had wide-spread involvement and agreement by multiple groups to ensure the required buy-in from all required areas.

These strategies worked very well in fashioning processes that were efficient, effective, and widely used.

As we did with our prior companies, we performed an initial assessment of the production acceptance environment for Company G (see Figure 9-8). Their points totaled 24, for a final assessment score of 60 percent.

Figure 9-8

Figure 9-8 Assessment Worksheet for Company G

Selected Companies Comparison in Summary

This concludes our discussion of our process experiences at seven client companies. Table 9-5 presents a summary comparison of each company's overall assessment scores, their relative strengths and weaknesses, and the lessons they and we learned from our process-improvement efforts.

Table 9-5. Summary of Strengths, Weaknesses, and Lessons Learned for All Companies

Company A

Company B

Company C

Company D

Company E

Company F

Company G

AS

50%

45%

48%

83%

40%

68%

60%

Strengths

- willing to try new techniques and new technologies

- Committed to continuous process improvement

- technically competent staff

- willing to collect meaningful metrics

- saw need for professional support for designing PA processes

- started out with pilot programs

- staffed pilot programs with qualified staff

- recognized need to upgrade antiquated processes

- committed resources to re-engineer SDLC

- provided much training to users on new processes

- committed to Baldrige quality award criteria

-analyzed metrics well

- strived to integrate, standardize, streamline, and then automate

- high-energy start-up with no prior processes to re-engineer

- small profile of applications allowed for many pilots

- young staff eager to learn

- recognized that change management must come first

- total support of IT executives

- remote sites part of strategy

- willing to start with clean slate

- highly empowering environment

- relatively young firm with few poor processes

- willing to hire full-time staff to implement PA

Weaknesses

- tended to not interact with staff

- little documented training

- operations and development group physically apart by 25 miles

- collected metrics, but did not always analyze them

- did not provide training and documentation to help desk group

- support for PA process weakened after pilot program

- no plans to integrate with other processes

- did not involve the infrastructure

- moved the control of PA into development out of operations

- little or no training and documentation for help desk and operations

- rush to meet deadlines undermined the following of the PA process

- high priority projects allowed to bypass process

– process not streamlined due to cutbacks

- turf wars

- rapid growth hindered use of processes

- entrepreneurial culture clashed with disciplined processes

- influential gurus unwilling to support new processes

- saw little need to use metrics

- no recognition of need to analyze metrics

- frequent re-orgs undermined improvements

- unwilling to nominate a pilot PA project

- more emphasis on audits than on results

- lack of experienced team leads

- rapid growth caused frequent priority changes

- PA analysis overly detailed

Lessons Learned

- development and operations need to work together from the start

- infrastructure support groups need to support the PA process and operations

- ensure the long-range commitments of IT

- consider a change management process prior to a PA process

- IT executives must ensure that operations control the PA process and that development is involved in the process design from the start

- there are significant benefits from standardizing across all divisions and remote sites; this helps merger integration

- one must be aware of changing and conflicting cultures due to the unstructured and entrepreneurial nature of startups

- important to implement key processes in proper sequence, such as a change management process prior to production services

- use simple, widely agreed upon processes, strategies, and tools to ensure the buy-in of all required support groups

AS = Assessment score for company's production acceptance process

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020