Home > Store

Improving Software Organizations: From Principles to Practice

Register your product to gain access to bonus material or receive a coupon.

Improving Software Organizations: From Principles to Practice

Book

  • Sorry, this book is no longer in print.
Not for Sale

Description

  • Copyright 2002
  • Dimensions: 7-3/8" x 9-1/4"
  • Edition: 1st
  • Book
  • ISBN-10: 0-201-75820-2
  • ISBN-13: 978-0-201-75820-7

Many practitioners are frustrated by software process improvement initiatives that fail, but now they can solve the problem, once and for all. Improving Software Organizations shares the field's most practical techniques for overcoming the key obstacles to process improvement. It offers a complete framework for software process improvement that draws upon the experiences of seven leading development organizations. This book contains powerful, experience-based lessons for planning, implementation, and ongoing management. Coverage includes: enhancing learning and knowledge transfer in the organization; choosing the right role for assessment methods such as CMM and Bootstrap; defining appropriate metrics for software improvement; and more. Mathiassen also offers practical guidance for maintaining the progress of a software improvement initiative once the lofty rhetoric and first wave of enthusiasm have died down.

Sample Content

Downloadable Sample Chapter

Click below for Sample Chapter related to this title:
mathiassench01.pdf

Table of Contents

I. LEARNING TO IMPROVE.

 1. Learning SPI in Practice.
 2. Mapping SPI Ideas and Practices.

II. LEARNING FROM EXPERIENCE.

 3. The Correct Effort.
 4. The Ambitious Effort.
 5. The Grassroots Effort.
 6. The Adolescent Effort.

III. INITIATING LEARNING.

 7. Learning from Assessments.
 8. From Problem Reports to Better Products.
 9. Problem Diagnosis in SPI.
10. Project Assessments.
11. A Framework for Selecting an Assessment Strategy.

IV. ORGANIZING FOR LEARNING.

12. Knowing and Implementing SPI.
13. Improving Customer Relations.
14. Strategies for Organizational Learning in SPI.

V. TECHNIQUES FOR LEARNING TO IMPROVE.

15. Implementing SPI: An Organizational Approach.
16. Risk Management in Process Action Teams.
17. Principles of Metrics Implementation.
18. Better Requirements.
Appendix A: Risk and Action Tables.
Research Team.
Index. 0201758202T10102001

Preface

Global competition and customer demands for better software quality are pushing companies to undertake software process improvement (SPI) initiatives. However, the scale and complexity of SPI organizational change can be daunting, and when it is not managed with great skill, the effort is likely to fail. Software development managers and engineers know too well the feelings of frustration associated with investing valuable resources and not achieving the desired SPI outcomes.

In this book, Improving Software Organizations, we discuss ways to understand and develop the core competencies required to succeed with SPI. Our approach is pragmatic and action-oriented. We examine SPI experiences from real-world situations and distill from them essential lessons for planning, implementing, and managing SPI initiatives to successful completion.

Our book is a result of a collaboration between four Danish companies—Danske Data, Brüel & Kjær, Ericsson Denmark, and Systematic Software Engineering—three universities—Aalborg University, Copenhagen Business School, and Technical University, Denmark—and an R&D organization, Delta. The project was part of the Danish National SPI Initiative and lasted from January 1997 to December 1999. It was funded in part by the government of Denmark through the Danish National Center for IT Research. During the three-year project, scientists and engineers from the companies and universities worked together on SPI projects within the companies. A primary objective of our collaboration was not only to successfully implement SPI in the companies but also to develop principles and strategies for effectively executing SPI initiatives. From the beginning, we set out to examine and develop solutions for difficult practical problems reported by other SPI experts. In these pages, we present our findings and reflections based on our experiences practicing SPI. We hope that you find our book informative and that the information in it supports your own efforts to solve the practical problems involved with planning and implementing your own SPI programs.

THE FOUR COMPANIES

Following is general information about each company. As you’ll see, the companies vary in size and in the products they make. They also have various objectives and approaches to SPI. Such variety offers us a unique opportunity to examine a broad range of SPI issues of interest to both software managers and engineers. You are thus likely to find many issues and problems presented in this book that are similar to those facing your own organization, as well as solutions that you can adapt and implement.

Brüel & Kjær A/S

Brüel & Kjær is a leading manufacturer of high-precision measuring instruments. These technically advanced instruments are used in many industries—including automotive, telecommunications, electricity, and aerospace—as well as in environmental measuring and university and industrial research. Brüel & Kjær’s measuring instruments are based on both embedded real-time software and Windows NT applications. The Brüel & Kjær product line covers the entire range of measurement equipment, from simple transducers to highly advanced software for calculating and presenting measurement results.

Brüel & Kjær’s main office is in Nærum (just north of Copenhagen), and the company operates more than 50 sales offices and agencies worldwide. In 1998, Brüel & Kjær was divided into two separate companies:

  • Brüel & Kjær Sound and Vibration Measurement
  • Brüel & Kjær Condition Monitoring Systems
  • Sound and Vibration is the larger of the two companies, with 550 employees. Approximately 80 of these employees are development engineers, of whom 40 are software developers. Annually, 10 to 15 development projects are carried out, with 4 to 8 people in each project group. Condition Monitoring Systems has some 50 employees, of whom 10 are software developers. Over the past 10 to 15 years, Brüel & Kjær has been transformed from a company focused on hardware, mechanics, and electronics to a company focused on software. Today, two out of three engineers at Brüel & Kjær are software engineers. Most Brüel & Kjær employees have an engineering education; a few have backgrounds in business or computer science.

    In the mid-1990s, Brüel & Kjær transformed itself from a departmental organization to a project-oriented organization. As part of this process, the entire middle management layer was replaced. Several other employees were trained in project management and given responsibility for managing development projects in the new organization. During the 1990s, Brüel & Kjær carried out several other organizational change initiatives. In 1994, the company successfully completed ISO 9000 certification.

    When assessed in October 1996, Brüel & Kjær was measured at level 2.25 on the Bootstrap scale. It was the only one of the four collaborating companies that started the SPI project at maturity level 2. In the fall of 1999, Brüel & Kjær was again assessed using the Bootstrap model, and the result showed an increase of maturity to 2.5.

    Danske Data A/S

    Danske Data is a subsidiary of Danske Bank Group, a financial institution that provides all types of financial services (banking, mortgaging, insurance, and so on). The primary business function of Danske Data is the development of information technology (IT) systems for Danske Bank Group, including Danske Bank, the largest bank in Denmark. Danske Data was originally the IT department within the bank, but on July 1, 1996, it was spun off as an independent company.1 The company has approximately 900 employees located at four development centers and is one of Scandinavia’s largest IT companies.

    Software development projects at Danske Data vary widely in size; most are small and short-term, but there are also some major projects that have strategic implications for the entire corporation. Project teams of 3 to 5 people typically handle the smaller projects, which usually take 6 to 12 months. Large projects, such as the Year 2000 compliance project, typically involve as many as 150 people and last 6 months to 3 years. Danske Data has four development divisions, each headed by a senior vice president. Each individual division is led by a vice president and organized into departments, typically with 20 to 50 people divided among five or so projects. Project managers oversee regular projects, and the vice president manages high-profile projects. Software developers at Danske Data typically have a bachelor’s degree in either an IT-related field or banking.

    Danske Data develops software mainly for mainframe computers but also develops some applications for client/server environments, such as Internet banking. Danske Data mainframe applications run 24 hours a day and process a daily average of nine million transactions from about 11,000 workstations. The company’s mainframe installation is the largest in Northern Europe and is divided between two operation centers. Systems developed for this platform are based on an advanced event-oriented database principle, something that increases data processing flexibility. Security and reliability are the two main system requirements because data are mirrored in real time between the two operation centers in Erhus and Copenhagen. Modern methods for modeling data, functions, and workflow are used along with the all-important business model—information framework—which is crucial to getting stakeholders from the user organization involved in the development process.

    In May 1997, Danske Data conducted its first assessment of software process maturity. It used both the Capability Maturity Model (CMM) and Bootstrap assessment approaches, which showed the company to be right between level 1 and 2 (1.5 using the Bootstrap scale). Danske Data was again assessed in October 1999 and was at that point at level 2.0.

    Ericsson Denmark

    The Ericsson Corporation is one of the world’s largest suppliers of telecom equipment. During the past 20 years, the company has gradually transitioned from hardware-only products to embedded software products and pure software products. Ericsson’s major product areas are fixed and wireless switching equipment, mobile phones, telecommunication management systems, PBX systems, transmission equipment, defense systems, and Internet solutions—all of which rely heavily on software. Ericsson Denmark has a mid-sized systems development division within the Ericsson Corporation and employs approximately 500 people working in five product groups.

    In early 1996, Ericsson Corporation changed its organizational structure from a line to a matrix organization. In the period following—from 1996 to 1998—Ericsson Denmark’s staff increased from 250 to 400, and each of its product groups reported to corresponding business units located in other countries. Both the Ericsson Corporation and Ericsson Denmark have a long history of improving software development. In 1992, the company took the first steps to set up a corporatewide SPI program, the Ericsson System Software Initiative (ESSI). From the beginning, ESSI was a strategic effort that ensured alignment, deployment, and follow-up on corporate SPI goals. ESSI’s first intervention was in Ericsson’s largest and most complex software development area, the telephone exchange software group. An aggressive goal was defined to reduce fault density in telephone exchange software products by 50% annually.

    Another important ESSI initiative focused on CMM as a long-term strategy for improving software development performance. The initiative was supported by the creation of an international corps of trained CMM assessors tasked with determining the level of software process maturity throughout the company. At the end of 1996, the ESSI program had been operational worldwide for a couple of years, and most of the company’s international software development sites had shown good progress toward reaching the corporate fault density goals.

    Ericsson Denmark was assessed at level 1 in 1995 and at level 2 in June 1998. In between the two assessments, the division underwent both Light Assessments and UltraLight Assessments.

    Systematic Software Engineering

    Systematic, founded in 1985, produces and integrates software for complex information and communications systems. Systematic’s international customers include military institutions and suppliers as well as data communication, transportation, and manufacturing companies and organizations in the finance and health care sectors. As a systems integrator, Systematic has established a core competency in the management and implementation of complex software projects that require high reliability and secure communications 24 hours a day. Systematic is recognized by its customers for the timely delivery of quality, cost-effective products.

    In 1996, Systematic employed 137 people. Of these employees, 105 were software engineers and 32 worked in finance, administration, internal IT, quality assurance, canteen, and cleaning. By 1999, the number of employees had grown to 155. At Systematic, all software development takes place in project teams, led by a project manager. Most managers started with the company as software engineers and were later trained internally for management responsibilities. In 1998-99, project teams ranged in size from 2 to 18 members and projects lasted from two months to three years. Typically, project members were not rotated out; they stayed with the project from the analysis phase through requirements specification, design, programming, test, documentation, installation, and user training. This practice reflects the company’s belief that such consistency ensures maximum commitment and development of staff competence.

    Despite the small number of graduates in computer science and systems engineering in Denmark, two-thirds of Systematic’s employees hold master’s or doctoral degrees. To facilitate high flexibility and preparedness for change, the company recruits highly educated people with knowledge of state-of-the-art technologies. One of the main reasons Systematic undertook SPI was to help meet its goal of becoming an internationally recognized software supplier and systems integrator in communications and interoperability between defense units, and in electronic commerce and data interchange between enterprises. In 1992, Systematic’s quality assurance system was certified in accordance with ISO 9001 and the military standards AQAP 110 and 150. The ISO 9001 certified quality management system is the basis of numerous elements in Systematic’s quality assurance procedures.

    In 1997, Systematic conducted its first software process maturity assessment using both the CMM and Bootstrap approaches and was rated to be just under Bootstrap 2. In 1998 and 1999, the company conducted additional Bootstrap assessments, and in 1999 the company was assessed to be at level 2.5 (using the Bootstrap maturity scale).

    THE STRUCTURE OF THE BOOK

    The book is divided into five parts. Part I consists of Chapters 1 and 2 and introduces the major learning points of our three-year collaborative project. In this first part, we present an overview—a map—of the theories and models that inspired us and formed the basis of our practice in the projects. Part II, Learning from Experience, is divided into four chapters. Each of these chapters characterizes the SPI experience of one of the four collaborating companies and is named accordingly. For example, Chapter 3, The Correct Effort, describes how Ericsson Denmark attempted first to follow standard advice, only to discover that adherence to general prescriptions did not bring the desired results. Thus, it had to deviate, ultimately producing a truly “correct” effort through innovation and adaptation to its particular circumstances.

    Part III, Initiating Learning, focuses on how to structure learning conditions and initiate learning in SPI initiatives. We discuss maturity level assessments as an important mechanism for learning. We have used a broad range of assessment methods. Some were inspired by formalized approaches, such as CMM or Bootstrap (discussed in Chapters 7 and 10), whereas others were invented in project groups (Chapters 8 and 9). Finally, Chapter 11 discusses how to select an appropriate assessment strategy. Part IV, Organizing for Learning, goes beyond assessments and takes a more reflective look at SPI: In Chapter 12, we reflect on knowledge transfer; in Chapter 13, we discuss customer maturity; and in Chapter 14 we focus on organizational learning in the SPI context.

    Part V examines interesting details in different techniques for SPI. Chapter 15 presents a framework for implementing SPI programs, and the remaining chapters offer detailed discussions of how to carry out risk assessments (Chapter 16), how to implement a metrics program (Chapter 17), and how to improve requirements specification (Chapter 18).

    This book is based on a truly collaborative effort. The team of engineers and scientists that have authored the chapters is listed at the very end of this book. Three of the authors—Lars Mathiassen, Jan Pries-Heje, and Ojelanki Ngwenyama—have edited this book assisted by Keri Schreiner who has interacted closely with the authors to help them write for practitioners. Finally, the staff at Addison-Wesley has provided valuable support in designing and producing the book.

    0201758202P10102001

    Index

    Aaen, Ivan, 50
    Aalborg University, xxi
    Questionnaire Based Assessment, 69, 70, 188–189, 192
    Action prioritization procedure, 284–285
    Action research, 316
    Adolescent effort. See Brüel & Kjær A/S Ambitious effort. See Systematic Software Engineering Analysis phase, 308–309
        gathering information in, 308–309
        identifying problem-solving techniques in, 309
        prioritizing techniques in, 309
    Assessments
        assessor experiences in, 128–131
        assisted model-based, 187–188, 189–190
        Bootstrap tool in, xxiv, xxv, 18, 30, 99, 101, 110, 115–117, 130–131, 133, 150, 153, 160, 162–163, 175, 185, 240, 257, 260, 290
        framework for selecting strategy, 185–198
        considering combined strategy, 194–195
        designing, 195–196
        intervention versus day-to-day management, 187, 194
        model-based versus problem-based, 186
        rigor versus relevance, 186, 191–192
        selecting primary strategy, 191–194
        independent, of individual projects, 188
        maturity, 154
        organizational learning and, 122–128
        problem diagnosis versus model-based, 162–163, 192–194
        questionnaire based, 69, 70, 188–189, 192
        software process, 190
    Assisted metrics programs, 188
    Assisted model-based assessment, 187–188, 189–190
    Assisted problem diagnosis, 190

    Balanced Scorecard initiative, 63–64
    Balkan syndrome, 16
    Bang, Stig, 50
    Beizer’s bug taxonomy, 136
    Benchmarking, 188
    Best-practices models, 30–31
    Bonuses, 31
    Bootstrap assessment, xxiv, xxv, 18, 30, 115–117, 133, 153, 185, 290
        algorithm in, 116–117
        at Brüel & Kjær at, 99, 101, 110, 117, 118,150, 160, 162–163
        at Danske Data, 117, 118, 175, 257, 260
        relevance of, 117
        at Systematic, 117, 118, 240
        validity of, 130–131
    Broad dissemination phase, 313–316
    Brüel & Kjær A/S, xxi, xxii–xxiii, 99–112, 133
        adolescent effort at, 41–42, 99–112
        Bootstrap assessment at, 99, 101, 110, 117, 118, 150, 160, 162–163
        configuration management initiative at, 107–108
        defect analysis at, 133–151
        development model initiative at, 103–105
        establishment of Center for Software Process Improvement at, 241
        focus on project managers at, 101, 111
        independent project-based metrics program at, 189
        methodology for Preventing Requirements Issues from Becoming Defects (PRIDE) at, 134, 140, 143, 146, 147
        organizational learning at, 123–124
        participatory improvement at, 11, 12
        Prevention of Errors through Experience-Driven Test Efforts (PET) at, 134, 137, 142, 146–147
        problem diagnosis at, 101–102, 108–109,153–154, 161–162, 190
        problem reports at, 133–151
        problem solving at, 5–6
        product profile at, 100
        project dependency at, 111
        project tracking initiative at, 106–107
        requirements specification initiative at, 105–106
        reuse initiative at, 106
        software development process at, 135, 241–243, 247, 307, 308–309, 313–314
        strengths and weaknesses of, 108–109
        support teams at, 102–103, 108–109
        technology competence at, 111
        Bug classification, taxonomies for, 136
        Bug distribution analysis, 137–140
        Bug distribution over time, 141–142
        Bureaucratic arrangements, limiting, 13
    Bureaucratization, 34
    Business Process Reegineering (BRI), 25, 30

    Calculus-based trust, 222, 228
    Capability Maturity Model (CMM), xxiv, xxv, 18, 30, 133, 153, 185, 290
        at Ericsson Corporation, xxiv, xxv, 18, 30, 50
        accelerator assessment, 54–55, 58
        UltraLight assessmentst, 55–56, 58–59, 169–173
    CASE tools, 201
    Circumspection, 223
    Combination in knowledge creation, 10
    Commission of the European Communities (CEC), 133
    Commitment-based improvement, alternatives to, 32
    Communication, in metrics implementation, 299–301
    Compass development program for measuring productivity and quality, 188
    Competence, 34–35
    Computer-Aided Software Engineering (CASE), 30
    Configuration management
        at Brüel & Kjær A/S, 107–108
        at Systematic Software Engineering, 72
    Contel, metrics implementation at, 287
    Context for software engineering activities, 35–36
    Continuous improvement, 17–19
        factors that undermine, 18
    Copenhagen Business School, xxi
    Core competence, 31
        at Systematic Software Engineering, 67
    Correct effort. See Ericsson Denmark Cost/benefit analysis, in prioritizing techniques, 309
    Cost/time aligment, 31–32
    CSC Denmark, use of assisted model-based assessment strategy by, 188
    Curtis, Bill, 53, 54
    Customer-focused maturity models, 224–227
    Customer perception, 31
    Customer relations, 217–233
        care and engineering in, 223
        collaboration and competition in, 220–221
        early initiatives in, 231–232
        maturity models in, 224–227
        relationship dynamics in, 224
        at Systematic Software Engineering, 217–233
        trust and control in, 221–222
        understanding, 217–220
        workshop for improving, 227–230
    Customer-supplier relationship, 218–219
        bureaucracies in, 220–221
        at the constituting level, 221
        market relationships in, 220
        teams in, 220

    Danish National Center for IT Research, xxi
    Danish National SPI Initiative, xxi
    Danske Bank Group, xxiii
    Danske Data A/S, xxi, xxiii–xxiv, 83–98
        Bootstrap assessment at, 84, 92, 117, 118, 175, 189, 257, 260
        change as cultural process at, 93–94
        CMM at, 87
        continuous improvement at, 17–18
        creating synergy in, 96
        development model at, 85
        establishment of Project Management Competence Center at, 8–9, 86–87, 89, 90–91, 173, 245
        expecting and utilizing conflict, 97
        fostering cross-organizational learning at, 86
        grassroots effort at, 40–41, 83–98
        improvement initiatives at, 85–86
        independent project-based self-assessments at, 191
        involvement of key players, 96
        knowledge creation at, 8–9
        leadership integration at, 14
        long-term thinking in, 97
        metrics implementation at, 291–292, 294, 295, 296–297, 298, 299, 300, 301, 302
        organizational implementation at, 85–86
        organizational learning at, 125–126
        participatory improvement at, 13
        politics of change at, 94–95
        problem solving at, 5
        process measurement at, 85
        project assessment at, 168, 173–177
        project-management initiative at, 86–90,91
        providing education at, 89–90, 91
        quality assurance at, 85
        questionnaire-based assessment at, 188–189
        rational view to change, 92–93
        remembering customers, 97
        risk management at, 273, 283
        shared vision in, 96
        software process improvement at, 208–212, 245–246, 247
    Datacentralen. See CSC Denmark Data usage in metric implementation, 301–302
    Day-to-day management, intervention versus, 187, 194
    Debate, facilitation of, in metrics implementation, 301
    Defect analysis, 133–151
        at Brüel & Kjær A/S, 133–151
        bug-distribution analysis in, 137–140
        bug distribution over time, 141–142
        identifying prevention techniques in, 142–146
        improvements obtained in, 146–148
        management support for, 148–149
        maturity impact in, 149–150
        measurement limits in, 149
        problem reports in, 133–134, 135
        requirements-related analysis, 140–141
        taxonomy limits in, 149
    Delta, xxi, 150, 195
    Denmark, software process improvement in, xviii, xxi
    Descriptive process model, 160
    Development model
        at Brüel & Kjær A/S, 103–105
        at Danske Data, 85
    Diagnostic competence, building, 6
    Diffusion, as demand-driven, 202
    Dynamic analysis, 143

    Ericsson Denmark, xxi, xxiv–xxv, 49–64
        aftermath at, 56–58
        background, 50–51
        CMM accelerator assessment at, 54–55, 58
        CMM UltraLight Assessment at, 55–56, 58–59, 169–173
        correct effort at, 37–39
        ensuring management attention and feedback, 62
        future for, 63–64
        goals at, 49, 52
        independent model-based self-assessments at, 190–191
        involvement of practitioners, 61–62
        keeping efforts local, 62–63
        lack of process performance measures, 59
        launch of Balanced Scorecard by, 63–64
        launch of software process improvement at, 51–54
        leadership integration at, 16
        loss of momentum in software process improvement effort, 59
        need for focusing and simplifying of software process improvement at, 61
        organization and start-up at, 53–54
        participatory improvement at, 11, 13
        process action teams at, 243, 244
        progress at, 58
        project assessment at, 168–173
        promoting individual responsibility at, 62
        reflections on software process improvement process, 59–61
        resource pool formation at, 51
        software process engineering groups at, 244, 245
        software process improvement at, 243–245, 247
        structural complexity and growth at Ericsson System Software Initiative (ESSI), xxiv, 37–39, 50, 52
    European Systems and Software Initiative (ESSI) program, 292–293, 307
    Evolutionary process, software process improvement as, 29–30
    Experience-based requirements engineering methodology
        requirements elicitation and validation, 145
        verification of the requirements specification, 145
    Explicit knowledge, 237
    Exploitation, learning by, 250–251
    Exploration, learning by, 249–250
    Externalization in knowledge creation, 10
    External software stress test, 311

    Facilitator in risk management, 278, 282, 283
    Fayol, Henri, 92
    Feedback
        in problem diagnosis method, 159
        in software process improvement
        management, 28–29
    Focused pilot phase, 310–313

    Goal determination, in metrics implementation, 298
    Goal Question Metric (GQM) method, 31, 188, 294, 298
    Gold-plating, 250, 251
    Grassroots effort. See Danske Data

    Hewlett-Packard, metrics implementation at, 287
    Hit rate, 143, 144
    Holistic view of software engineering, 33–34
    Hospitality, 223
    Humphrey, Watts, xvii

    IDEAL (Initiate, Diagnose, Establish, Act, and Learn) model, 7, 19
        at Systematic Software Engineering, 66
    Identification-based trust, 222, 228
    Improvement actors, risk resolution actions
        associated with, 324
    Improvement areas
        risk items associated with, 319
        risk resolution actions associated with, 320
    Improvement diffusion, at Systematic Software Engineering, 76–79
    Improvement ideas
        risk items associated with, 320–321
        risk resolution actions associated with, 321
    Improvement knowledge, in metrics implementation, 294
    Improvement process
        risk items associated with, 322
        risk resolution actions associated with, 322–323
    Incentives, 31
        in metrics implementation, 296–297
    Independent model-based self-assessments, 190–191
    Independent project-based metrics program, 189
        at Brüel & Kjær A/S, 189
    Independent project-based self-assessments, 191
    Individualist perspective, 202, 205, 209
    Information, gathering, in analysis phase, 308–309
    Information technology projects, factors in success of, 11
    Initial value check, 311
    Insider solutions at Systematic Software Engineering, 65, 66, 73–74, 75–76, 78–79, 81
    Institutionalized process, resistance to change, 33
    Interactive-process perspective, 202–203, 206–207, 210–211
    Internalization in knowledge creation, 10
    Intervention assessment versus day-to-day management, 187, 194
    Interviews in problem diagnosis method, 156–158

    Kaizen strategy, 193
    Kierkegaard, Soren, 115
    Knowledge
        diffusion of, about techniques, 314
        explicit, 237
        in metrics implementation, 294–295
        tacit, 237
    Knowledge-based trust, 222, 228
    Knowledge creation
        combination in, 10
        externalization in, 10
        internalization in, 10
        socialization in, 10
        in software process improvement, 7–11
        systematic assessments in, 9, 10
    Knowledge transfer, implementation and, 212

    LDRA Testbed tool, 142
    Leadership
        integrating, 14–16
        organizational, 16
    Leadership principles, at Systematic Software Engineering, 68
    Learning
        by exploitation, 250–251
        by exploration, 249–250

    Management of software process improvement, 25–29
        feedback in, 28–29
        organization in, 25–27
        planning in, 27–28
    Mapping SPI ideas and practices, 23–43
    Market share, 31
    Maturity assessments, 154
        methods in, 168
        role of, in improvement efforts, 167
    Maturity models
        as basis for project assessments, 179
        customer-focused, 224–227
    Methodology for Preventing Requirements Issues from Becoming Defects (PRIDE), at Brüel & Kjær A/S, 134, 140, 143, 146, 147
    Metrics implementation, 287–304
        advanced guidelines in, 290
        communication in, 299–301
        at Danske Data, 291–292, 294, 295, 296–297, 298, 299, 300, 301, 302
        data collection in, 300
        data usage in, 301–302
        debate facilitation in, 301
        establishing incentive structures in, 296–297
        establishing project in, 296
        goal determination in, 298
        improvement knowledge in, 294
        knowledge in, 294–295
        organizational knowledge in, 295
        program design in, 297–299
        program organization in, 295–297
        publication of objectives in, 300
        starting simple in, 299
    Metrics program, independent project-based, 189
    Mission, establishing clear, for project assessment, 178
    Model-based assessments assisted, 187–188
        versus problem-based assessments, 186, 192–194
        versus problem diagnosis, 162–163
    Motorola’s Progress Assessment, 168
    Myths in undermining knowledge creation, 9–10

    NASA, metrics implementation at, 287
    Network Products, SPI in, 204–208
    Normative models, 42, 133, 153
        in knowledge creation, 10
    Norm-based approach for software process improvement, 30–31
    Norms, use of accepted, as basis for project assessment, 179
    Not-invented-here syndrome, in undermining knowledge creation, 9–10

    Objectives, publication of, in metrics implementation, 300
    Organizational approach to implementing software process improvement, 257–269
        diffusion in, 258–259
        implementation workshop in, 261–267
        lessons learned in, 267–269
        project context in, 260
    Organizational implementation at Danske Data, 85–86
    Organizational inertia, 31
    Organizational knowledge in metrics implementation, 295
    Organizational leadership, 16
    Organizational learning, 122–128
        at Brüel & Kjær A/S, 123–124
        at Danske Data, 125–126
        strategies for, 235–252
        at Systematic Software Engineering, 124–125
        theory of, 237–239
    Organizational maturity, 31
    Organization in software process improvement management, 25–27
    Orthogonality check, 311

    Participant commitment, 31–32
    Participatory improvement, in software process improvement, 11–14
    PATs. See Process action teams (PATs)
    Peer-based structure at Systematic Software Engineering, 67–68
    People Capability Maturity Model, 30–31
    Perception, 223
    Performance-based trust, 222, 228
    Performance specifications, 311
    Planning in software process improvement management, 27–28
    Prevention of Errors through Experience-Driven Test Efforts (PET), at Brüel & Kjær A/S, 134, 137, 142, 146–147
    Prioritizing techniques in analysis phase, 309
    Problem-based assessment versus model-based assessment, 186, 192–194
    Problem diagnosis
        analyzing immediate results in, 158
        assisted, 190
        at Brüel & Kjær A/S, 101–102, 108–109, 153–154, 161–162, 190
        comparison to model-based assessments, 162–163
        conducting interviews in, 158
        defining scope, 155–156
        offering feedback and obtaining validation in, 159
        oranizational resources needed for, 164
        organizing interviews in, 157–158
        personnel for performing, 163–164
        preparing for interviews in, 156–157
        prioritizing findings in, 164
        synthesizing problems in, 159, 160
        timing in performing, 163
        using, 160–161
    Problem reports in product improvement, 133–151. See also Defect analysis
    Problems, synthesizing, in problem diagnosis method, 159, 160
    Problem solving
    at Brüel & Kjær A/S, 5–6
        identifying techniques, in analysis phase, 309
        interaction mode in, 19
        interventionist mode in, 19
        in SPI, 4–7
    Process action teams (PATs)
        at Ericsson Denmark, 243, 244
        risk management in, 273–285
        at Systematic Software Engineering, 74, 75, 239, 240
    Process assessments, 31
    Process improvement activities, 31
    Process improvement infrastructure, 31
    Process measurement at Danske Data, 85
    Product cycle time, 31
    Product development, using problem reports in, 133–151
    Product expert consultation, 311
    Productivity, Compass development program for measuring, 188
    Product profile at Brüel & Kjær A/S, 100
    Profitability, 31
    Program design in metrics implementation, 297–299
    Program organization in metrics implementation, 295–297
    Project, establishment of, in metrics implementation, 296
    Project assessments, 167–184
        at Danske Data, 168, 173–177
        ensuring support in, 179
        at Ericsson Denmark, 168–173
        establishing clear vision or mission for, 178
        establishing separate process for, 178–179
        implementing, in projects’ daily work, 179
        improving approach in, 179
        opportunities in, 180–182
        risks in, 181, 183
        supporting dedicated project-specific improvements, 180–181
        supporting organizational improvements, 181–183
        supporting software process improvement with, 177–183
        use of accepted norm, standard, or maturity model as basis for, 179
    Project conclusion, 160
    Project Diagnosis Workshops at Systematic Software Engineering, 231
    Project Establishment Workshops at Systematic Software Engineering, 231
    Project evaluation workshop, 231–232
    Project management
        at Danske Data, 86–90
        at Systematic Software Engineering, 72
    Project managers, focus on, at Brüel & Kjær A/ S, 101
    Project teams at Systematic Software Engineering, 67–68
    Project tracking and control, 160
        at Brüel & Kjær A/S, 106–107
    Prototypes
        experimenting with, 160
        usability test of functional, 311

    Quality, Compass development program for measuring, 188
    Quality assurance
        at Danske Data, 85
        at Systematic Software Engineering, 68–70
    Quality management system, at Systematic Software Engineering, 9
    Questionnaire based assessment, 69–70, 188–189, 192

    Relevance-based assessment versus rigor-based assessment, 186, 191–192
    Requirements engineering techniques, 311
    Requirements-related analysis, 140–141
    Requirements specification initiative at Brüel & Kjær A/S, 105–106
    Reuse initiative at Brüel & Kjær A/S, 106
    Rigor-based assessment versus relevance-based assessment, 186, 191–192
    Risk management, 160
        at Danske Data A/S, 273, 283
        in the literature, 284
        in process action teams, 273–285
        Risk resolution actions
        improvement actors and, 324
        improvement areas and, 320
        improvement ideas and, 321
        improvement process and, 322–323
    Risks
        improvement actors and, 323
        improvement areas and, 319
        improvement ideas and, 320–321
        improvement process and, 322
        in project assessment, 183
    Rollout Workshop, at Systematic Software Engineering, 231

    Scenarios, 311
    Scope, defining, for problem diagnosis method, 155–156
    Self-assessments
        independent model-based, 190–191
        independent project-based, 191
    Silver bullet syndrome, 6
    Socialization in knowledge creation, 10
    Soft Systems Methodology, 7, 19
        interaction mode, 19
        interventionist mode, 19
    Software Acquisition Capability Maturity Model, 31, 224–225, 226
    Software capability evaluation, 190
    Software engineering
        holistic view of, 33–34
        as target of software process improvement, 32
    Software Engineering Institute, xvii
    Software organizations, Balkan syndrome at, 16
    Software process assessment, 190
    Software process engineering groups at Systematic Software Engineering, 238–239, 240
    Software processes, 32–34
        competence in, 34–35
        context in, 35–36
        defined, 32
        persistence in, 32–34
    Software process improvement (SPI), xviii, xxi
        alternatives to measuring effectiveness, 28–29
        assessment of current processes in, 3
        association with maturity assessments, 154
        benefits in collecting appropriate data, 29
        at Brüel & Kjær, 241–243, 247
        commitment in, 31–32
        continuous improvement planning in, 17–19
        at Danske Data, 208–212, 245–246, 247
        development of focused strategy for, 3
        difference from traditional approches, 4
        encouraging participation in, 11–14
        at Ericsson Denmark, 243–245, 247
        as evolutionary process, 29–30
        goal of, 3–4
        implementation plan for, 27–28
        initiatives in, xxi
        integration of leadership in, 14–16
        knowing and implementing, 201–213
        framework for, 202–204
        knowledge creation in, 7–11
        management of feedback, 28–29
        organization, 25–27
        planning, 27–28
        mapping ideas and practices in, 23–43
        in Network Products, 204–208
        norm-based approach for, 30–31
        organizational approach to implementing, 31–32, 257–269
        diffusion in, 258–259
        implementation workshop in, 261–267
        lessons learned in, 267–269
        project context in, 260
        perspectives of, 32–36
        problem diagnosis in, 153–165
        problem solving in, 4–7
        rhetoric in, 3
        software engineering practice as target of, 32
        steps in improving, 3
        strategic factors in, 31
        strategies for organizational learning in, 235–252
        supporting, with project assessments, 177–183
        at Systematic Software Engineering, 239–241, 247
        tactical factors in, 31
    Software process improvement (SPI) group, 238–239
        establishment of, 26
        resources for, 26
        risks to institutionalizing, 26
    Software requirements specification and requirements management, 160
    Software reuse, 160
    SPI. See Software process improvement (SPI)
    SPICE (Software Process Improvement And Capability Determination), 18, 30, 185, 187, 290, 294
    SPIRE, 225, 226–227
    Stability, 31
    Stakeholder analysis, 268
    State-of-the-art knowledge in facilitating a knowledge creation approach to SPI, 9
    Static analysis, 143
    Strategic focus, 31
    Structionalist perspective, 202, 205, 209–210
    Structural control, 222
        balancing trust with, 222
    Support teams at Brüel & Kjær A/S, 102–103, 108–109
    Synquest, 168
    Systematic assessments in knowledge creation, 9, 10
    Systematic Software Engineering, xv–xvi, xxi, 65–82
        Aalborg University’s Questionnaire Based Assessment at, 69, 70
        ambitions at, 65, 73, 76, 80
        ambitious effort at, 39–40
        Bootstrap assessment at, 117, 118, 240
        configuration management at, 72
        core competence at, 67
        culture at, 67–68
        customer relations at, 217–233
        Establishing Requirements Workshop at, 231
        establishment of quality management system at, 9
        expectations for SPI at, 66
        goals at, 65, 71
        IDEAL model at, 66
        improvement diffusion at, 76–79
        insider solutions at, 65, 66, 73–74, 75–76, 78–79, 81
        leadership integration at, 15
        leadership principles at, 68
        organizational learning at, 124–125
        organization at, 67–68
        peer-based structure at, 67–68
        process action teams at, 74, 75, 239, 240
        Project Diagnosis Workshop at, 231
        Project Establishment Workshop at, 231
        Project Evaluation Workshop at, 231
        project management at, 72
        project teams at, 67–68
        quality assurance system at, 68–70
        role of the organization at, 80–81
        Rollout Workshop at, 231
        software process engineering groups at, 238–239, 240
        SPI at, 239–241, 247
        stalemates at, 65, 66, 72–73, 74–75, 76, 77–78, 81
        Test Planning Workshop at, 231
        work groups at, 71, 72
    Systems Engineering Capability Maturity Model, 31

    Tacit knowledge, 237
    Task pool, 89
    Taylor, Frederick, 92
    Teams. See also Process action teams (PATs)
        in customer-supplier relationship, 220
        in software development, 311, 312–313
        support, 102–103, 108–109
    Technical University, Denmark, xxi
    Technology transfer, xvii–xviii
        failures in, xvii
        as supply-driven, 202
    Test Planning Workshop at Systematic Software Engineering, 231
    Time-boxes, 104–105
    Total Quality Management (TQM), 25, 193 Trust, 221–222
        balancing with structural control, 222
        calculus-based, 222, 228
        identification-based, 222, 228
        knowledge-based, 222, 228
        performance-based, 222, 228

    Understanding, 223
    Unit testing, 134
    Usability test of functional prototype, 311

    Validation, offering, in problem diagnosis method, 159
    Vision, 31
        establishing clear, for project assessment, 178

    Waterfall model, 134
    Weber, Max, 92
    Work groups at Systematic Software Engineering, 71, 72

    Updates

    Submit Errata

    More Information

    InformIT Promotional Mailings & Special Offers

    I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

    Overview


    Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

    This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

    Collection and Use of Information


    To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

    Questions and Inquiries

    For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

    Online Store

    For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

    Surveys

    Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

    Contests and Drawings

    Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

    Newsletters

    If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

    Service Announcements

    On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

    Customer Service

    We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

    Other Collection and Use of Information


    Application and System Logs

    Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

    Web Analytics

    Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

    Cookies and Related Technologies

    This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

    Do Not Track

    This site currently does not respond to Do Not Track signals.

    Security


    Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

    Children


    This site is not directed to children under the age of 13.

    Marketing


    Pearson may send or direct marketing communications to users, provided that

    • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
    • Such marketing is consistent with applicable law and Pearson's legal obligations.
    • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
    • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

    Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

    Correcting/Updating Personal Information


    If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

    Choice/Opt-out


    Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

    Sale of Personal Information


    Pearson does not rent or sell personal information in exchange for any payment of money.

    While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

    Supplemental Privacy Statement for California Residents


    California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

    Sharing and Disclosure


    Pearson may disclose personal information, as follows:

    • As required by law.
    • With the consent of the individual (or their parent, if the individual is a minor)
    • In response to a subpoena, court order or legal process, to the extent permitted or required by law
    • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
    • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
    • To investigate or address actual or suspected fraud or other illegal activities
    • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
    • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
    • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

    Links


    This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

    Requests and Contact


    Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

    Changes to this Privacy Notice


    We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

    Last Update: November 17, 2020