Home > Store > Software Development & Management > Object Technology

Improving Software Organizations: From Principles to Practice

Register your product to gain access to bonus material or receive a coupon.

Improving Software Organizations: From Principles to Practice


  • Your Price: $35.99
  • List Price: $44.99
  • Available on demand.


  • Copyright 2002
  • Dimensions: 7-3/8x9-1/4
  • Pages: 368
  • Edition: 1st
  • Book
  • ISBN-10: 0-201-75820-2
  • ISBN-13: 978-0-201-75820-7

Global competition, the time sensitivity of the new Internet economy, and increasing customer demand for better software quality are pushing companies to undertake software process improvement (SPI) initiatives. Numerous software organizations worldwide have implemented these initiatives with varying degrees of success. Many have adhered to standard SPI practice, only to experience less-than-satisfactory results when the execution proves more difficult than expected and enthusiasm and resources wane.

Improving Software Organizations offers a modern perspective on SPI. It outlines and discusses what it takes to move from SPI theory to successful SPI initiatives. Based on the results of the three-year National Danish SPI Initiative, this book draws on the experiences of four world-class companies—Danske Data, Brüel & Kjær, Ericsson Denmark, and Systematic Software Engineering. It presents a proven roadmap for successful SPI. It distills in-depth studies of these organizations—the strategies, approaches, and specific techniques that yielded tangible results. It presents a comprehensive framework for planning and executing successful SPI projects throughout the project lifecycle.

Improving Software Organizations presents the major lessons learned in the four companies. It provides an overview of the theories and models that formed the basis of the SPI initiatives. It also provides an in-depth examination of the four companies¿ development organizations, how each began the SPI initiative, what mistakes were made, and how they ultimately succeeded.

You will learn:

  • The five key principles of the SPI focus on problems, emphasize knowledge creation, encourage participation, integrate leadership, and plan for continuous improvement
  • How diverse companies adapt standard
  • SPI theory to achieve desired results
  • How to structure learning conditions in SPI initiatives
  • Maturity level assessments, including CMM, BOOTSTRAP, and other customized approaches
  • Knowledge transfer, customer maturity, and organizational learning
  • Proper paths for carrying out risk assessments
  • The specifics of implementing a metrics program
  • Tips on improving requirements specification
  • For each of the five SPI principles, the book offers examples from practice that demonstrate how successful organizations approached the issue. From these examples and the more detailed case studies, you will gain the understanding of how to design, implement, and execute an SPI initiative that is right for your organization.


    Sample Content

    Downloadable Sample Chapter

    Click below for Sample Chapter related to this title:

    Table of Contents


     1. Learning SPI in Practice.
     2. Mapping SPI Ideas and Practices.


     3. The Correct Effort.
     4. The Ambitious Effort.
     5. The Grassroots Effort.
     6. The Adolescent Effort.


     7. Learning from Assessments.
     8. From Problem Reports to Better Products.
     9. Problem Diagnosis in SPI.
    10. Project Assessments.
    11. A Framework for Selecting an Assessment Strategy.


    12. Knowing and Implementing SPI.
    13. Improving Customer Relations.
    14. Strategies for Organizational Learning in SPI.


    15. Implementing SPI: An Organizational Approach.
    16. Risk Management in Process Action Teams.
    17. Principles of Metrics Implementation.
    18. Better Requirements.
    Appendix A: Risk and Action Tables.
    Research Team.
    Index. 0201758202T10102001


    Global competition and customer demands for better software quality are pushing companies to undertake software process improvement (SPI) initiatives. However, the scale and complexity of SPI organizational change can be daunting, and when it is not managed with great skill, the effort is likely to fail. Software development managers and engineers know too well the feelings of frustration associated with investing valuable resources and not achieving the desired SPI outcomes.

    In this book, Improving Software Organizations, we discuss ways to understand and develop the core competencies required to succeed with SPI. Our approach is pragmatic and action-oriented. We examine SPI experiences from real-world situations and distill from them essential lessons for planning, implementing, and managing SPI initiatives to successful completion.

    Our book is a result of a collaboration between four Danish companies—Danske Data, Brüel & Kjær, Ericsson Denmark, and Systematic Software Engineering—three universities—Aalborg University, Copenhagen Business School, and Technical University, Denmark—and an R&D organization, Delta. The project was part of the Danish National SPI Initiative and lasted from January 1997 to December 1999. It was funded in part by the government of Denmark through the Danish National Center for IT Research. During the three-year project, scientists and engineers from the companies and universities worked together on SPI projects within the companies. A primary objective of our collaboration was not only to successfully implement SPI in the companies but also to develop principles and strategies for effectively executing SPI initiatives. From the beginning, we set out to examine and develop solutions for difficult practical problems reported by other SPI experts. In these pages, we present our findings and reflections based on our experiences practicing SPI. We hope that you find our book informative and that the information in it supports your own efforts to solve the practical problems involved with planning and implementing your own SPI programs.


    Following is general information about each company. As you’ll see, the companies vary in size and in the products they make. They also have various objectives and approaches to SPI. Such variety offers us a unique opportunity to examine a broad range of SPI issues of interest to both software managers and engineers. You are thus likely to find many issues and problems presented in this book that are similar to those facing your own organization, as well as solutions that you can adapt and implement.

    Brüel & Kjær A/S

    Brüel & Kjær is a leading manufacturer of high-precision measuring instruments. These technically advanced instruments are used in many industries—including automotive, telecommunications, electricity, and aerospace—as well as in environmental measuring and university and industrial research. Brüel & Kjær’s measuring instruments are based on both embedded real-time software and Windows NT applications. The Brüel & Kjær product line covers the entire range of measurement equipment, from simple transducers to highly advanced software for calculating and presenting measurement results.

    Brüel & Kjær’s main office is in Nærum (just north of Copenhagen), and the company operates more than 50 sales offices and agencies worldwide. In 1998, Brüel & Kjær was divided into two separate companies:

  • Brüel & Kjær Sound and Vibration Measurement
  • Brüel & Kjær Condition Monitoring Systems
  • Sound and Vibration is the larger of the two companies, with 550 employees. Approximately 80 of these employees are development engineers, of whom 40 are software developers. Annually, 10 to 15 development projects are carried out, with 4 to 8 people in each project group. Condition Monitoring Systems has some 50 employees, of whom 10 are software developers. Over the past 10 to 15 years, Brüel & Kjær has been transformed from a company focused on hardware, mechanics, and electronics to a company focused on software. Today, two out of three engineers at Brüel & Kjær are software engineers. Most Brüel & Kjær employees have an engineering education; a few have backgrounds in business or computer science.

    In the mid-1990s, Brüel & Kjær transformed itself from a departmental organization to a project-oriented organization. As part of this process, the entire middle management layer was replaced. Several other employees were trained in project management and given responsibility for managing development projects in the new organization. During the 1990s, Brüel & Kjær carried out several other organizational change initiatives. In 1994, the company successfully completed ISO 9000 certification.

    When assessed in October 1996, Brüel & Kjær was measured at level 2.25 on the Bootstrap scale. It was the only one of the four collaborating companies that started the SPI project at maturity level 2. In the fall of 1999, Brüel & Kjær was again assessed using the Bootstrap model, and the result showed an increase of maturity to 2.5.

    Danske Data A/S

    Danske Data is a subsidiary of Danske Bank Group, a financial institution that provides all types of financial services (banking, mortgaging, insurance, and so on). The primary business function of Danske Data is the development of information technology (IT) systems for Danske Bank Group, including Danske Bank, the largest bank in Denmark. Danske Data was originally the IT department within the bank, but on July 1, 1996, it was spun off as an independent company.1 The company has approximately 900 employees located at four development centers and is one of Scandinavia’s largest IT companies.

    Software development projects at Danske Data vary widely in size; most are small and short-term, but there are also some major projects that have strategic implications for the entire corporation. Project teams of 3 to 5 people typically handle the smaller projects, which usually take 6 to 12 months. Large projects, such as the Year 2000 compliance project, typically involve as many as 150 people and last 6 months to 3 years. Danske Data has four development divisions, each headed by a senior vice president. Each individual division is led by a vice president and organized into departments, typically with 20 to 50 people divided among five or so projects. Project managers oversee regular projects, and the vice president manages high-profile projects. Software developers at Danske Data typically have a bachelor’s degree in either an IT-related field or banking.

    Danske Data develops software mainly for mainframe computers but also develops some applications for client/server environments, such as Internet banking. Danske Data mainframe applications run 24 hours a day and process a daily average of nine million transactions from about 11,000 workstations. The company’s mainframe installation is the largest in Northern Europe and is divided between two operation centers. Systems developed for this platform are based on an advanced event-oriented database principle, something that increases data processing flexibility. Security and reliability are the two main system requirements because data are mirrored in real time between the two operation centers in Erhus and Copenhagen. Modern methods for modeling data, functions, and workflow are used along with the all-important business model—information framework—which is crucial to getting stakeholders from the user organization involved in the development process.

    In May 1997, Danske Data conducted its first assessment of software process maturity. It used both the Capability Maturity Model (CMM) and Bootstrap assessment approaches, which showed the company to be right between level 1 and 2 (1.5 using the Bootstrap scale). Danske Data was again assessed in October 1999 and was at that point at level 2.0.

    Ericsson Denmark

    The Ericsson Corporation is one of the world’s largest suppliers of telecom equipment. During the past 20 years, the company has gradually transitioned from hardware-only products to embedded software products and pure software products. Ericsson’s major product areas are fixed and wireless switching equipment, mobile phones, telecommunication management systems, PBX systems, transmission equipment, defense systems, and Internet solutions—all of which rely heavily on software. Ericsson Denmark has a mid-sized systems development division within the Ericsson Corporation and employs approximately 500 people working in five product groups.

    In early 1996, Ericsson Corporation changed its organizational structure from a line to a matrix organization. In the period following—from 1996 to 1998—Ericsson Denmark’s staff increased from 250 to 400, and each of its product groups reported to corresponding business units located in other countries. Both the Ericsson Corporation and Ericsson Denmark have a long history of improving software development. In 1992, the company took the first steps to set up a corporatewide SPI program, the Ericsson System Software Initiative (ESSI). From the beginning, ESSI was a strategic effort that ensured alignment, deployment, and follow-up on corporate SPI goals. ESSI’s first intervention was in Ericsson’s largest and most complex software development area, the telephone exchange software group. An aggressive goal was defined to reduce fault density in telephone exchange software products by 50% annually.

    Another important ESSI initiative focused on CMM as a long-term strategy for improving software development performance. The initiative was supported by the creation of an international corps of trained CMM assessors tasked with determining the level of software process maturity throughout the company. At the end of 1996, the ESSI program had been operational worldwide for a couple of years, and most of the company’s international software development sites had shown good progress toward reaching the corporate fault density goals.

    Ericsson Denmark was assessed at level 1 in 1995 and at level 2 in June 1998. In between the two assessments, the division underwent both Light Assessments and UltraLight Assessments.

    Systematic Software Engineering

    Systematic, founded in 1985, produces and integrates software for complex information and communications systems. Systematic’s international customers include military institutions and suppliers as well as data communication, transportation, and manufacturing companies and organizations in the finance and health care sectors. As a systems integrator, Systematic has established a core competency in the management and implementation of complex software projects that require high reliability and secure communications 24 hours a day. Systematic is recognized by its customers for the timely delivery of quality, cost-effective products.

    In 1996, Systematic employed 137 people. Of these employees, 105 were software engineers and 32 worked in finance, administration, internal IT, quality assurance, canteen, and cleaning. By 1999, the number of employees had grown to 155. At Systematic, all software development takes place in project teams, led by a project manager. Most managers started with the company as software engineers and were later trained internally for management responsibilities. In 1998-99, project teams ranged in size from 2 to 18 members and projects lasted from two months to three years. Typically, project members were not rotated out; they stayed with the project from the analysis phase through requirements specification, design, programming, test, documentation, installation, and user training. This practice reflects the company’s belief that such consistency ensures maximum commitment and development of staff competence.

    Despite the small number of graduates in computer science and systems engineering in Denmark, two-thirds of Systematic’s employees hold master’s or doctoral degrees. To facilitate high flexibility and preparedness for change, the company recruits highly educated people with knowledge of state-of-the-art technologies. One of the main reasons Systematic undertook SPI was to help meet its goal of becoming an internationally recognized software supplier and systems integrator in communications and interoperability between defense units, and in electronic commerce and data interchange between enterprises. In 1992, Systematic’s quality assurance system was certified in accordance with ISO 9001 and the military standards AQAP 110 and 150. The ISO 9001 certified quality management system is the basis of numerous elements in Systematic’s quality assurance procedures.

    In 1997, Systematic conducted its first software process maturity assessment using both the CMM and Bootstrap approaches and was rated to be just under Bootstrap 2. In 1998 and 1999, the company conducted additional Bootstrap assessments, and in 1999 the company was assessed to be at level 2.5 (using the Bootstrap maturity scale).


    The book is divided into five parts. Part I consists of Chapters 1 and 2 and introduces the major learning points of our three-year collaborative project. In this first part, we present an overview—a map—of the theories and models that inspired us and formed the basis of our practice in the projects. Part II, Learning from Experience, is divided into four chapters. Each of these chapters characterizes the SPI experience of one of the four collaborating companies and is named accordingly. For example, Chapter 3, The Correct Effort, describes how Ericsson Denmark attempted first to follow standard advice, only to discover that adherence to general prescriptions did not bring the desired results. Thus, it had to deviate, ultimately producing a truly “correct” effort through innovation and adaptation to its particular circumstances.

    Part III, Initiating Learning, focuses on how to structure learning conditions and initiate learning in SPI initiatives. We discuss maturity level assessments as an important mechanism for learning. We have used a broad range of assessment methods. Some were inspired by formalized approaches, such as CMM or Bootstrap (discussed in Chapters 7 and 10), whereas others were invented in project groups (Chapters 8 and 9). Finally, Chapter 11 discusses how to select an appropriate assessment strategy. Part IV, Organizing for Learning, goes beyond assessments and takes a more reflective look at SPI: In Chapter 12, we reflect on knowledge transfer; in Chapter 13, we discuss customer maturity; and in Chapter 14 we focus on organizational learning in the SPI context.

    Part V examines interesting details in different techniques for SPI. Chapter 15 presents a framework for implementing SPI programs, and the remaining chapters offer detailed discussions of how to carry out risk assessments (Chapter 16), how to implement a metrics program (Chapter 17), and how to improve requirements specification (Chapter 18).

    This book is based on a truly collaborative effort. The team of engineers and scientists that have authored the chapters is listed at the very end of this book. Three of the authors—Lars Mathiassen, Jan Pries-Heje, and Ojelanki Ngwenyama—have edited this book assisted by Keri Schreiner who has interacted closely with the authors to help them write for practitioners. Finally, the staff at Addison-Wesley has provided valuable support in designing and producing the book.



    Aaen, Ivan, 50
    Aalborg University, xxi
    Questionnaire Based Assessment, 69, 70, 188–189, 192
    Action prioritization procedure, 284–285
    Action research, 316
    Adolescent effort. See Brüel & Kjær A/S Ambitious effort. See Systematic Software Engineering Analysis phase, 308–309
        gathering information in, 308–309
        identifying problem-solving techniques in, 309
        prioritizing techniques in, 309
        assessor experiences in, 128–131
        assisted model-based, 187–188, 189–190
        Bootstrap tool in, xxiv, xxv, 18, 30, 99, 101, 110, 115–117, 130–131, 133, 150, 153, 160, 162–163, 175, 185, 240, 257, 260, 290
        framework for selecting strategy, 185–198
        considering combined strategy, 194–195
        designing, 195–196
        intervention versus day-to-day management, 187, 194
        model-based versus problem-based, 186
        rigor versus relevance, 186, 191–192
        selecting primary strategy, 191–194
        independent, of individual projects, 188
        maturity, 154
        organizational learning and, 122–128
        problem diagnosis versus model-based, 162–163, 192–194
        questionnaire based, 69, 70, 188–189, 192
        software process, 190
    Assisted metrics programs, 188
    Assisted model-based assessment, 187–188, 189–190
    Assisted problem diagnosis, 190

    Balanced Scorecard initiative, 63–64
    Balkan syndrome, 16
    Bang, Stig, 50
    Beizer’s bug taxonomy, 136
    Benchmarking, 188
    Best-practices models, 30–31
    Bonuses, 31
    Bootstrap assessment, xxiv, xxv, 18, 30, 115–117, 133, 153, 185, 290
        algorithm in, 116–117
        at Brüel & Kjær at, 99, 101, 110, 117, 118,150, 160, 162–163
        at Danske Data, 117, 118, 175, 257, 260
        relevance of, 117
        at Systematic, 117, 118, 240
        validity of, 130–131
    Broad dissemination phase, 313–316
    Brüel & Kjær A/S, xxi, xxii–xxiii, 99–112, 133
        adolescent effort at, 41–42, 99–112
        Bootstrap assessment at, 99, 101, 110, 117, 118, 150, 160, 162–163
        configuration management initiative at, 107–108
        defect analysis at, 133–151
        development model initiative at, 103–105
        establishment of Center for Software Process Improvement at, 241
        focus on project managers at, 101, 111
        independent project-based metrics program at, 189
        methodology for Preventing Requirements Issues from Becoming Defects (PRIDE) at, 134, 140, 143, 146, 147
        organizational learning at, 123–124
        participatory improvement at, 11, 12
        Prevention of Errors through Experience-Driven Test Efforts (PET) at, 134, 137, 142, 146–147
        problem diagnosis at, 101–102, 108–109,153–154, 161–162, 190
        problem reports at, 133–151
        problem solving at, 5–6
        product profile at, 100
        project dependency at, 111
        project tracking initiative at, 106–107
        requirements specification initiative at, 105–106
        reuse initiative at, 106
        software development process at, 135, 241–243, 247, 307, 308–309, 313–314
        strengths and weaknesses of, 108–109
        support teams at, 102–103, 108–109
        technology competence at, 111
        Bug classification, taxonomies for, 136
        Bug distribution analysis, 137–140
        Bug distribution over time, 141–142
        Bureaucratic arrangements, limiting, 13
    Bureaucratization, 34
    Business Process Reegineering (BRI), 25, 30

    Calculus-based trust, 222, 228
    Capability Maturity Model (CMM), xxiv, xxv, 18, 30, 133, 153, 185, 290
        at Ericsson Corporation, xxiv, xxv, 18, 30, 50
        accelerator assessment, 54–55, 58
        UltraLight assessmentst, 55–56, 58–59, 169–173
    CASE tools, 201
    Circumspection, 223
    Combination in knowledge creation, 10
    Commission of the European Communities (CEC), 133
    Commitment-based improvement, alternatives to, 32
    Communication, in metrics implementation, 299–301
    Compass development program for measuring productivity and quality, 188
    Competence, 34–35
    Computer-Aided Software Engineering (CASE), 30
    Configuration management
        at Brüel & Kjær A/S, 107–108
        at Systematic Software Engineering, 72
    Contel, metrics implementation at, 287
    Context for software engineering activities, 35–36
    Continuous improvement, 17–19
        factors that undermine, 18
    Copenhagen Business School, xxi
    Core competence, 31
        at Systematic Software Engineering, 67
    Correct effort. See Ericsson Denmark Cost/benefit analysis, in prioritizing techniques, 309
    Cost/time aligment, 31–32
    CSC Denmark, use of assisted model-based assessment strategy by, 188
    Curtis, Bill, 53, 54
    Customer-focused maturity models, 224–227
    Customer perception, 31
    Customer relations, 217–233
        care and engineering in, 223
        collaboration and competition in, 220–221
        early initiatives in, 231–232
        maturity models in, 224–227
        relationship dynamics in, 224
        at Systematic Software Engineering, 217–233
        trust and control in, 221–222
        understanding, 217–220
        workshop for improving, 227–230
    Customer-supplier relationship, 218–219
        bureaucracies in, 220–221
        at the constituting level, 221
        market relationships in, 220
        teams in, 220

    Danish National Center for IT Research, xxi
    Danish National SPI Initiative, xxi
    Danske Bank Group, xxiii
    Danske Data A/S, xxi, xxiii–xxiv, 83–98
        Bootstrap assessment at, 84, 92, 117, 118, 175, 189, 257, 260
        change as cultural process at, 93–94
        CMM at, 87
        continuous improvement at, 17–18
        creating synergy in, 96
        development model at, 85
        establishment of Project Management Competence Center at, 8–9, 86–87, 89, 90–91, 173, 245
        expecting and utilizing conflict, 97
        fostering cross-organizational learning at, 86
        grassroots effort at, 40–41, 83–98
        improvement initiatives at, 85–86
        independent project-based self-assessments at, 191
        involvement of key players, 96
        knowledge creation at, 8–9
        leadership integration at, 14
        long-term thinking in, 97
        metrics implementation at, 291–292, 294, 295, 296–297, 298, 299, 300, 301, 302
        organizational implementation at, 85–86
        organizational learning at, 125–126
        participatory improvement at, 13
        politics of change at, 94–95
        problem solving at, 5
        process measurement at, 85
        project assessment at, 168, 173–177
        project-management initiative at, 86–90,91
        providing education at, 89–90, 91
        quality assurance at, 85
        questionnaire-based assessment at, 188–189
        rational view to change, 92–93
        remembering customers, 97
        risk management at, 273, 283
        shared vision in, 96
        software process improvement at, 208–212, 245–246, 247
    Datacentralen. See CSC Denmark Data usage in metric implementation, 301–302
    Day-to-day management, intervention versus, 187, 194
    Debate, facilitation of, in metrics implementation, 301
    Defect analysis, 133–151
        at Brüel & Kjær A/S, 133–151
        bug-distribution analysis in, 137–140
        bug distribution over time, 141–142
        identifying prevention techniques in, 142–146
        improvements obtained in, 146–148
        management support for, 148–149
        maturity impact in, 149–150
        measurement limits in, 149
        problem reports in, 133–134, 135
        requirements-related analysis, 140–141
        taxonomy limits in, 149
    Delta, xxi, 150, 195
    Denmark, software process improvement in, xviii, xxi
    Descriptive process model, 160
    Development model
        at Brüel & Kjær A/S, 103–105
        at Danske Data, 85
    Diagnostic competence, building, 6
    Diffusion, as demand-driven, 202
    Dynamic analysis, 143

    Ericsson Denmark, xxi, xxiv–xxv, 49–64
        aftermath at, 56–58
        background, 50–51
        CMM accelerator assessment at, 54–55, 58
        CMM UltraLight Assessment at, 55–56, 58–59, 169–173
        correct effort at, 37–39
        ensuring management attention and feedback, 62
        future for, 63–64
        goals at, 49, 52
        independent model-based self-assessments at, 190–191
        involvement of practitioners, 61–62
        keeping efforts local, 62–63
        lack of process performance measures, 59
        launch of Balanced Scorecard by, 63–64
        launch of software process improvement at, 51–54
        leadership integration at, 16
        loss of momentum in software process improvement effort, 59
        need for focusing and simplifying of software process improvement at, 61
        organization and start-up at, 53–54
        participatory improvement at, 11, 13
        process action teams at, 243, 244
        progress at, 58
        project assessment at, 168–173
        promoting individual responsibility at, 62
        reflections on software process improvement process, 59–61
        resource pool formation at, 51
        software process engineering groups at, 244, 245
        software process improvement at, 243–245, 247
        structural complexity and growth at Ericsson System Software Initiative (ESSI), xxiv, 37–39, 50, 52
    European Systems and Software Initiative (ESSI) program, 292–293, 307
    Evolutionary process, software process improvement as, 29–30
    Experience-based requirements engineering methodology
        requirements elicitation and validation, 145
        verification of the requirements specification, 145
    Explicit knowledge, 237
    Exploitation, learning by, 250–251
    Exploration, learning by, 249–250
    Externalization in knowledge creation, 10
    External software stress test, 311

    Facilitator in risk management, 278, 282, 283
    Fayol, Henri, 92
        in problem diagnosis method, 159
        in software process improvement
        management, 28–29
    Focused pilot phase, 310–313

    Goal determination, in metrics implementation, 298
    Goal Question Metric (GQM) method, 31, 188, 294, 298
    Gold-plating, 250, 251
    Grassroots effort. See Danske Data

    Hewlett-Packard, metrics implementation at, 287
    Hit rate, 143, 144
    Holistic view of software engineering, 33–34
    Hospitality, 223
    Humphrey, Watts, xvii

    IDEAL (Initiate, Diagnose, Establish, Act, and Learn) model, 7, 19
        at Systematic Software Engineering, 66
    Identification-based trust, 222, 228
    Improvement actors, risk resolution actions
        associated with, 324
    Improvement areas
        risk items associated with, 319
        risk resolution actions associated with, 320
    Improvement diffusion, at Systematic Software Engineering, 76–79
    Improvement ideas
        risk items associated with, 320–321
        risk resolution actions associated with, 321
    Improvement knowledge, in metrics implementation, 294
    Improvement process
        risk items associated with, 322
        risk resolution actions associated with, 322–323
    Incentives, 31
        in metrics implementation, 296–297
    Independent model-based self-assessments, 190–191
    Independent project-based metrics program, 189
        at Brüel & Kjær A/S, 189
    Independent project-based self-assessments, 191
    Individualist perspective, 202, 205, 209
    Information, gathering, in analysis phase, 308–309
    Information technology projects, factors in success of, 11
    Initial value check, 311
    Insider solutions at Systematic Software Engineering, 65, 66, 73–74, 75–76, 78–79, 81
    Institutionalized process, resistance to change, 33
    Interactive-process perspective, 202–203, 206–207, 210–211
    Internalization in knowledge creation, 10
    Intervention assessment versus day-to-day management, 187, 194
    Interviews in problem diagnosis method, 156–158

    Kaizen strategy, 193
    Kierkegaard, Soren, 115
        diffusion of, about techniques, 314
        explicit, 237
        in metrics implementation, 294–295
        tacit, 237
    Knowledge-based trust, 222, 228
    Knowledge creation
        combination in, 10
        externalization in, 10
        internalization in, 10
        socialization in, 10
        in software process improvement, 7–11
        systematic assessments in, 9, 10
    Knowledge transfer, implementation and, 212

    LDRA Testbed tool, 142
        integrating, 14–16
        organizational, 16
    Leadership principles, at Systematic Software Engineering, 68
        by exploitation, 250–251
        by exploration, 249–250

    Management of software process improvement, 25–29
        feedback in, 28–29
        organization in, 25–27
        planning in, 27–28
    Mapping SPI ideas and practices, 23–43
    Market share, 31
    Maturity assessments, 154
        methods in, 168
        role of, in improvement efforts, 167
    Maturity models
        as basis for project assessments, 179
        customer-focused, 224–227
    Methodology for Preventing Requirements Issues from Becoming Defects (PRIDE), at Brüel & Kjær A/S, 134, 140, 143, 146, 147
    Metrics implementation, 287–304
        advanced guidelines in, 290
        communication in, 299–301
        at Danske Data, 291–292, 294, 295, 296–297, 298, 299, 300, 301, 302
        data collection in, 300
        data usage in, 301–302
        debate facilitation in, 301
        establishing incentive structures in, 296–297
        establishing project in, 296
        goal determination in, 298
        improvement knowledge in, 294
        knowledge in, 294–295
        organizational knowledge in, 295
        program design in, 297–299
        program organization in, 295–297
        publication of objectives in, 300
        starting simple in, 299
    Metrics program, independent project-based, 189
    Mission, establishing clear, for project assessment, 178
    Model-based assessments assisted, 187–188
        versus problem-based assessments, 186, 192–194
        versus problem diagnosis, 162–163
    Motorola’s Progress Assessment, 168
    Myths in undermining knowledge creation, 9–10

    NASA, metrics implementation at, 287
    Network Products, SPI in, 204–208
    Normative models, 42, 133, 153
        in knowledge creation, 10
    Norm-based approach for software process improvement, 30–31
    Norms, use of accepted, as basis for project assessment, 179
    Not-invented-here syndrome, in undermining knowledge creation, 9–10

    Objectives, publication of, in metrics implementation, 300
    Organizational approach to implementing software process improvement, 257–269
        diffusion in, 258–259
        implementation workshop in, 261–267
        lessons learned in, 267–269
        project context in, 260
    Organizational implementation at Danske Data, 85–86
    Organizational inertia, 31
    Organizational knowledge in metrics implementation, 295
    Organizational leadership, 16
    Organizational learning, 122–128
        at Brüel & Kjær A/S, 123–124
        at Danske Data, 125–126
        strategies for, 235–252
        at Systematic Software Engineering, 124–125
        theory of, 237–239
    Organizational maturity, 31
    Organization in software process improvement management, 25–27
    Orthogonality check, 311

    Participant commitment, 31–32
    Participatory improvement, in software process improvement, 11–14
    PATs. See Process action teams (PATs)
    Peer-based structure at Systematic Software Engineering, 67–68
    People Capability Maturity Model, 30–31
    Perception, 223
    Performance-based trust, 222, 228
    Performance specifications, 311
    Planning in software process improvement management, 27–28
    Prevention of Errors through Experience-Driven Test Efforts (PET), at Brüel & Kjær A/S, 134, 137, 142, 146–147
    Prioritizing techniques in analysis phase, 309
    Problem-based assessment versus model-based assessment, 186, 192–194
    Problem diagnosis
        analyzing immediate results in, 158
        assisted, 190
        at Brüel & Kjær A/S, 101–102, 108–109, 153–154, 161–162, 190
        comparison to model-based assessments, 162–163
        conducting interviews in, 158
        defining scope, 155–156
        offering feedback and obtaining validation in, 159
        oranizational resources needed for, 164
        organizing interviews in, 157–158
        personnel for performing, 163–164
        preparing for interviews in, 156–157
        prioritizing findings in, 164
        synthesizing problems in, 159, 160
        timing in performing, 163
        using, 160–161
    Problem reports in product improvement, 133–151. See also Defect analysis
    Problems, synthesizing, in problem diagnosis method, 159, 160
    Problem solving
    at Brüel & Kjær A/S, 5–6
        identifying techniques, in analysis phase, 309
        interaction mode in, 19
        interventionist mode in, 19
        in SPI, 4–7
    Process action teams (PATs)
        at Ericsson Denmark, 243, 244
        risk management in, 273–285
        at Systematic Software Engineering, 74, 75, 239, 240
    Process assessments, 31
    Process improvement activities, 31
    Process improvement infrastructure, 31
    Process measurement at Danske Data, 85
    Product cycle time, 31
    Product development, using problem reports in, 133–151
    Product expert consultation, 311
    Productivity, Compass development program for measuring, 188
    Product profile at Brüel & Kjær A/S, 100
    Profitability, 31
    Program design in metrics implementation, 297–299
    Program organization in metrics implementation, 295–297
    Project, establishment of, in metrics implementation, 296
    Project assessments, 167–184
        at Danske Data, 168, 173–177
        ensuring support in, 179
        at Ericsson Denmark, 168–173
        establishing clear vision or mission for, 178
        establishing separate process for, 178–179
        implementing, in projects’ daily work, 179
        improving approach in, 179
        opportunities in, 180–182
        risks in, 181, 183
        supporting dedicated project-specific improvements, 180–181
        supporting organizational improvements, 181–183
        supporting software process improvement with, 177–183
        use of accepted norm, standard, or maturity model as basis for, 179
    Project conclusion, 160
    Project Diagnosis Workshops at Systematic Software Engineering, 231
    Project Establishment Workshops at Systematic Software Engineering, 231
    Project evaluation workshop, 231–232
    Project management
        at Danske Data, 86–90
        at Systematic Software Engineering, 72
    Project managers, focus on, at Brüel & Kjær A/ S, 101
    Project teams at Systematic Software Engineering, 67–68
    Project tracking and control, 160
        at Brüel & Kjær A/S, 106–107
        experimenting with, 160
        usability test of functional, 311

    Quality, Compass development program for measuring, 188
    Quality assurance
        at Danske Data, 85
        at Systematic Software Engineering, 68–70
    Quality management system, at Systematic Software Engineering, 9
    Questionnaire based assessment, 69–70, 188–189, 192

    Relevance-based assessment versus rigor-based assessment, 186, 191–192
    Requirements engineering techniques, 311
    Requirements-related analysis, 140–141
    Requirements specification initiative at Brüel & Kjær A/S, 105–106
    Reuse initiative at Brüel & Kjær A/S, 106
    Rigor-based assessment versus relevance-based assessment, 186, 191–192
    Risk management, 160
        at Danske Data A/S, 273, 283
        in the literature, 284
        in process action teams, 273–285
        Risk resolution actions
        improvement actors and, 324
        improvement areas and, 320
        improvement ideas and, 321
        improvement process and, 322–323
        improvement actors and, 323
        improvement areas and, 319
        improvement ideas and, 320–321
        improvement process and, 322
        in project assessment, 183
    Rollout Workshop, at Systematic Software Engineering, 231

    Scenarios, 311
    Scope, defining, for problem diagnosis method, 155–156
        independent model-based, 190–191
        independent project-based, 191
    Silver bullet syndrome, 6
    Socialization in knowledge creation, 10
    Soft Systems Methodology, 7, 19
        interaction mode, 19
        interventionist mode, 19
    Software Acquisition Capability Maturity Model, 31, 224–225, 226
    Software capability evaluation, 190
    Software engineering
        holistic view of, 33–34
        as target of software process improvement, 32
    Software Engineering Institute, xvii
    Software organizations, Balkan syndrome at, 16
    Software process assessment, 190
    Software process engineering groups at Systematic Software Engineering, 238–239, 240
    Software processes, 32–34
        competence in, 34–35
        context in, 35–36
        defined, 32
        persistence in, 32–34
    Software process improvement (SPI), xviii, xxi
        alternatives to measuring effectiveness, 28–29
        assessment of current processes in, 3
        association with maturity assessments, 154
        benefits in collecting appropriate data, 29
        at Brüel & Kjær, 241–243, 247
        commitment in, 31–32
        continuous improvement planning in, 17–19
        at Danske Data, 208–212, 245–246, 247
        development of focused strategy for, 3
        difference from traditional approches, 4
        encouraging participation in, 11–14
        at Ericsson Denmark, 243–245, 247
        as evolutionary process, 29–30
        goal of, 3–4
        implementation plan for, 27–28
        initiatives in, xxi
        integration of leadership in, 14–16
        knowing and implementing, 201–213
        framework for, 202–204
        knowledge creation in, 7–11
        management of feedback, 28–29
        organization, 25–27
        planning, 27–28
        mapping ideas and practices in, 23–43
        in Network Products, 204–208
        norm-based approach for, 30–31
        organizational approach to implementing, 31–32, 257–269
        diffusion in, 258–259
        implementation workshop in, 261–267
        lessons learned in, 267–269
        project context in, 260
        perspectives of, 32–36
        problem diagnosis in, 153–165
        problem solving in, 4–7
        rhetoric in, 3
        software engineering practice as target of, 32
        steps in improving, 3
        strategic factors in, 31
        strategies for organizational learning in, 235–252
        supporting, with project assessments, 177–183
        at Systematic Software Engineering, 239–241, 247
        tactical factors in, 31
    Software process improvement (SPI) group, 238–239
        establishment of, 26
        resources for, 26
        risks to institutionalizing, 26
    Software requirements specification and requirements management, 160
    Software reuse, 160
    SPI. See Software process improvement (SPI)
    SPICE (Software Process Improvement And Capability Determination), 18, 30, 185, 187, 290, 294
    SPIRE, 225, 226–227
    Stability, 31
    Stakeholder analysis, 268
    State-of-the-art knowledge in facilitating a knowledge creation approach to SPI, 9
    Static analysis, 143
    Strategic focus, 31
    Structionalist perspective, 202, 205, 209–210
    Structural control, 222
        balancing trust with, 222
    Support teams at Brüel & Kjær A/S, 102–103, 108–109
    Synquest, 168
    Systematic assessments in knowledge creation, 9, 10
    Systematic Software Engineering, xv–xvi, xxi, 65–82
        Aalborg University’s Questionnaire Based Assessment at, 69, 70
        ambitions at, 65, 73, 76, 80
        ambitious effort at, 39–40
        Bootstrap assessment at, 117, 118, 240
        configuration management at, 72
        core competence at, 67
        culture at, 67–68
        customer relations at, 217–233
        Establishing Requirements Workshop at, 231
        establishment of quality management system at, 9
        expectations for SPI at, 66
        goals at, 65, 71
        IDEAL model at, 66
        improvement diffusion at, 76–79
        insider solutions at, 65, 66, 73–74, 75–76, 78–79, 81
        leadership integration at, 15
        leadership principles at, 68
        organizational learning at, 124–125
        organization at, 67–68
        peer-based structure at, 67–68
        process action teams at, 74, 75, 239, 240
        Project Diagnosis Workshop at, 231
        Project Establishment Workshop at, 231
        Project Evaluation Workshop at, 231
        project management at, 72
        project teams at, 67–68
        quality assurance system at, 68–70
        role of the organization at, 80–81
        Rollout Workshop at, 231
        software process engineering groups at, 238–239, 240
        SPI at, 239–241, 247
        stalemates at, 65, 66, 72–73, 74–75, 76, 77–78, 81
        Test Planning Workshop at, 231
        work groups at, 71, 72
    Systems Engineering Capability Maturity Model, 31

    Tacit knowledge, 237
    Task pool, 89
    Taylor, Frederick, 92
    Teams. See also Process action teams (PATs)
        in customer-supplier relationship, 220
        in software development, 311, 312–313
        support, 102–103, 108–109
    Technical University, Denmark, xxi
    Technology transfer, xvii–xviii
        failures in, xvii
        as supply-driven, 202
    Test Planning Workshop at Systematic Software Engineering, 231
    Time-boxes, 104–105
    Total Quality Management (TQM), 25, 193 Trust, 221–222
        balancing with structural control, 222
        calculus-based, 222, 228
        identification-based, 222, 228
        knowledge-based, 222, 228
        performance-based, 222, 228

    Understanding, 223
    Unit testing, 134
    Usability test of functional prototype, 311

    Validation, offering, in problem diagnosis method, 159
    Vision, 31
        establishing clear, for project assessment, 178

    Waterfall model, 134
    Weber, Max, 92
    Work groups at Systematic Software Engineering, 71, 72


    Submit Errata

    More Information

    Unlimited one-month access with your purchase
    Free Safari Membership