Home > Store

Practical Software Measurement: Objective Information for Decision Makers

Register your product to gain access to bonus material or receive a coupon.

Practical Software Measurement: Objective Information for Decision Makers

Book

  • Sorry, this book is no longer in print.
Not for Sale

Description

  • Copyright 2002
  • Edition: 1st
  • Book
  • ISBN-10: 0-201-71516-3
  • ISBN-13: 978-0-201-71516-3

PSM provides you with a way to realize the significant benefits of a software measurement program, while understanding and avoiding the risks involved with a “blind jump.” You’ll find this book a worthwhile starting point for your future software measurement initiatives, as well as a source of continuing guidance as you chart your way through the sea of complex opportunities ahead.
—Barry Boehm, from the Foreword

Objective, meaningful, and quantifiable measurement is critical to the successful development of today’s complex software systems. Supported by the U.S. Department of Defense and a rapidly increasing number of commercial practitioners, Practical Software Measurement (PSM) is a process for designing and implementing a project-based software measurement program. PSM provides essential information on scheduling, resource allocation, and technological performance. It enables software managers and developers to make decisions that will affect the project’s outcome positively.

This book is the official, definitive guide to PSM written by the leaders of the PSM development initiative. It describes the principles and practices for developing, operating, and continuously improving your organization’s measurement program. It uses real-world examples to illustrate practical solutions and specific measurement techniques. This book examines the foundations of a software measurement program in depth, defining and prioritizing information needs, developing a project-specific information model, tailoring a process model to integrate measurement activities, and analyzing and understanding the results.

Specific topics include:

  • The relationship between project- and organizational-level measurement
  • Defining an information-driven, project-specific measurement plan
  • Performing measurement activities and collecting data
  • The basics of data analysis, including estimation, feasibility analysis, and performance analysis
  • Evaluating the effectiveness of the measurement processes and activities
  • Sustaining organizational commitment to a measurement program
  • Key measurement success factors and best practices
  • In addition, this book includes numerous detailed examples of measurement constructs typically applied to software projects, as well as two comprehensive case studies that illustrate the implementation of a measurement program in different types of projects. With this book you will have the understanding and information you need to realize the significant benefits of PSM as well as a guide for a long-term, organization-wide measurement program.

    PSM is founded on the contributions and collaboration of key practitioners in the software measurement field. The initiative was established in 1994 by John McGarry and is currently managed by Cheryl Jones. Both are civilians employed by the U.S. Army. David Card is an internationally known software measurement expert, and is with the Software Productivity Consortium. Beth Layman, Elizabeth Clark, Joseph Dean, and Fred Hall have been primary contributors to PSM since its inception.



    0201715163B10012001

    Sample Content

    Online Sample Chapter

    Software Measurement: Key Concepts and Practices

    Downloadable Sample Chapter

    Click below for Sample Chapter related to this title:
    mcgarrych01.pdf

    Table of Contents



    Foreword.


    Preface.


    Acknowledgements.


    1. Measurement: Key Concepts and Practices.

    Motivation for Measurement.

    Measurement as an Organizational Discriminator.

    The Foundation—Project Measurement.

    What Makes Measurement Work.

    Measurement Information Model.

    Measurement Process Model 10.



    2. Measurement Information Model.

    Information Needs.

    Measurement Construct.

    Measurement Construct Examples.



    3. Plan Measurement.

    Identify and Prioritize Information Needs.

    Select and Specify Measures.

    Integrate the Measurement Approach into Project Processes.



    4. Perform Measurement.

    Collect and Process Data.

    Analyze Data.

    Make Recommendations.



    5. Analysis Techniques.

    Estimation.

    Feasibility Analysis.

    Performance Analysis.



    6. Evaluate Measurement.

    Evaluate the Measures.

    Evaluate the Measurement Process.

    Update the Experience Base.

    Identify and Implement Improvements.



    7. Establish and Sustain Commitment.

    Obtain Organizational Commitment.

    Define Measurement Responsibilities.

    Provide Resources.

    Review the Measurement Program.

    Lessons Learned.



    8. Measure For Success.

    Appendix A: Measurement Construct Examples.

    Milestone Completion.

    Work Unit Progress: Software Design Progress.

    Incremental Capability.

    Personnel Effort.

    Financial Performance: Earned Value.



    Appendix B. Information System Case Study.

    Project Overview.

    Getting the Project Under Control.

    Evaluating Readiness for Test.

    Installation and Software Support.



    Appendix C. Xerox Case Study.

    Product and Project Overview.

    Estimation and Feasibility Analysis.

    Performance Analysis.

    Redesign and Replanning.



    Glossary.


    Bibliography.


    Index. 0201715163T09252001

    Preface

    Management by fact has become an increasingly popular concept in the software engineering and information technology communities. Organizations are focusing attention on measurement and the use of objective information to make decisions. Quantitative performance information is essential to fact-based management. Practical Software Measurement: Objective Information for Decision Makers describes an approach to management by fact for software project managers based on integrating the concepts of a Measurement Information Model and a Measurement Process Model. While these concepts apply to non-software activities as well, the examples and terminology presented in this book focus primarily on software.

    The information needs of the decision maker drive the selection of software measures and associated analysis techniques. This is the premise behind the widely accepted approaches to software measurement, including goal/question/metric (Basili and Weiss, 1984) and issue/category/measure (McGarry et al., 1997). Information needs result from the efforts of managers to influence the outcomes of projects, processes, and initi-atives toward defined objectives. Information needs are usually derived from two sources: (1) goals that the manager seeks to achieve and (2) obstacles that hinder the achievement of these goals. Obstacles, also referred to as issues, include risks, problems, and a lack of information in a goal-related area. Unless there is a manager or other decision maker with an information need, measurement serves no purpose. The issues faced by a software project manager are numerous. Typically they include estimating and allocating project resources, tracking progress, and delivering products that meet customer specifications and expectations.

    A Measurement Information Model defines the relationship between the information needs of the manager and the objective data to be collected, commonly called measures. It also establishes a consistent terminology for basic measurement ideas and concepts, which is critical to communicating the measurement information to decision makers. The information model in Practical Software Measurement (PSM) defines three levels of measures, or quantities: (1) base measures, (2) derived measures, and (3) indicators. It is interesting to note that the three levels of measurement defined in the PSM information model roughly correspond to the three-level structures advocated by many existing measurement approaches. Examples include the goal/question/metric (Basili and Weiss, 1984), factor/criteria/metric (Walters and McCall, 1977), and issue/category/measure (McGarry et al., 1997) approaches already in use within the software community. A similar approach for defining a generic data structure for measurement was developed by Kitchenham et al., who defined their structure as an Entity Relationship Diagram (1995).

    An effective measurement process must address the selection of appropriate measures as well as provide for effective analysis of the data collected. The Measurement Process Model describes a set of related measurement activities that are generally applicable in all circumstances, regardless of the specific information needs of any particular situation. The process consists of four iterative measurement activities: establish, plan, perform, and evaluate. This process is similar to the commonly seen Plan-Do-Check-Act cycle (Deming, 1986).

    Recognition of a need for fact-based, objective information leads to the establishment of a measurement process for a project or an organization. The specific information needs of the decision makers and measurement users drive the selection and definition of appropriate measures during measurement planning. The resulting measurement approach instantiates a project-specific information model identifies the base measures, derived measures, and indicators to be employed, and the analysis techniques to be applied in order to address the project’s prioritized information needs.

    As the measurement plan is implemented, or performed, the required measurement data is collected and analyzed. The information product that results from the perform measurement activity is provided to the decision makers. Feedback from these measurement users helps in the evaluation of the effectiveness of the measures and measurement process so that they can be improved on a continuing basis.

    The basic concepts presented in this book evolved from extensive measurement experience and prior research. They were previously introduced in sequentially released versions of Practical Software Measurement (McGarry et al., 1997) and were formalized in ISO/IEC Standard 15939—Software Measurement Process (2001). The measurement process model and measurement terminology from ISO/IEC 15939 have also been adopted as the basis of a new Measurement and Analysis Process Area in the Software Engineering Institute’s Capability Maturity Model Integration (CMMI) project (CMMI Development Team, 2000). This book explains how software development and maintenance organizations can implement a viable measurement process based on the proven measurement concepts of ISO/IEC 15939 and the CMMI in a practical and understandable manner.

    In simplistic terms, implementing an objective measurement-by-fact process for a software-intensive project encompasses defining and prioritizing the information needs of the project decision makers through the development of a project-specific information model and then tailoring and executing a project-specific set of measurement process activities. The PSM approach to accomplishing this integrates prior experience and research from many sources across many application domains.

    Practical Software Measurement is structured to provide progressive guidance for implementing a software measurement process. It provides comprehensive descriptions of the Measurement Information Model and the Measurement Process Model, as well as experience-based guidance for applying these models in an actual project environment. No book could ever capture all of the pertinent information and practical examples related to software measurement. As such, the Practical Software Measurement Web site at www.psmsc.com contains additional information, guidance, examples, and tools to augment Practical Software Measurement.

    To enhance readability, the authors have limited most of the in-text references to suggestions for further reading on specific topics. Additional references are provided in the bibliography.

    The following topics are addressed in this book:

    Chapter 1: Measurement: Key Concepts and Practices. Chapter 1 provides an overview of software measurement, explaining how measurement supports today’s information-oriented business models and how measurement can become a corporate resource. It describes the relationships between project- and organizational-level measurement, and introduces the two primary concepts of PSM: the Measurement Information Model and the Measurement Process Model.

    Chapter 2: Measurement Information Model. Chapter 2 presents an in-depth discussion of the Measurement Information Model and its measurement components. It relates the Measurement Information Model to measurement planning and implementation activities.

    Chapter 3: Plan Measurement. Chapter 3 is the first of five chapters that look at the individual measurement process activities in detail. Chapter 3 focuses on the Plan Measurement activity and describes what is required to define an information-driven, project-specific measurement plan.

    Chapter 4: Perform Measurement. Chapter 4 addresses the Perform Measurement activity and discusses how to collect and analyze measurement data. It introduces several concepts related to measurement analysis, including the types of analysis and how to relate information needs and associated issues in terms of cause and effect.

    Chapter 5: Analysis Techniques. Chapter 5 provides an in-depth treatment of the three primary types of measurement analysis: estimation, feasibility analysis, and performance analysis.

    Chapter 6: Evaluate Measurement. Chapter 6 describes the Evaluate Measurement activity. It focuses on the assessment, evaluation, and improvement of applied project measures and the implemented project measurement processes.

    Chapter 7: Establish and Sustain Commitment. Chapter 7 explains the final measurement activity, Establish and Sustain Commitment, which addresses the organizational requirements related to implementing a viable project measurement process. Chapter 7 also addresses measurement “lessons learned.”

    Chapter 8: Measure for Success. Chapter 8 reviews some of the major concepts presented in this book and identifies key measurement success factors.

    Appendix A: Measurement Construct Examples. Appendix A provides detailed examples of measurement constructs typically applied to software-intensive projects.

    Appendix B: Information System Case Study Appendix B provides a comprehensive case study that addresses the implementation of a measurement process for a typical information system.

    Appendix C: Synergy Integrated Copier Case Study. Appendix C is a second case study that describes how measurement can be applied to a major software-intensive upgrade project.



    0201715163P10012001

    Index

    Activity aggregation structures, 52–54
    Activity-based model estimation approach, 92
        varying approaches during projects, 96–97
        versus other approaches, 94–96
    Aggregation structures, 52–54
        data verification, 66
    Analogy model estimation approach, 92–93
        varying approaches during projects, 96–97
        versus other approaches, 94–96
    Analysis models, measurement constructs, 23–24
        examples, 26–29
    Analysis techniques
    estimation, approaches, 90–97
        estimation, basics, 86
        estimation, calibrating and mapping data, 97–98
        estimation, computing, 98–99
        estimation, effort, 100–101
        estimation, estimators, 87–89
        estimation, evaluating estimates, 103–104
        estimation, Integrated Analysis Model, 87
        estimation, poor factors, 86–87
        estimation, process steps, 89–90
        estimation, quality, 102–103
        estimation, schedules, 101–102
        estimation, size, 99–100
        evaluating measures, 127–131
        feasibility, basics, 104–106
        feasibility, indicators, 106–108
        feasibility, process, 108–112
        performance, basics, 112–113
        performance, indicators, 114–117
        performance, Integrated Analysis Model, 113–114, 119
        performance, plans, comparing to performance, 118–121
        performance, plans, evaluating alternatives, 123–124
        performance, problems, assessing impact, 121–122
        performance, problems, predicting impact, 122–123
        performance, process, 117–118
    Analyze Data task
        basics, 65–68
        indicators, generating, 68–71
        indicators, representing graphically, 71–75
        Integrated Analysis Model, 75–81
    Attributes, measurement constructs, 18–20
        examples, 26–29
        table of Measurement Information Model, 160

    Baldrige, Malcolm, Award, 155
    Bar charts, indicators for data analysis, 73
    Base measures, measurement constructs, 18–20, 24–25
        evaluating measures, 127–131
        examples, 26–29
        planned and actual completion, 62–63
        relation to indicators, 68
        selecting/specifying, 39–48
        table of Measurement Information Model, 160

    Capability Maturity Model Integration (CMMI), Software Engineering Institute, 1
        evaluating measurement process maturity, 139
        example, evaluating software, 186–188
        Measurement and Analysis Process Area, 140
        recognition of measurement’s importance, 155
    Charts. See graphs for indicators
    Coding Progress indicators, 47–48
    Component aggregation structures, 52–54

    Data, measurement constructs, 24
        analyzing, indicator generation, 68–71
        analyzing, indicator graphical representation, 71–75
        analyzing, types of analysis, 65, 67–68
        collecting and processing, 61–64
        Integrated Analysis Model, 75–81
        making recommendations after analysis, 81–83
        types of data, 49–50
        verifying, 65–66
    Decision criteria, measurement constructs, 24
        table of Measurement Information Model, 160
    Derived measures, measurement constructs, 22–25
        evaluating measures, 127–131
        examples, 26–29
        relation to indicators, 68
        selecting/specifying, 39–48
        table of Measurement Information Model, 160

    Establish and Sustain Commitment activity (MPM), 10–12
        defining measurement responsibilities, 146–147
        evaluating measurement process, 137–138
        obtaining organizational commitment, 144–145
        providing measurement resources, 147–150
        reviewing measurement program, 150–151
    Estimation, 65, 67
        approaches, activity-based models, 92
        approaches, analogy, 92–93
        approaches, comparing models, 94–96
        approaches, parametric models, 90–92
        approaches, selecting models, 94
        approaches, simple estimating relationships, 93
        approaches, varying during projects, 96–97
        basics, 86
        calibrating models with local historical data, 97–98
        computing, 98–99
        effort, 100–101
        estimators, 87–89
        evaluating estimates, 103–104
        Integrated Analysis Model, 87
        mapping model’s assumptions to project’s characteristics, 97–98
        poor estimation factors, 86–87
        process steps, 89–90
        quality, 102–103
        role in data analysis, 79–81
        schedules, 101–102
        size, 99–100
        Synergy Integrated Copier Case Study, 246–250
    Estimators, type of indicators, 87–89
    Evaluate Measurement activity (MPM), 10–12
        assessing measurement process, conformance, 131, 134–138
        assessing measurement process, maturity, 131, 138–140
        assessing measurement process, performance, 131–134
        assessing products of measurement process, 127–131
        basic tasks, 125–127
        identifying/implementing process improvements, 141–142
        updating experience base, 140–141
    Examples, measurement constructs
        assessing adequacy of personnel resources, 169–171
        assessing earned value performance information, 169–171
        assessing functional correctness, defect density, 182–184
        assessing functional correctness, defects, 179–181
        assessing functional size and stability of requirements, 176–179
        comparing achieved productivity to bid rates, 188–191
        comparing plan to actual code production, 173–176
        comparing plan to actual software design completion rate, 163–165
        comparing plan to completion of incremental builds, 166–169
        evaluating completed milestones, 161–163
        evaluating customer satisfaction from test cases, 191–193
        evaluating customer satisfaction of incremental releases, 193–195
        evaluating efficiency of response time requirements, 184–186
        evaluating software development to CMM, 186–188
        Measurement Information Model, 26–29
        Measurement Information Model table, 160
    Experience base, updating during Evaluate Measurement activity (MPM), 140–141

    Feasibility analysis, 67
        basics, 104–106
        indicators, 106–108
        process, 108–112
        Synergy Integrated Copier Case Study, 246–250
    Functional aggregation structures, 52–54
    Functions, measurement constructs (MIM), 23
        examples, 26–29
        table of Measurement Information Model, 160

    Graphs for indicators, data analysis
        guidelines for effectiveness, 74–75
        types, 71–74

    Historical measurement data, 49–50
        calibrating model’s estimations, 97–98
        computing estimations, 99
        data analysis, 67

    Indicators, measurement constructs (MIM), 23–25
        Coding Progress indicators, 47–48
        data analysis, generating new indicators, 68–71
        data analysis, graphical representation, 71–75
        estimators, 87–89
        evaluating measures, 127–131
        examples, 26–29
        feasibility analysis, 106–108
        performance analysis, 114–117
        selecting/specifying, 39–47
        table of Measurement Information Model, 160
    Information needs, 14–15
        categories, 16–17, 35–36, 160
        identifying, 33–35
        issues, 16
        prioritizing, 36–38
    selecting/specifying measures, 39–48
        table of Measurement Information Model, 160
    Information products, 9, 14–15
    Information-driven measurement, 7–8
    Integrated Analysis Model
        basics, 75–77
        estimation, 87
        example, 79–81
        performance analysis, 113–114, 119
        relationships between measurable concepts, 77–79
    Interval scales, 22
        table of Measurement Information Model, 160
    ISO/IEC 15939 standard, 15
        evaluating measurement process compliance, 134, 138
        evaluating measurement process maturity, 139
        evaluating products of measurement process, 127
        Measurement Process definition, 140
    Issues, information needs, 16

    Line charts, indicators for data analysis, 72–73

    Major Automated Information Systems Review Council (MAISRC), 200–201
    Make Recommendations task, 81–83
    Malcolm Baldrige Award, 155
    MAPS. See Military Automated Personnel System
    Measurable concepts, 14–15
        defining, 41–42
        table of Measurement Information Model, 160
    Measurement constructs, 14–15, 159. See also examples, measurement constructs
        analysis models, 23–24
        attributes (measurable), 18–20
        base measures, 18–21, 24–25
        benefits, 20
        data, 24
        decision criteria, 24
        derived measures, 22–25
        functions, 23
        indicators, 23–25
        levels, 17–18
        measurable concepts, defining, 41–42
        measurement methods, 21
        measures, 24
        scales, 21–22, 160
        specifying, 45–48
        standards, 24–25
        structure, 18–19
        table of Measurement Information Model, 160
        units of measurements, 22
    Measurement Information Model, 159
        basics, 8–9
        information needs, 14–17
        information products, 14–15
        iSO/IEC 15939 standard, 15, 127, 134, 138–140
        measurable concepts, 14–15
        measurement constructs, 14–15, 17–25 (See also examples, measurement constructs)
        measurement plans, 14–15
        measurement procedures, 14–15
        software entities, 14–15
        table of model, 160
        terminology, lack of agreed-on definitions, 13
    Measurement methods, measurement constructs, 21
        examples, 26–29
        table of Measurement Information Model, 160
    Measurement of software projects, 5–6
        benefits for managers, 3–4
        CMMI (Software Engineering Institute), 1
        criteria for effectiveness, 6–8
        Measurement Information Model, 8–9
        Measurement Process Model, 10–12
        organizational necessity, 5
        reasons to measure, 2
    Measurement of software projects, examples, constructs
        assessing adequacy of personnel resources, 169–171
        assessing earned value performance information, 169–171
        assessing functional correctness, defect density, 182–184
        assessing functional correctness, defects, 179–181
        assessing functional size and stability of requirements, 176–179
        comparing achieved productivity to bid rates, 188–191
        comparing plan to actual code production, 173–176
        comparing plan to completion of incremental builds, 166–169
        comparing plan to software design completion rate, 163–165
        evaluating completed milestones, 161–163
        evaluating customer satisfaction from test cases, 191–193
        evaluating customer satisfaction of incremental releases, 193–195
        evaluating efficiency of response time requirements, 184–186
        evaluating software development to CMM, 186–188
        Measurement Information Model, 26–29
        Measurement Information Model table, 160
    Measurement of software projects, examples, MAPS
        Air Force Business Process Modernization Initiative, 201–203
        background information, 199–201
        management plan, comparing performance to revised plan, 217–222
        management plan, evaluating, 210–212
    Measurement of software projects, examples, MAPS
        management plan, revising, 212–216
        project description, 203–204
        software, evaluating readiness for testing and evaluation, 223–231
        software, installing, 233–234
        software, supporting, 234–237
        system architecture and functionality, 204–207
    Measurement of software projects, examples, Synergy Integrated Copier Case Study
        product/project basics, 239–243
        software development, approaches, 243–246
        software development, estimation and feasibility analysis, 246–250
        software development, performance analysis, 251–253
        software development, redesigning/replanning, 253–257
    Measurement plans. See Plan Measurement activity (MPM)
    Measurement procedures, 14–15
        developing, 50–52
    Measurement Process Model (MPM), 10–12
    Measurements, evaluating. See Evaluate Measurement activity (MPM)
    Measures, measurement constructs, 24
    Measurement tools, 149–150
    Measurement training, 148–149, 152
    Metric measurements, 13
    Military Automated Personnel System (MAPS), case study
        Air Force Business Process Modernization Initiative, features, 201–203
        background information, 199–201
        management plan, comparing performance to revised plan, 217–222
        management plan, evaluating, 210–212
        management plan, revising, 212–216
        project description, 203–204
        software, evaluating readiness for testing and evaluation, 223–231
        software, installing, 233–234
        software, supporting, 234–237
        system architecture and functionality, 204–207
    MIM. See Measurement Information Model (MIM)
    MPM. See Measurement Process Model (MPM)

    Nominal scales, 22

    Objective measurement, 21, 160
    Objectives, software projects
        information needs, 33
    Ordinal scales, 22
        table of Measurement Information Model, 160

    Parametric model estimation approach, 90–92
        varying approaches during projects, 96–97
        versus other approaches, 94–96
    Perform Measurement activity (MPM), 10–11
        data analysis, indicator generation, 68–71
        data analysis, indicator graphical representation, 71–75
        data analysis, types, 65, 67–68
        data, collecting and processing, 61–64
        data, verifying, 65–66
        evaluating measurement process, 135–137
        Integrated Analysis Model, 75–81
        making recommendations, 81–83
        measurement constructs, 25
    Performance analysis, 67–68
        basics, 112–113
        comparing plans to performance, 118–121
        comparing plans, examples, design activities, 163–165
        comparing plans, examples, development activities and events, 161–163
        evaluating alternative plans, 123–124
        indicators, 114–117
        Integrated Analysis Model, 113–114, 119
        measurement data, 49–50
        problems, assessing impact, 121–122
        problems, predicting impact, 122–123
        process, 117–118
        Synergy Integrated Copier Case Study, 251–253
    Plan Measurement activity (MPM), 10–11
        basics, 31–33
        documenting plans, 55–57
        evaluating measurement process, 135–137
        information needs, categories, 35–36
        information needs, identifying, 33–35
        information needs, prioritizing, 36–38
        initiating after review of measurement process, 151
        measurement constructs, 25
        measurement plans, 14–15
        measures, integrating approaches into projects, 48–54
        measures, selecting and specifying, 39–48
        reporting plan progress, 54–55
    Planning measurement data, 49–50
        data analysis, 67
    Projects, software measurement, 5–6
        benefits for managers, 3–4
        CMMI (Software Engineering Institute), 1
        criteria for effectiveness, 6–8
        Measurement Information Model, 8–9
        Measurement Process Model, 10–12
        necessity for organizations, 5
        reasons for measurement, 2
    Projects, software measurement, examples, constructs
        assessing adequacy of personnel resources, 169–171
        assessing earned value performance information, 169–171
        assessing functional correctness, defect density, 182–184
        assessing functional correctness, defects, 179–181
        assessing functional size and stability of requirements, 176–179
        comparing achieved productivity to bid rates, 188–191
        comparing plan to actual code production, 173–176
        comparing plan to completion of incremental builds, 166–169
        comparing plan to software design completion rate, 163–165
        evaluating completed milestones, 161–163
        evaluating customer satisfaction from test cases, 191–193
    Projects, software measurement, examples, constructs
        evaluating customer satisfaction of incremental releases, 193–195
        evaluating efficiency of response time requirements, 184–186
        evaluating software development to CMM, 186–188
        Measurement Information Model, 26–29
        Measurement Information Model table, 160

    Ratio scales, 22
        table of Measurement Information Model, 160
    Reliability models, quality estimation, 102–103

    Scales, measurement constructs, 21–22
        table of Measurement Information Model, 160
    Scatter charts, indicators for data analysis, 73–74
    Simple estimating relationships estimation approach, 93
        varying approaches during projects, 94–96
        versus other approaches, 94–96
    Software Engineering Institute’s Capability Maturity Model Integration (CMMI), 1
        evaluating measurement process maturity, 139
        Measurement and Analysis Process Area, 140
    Software engineering, measurement as standard practice, 1
    Software entities, 14–15
        measurable attributes, 19–20
    Software organizations
        characteristics, 4–5
        measurement as necessity, 5
    Software project measurement, 5–6
        benefits for managers, 3–4
        CMMI (Software Engineering Institute), 1
        criteria for effectiveness, 6–8
        Measurement Information Model, 8–9
        Measurement Process Model, 10–12
        necessity for organizations, 5
        reasons for measurement, 2
    Software project measurement, examples, constructs
        assessing adequacy of personnel resources, 169–171
        assessing earned value performance information, 169–171
        assessing functional correctness, defect density, 182–184
        assessing functional correctness, defects, 179–181
        assessing functional size and stability of requirements, 176–179
        comparing achieved productivity to bid rates, 188–191
        comparing plan to actual code production, 173–176
        comparing plan to completion of incremental builds, 166–169
        comparing plan to software design completion rate, 163–165
        evaluating completed milestones, 161–163
        evaluating customer satisfaction from test cases, 191–193
        evaluating customer satisfaction of incremental releases, 193–195
        evaluating efficiency of response time requirements, 184–186
        evaluating software development to CMM, 186–188
        Measurement Information Model (MIM), 26–29
        Measurement Information Model (MIM) table, 160
    Software project measurement, examples, MAPS
        Air Force Business Process Modernization Initiative, 201–203
        background information, 199–201
        management plan, comparing performance to revised plan, 217–222
        management plan, evaluating,210–212
        management plan, revising, 212–216
        project description, 203–204
        software, evaluating readiness for testing and evaluation, 223–231
        software, installing, 233–234
        software, supporting, 234–237
        system architecture and functionality, 204–207
    Standards, measurement constructs (MIM), 24–25
    Subjective measurement, 21, 160
    Synergy Integrated Copier Case Study
        product/project basics, 239–243
        software development, approaches, 243–246
        software development, estimation and feasibility analysis, 246–250
        software development, performance analysis, 251–253
        software development, redesigning/replanning, 253–257

    Target-based indicators
        feasibility analysis, 107–108
        performance analysis, 114–115
    Technical and Management Processes activity (MPM), 11–12
    Threshold-based indicators
        feasibility analysis, 107–108
        performance analysis, 114–115
    Transaction models, quality estimation, 102–103
    Trend-based indicators
        feasibility analysis, 106–107
        performance analysis, 114–115

    Units, measurement constructs, 22
        table of Measurement Information Model, 160

    Verification of data, 65–66

    Work Breakdown Structure (WBS), 41

    Updates

    Submit Errata

    More Information

    InformIT Promotional Mailings & Special Offers

    I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

    Overview


    Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

    This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

    Collection and Use of Information


    To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

    Questions and Inquiries

    For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

    Online Store

    For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

    Surveys

    Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

    Contests and Drawings

    Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

    Newsletters

    If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

    Service Announcements

    On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

    Customer Service

    We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

    Other Collection and Use of Information


    Application and System Logs

    Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

    Web Analytics

    Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

    Cookies and Related Technologies

    This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

    Do Not Track

    This site currently does not respond to Do Not Track signals.

    Security


    Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

    Children


    This site is not directed to children under the age of 13.

    Marketing


    Pearson may send or direct marketing communications to users, provided that

    • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
    • Such marketing is consistent with applicable law and Pearson's legal obligations.
    • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
    • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

    Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

    Correcting/Updating Personal Information


    If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

    Choice/Opt-out


    Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

    Sale of Personal Information


    Pearson does not rent or sell personal information in exchange for any payment of money.

    While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

    Supplemental Privacy Statement for California Residents


    California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

    Sharing and Disclosure


    Pearson may disclose personal information, as follows:

    • As required by law.
    • With the consent of the individual (or their parent, if the individual is a minor)
    • In response to a subpoena, court order or legal process, to the extent permitted or required by law
    • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
    • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
    • To investigate or address actual or suspected fraud or other illegal activities
    • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
    • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
    • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

    Links


    This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

    Requests and Contact


    Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

    Changes to this Privacy Notice


    We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

    Last Update: November 17, 2020