EARTH WEEK
Now through April 22, save up to 70% on digital learning resources. Learn more.
Register your product to gain access to bonus material or receive a coupon.
PSM provides you with a way to realize the significant benefits of a software measurement program, while understanding and avoiding the risks involved with a “blind jump.” You’ll find this book a worthwhile starting point for your future software measurement initiatives, as well as a source of continuing guidance as you chart your way through the sea of complex opportunities ahead.
—Barry Boehm, from the Foreword
Objective, meaningful, and quantifiable measurement is critical to the successful development of today’s complex software systems. Supported by the U.S. Department of Defense and a rapidly increasing number of commercial practitioners, Practical Software Measurement (PSM) is a process for designing and implementing a project-based software measurement program. PSM provides essential information on scheduling, resource allocation, and technological performance. It enables software managers and developers to make decisions that will affect the project’s outcome positively.
This book is the official, definitive guide to PSM written by the leaders of the PSM development initiative. It describes the principles and practices for developing, operating, and continuously improving your organization’s measurement program. It uses real-world examples to illustrate practical solutions and specific measurement techniques. This book examines the foundations of a software measurement program in depth, defining and prioritizing information needs, developing a project-specific information model, tailoring a process model to integrate measurement activities, and analyzing and understanding the results.
Specific topics include:
In addition, this book includes numerous detailed examples of measurement constructs typically applied to software projects, as well as two comprehensive case studies that illustrate the implementation of a measurement program in different types of projects. With this book you will have the understanding and information you need to realize the significant benefits of PSM as well as a guide for a long-term, organization-wide measurement program.
PSM is founded on the contributions and collaboration of key practitioners in the software measurement field. The initiative was established in 1994 by John McGarry and is currently managed by Cheryl Jones. Both are civilians employed by the U.S. Army. David Card is an internationally known software measurement expert, and is with the Software Productivity Consortium. Beth Layman, Elizabeth Clark, Joseph Dean, and Fred Hall have been primary contributors to PSM since its inception.
Software Measurement: Key Concepts and Practices
Click below for Sample Chapter related to this title:
mcgarrych01.pdf
Foreword.
Preface.
Acknowledgements.
1. Measurement: Key Concepts and Practices.
Motivation for Measurement.
Measurement as an Organizational Discriminator.
The Foundation—Project Measurement.
What Makes Measurement Work.
Measurement Information Model.
Measurement Process Model 10.
Information Needs.
Measurement Construct.
Measurement Construct Examples.
Identify and Prioritize Information Needs.
Select and Specify Measures.
Integrate the Measurement Approach into Project Processes.
Collect and Process Data.
Analyze Data.
Make Recommendations.
Estimation.
Feasibility Analysis.
Performance Analysis.
Evaluate the Measures.
Evaluate the Measurement Process.
Update the Experience Base.
Identify and Implement Improvements.
Obtain Organizational Commitment.
Define Measurement Responsibilities.
Provide Resources.
Review the Measurement Program.
Lessons Learned.
Appendix A: Measurement Construct Examples.
Milestone Completion.
Work Unit Progress: Software Design Progress.
Incremental Capability.
Personnel Effort.
Financial Performance: Earned Value.
Project Overview.
Getting the Project Under Control.
Evaluating Readiness for Test.
Installation and Software Support.
Product and Project Overview.
Estimation and Feasibility Analysis.
Performance Analysis.
Redesign and Replanning.
Management by fact has become an increasingly popular concept in the software engineering and information technology communities. Organizations are focusing attention on measurement and the use of objective information to make decisions. Quantitative performance information is essential to fact-based management. Practical Software Measurement: Objective Information for Decision Makers describes an approach to management by fact for software project managers based on integrating the concepts of a Measurement Information Model and a Measurement Process Model. While these concepts apply to non-software activities as well, the examples and terminology presented in this book focus primarily on software.
The information needs of the decision maker drive the selection of software measures and associated analysis techniques. This is the premise behind the widely accepted approaches to software measurement, including goal/question/metric (Basili and Weiss, 1984) and issue/category/measure (McGarry et al., 1997). Information needs result from the efforts of managers to influence the outcomes of projects, processes, and initi-atives toward defined objectives. Information needs are usually derived from two sources: (1) goals that the manager seeks to achieve and (2) obstacles that hinder the achievement of these goals. Obstacles, also referred to as issues, include risks, problems, and a lack of information in a goal-related area. Unless there is a manager or other decision maker with an information need, measurement serves no purpose. The issues faced by a software project manager are numerous. Typically they include estimating and allocating project resources, tracking progress, and delivering products that meet customer specifications and expectations.
A Measurement Information Model defines the relationship between the information needs of the manager and the objective data to be collected, commonly called measures. It also establishes a consistent terminology for basic measurement ideas and concepts, which is critical to communicating the measurement information to decision makers. The information model in Practical Software Measurement (PSM) defines three levels of measures, or quantities: (1) base measures, (2) derived measures, and (3) indicators. It is interesting to note that the three levels of measurement defined in the PSM information model roughly correspond to the three-level structures advocated by many existing measurement approaches. Examples include the goal/question/metric (Basili and Weiss, 1984), factor/criteria/metric (Walters and McCall, 1977), and issue/category/measure (McGarry et al., 1997) approaches already in use within the software community. A similar approach for defining a generic data structure for measurement was developed by Kitchenham et al., who defined their structure as an Entity Relationship Diagram (1995).
An effective measurement process must address the selection of appropriate measures as well as provide for effective analysis of the data collected. The Measurement Process Model describes a set of related measurement activities that are generally applicable in all circumstances, regardless of the specific information needs of any particular situation. The process consists of four iterative measurement activities: establish, plan, perform, and evaluate. This process is similar to the commonly seen Plan-Do-Check-Act cycle (Deming, 1986).
Recognition of a need for fact-based, objective information leads to the establishment of a measurement process for a project or an organization. The specific information needs of the decision makers and measurement users drive the selection and definition of appropriate measures during measurement planning. The resulting measurement approach instantiates a project-specific information model identifies the base measures, derived measures, and indicators to be employed, and the analysis techniques to be applied in order to address the project’s prioritized information needs.
As the measurement plan is implemented, or performed, the required measurement data is collected and analyzed. The information product that results from the perform measurement activity is provided to the decision makers. Feedback from these measurement users helps in the evaluation of the effectiveness of the measures and measurement process so that they can be improved on a continuing basis.
The basic concepts presented in this book evolved from extensive measurement experience and prior research. They were previously introduced in sequentially released versions of Practical Software Measurement (McGarry et al., 1997) and were formalized in ISO/IEC Standard 15939—Software Measurement Process (2001). The measurement process model and measurement terminology from ISO/IEC 15939 have also been adopted as the basis of a new Measurement and Analysis Process Area in the Software Engineering Institute’s Capability Maturity Model Integration (CMMI) project (CMMI Development Team, 2000). This book explains how software development and maintenance organizations can implement a viable measurement process based on the proven measurement concepts of ISO/IEC 15939 and the CMMI in a practical and understandable manner.
In simplistic terms, implementing an objective measurement-by-fact process for a software-intensive project encompasses defining and prioritizing the information needs of the project decision makers through the development of a project-specific information model and then tailoring and executing a project-specific set of measurement process activities. The PSM approach to accomplishing this integrates prior experience and research from many sources across many application domains.
To enhance readability, the authors have limited most of the in-text references to suggestions for further reading on specific topics. Additional references are provided in the bibliography.
The following topics are addressed in this book:
Chapter 1: Measurement: Key Concepts and Practices. Chapter 1 provides an overview of software measurement, explaining how measurement supports today’s information-oriented business models and how measurement can become a corporate resource. It describes the relationships between project- and organizational-level measurement, and introduces the two primary concepts of PSM: the Measurement Information Model and the Measurement Process Model.
Chapter 2: Measurement Information Model. Chapter 2 presents an in-depth discussion of the Measurement Information Model and its measurement components. It relates the Measurement Information Model to measurement planning and implementation activities.
Chapter 3: Plan Measurement. Chapter 3 is the first of five chapters that look at the individual measurement process activities in detail. Chapter 3 focuses on the Plan Measurement activity and describes what is required to define an information-driven, project-specific measurement plan.
Chapter 4: Perform Measurement. Chapter 4 addresses the Perform Measurement activity and discusses how to collect and analyze measurement data. It introduces several concepts related to measurement analysis, including the types of analysis and how to relate information needs and associated issues in terms of cause and effect.
Chapter 5: Analysis Techniques. Chapter 5 provides an in-depth treatment of the three primary types of measurement analysis: estimation, feasibility analysis, and performance analysis.
Chapter 6: Evaluate Measurement. Chapter 6 describes the Evaluate Measurement activity. It focuses on the assessment, evaluation, and improvement of applied project measures and the implemented project measurement processes.
Chapter 7: Establish and Sustain Commitment. Chapter 7 explains the final measurement activity, Establish and Sustain Commitment, which addresses the organizational requirements related to implementing a viable project measurement process. Chapter 7 also addresses measurement “lessons learned.”
Chapter 8: Measure for Success. Chapter 8 reviews some of the major concepts presented in this book and identifies key measurement success factors.
Appendix A: Measurement Construct Examples. Appendix A provides detailed examples of measurement constructs typically applied to software-intensive projects.
Appendix B: Information System Case Study Appendix B provides a comprehensive case study that addresses the implementation of a measurement process for a typical information system.
Appendix C: Synergy Integrated Copier Case Study. Appendix C is a second case study that describes how measurement can be applied to a major software-intensive upgrade project.
Activity aggregation structures, 52–54
Activity-based model
estimation approach, 92
varying approaches
during projects, 96–97
versus
other approaches, 94–96
Aggregation structures,
52–54
data verification,
66
Analogy model estimation approach,
92–93
varying approaches during
projects, 96–97
versus other
approaches, 94–96
Analysis models, measurement constructs,
23–24
examples,
26–29
Analysis techniques
estimation, approaches,
90–97
estimation, basics,
86
estimation, calibrating and mapping
data, 97–98
estimation, computing,
98–99
estimation, effort,
100–101
estimation, estimators,
87–89
estimation, evaluating
estimates, 103–104
estimation,
Integrated Analysis Model, 87
estimation,
poor factors, 86–87
estimation,
process steps, 89–90
estimation,
quality, 102–103
estimation,
schedules, 101–102
estimation, size,
99–100
evaluating measures,
127–131
feasibility, basics,
104–106
feasibility, indicators,
106–108
feasibility, process,
108–112
performance, basics,
112–113
performance, indicators,
114–117
performance, Integrated
Analysis Model, 113–114,
119
performance, plans, comparing to
performance, 118–121
performance,
plans, evaluating alternatives,
123–124
performance, problems,
assessing impact,
121–122
performance, problems,
predicting impact,
122–123
performance, process,
117–118
Analyze Data task
basics,
65–68
indicators, generating,
68–71
indicators, representing
graphically, 71–75
Integrated
Analysis Model, 75–81
Attributes, measurement constructs,
18–20
examples,
26–29
table of Measurement
Information Model, 160
Baldrige, Malcolm, Award, 155
Bar
charts, indicators for data analysis, 73
Base measures, measurement
constructs, 18–20,
24–25
evaluating measures,
127–131
examples,
26–29
planned and actual completion,
62–63
relation to indicators,
68
selecting/specifying,
39–48
table of Measurement
Information Model, 160
Capability Maturity Model Integration
(CMMI), Software Engineering Institute,
1
evaluating measurement process maturity,
139
example, evaluating software,
186–188
Measurement and Analysis
Process Area, 140
recognition of
measurement’s importance, 155
Charts. See graphs for
indicators
Coding Progress indicators, 47–48
Component
aggregation structures, 52–54
Data, measurement
constructs, 24
analyzing, indicator
generation, 68–71
analyzing,
indicator graphical representation,
71–75
analyzing, types of analysis,
65, 67–68
collecting and processing,
61–64
Integrated Analysis Model,
75–81
making recommendations after
analysis, 81–83
types of data,
49–50
verifying,
65–66
Decision criteria, measurement constructs,
24
table of Measurement Information Model,
160
Derived measures, measurement constructs,
22–25
evaluating measures,
127–131
examples,
26–29
relation to indicators,
68
selecting/specifying,
39–48
table of Measurement
Information Model, 160
Establish and Sustain Commitment
activity (MPM), 10–12
defining
measurement responsibilities,
146–147
evaluating measurement
process, 137–138
obtaining
organizational commitment,
144–145
providing measurement
resources, 147–150
reviewing
measurement program, 150–151
Estimation, 65,
67
approaches, activity-based models,
92
approaches, analogy,
92–93
approaches, comparing models,
94–96
approaches, parametric models,
90–92
approaches, selecting models,
94
approaches, simple estimating
relationships, 93
approaches, varying
during projects, 96–97
basics,
86
calibrating models with local historical
data, 97–98
computing,
98–99
effort,
100–101
estimators,
87–89
evaluating estimates,
103–104
Integrated Analysis Model,
87
mapping model’s assumptions to
project’s characteristics,
97–98
poor estimation factors,
86–87
process steps,
89–90
quality,
102–103
role in data analysis,
79–81
schedules,
101–102
size,
99–100
Synergy Integrated Copier Case
Study, 246–250
Estimators, type of indicators,
87–89
Evaluate Measurement activity (MPM),
10–12
assessing measurement process,
conformance, 131, 134–138
assessing
measurement process, maturity, 131,
138–140
assessing measurement
process, performance,
131–134
assessing products of
measurement process, 127–131
basic
tasks,
125–127
identifying/implementing
process improvements,
141–142
updating experience base,
140–141
Examples, measurement
constructs
assessing adequacy of personnel
resources, 169–171
assessing earned
value performance information,
169–171
assessing functional
correctness, defect density,
182–184
assessing functional
correctness, defects,
179–181
assessing functional size and
stability of requirements,
176–179
comparing achieved
productivity to bid rates,
188–191
comparing plan to actual code
production, 173–176
comparing plan to
actual software design completion rate,
163–165
comparing plan to completion
of incremental builds,
166–169
evaluating completed
milestones, 161–163
evaluating
customer satisfaction from test cases,
191–193
evaluating customer
satisfaction of incremental releases,
193–195
evaluating efficiency of
response time requirements,
184–186
evaluating software
development to CMM,
186–188
Measurement Information
Model, 26–29
Measurement Information
Model table, 160
Experience base, updating during Evaluate
Measurement activity (MPM), 140–141
Feasibility analysis,
67
basics,
104–106
indicators,
106–108
process,
108–112
Synergy Integrated Copier
Case Study, 246–250
Functional aggregation structures,
52–54
Functions, measurement constructs (MIM),
23
examples,
26–29
table of Measurement
Information Model, 160
Graphs for indicators, data
analysis
guidelines for effectiveness,
74–75
types,
71–74
Historical measurement data,
49–50
calibrating model’s
estimations, 97–98
computing
estimations, 99
data analysis,
67
Indicators, measurement constructs (MIM),
23–25
Coding Progress indicators,
47–48
data analysis, generating new
indicators, 68–71
data analysis,
graphical representation,
71–75
estimators,
87–89
evaluating measures,
127–131
examples,
26–29
feasibility analysis,
106–108
performance analysis,
114–117
selecting/specifying,
39–47
table of Measurement
Information Model, 160
Information needs,
14–15
categories, 16–17,
35–36, 160
identifying,
33–35
issues,
16
prioritizing,
36–38
selecting/specifying measures,
39–48
table of Measurement
Information Model, 160
Information products, 9,
14–15
Information-driven measurement, 7–8
Integrated
Analysis Model
basics,
75–77
estimation,
87
example,
79–81
performance analysis,
113–114, 119
relationships between
measurable concepts, 77–79
Interval scales,
22
table of Measurement Information Model,
160
ISO/IEC 15939 standard,
15
evaluating measurement process
compliance, 134, 138
evaluating measurement
process maturity, 139
evaluating products
of measurement process, 127
Measurement
Process definition, 140
Issues, information needs, 16
Line
charts, indicators for data analysis, 72–73
Major
Automated Information Systems Review Council (MAISRC),
200–201
Make Recommendations task, 81–83
Malcolm
Baldrige Award, 155
MAPS. See Military Automated Personnel
System
Measurable concepts,
14–15
defining,
41–42
table of Measurement
Information Model, 160
Measurement constructs, 14–15, 159.
See also examples, measurement
constructs
analysis models,
23–24
attributes (measurable),
18–20
base measures, 18–21,
24–25
benefits,
20
data,
24
decision criteria,
24
derived measures,
22–25
functions,
23
indicators,
23–25
levels,
17–18
measurable concepts, defining,
41–42
measurement methods,
21
measures,
24
scales, 21–22,
160
specifying,
45–48
standards,
24–25
structure,
18–19
table of Measurement
Information Model, 160
units of
measurements, 22
Measurement Information Model,
159
basics,
8–9
information needs,
14–17
information products,
14–15
iSO/IEC 15939 standard, 15,
127, 134, 138–140
measurable
concepts, 14–15
measurement
constructs, 14–15, 17–25 (See also examples,
measurement constructs)
measurement plans,
14–15
measurement procedures,
14–15
software entities,
14–15
table of model,
160
terminology, lack of agreed-on
definitions, 13
Measurement methods, measurement constructs,
21
examples,
26–29
table of Measurement
Information Model, 160
Measurement of software projects,
5–6
benefits for managers,
3–4
CMMI (Software Engineering
Institute), 1
criteria for effectiveness,
6–8
Measurement Information Model,
8–9
Measurement Process Model,
10–12
organizational necessity,
5
reasons to measure, 2
Measurement of
software projects, examples,
constructs
assessing adequacy of personnel
resources, 169–171
assessing earned
value performance information,
169–171
assessing functional
correctness, defect density,
182–184
assessing functional
correctness, defects,
179–181
assessing functional size and
stability of requirements,
176–179
comparing achieved
productivity to bid rates,
188–191
comparing plan to actual code
production, 173–176
comparing plan to
completion of incremental builds,
166–169
comparing plan to software
design completion rate,
163–165
evaluating completed
milestones, 161–163
evaluating
customer satisfaction from test cases,
191–193
evaluating customer
satisfaction of incremental releases,
193–195
evaluating efficiency of
response time requirements,
184–186
evaluating software
development to CMM,
186–188
Measurement Information
Model, 26–29
Measurement Information
Model table, 160
Measurement of software projects, examples,
MAPS
Air Force Business Process
Modernization Initiative,
201–203
background information,
199–201
management plan, comparing
performance to revised plan,
217–222
management plan, evaluating,
210–212
Measurement of software projects, examples,
MAPS
management plan, revising,
212–216
project description,
203–204
software, evaluating
readiness for testing and evaluation,
223–231
software, installing,
233–234
software, supporting,
234–237
system architecture and
functionality, 204–207
Measurement of software projects,
examples, Synergy Integrated Copier Case
Study
product/project basics,
239–243
software development,
approaches, 243–246
software
development, estimation and feasibility analysis,
246–250
software development,
performance analysis,
251–253
software development,
redesigning/replanning, 253–257
Measurement plans. See
Plan Measurement activity (MPM)
Measurement procedures,
14–15
developing,
50–52
Measurement Process Model (MPM),
10–12
Measurements, evaluating. See Evaluate
Measurement activity (MPM)
Measures, measurement constructs,
24
Measurement tools, 149–150
Measurement training,
148–149, 152
Metric measurements, 13
Military Automated
Personnel System (MAPS), case study
Air
Force Business Process Modernization Initiative, features,
201–203
background information,
199–201
management plan, comparing
performance to revised plan,
217–222
management plan, evaluating,
210–212
management plan, revising,
212–216
project description,
203–204
software, evaluating
readiness for testing and evaluation,
223–231
software, installing,
233–234
software, supporting,
234–237
system architecture and
functionality, 204–207
MIM. See Measurement
Information Model (MIM)
MPM. See Measurement Process Model
(MPM)
Nominal scales, 22
Objective measurement, 21,
160
Objectives, software
projects
information needs, 33
Ordinal
scales, 22
table of Measurement Information
Model, 160
Parametric model estimation approach,
90–92
varying approaches during
projects, 96–97
versus other
approaches, 94–96
Perform Measurement activity (MPM),
10–11
data analysis, indicator
generation, 68–71
data analysis,
indicator graphical representation,
71–75
data analysis, types, 65,
67–68
data, collecting and
processing, 61–64
data, verifying,
65–66
evaluating measurement process,
135–137
Integrated Analysis Model,
75–81
making recommendations,
81–83
measurement constructs,
25
Performance analysis,
67–68
basics,
112–113
comparing plans to
performance, 118–121
comparing plans,
examples, design activities,
163–165
comparing plans, examples,
development activities and events,
161–163
evaluating alternative plans,
123–124
indicators,
114–117
Integrated Analysis Model,
113–114, 119
measurement data,
49–50
problems, assessing impact,
121–122
problems, predicting impact,
122–123
process,
117–118
Synergy Integrated Copier
Case Study, 251–253
Plan Measurement activity (MPM),
10–11
basics,
31–33
documenting plans,
55–57
evaluating measurement process,
135–137
information needs,
categories, 35–36
information needs,
identifying, 33–35
information needs,
prioritizing, 36–38
initiating after
review of measurement process,
151
measurement constructs,
25
measurement plans,
14–15
measures, integrating
approaches into projects,
48–54
measures, selecting and
specifying, 39–48
reporting plan
progress, 54–55
Planning measurement data,
49–50
data analysis, 67
Projects,
software measurement, 5–6
benefits
for managers, 3–4
CMMI (Software
Engineering Institute), 1
criteria for
effectiveness, 6–8
Measurement
Information Model, 8–9
Measurement
Process Model, 10–12
necessity for
organizations, 5
reasons for measurement,
2
Projects, software measurement, examples,
constructs
assessing adequacy of personnel
resources, 169–171
assessing earned
value performance information,
169–171
assessing functional
correctness, defect density,
182–184
assessing functional
correctness, defects,
179–181
assessing functional size and
stability of requirements,
176–179
comparing achieved
productivity to bid rates,
188–191
comparing plan to actual code
production, 173–176
comparing plan to
completion of incremental builds,
166–169
comparing plan to software
design completion rate,
163–165
evaluating completed
milestones, 161–163
evaluating
customer satisfaction from test cases, 191–193
Projects,
software measurement, examples,
constructs
evaluating customer satisfaction
of incremental releases,
193–195
evaluating efficiency of
response time requirements,
184–186
evaluating software
development to CMM,
186–188
Measurement Information
Model, 26–29
Measurement Information
Model table, 160
Ratio scales,
22
table of Measurement Information Model,
160
Reliability models, quality estimation,
102–103
Scales, measurement constructs,
21–22
table of Measurement
Information Model, 160
Scatter charts, indicators for data
analysis, 73–74
Simple estimating relationships estimation
approach, 93
varying approaches during
projects, 94–96
versus other
approaches, 94–96
Software Engineering Institute’s
Capability Maturity Model Integration (CMMI),
1
evaluating measurement process maturity,
139
Measurement and Analysis Process Area,
140
Software engineering, measurement as standard practice,
1
Software entities,
14–15
measurable attributes,
19–20
Software
organizations
characteristics,
4–5
measurement as necessity,
5
Software project measurement,
5–6
benefits for managers,
3–4
CMMI (Software Engineering
Institute), 1
criteria for effectiveness,
6–8
Measurement Information Model,
8–9
Measurement Process Model,
10–12
necessity for organizations,
5
reasons for measurement, 2
Software
project measurement, examples,
constructs
assessing adequacy of personnel
resources, 169–171
assessing earned
value performance information,
169–171
assessing functional
correctness, defect density,
182–184
assessing functional
correctness, defects,
179–181
assessing functional size and
stability of requirements,
176–179
comparing achieved
productivity to bid rates,
188–191
comparing plan to actual code
production, 173–176
comparing plan to
completion of incremental builds,
166–169
comparing plan to software
design completion rate,
163–165
evaluating completed
milestones, 161–163
evaluating
customer satisfaction from test cases,
191–193
evaluating customer
satisfaction of incremental releases,
193–195
evaluating efficiency of
response time requirements,
184–186
evaluating software
development to CMM,
186–188
Measurement Information Model
(MIM), 26–29
Measurement Information
Model (MIM) table, 160
Software project measurement, examples,
MAPS
Air Force Business Process
Modernization Initiative,
201–203
background information,
199–201
management plan, comparing
performance to revised plan,
217–222
management plan,
evaluating,210–212
management plan,
revising, 212–216
project
description, 203–204
software,
evaluating readiness for testing and evaluation,
223–231
software, installing,
233–234
software, supporting,
234–237
system architecture and
functionality, 204–207
Standards, measurement constructs
(MIM), 24–25
Subjective measurement, 21, 160
Synergy
Integrated Copier Case
Study
product/project basics,
239–243
software development,
approaches, 243–246
software
development, estimation and feasibility analysis,
246–250
software development,
performance analysis,
251–253
software development,
redesigning/replanning, 253–257
Target-based
indicators
feasibility analysis,
107–108
performance analysis,
114–115
Technical and Management Processes activity (MPM),
11–12
Threshold-based
indicators
feasibility analysis,
107–108
performance analysis,
114–115
Transaction models, quality estimation,
102–103
Trend-based
indicators
feasibility analysis,
106–107
performance analysis,
114–115
Units, measurement constructs,
22
table of Measurement Information Model,
160
Verification of data, 65–66
Work Breakdown
Structure (WBS), 41