Home > Store

Practical Guide to Testing Object-Oriented Software, A

Register your product to gain access to bonus material or receive a coupon.

Practical Guide to Testing Object-Oriented Software, A

Book

  • Sorry, this book is no longer in print.
Not for Sale

Description

  • Copyright 2001
  • Dimensions: 7-1/4" x 9-1/4"
  • Pages: 416
  • Edition: 1st
  • Book
  • ISBN-10: 0-201-32564-0
  • ISBN-13: 978-0-201-32564-5

While developers and IT organizations increasingly acknowledge the importance of software testing, few know how to proceed -- especially when it comes to testing advanced object-oriented software systems. In this book, two leading O-O test researchers and consultants outline a start-to-finish methodology for testing: what to test, why to test it, how to test it, who should do the testing, and when. The book is organized around a task orientation, encompassing testing models; testing components, systems and subsystems; and planning for testing. The authors review the unique challenges associated with object-oriented software testing, offer practical insights into testing priorities, introduce each leading testing technique, and walk step-by-step through applying them. They review the development of custom test software, and demonstrate how to strengthen the ties between testing and the rest of the development process. Features include a detailed object-oriented testing FAQ, and a running case study that ties together all stages and elements of O-O testing. For every IT manager, project manager, software developer and engineer, and for any professional concerned with the measurement of software quality.

Downloads

Source Code

Download the Source Code Files for this book:

Java version of Brickles

C++ version of Brickles

Extras

Related Article

Introduction to Testing Object-Oriented Software

Web Resources

Click below for Web Resources related to this title:
Author's Web Site

Sample Content

Table of Contents



1. Introduction.

Who Should Read This Book?

What Software Testing Is and Isn't.

What Is Different about Testing Object-Oriented Software?

Overview of Our Testing Approach.

Test Early.

Test Often.

Test Enough.

The Testing Perspective.

Organization of This Book.

Conventions Used in This Book.

A Continuing Example — Brickles.

Basic Brickles Components.

Brickles Physics.

Game Environment.



2. The Testing Perspective.

Testing Perspective.

Object-Oriented Concepts.

Object.

Message.

Interface.

Class.

Inheritance.

Polymorphism.

Development Products.

Analysis Models.

Design Models.

Source Code.

Summary.



3. Planning for Testing.

A Development Process Overview.

A Testing Process Overview.

Risk Analysis — A Tool for Testing.

Risks.

Risk Analysis.

A Testing Process.

Planning Issues.

Dimensions of Software Testing.

Who Performs Testing?

Which Pieces Are Tested?

When is Testing Performed?

How Is Testing Performed?

How Much Testing Is Adequate?

Roles in the Testing Process.

A Detailed Set of Test Activities.

Roles in the Testing Process.

Class Tester.

Integration Tester.

System Tester.

Test Manager.

A Detailed Set of Test Activities.

Planning Activities.

Scheduling Testing Activities.

Estimation.

A Process for Testing Brickles.

Document Templates.

Test Metrics.

Summary.



4. Testing Analysis and Design Models.

An Overview.

Place in the Development Process.

The Basics of Guided Inspection.

Evaluation Criteria.

Organization of the Guided Inspection Activity.

Basic Roles.

Individual Inspection.

Preparing for the Inspection.

Specifying the Inspection.

Realistic Models.

Selecting Test Cases for the Inspection.

Creating Test Cases .

Completing Checklists.

The Interactive Inspection Session.

Testing Specific Types of Models.

Requirements Model.

Analysis Models.

Design Models.

Testing Again.

Testing Models for Additional Qualities.

Summary.

Model Testing Checklist.

Addendum: A Process Definition for Guided Inspection.

Steps in the Process.

Detailed Step Descriptions.

Roles in the Process.



5. Class Testing Basics.

Class Testing.

Ways to Test a Class.

Dimensions of Class Testing.

Constructing Test Cases.

Adequacy of Test Suites for a Class.

Constructing a Test Driver.

Test Driver Requirements.

Tester Class Design.

Summary.



6. Testing Interactions.

Object Interactions.

Identifying Interactions.

Specifying Interactions.

Testing Object Interactions.

Testing Collection Classes.

Testing Collaborator Classes.

The Interaction between Testing and Design Approach.

Sampling Test Cases.

Orthogonal Array Testing.

Adequacy Criteria for OATS.

Another Example.

Another Application of OATS.

Testing Off-the-Shelf Components.

Case Study in Component Acceptance Testing.

Protocol Testing.

Test Patterns.

Listener Test Pattern.

Specific Example.

Testing Exceptions.

Testing Interactions at the System Level.

Summary.



7. Testing Class Hierarchies.

Inheritance in Object-Oriented Development.

Subclass Test Requirements.

Refinement Possibilities.

Hierarchical, Incremental Testing.

Organizing Testing Software.

Testing Abstract Classes.

Summary.



8. Testing Distributed Objects.

Basic Concepts.

Computational Models.

Concurrent.

Parallel.

Networked.

Distributed.

Basic Differences.

Non-Determinism.

Additional Infrastructure.

Partial Failures.

Time Outs.

Dynamic Nature of the Structure.

Threads.

Synchronization.

Path Testing in Distributed Systems.

Thread Models.

Life Cycle Testing.

Models of Distribution.

Basic Client/Server Model.

Standard Models of Distribution.

Comparisons and Implications.

A Generic Distributed Component Model.

Basic Architecture.

Local and Remote Interfaces.

Specifying Distributed Objects.

Interface Definition Language.

Traditional Pre/Post-Conditions and Invariants.

Temporal Logic.

Temporal Test Patterns.

Eventually(a).

Until(a,b).

Always.

A Test Environment.

Class Testing.

Interaction Testing.

Test Cases.

Model-specific tests.

Testing Every Assumption.

Infrastructure Tests.

Logic-Specific Test Cases.

The Ultimate Distributed System — The Internet.

Web Servers.

Life Cycle Testing of Internet Applications.

What Haven't We Said?

Summary.



9. Testing Systems.

Defining the System Test Plan.

Features Tested and Not Tested.

Test Suspension Criteria and Resumption Requirements.

Complementary Strategies for Selecting Test Cases.

Use Profile.

ODC.

Use Cases as Sources of Test Cases.

Constructing Use Profiles.

Using Scenarios to Construct Test Cases.

The Expected Results Section of a Test Case.

Brickles.

Testing Incremental Projects.

Legacy Projects.

Testing Multiple Representations.

What Needs to Be Tested.

Testing Against Functional Requirements.

Testing for Qualitative System Attributes.

Testing the System Deployment.

Testing After Deployment.

Testing Environment Interactions.

Test System Security.

Types of Testing.

Stress Testing.

Life Cycle Testing.

Performance Testing.

Testing Different Types of Systems.

Reactive Systems.

Embedded Systems.

Multi-Tiered Systems.

Distributed Systems.

Measuring Test Coverage.

What Is to Be Covered?

When Is Coverage Measured?

When Is Coverage Used?

ODC<ETH>Defect Impacts.

More Examples.

Summary.



10. Components, Frameworks, and Product Lines.

Component Models.

Enterprise JavaBeans Component Model.

Testing Components vs. Objects.

Component Test Processes.

Test Cases Based on Interfaces.

Case Study — A GameBoard Component.

Frameworks.

Basic Issues.

Framework Testing Processes.

Inspecting a Framework.

Structuring Test Cases to Support a Framework.

Product Lines.

Testing at the Organizational Management Level.

Testing at the Technical Management Level.

Testing at the Software Engineering Level.

Testing in a Product Line Project.

Future.

Summary.



11. Conclusion.

Suggestions.

Organization and Process.

Data.

Standards.

Software Infrastructure.

Techniques.

Risks.

Brickles.

Finally.



Bibliography.


Index. 0201325640T04062001

Preface

Testing software is a very important and challenging activity. This is a book for people who test software during its development. Our focus is on object-oriented and component-based software, but you can apply many of the techniques discussed in this book regardless of the development paradigm. We assume our reader is familiar with testing procedural software--that is, software written in the procedural paradigm using languages such as C, Ada, Fortran, or COBOL. We also assume our reader is familiar and somewhat experienced in developing software using object-oriented and component-based technologies. Our focus is on describing what to test in object-oriented development efforts as well as on describing techniques for how to test object-oriented software, and how testing software built with these newer technologies differs from testing procedural software.

What is software testing? To us, testing is the evaluation of the work products created during a software development effort. This is more general than just checking part or all of a software system to see if it meets its specifications. Testing software is a difficult process, in general, and sufficient resources are seldom available for testing. From our standpoint, testing is done throughout a development effort and is not just an activity tacked on at the end of a development phase to see how well the developers did. We see testing as part of the process that puts quality into a software system. As a result, we address the testing of all development products (models) even before any code is written.

We do not necessarily believe that you will apply everything we describe in this book. There are seldom enough resources available to a development effort to do all the levels and kinds of testing we would like. We hope you will find a number of approaches and techniques that will prove useful to and affordable for your project.

In this book we describe a set of testing techniques. All of the techniques we describe have been applied in practice. Many of these techniques have been used in a wide variety of industries and on projects of vastly different sizes. In Chapter 3, we will consider the impact of some of these variables on the types of testing that are routinely performed.

To describe these techniques, we rely in many cases on one or more examples to illustrate their application. We hope from these examples and from our explanations that you can apply the same techniques to your project software in a straightforward manner. The complete code for these examples, test code, and other resources can be obtained via a link off this Web site.

In order to make this book as useful as possible, we will provide two major organizational threads. The physical layout of the book will follow the usual sequence of events as they happen on a project. Model testing will be addressed earlier than component or code testing, for example. We will also include a set of questions that a tester might ask when he or she is faced with specific testing tasks on a project. This testing FAQ will be tied into the main body of the text with citations.

We have included alternative techniques and ways of adapting techniques for varying the amount of testing. Testing life-critical or mission-critical software requires more effort than testing an arcade game. The summary sections of each chapter should make these choices clear.

This book is the result of many years of research, teaching, and consulting both in the university and in companies. We would like to thank the sponsors of our research, including COMSOFT, IBM, and AT&T for their support of our academic research. Thanks to the students who assisted in the research and those who sat through many hours of class and provided valuable feedback on early versions of the text. The consultants working for Korson-McGregor, formerly Software Architects, made many suggestions and worked with early versions of the techniques while still satisfying client needs. The employees of numerous consulting clients helped us perfect the techniques by providing real problems to be solved and valuable feedback. A special thanks to Melissa L. Russ (formerly Major) who helped teach several tutorials and made her usual insightful comments to improve the material.

Most of all, we wish to thank our families for enduring our mental and physical absences and for the necessary time to produce this work: Gayle and Mary Frances McGregor; Susan, Aaron, Perry, and Nolan Sykes.

JDM
DAS



0201325640P04062001

Index

Abstract classes, 35-36
testing of, 263-661
Abstraction, 39
Acceptance-level component test process, 348
Acceptance testing, 133, 323
of components, 237-41
on frameworks, 360
of systems, 310
Accessor (inspector) operations, 24-25, 53
baseline testing of, 193, 194
Action, 49
Activity, 49
Activity diagrams, 41
in analysis models, 54-55
Actor profiles, 316-17
Actual parameters of messages, 20
Adequacy criteria for orthogonal array testing system, 234-35
Adequacy of testing, idea of, 84-86
"All events" level of coverage, 335
Always temporal operator, 293
Analysis
in STEP technique, 72
testing during, 5-6
Analysis models, 40-55
activity diagrams in, 54-55
application analysis, 41
class diagrams in, 45-48, 49
class specifications in, 49-51
domain analysis, 40-41
guided inspection of, 138-41
level of detail in testing, 82
purpose of, 42
sequence diagrams in, 51-54, 55
state diagrams in, 49, 50, 53
use case diagrams in, 42-45
Analysis phase of development process, 66
Ancestors, 32
API testing, 304
Application, system vs., 310
Application analysis, 41, 66, 115
testing activities, 88
Application-analysis models
guided inspection of, 140-41
mapping domain models onto, 140
Application implementation testing activities, 90
Application-level use cases, 44
Application life cycles, 331-32
Arcade games, domain-level use cases for, 43-44
Architects, role in testing class diagram, 149
Architectural design, 115
guided inspection of, 142-48
evaluating performance and scalability, 147-48
inspection criteria, 43
roles in inspection, 142
test case for, 142, 145, 146-47
test execution, 146
verification of results, 146
testing activities, 89
Architectural models, performance goals in, 120
Architecture
software, 142
three-tiered, 146-47
Assertion checking in class under test (CUT), 194
Asynchronous messages, 287, 302
Attitudes toward testing, 2
Base class (superclass), 32
Baseline testing, 193-95
BeanBox, 346
Beans, 345-46
Behavior
constrained by substitution principle, 33
of instances, 27
BetterState, 143
Black box testing (functional testing), 84, 187
Boundary conditions, 180
Boundary values for states, 179
Brickles, 10-14
abstract classes in, 36
activity diagram, 54
all events coverage for, 335
application analysis of, 41
architectural model for, 145
change case for, 152
class diagram for, 45-47
class risks in, 166
class testing in, 169
collaborating classes in, 219, 223
collection classes in, 218, 219
components of, 11
design class diagram for, 56-57, 58
detailed design model for, 148
domain analysis of, 40
exceptions in, 245
frameworks in, 359
game environment of, 14
guided inspection of, 111-14
incremental iterative development process, 68-69
inheritance in, 34-35, 256-57
Java implementation of, 270, 278, 303
objects in, 18
package diagram for, 47
performance testing of, 333
physics of, 11-13
player actor in, 316, 317
risk analysis of, 76
sequence diagram for, 51, 52, 57-58, 60
Sprite class in, 194
state diagram for, 57, 59
summary of testing, 371-73
system test cases from, 320-22
system test plan for, 312
testing process for, 93-94
test pattern for, 244-45
use-case diagram for, 132, 133, 134-36
use cases in, 42-44, 137
use profiles in, 130
Buddy testing, 70, 80, 92, 151
Business risks, 74
C++
class of referents in, 38
friends supports, 61
inclusion polymorphism in, 34
risks associated with, 75
templates in, 38-39
thread synchronization in, 275
typing in, 61
Callbacks, distributed, 296-97
Capability Maturity Model for Software (SW-CMM), 362
Cell phone, state machine for, 337
Change cases, 151-54
Checklists
completing, 128
design phase, 129
model-testing, 155-56
Class(es), 22-24
abstract, 35-36, 263-66
base, 32
collaborator, 213, 218-20, 223
collection, 218, 219, 222, 237
container, 218
derived, 32, 34
implementation, 22, 30-31, 90
nonprimitive, 215-16, 218
objectives for, 99
as objects, 22
primitive, 215
risk in, 166
sets representing, 35
specification, 22, 24-29
in analysis models, 49-51
subsystems and, 31
Tester. See Tester class
Timer, 279-80
Class diagrams, 41
in analysis models, 45-48, 49
in design models, 56-57
layering for, 122
linking, 123, 124
Class family, 227
Class hierarchies, 249-67
abstract classes, 263-66
organizing testing software, 262-63
subclass test requirements, 250-62
hierarchical incremental testing (HIT), 253-57
implementation-based test cases, 261-62
refinement possibilities, 251-52
specification-based test cases, 257-61
top-down, 250-51
Class invariants, 26-27, 289, 290
Class libraries, 166
container classes in, 218
Class tester, 86-87
Class testing, 78, 163-211. See also Interactions; Parallel architecture for class testing (PACT)
adequacy of, 168, 179-82
code-based coverage, 181-82
constraint-based coverage, 181
state-based coverage, 179-81
buddy approach to, 92
continuum for assignments of roles in, 80
defined, 164
dimensions of, 166-68
of distributed objects, 272, 293-94
execution-based, 263-65
guided inspections, 266
integration testing and, 164
of objects with threads, 277-78
parallel architecture for (PACT), 249, 262-63
partial, 215
regression, 167
scheduling, 91
test cases for, 168-79
describing, 177
identification of, 168
from pre- and postconditions, 169-74
from state transition diagrams, 174-79
test driver for, 168, 183-210
design structures, 183-86
example code for, 196-210
requirements of, 186-88
Tester class, 187-210
unit testing and, 164, 165
ways of, 164-66
Class under test (CUT), 187
assertion checking in, 194
Client/server model, 281
tests for, 296-97
Code-based coverage, 181-82
Code coverage, 106
Code inspections, 165
Code reviews, 186
Code testing, 6
Collaboration. See Interactions
Collaboration diagrams, 51n, 54
Collaborator classes, 213, 218-20, 223
Collaborators, addressing, 217
Collection classes, 218, 219, 222, 237
Commercial-off-the-shelf (COTS) software, 323
Common Object Request Broker Architecture. See CORBA
Communication, component integration failure triggered by, 348
Compatibility, lateral, 125
Compilation, conditional, 265
Completeness, 119
of diagrams, 118
of frameworks, 360
of requirements model, 133
Components, 344-58
of architecture, 142
defined, 344
distributed, 345
Enterprise JavaBeans (EJB) model, 345-49
GameBoard, 351-58
objects vs., 346-48
packaging technology for, 350
protocol between, 350-51
test cases based on interfaces, 349-51
testing off-the-shelf, 237-41
test processes, 348-49
Component test plan, 98-100, 101, 197
Computational models, 271-72
sequential vs. nonsequential, 272-74
Concrete subclasses, 36
Concurrency, 51, 270. See also Distributed objects
among components, 348
as inspection trigger, 125
Concurrent model of computation, 271
Concurrent state diagram, 49
Concurrent state machines, 334
Conditional compilation, 265
Configurations, hardware, 315
Connectors of architectural components, 142
Consistency, 119-20
checks for, 143
between diagrams, 155
of frameworks, 361
of requirements model, 133
Constraint-based coverage, 181
Construction, in STEP technique, 72
Constructors, 25, 30, 266
baseline testing of, 193, 194
default, 280
Container classes, 218
Context
defining, 329
for performance testing, 333
Contract approach, 27, 29, 31, 171
object interaction testing and, 221, 224-25
CORBA, 282-83, 284, 300, 301, 345
CORBA TestBox, 350
Core assets, 362
Correctness
of frameworks, 361
of models, 119
Correctness faults of requirements model, 133
Coverage, 85-86, 106
"All events" level of, 335
code, 106
code-based, 181-82
concept of, 84
constraint-based, 181
estimating levels of, 91-92
model-element, 106
in models, 117
of path testing, 275-76
postcondition, 106, 181
of preconditions, 181
state-based, 179-81
of system testing, 338-41
CTester, 185
Data
collecting for testing, 368
problems with real, 332
static, 168
Database, relational, 325
Data members, 30
Dates, 289
DCOM, 283-84, 285, 300, 345
Debugging, testing vs., 3
Defect categories, 369. See also Orthogonal Defect Classification (ODC)
Defect impacts, 339-40
Defensive design approach, 27-29, 31, 171
object interaction testing and, 221, 222, 224-25
Definitional aspect of object-oriented programming, 19
Deployment testing, 300, 327-28
Derived class. See Subclass
Descendents, 32
Design
for testability, 53, 279-80
tests for, 5-6
Design conformance, as inspection trigger, 125
Design models, 56-59, 141-51
architectural, 142-48
evaluating performance and scalability, 147-48
inspection criteria, 43
roles in inspection, 142
test case for, 142, 145, 146-47
test execution, 146
verification of results, 146
class diagrams in, 56-57
coverage in, 117
detailed class, 148-51
sequence diagrams in, 57-59
state diagrams in, 57
testing, 78
Design patterns, 242-43
vetoable-change, 346, 351, 354-56
Design phase, 66
checklist, 129
Desk check, 121
Destructors, 25, 30, 266
Detailed class design models, 148-51
Detailed design, 115
testing activities, 90
Developer
class testing by, 166-67
in guided inspection, 120-21
role of, 80
Developer hours/number of defects metric, 107
Development-level component test process, 348
Development process, 66-69
changes in, 5
feedback loop to, 70-71
incremental, iterative, 66-69
main activities in, 66
place of guided inspection in, 115
testing's fit into, 2
Development schedule, testing schedule and, 86
Diagrams. See also Activity diagrams; Class diagrams; Sequence diagrams; State diagrams
collaboration, 51n, 54
consistency between, 155
layering, 122-23, 125
message-sequence, 146
package, 41
use case, 41, 42-45
Distributed callbacks, 296-97
Distributed Common Object Model (DCOM), 283-84, 285, 300, 345
Distributed model of computation, 272
Distributed objects, 296-307
additional infrastructure required by, 273
class testing of, 272, 293-94
"component" aspect of, 345
dynamic nature of, 274
failures in, 269-70
partial, 273-74, 301, 303
generic distributed-component model, 284-87
architecture of, 285-86
local and remote interfaces, 287
interaction testing of, 295
Internet, 303-6
failures in, 303-4
life-cycle testing of applications for, 305-6
Web servers, 304-5
life cycle testing of, 280-81
models of distribution, 281-84
basic client/server model, 281
CORBA, 282-83, 284, 300, 301
DCOM, 283-84, 285, 300
RMI, 284, 300
multitiered system, 336-37
path testing in, 275-80
performance of, 148
sequential systems vs., 272-74
specifying, 287-88
system testing of, 338
temporal logic in, 288-93
tests, 295-302
for client/server model, 296-97
of infrastructure, 300-301
language dependence issues, 300
logic-specific cases, 301-2
platform independence issues, 300
threads in, 270-71, 274-75, 278-80
DLLs, 328, 350
Documentation, source code, 59
Document templates. See Test plans
Document/View architecture, 142
Domain analysis, 40-41, 66, 115
coverage in, 117
guided inspection of, 138-39
testing activities, 88
Domain experts, 120
division into teams, 119
in requirements inspection, 134-35, 137
Domain-level use cases, 43
Domain life cycles, 331-32
Domain models, mapping onto application-analysis model, 140
Domain type, estimation based on, 92
Drawer, in interactive inspection session, 131
Drivers, interactions among, 315
Dynamic binding, 34n
Dynamic link library (DLL), 328, 350
E-commerce systems, 303
Editor inheritance, 265
Effectiveness of testing process, 106-7
Efficiency of testing process, 106, 107
Embedded systems, 335-36
End-to-end scenarios, 133
End-to-end system-level use cases, 101
Enterprise JavaBeans (EJB) model, 345-49
Environmental interactions, 328-30
Equipment required, estimating, 92
Estimation, 91-93
Event, 49
Eventually temporal operator, 291-92
Exceptions, 20, 28
provider-related, 288
testing, 245-48
Execution and evaluation, in STEP technique, 73
Execution-based testing, 164, 165, 263-65
Execution time, 289
Expected results, 72
Experts, domain, 120
division into teams, 119
in requirements inspection, 134-35, 137
Extends relation (in use cases), 43-44
Facade, 324
Factors, 228-31
Factory Methods, object under test (OUT), 189-93, 195, 222n
Fagan inspections, 110
Frameworks, 344, 359-61
Friend functions, 61, 223
Functionality use cases, 101
Functional sub-use cases, 101
Functional test cases, 99, 187
Functional testing, 84, 187
GameBoard component, 351-58
Generalization of inheritance relationship, 250
Grand tour test cases, 144, 320
Guard condition, 49
Guided inspection, 109-62, 300, 370
abstract class testing using, 266
of analysis model, 138-41
of design models, 141-51
architectural, 142-48
detailed, 148-51
of diagram completeness and consistency, 118
evaluation criteria, 118-20
of frameworks, 360
metalevel questions about system using, 151-54
organization of, 120-21
overview of, 110-14
place in development process, 115
preparing for, 121-31
checklist completion, 128
creating test cases, 127-28
interactive inspection session, 128-31
realistic models, 121-23
selecting test cases for, 123-27
specifying inspection, 121
process checklist for, 155-56
process definition for, 157-62
of product line architecture, 364
of requirements model, 131-38
evaluation criteria, 132-34
roles for, 134-37
testing, 137-38
use case for, 132-36
requirements of, 99
scope and depth of, 111, 116, 121
to transfer knowledge about model under test, 151
GUI testing, 304
Hardware configurations, 315
Hierarchies, class. See Class hierarchies
Hierarchical incremental testing (HIT), 253-57, 370
High-level use cases, 101
Hot spots, 359
IEEE standards, 94, 96, 311, 368
Implementation-based testing (structural testing), 84, 167, 182, 187, 261-62
Implementation for software entity, 83
Implementation phase of development process, 66
Implicit specifications, 288
Implicit tests, 288
In, out attribute, 287
Include directive, 22
Inclusion polymorphism, 32, 34-38, 83
Incremental, iterative development process, 66-69
Independent testers, 80, 92
Infeasible test case, 320
Inheritance, 31-32, 33, 37, 249, 250. See also Class hierarchies
editor, 265
inclusion polymorphism and, 34
multiple, 31n, 264
private, 34n
protected, 34n
public, 34n
Inheritance hierarchy, 32
Inherited test cases, 253
Input for test cases, 72, 177
Insertion operator, 223
Inspection, in software testing, 16
Inspection team, assigning, 111
Inspector (accessor) operations, 24-25, 53
baseline testing of, 193, 194
Instances, 22, 27
Instance variables, 30
Instantiation, 22
Integration-level component test process, 348-49
Integration tester, 87
Integration testing, 2, 5, 70, 83
class testing and, 164
scheduling, 91
Integration test plan, 101-2
Interactions
among components, 348
environmental, 328-30
in multiple representations, 324
object. See Object interactions
parameter, 216
standard patterns of, 347
test cases for memory/disk, 328-29
Interaction test cases, 100, 187
Interaction testing, 78, 213-48
of distributed objects, 295
of exceptions, 245-47
focus of, 214
of off-the-shelf components, 237-41
of protocols, 241
sampling test cases, 225-37
dimensions for, 227-28
orthogonal array testing system (OATS), 228-37
at system level, 247-48
test patterns, 242-45
Interaction test suite, 187
Interactive inspection session, 128-31
Interface(s)
approaches to defining, 27-29
in Java, 61
to legacy software, 323
object-oriented, 21-22
PACT approach to, 349
test cases based on, 349-51
between two data representations, 325
Interface definition language (IDL), 287
Internet, 303-6
failures in, 303-4
life-cycle testing of applications for, 305-6
Web servers, 304-5
Internet Inter-Orb Protocol (IIOP), 284
Invariant conditions, 27, 49
Invariants, 26-27
checking, 191
class, 26-27, 289, 290
for distributed components, 288
object, 290
Iterative enhancement, 7
Java
class of referents in, 38
inclusion polymorphism in, 34
interfaces in, 61
permissions file in, 330
risks associated with, 75
thread synchronization in, 270, 274, 275
typing in, 61
Java Archives (JAR), 350
JUnit, 370
Languages, programming
effect on testing, 61
risks associated with, 75
Lateral compatibility as inspection trigger, 125
Legacy projects, 323-24
Levels, 229
Life cycle(s)
for an object, 18
types of, 331-32
Life-cycle scenario, 238, 240-41
Life-cycle testing
of distributed system, 280-81
of Internet applications, 305-6
system, 331-32
Lifetime of software, expected, 84-85
Listener design pattern, 242-43, 346
Listener test pattern, 242-45
Load test, 326-27
Logic, temporal, 288-93, 370
Maintenance, 66
Management-level testing, 362-63
Manager, test, 87
Member function call, 20n. See also Messages
Member functions, 30
Memory/disk interaction, test cases for, 328-29
Memory handling, 328
Messages, 20-21
asynchronous, 51, 287, 302
directional, 214
Message-sequence diagram, 146
Metaclass, 19
Methods, 30
invocation, 20n. See also Messages
object interaction through, 217
operations distinguished from, 17-18
signature for, 287
static, 168
test case, 189
Microsoft Foundation Classes (MFC), 57, 359
Model-element coverage, 106
Modeling, 109. See also Guided inspection
diagrams and, 47-48
Models. See also Analysis models; Design models; Requirements model, guided inspection of
application-analysis, 140-41
completeness of, 119
computational, 271-74
consistency of, 119-20
correctness of, 119
coverage in, 117
organizational, 92
realistic, 121-23
testing for additional qualities, 151-54
thread, 278-80
Model testing, 6
checklist for, 155-56
Model/View architecture, 144
Model/View/Controller (MVC) architecture, 142
Moderator in interactive inspection session, 130
Modifier methods, baseline testing of, 193, 194
Modifier (mutator) operations, 24-25
Modularity of executables, security issues with, 330
Movable sprite, 257
Multiple inheritance, 31n, 264
Multitiered systems, 336-37
Mutator (modifier) operations, 24-25
National Oceanic and Atmospheric Administration (NOAA), 271
Navigability, 57
Networked model of computation, 271-72
Networked system, time-outs in, 274
Nonprimitive classes, 215-16, 218
Null "out" parameter, 288
Null provider reference, 288
Number of defects/developer-hour metric, 106
N-way switch cover, 336
Object Constraint Language (OCL), 49-51, 53, 221
Object interactions, 214-25
collaboration classes, 218-20, 223
collection classes, 218, 219, 222, 237
defined, 214
identifying, 214-18
testing, 222-25
"chunk" size for, 220-21
defensive vs. contract approach and, 221, 223, 224-25
Object invariants, 290
Object locator, 285
Object Management Group (OMG), 282
Object-oriented concepts, 17-39
class, 22-24
implementation, 22, 30-31
specification, 22, 24-29
inheritance, 31-32, 33
interface, 21-22
message, 20-21
object, 18-20, 22
polymorphism, 32-39
inclusion, 32, 34-38, 83
parametric, 38-39
Object-oriented software, testing of, 5-6
Object request broker (ORB), 282
Objects, 18-20, 22. See also Distributed objects
classes as, 22
components vs., 346-48
definitional vs. operational semantics of, 19
with threads, 277-78
ObjectTime, 143
Object under test (OUT), 177, 189-93
Object under test (OUT) Factory Methods, 189-93, 195, 222n
ODC. See Orthogonal Defect Classification (ODC)
Off-the-shelf components, 237-41, 323
One-off systems, 359
One-way designation, 287
Operation, 24-29
Operational aspect of object-oriented programming, 19
Operational profile, 313
Operations
methods distinguished from, 17-18
private, 30
Operations semantics, 26-27
Operators, temporal, 289, 291-93
Oracle, testing, 119
Organization, testing, 367-68
Organizational management level, testing at, 362-63
Organization model, estimation based, 92
Orthogonal array testing system (OATS), 86, 228-37, 274, 320, 322, 347, 370
adequacy criteria for, 234-35
example of, 235-37
factors, 228-31
levels, 229
standard arrays, 231, 234
test cases in, 234
Orthogonal Defect Classification (ODC), 124, 125-26, 314-15, 369
defect impacts, 339-40
Output for test cases, 177
Package diagram, 41
Parallel architecture for class testing (PACT), 71, 249, 262-63, 291, 370
frameworks, 360, 361
interfaces, 349
Parallel model of computation, 271
Parameters
interaction among, 216
of messages, actual, 20
Parametric polymorphism, 38-39
Partial class testing, 215
Path testing in distributed system, 275-80
Patterns in bean design, 346
Performance
architectural design and, 147
of distributed systems, 148
Performance-based claims, 326-27
Performance testing of systems, 333-34
Permissions file, 330
Pinball game, state diagram for, 154
Planning for testing, 65-107. See also Test plans
development process and, 66-69, 70-71
incremental, iterative, 66-69
main activities in, 66
document templates for, 94-103
component test plan, 98-100, 101
integration test plan, 101-2
project test plan, 97-98
system test plans, 104, 105
use-case test plans, 101, 102-4
effort extended in, 105
estimation, 91-93
iterations in, 105
risk analysis, 74-78
testing activities, 87-90
scheduling, 91
testing process, 68-73, 78-87
adequacy of testing, 84-86, 94
dimensions of, 78-79
feedback loop to development process, 70-71
performance of testing, 83, 84
pieces tested, 81, 94
roles in, 80, 86-87
STEP technique, 72-73
testers, 80, 93
timing of testing, 82, 84
test metrics, 106-7
Plug and play, dynamic, 347
"Plug-in" modules, 303
Pointers, 217
Polymorphic substitution principle, 347
Polymorphism, 32-39, 217
inclusion, 32, 34-38, 83
parametric, 38-39
Population, 225
Postconditions, 49
checking, 191, 195
coverage of, 106, 181
for distributed components, 288
for operation, 26
test cases for class testing from, 169-74
Preconditions, 27, 31, 49
coverage of, 181
for distributed components, 288
for operation, 26
test cases for class testing from, 169-74
Primitive classes, 215
Private inheritance, 34n
Private operations, 30
Probability distribution, 225
Process, 66
Product line architecture, 362
Product lines, 343-44, 362-64
Product validation, 364
Programming languages
effect on testing, 61
risks associated with, 75
Project criteria, 98
Project procedures, 98
Project risks, 74
Project test plan, 97-98
Protected inheritance, 34n
Protocols
between components, 350-51
interaction testing of, 241
Internet Inter-Orb (IIOP), 284
Providers, 285, 286
component test plan for, 298
exceptions from, 288
service specifications, 287
Public inheritance, 34n
Public operations, 216-17
Qualitative system attributes, 326-27
Quality assurance, testing vs., 4
Rapide, 143, 144
Rational Rose, 143
Reactive systems, 334-35
Real calendar time, 289
Receiver, 214
object, 20
Recorder, in interactive inspection session, 131
References, 217
Regression testing, 5, 66, 78
of classes, 167
Relational database, 325
Reliability, computation of, 313
Remote Method Invocation (RMI), 284, 300, 346
Reports, 368-69
Requester, 285-86
component test plan for, 299
Requester surrogate, 285
Requirements model, guided inspection of, 131-38
evaluation criteria, 132-34
roles for, 134-37
testing, 137-38
use case for, 132-36
Requirements specification, 71
Requirements-to-use-case mapping matrix, 105
Results, expected, 72
Resumption requirements for system testing, 311
Return code, 28
Return value, 20
Reuse
framework approach to, 359
in software product line, 362
of test cases, 336
Reviews
class testing by, 164
code, 186
in software testing, 16
Rheostat effect, 86
Risk analysis, 74-78
coverage and, 86
Risks, 74
in class, 166
sources of, 75
as test-case selector, 126
in testing roles, 371
RMI, 284, 300, 346
Root, 32
SAAM approach, 147
Sample
defined, 225
stratified, 226
Sampling test cases, 182, 225-37
dimensions for, 227-28
orthogonal array testing system (OATS), 228-37
adequacy criteria for, 234-35
example of, 235-37
factors, 228-29, 230-31
levels, 229
standard arrays, 231, 234
test cases in, 234
Scalability, evaluating, 147
Scalability test case, 148
Scenarios, 44-45, 143
to construct test cases, 317-19
software-development, 3
"sunny-day," 127
Scheduling of testing, 86-87, 91
Security, system, 330
Self-test functionality, 328
"Self-test" mode, 336
Semantics of operations, 26-27
OCL for, 49-51
Sender, 214
object, 20
Sequence, component integration failure triggered by, 348
Sequence diagrams, 41, 148
in analysis models, 51-54, 55
in design models, 57-59
layering for, 123, 125
Sequential processing of program, 271, 272-74
Servers, Web, 304-5
Service providers, 285, 286
specification for, 287
Service requester, 285-86
Signature for method, 287
Singleton design pattern, 290
Skeletons, in generic distributed-component model, 286
Smalltalk, 61, 75
Software
commercial-off-the-shelf (COTS), 323
defined, 4
object-oriented, 5-6
Software architecture, defined, 142
Software Architecture Testing (SAT) technique, 144-48
Software-development scenario, 3
Software Engineering Institute (SEI), 362
Software engineering level, testing at, 363
Software infrastructure, 370
Software product line, 362
Software system, components of, 81
Source code, 59-62
Source code documentation, 59
Specialization of inheritance relationship, 250
Specification-based test cases, 257-61
Specification-based testing (functional testing), 84, 187
Specifications
class, 22, 24-29
in analysis models, 49-51
component, 347
implicit, 288
of requirements, 71
for service providers, 287
for software entity, 83
Specificity of frameworks, 361
Sprites, 45-47, 256-57
Standard orthogonal array, 231, 234
Standard review techniques, principal shortcoming of, 109-10
Standards, 368
Standard test environment, 273
State-based coverage, 179-81
State-based test cases, 99
State diagrams, 41
in analysis models, 49, 50, 53
in design models, 57
for pinball game, 154
States, 27
boundary values for, 179
sampling test cases and, 228
State transition diagrams
boundary conditions identified from, 180
test-case construction from, 174-79
Static data members and methods, 168
Static keyword, 22
Static operations, 22
Stationary sprite, 257
STEP testing technique, 72-73
Stratified sample, 226
Stress testing
of systems, 331, 332
of Web sites, 306
Structural test cases, 99, 187, 261-62
Structural testing (implementation-based or white-box testing), 84, 167, 182, 187, 261-62
Stubs, 264, 286
Subclass, 32, 34, 36
Subclassing, 37-38
Subclass test requirements, 250-62
hierarchical incremental testing (HIT), 253-57
implementation-based test cases, 261-62
refinement possibilities, 251-52
specification-based test cases, 257-61
top-down, 250-51
Substates, 49
Substitution principle, 33, 250, 252
polymorphic, 347
Subsystems, classes and, 31
Subtyping, 37-38
"Sunny-day" clauses, 106n
"Sunny-day" scenario, 127
Superclass (base class), 32
Superstate, 49
Suspension criteria, for system testing, 311
Swap space issues, 328-29
Swim lanes, in activity diagrams, 55
Synchronization of threads, 270, 274-75
SYN-path analysis, 276-78, 370
System configuration, 315
System tester, 87
System testing, 2, 5, 78, 80, 309-42
acceptance tests, 310
coverage of, 338-41
defined, 309
of deployment, 327-28
after deployment, 328
distributed systems, 338
embedded systems, 335-36
of environmental interactions, 328-30
focus of, 310
against functional requirements, 326
of incremental projects, 323-24
life-cycle testing, 331-32
of multiple representations, 324-25
multitiered systems, 336-37
performance testing, 333-34
for qualitative system attributes, 326-27
reactive systems, 334-35
scheduling, 91
of security, 330
stress testing, 331, 332
test cases
selection strategies, 313-15
sources of, 315-22
test plan for, 311-12, 315
System test plans, 104, 105, 311-12, 315
System vs. application, 310
Technical management level, testing at, 363
Technical risks, 74
Templates, 368. See also Test plans
in C++, 38-39
use case, 369
Temporal logic, 370
in distributed objects, 288-93
Temporal operators, 289, 291-93
Testability, 71
design for, 279-80
TestBox, 346, 349-50
Test cases, 72, 189, 368-69
adequacy of, 84
for application-analysis model, 140-41
for architectural design models, 142, 145, 146-47
availability prior to inspection session, 118
based on interfaces, 349-51
for class testing, 168-79
describing, 177
identification of, 168
from pre- and postconditions, 169-74
from state transition diagrams, 174-79
coverage of, 117
creating, 127-28
for detailed class design, 149-50
for distributed objects, 273
for domain-analysis model, 139-40
functional, 99, 187
grand tour, 144, 320
implementation-based (structural), 99, 187, 261-62
infeasible, 320
inherited, 253
interaction, 100, 187
for memory/disk interaction, 328-29
naming, 190
in orthogonal array testing system, 234
for regressive check, 118
reuse of, 336
sampling, 182, 225-37
dimensions for, 227-28
orthogonal array testing system (OATS), 228-37
selecting, 81, 123-27
Orthogonal Defect Classification (ODC), 124, 125-26
risk for, 126
use profiles for, 124, 126, 127
specification-based, 257-61
state-based, 99
to support a framework, 361
Test classes
constraints on design of, 243
TestBox versus, 350
Test driver, 165-66
for class testing, 168, 183-210
design structures, 183-86
example code for, 196-210
requirements of, 186-88
Tester class, 185, 187-200, 215
baseline testing, 193-95
OUT factory methods, 189-93, 195
logging mechanisms in, 368
reporting results, 196
running test suites, 195-96
test case methods, 189
Testers, 17, 73, 80, 93
class, 86-87
in guided inspection, 120
independent, 80, 92
integration, 87
system, 87
Test execution, 16, 146
Testing, 3-8
activities in, 16
of object-oriented software, 5-6
overview of, 6-8
perspective on, 8
Testing effort estimate, 93
Testing interactions. See Interaction testing
Testing perspective, 8, 15-63
classes from, 30
defined, 15
development products, 39-62
analysis models, 40-55
design models, 56-59
source code, 59-62
inclusion polymorphism viewed from, 38
inheritance from, 32
interfaces from, 22, 29
messages from, 21
objects from, 19-20
Testing phase of development process, 66
Test manager, 87
Test patterns, 242-45, 371
temporal, 291-93
Test plans, 368-69
component, 98-100, 101, 197
for provider, 298
for requester, 299
features of, 44-46
hierarchical incremental testing from context of, 256-57
IEEE format, 94, 96, 311, 368
integration, 101-2
project, 97-98
relationships among, 95
system, 104, 105, 311-12, 315
timing of development of, 167
use-case, 101, 102-4
Test script methods, 189, 195
Test suites, 72-73, 187
baseline, 193-95
building and retaining, 99
sampling test cases to construct, 226
Thin client, 337
Thread models, 278-80
Threads, 270-71, 274-75, 278-80
Time-outs in networked system, 274
Timer class, 279-80
Timing, component integration failure triggered by, 348
Traceability matrices, 105
Transitions, 27, 49
testing of, 181
Typing system, 32
Unified Modeling Language (UML), 3, 40, 41, 142-43
Uniform probability distribution, 225
Unit testing, 2, 5, 70
class testing and, 164, 165
Until temporal operator, 292
Use case diagrams, 41, 42-45
Use cases, 101
application-level, 44
change cases, 151-54
creating test cases from, 127
domain-level, 43
estimating coverage levels based on, 91-92
example of, 132, 133, 134-36
levels of, 101
risk ratings for, 75-77
system-level test development and, 73
system test cases from, 315-22
in Brickles, 320-22
expected results section of, 319-20
scenarios to construct, 317-19
templates, 369
types of, 101
Use case test plans, 101, 102-4
Use profiles, 130, 148
constructing, 316-17
sampling based on, 225
system test case selection using, 313-14
as test-case selector, 124, 126, 127
Uses relation (in use cases), 43-44
Validation testing, 78, 311
Variables, instance, 30
"Variance from requirements" defects, 310
Variation points, 362, 364
Velocity, 169, 257
Vetoable change design pattern, 346, 351, 354-56
"V" testing model, new, 110
Web pages, testing, 303
Web servers, 304-5
Web sites, stress tests of, 306
White box testing (structural testing; implementation based testing), 84, 167, 182, 187, 261-62
XML-based system, 325

Updates

Submit Errata

More Information

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020