Home > Store

Evaluating Software Architectures: Methods and Case Studies

Register your product to gain access to bonus material or receive a coupon.

Evaluating Software Architectures: Methods and Case Studies

Book

  • Your Price: $63.99
  • List Price: $79.99
  • Usually ships in 24 hours.

Description

  • Copyright 2002
  • Dimensions: 6-1/4" x 9-1/4"
  • Pages: 368
  • Edition: 1st
  • Book
  • ISBN-10: 0-201-70482-X
  • ISBN-13: 978-0-201-70482-2

The foundation of any software system is its architecture. Using this book, you can evaluate every aspect of architecture in advance, at remarkably low cost -- identifying improvements that can dramatically improve any system's performance, security, reliability, and maintainability. As the practice of software architecture has matured, it has become possible to identify causal connections between architectural design decisions and the qualities and properties that result downstream in the systems that follow from them. This book shows how, offering step-by-step guidance, as well as detailed practical examples -- complete with sample artifacts reflective of those that evaluators will encounter. The techniques presented here are applicable not only to software architectures, but also to system architectures encompassing computing hardware, networking equipment, and other elements. For all software architects, software engineers, developers, IT managers, and others responsible for creating, evaluating, or implementing software architectures.

Sample Content

Online Sample Chapter

Evaluating a Software Architecture

Downloadable Sample Chapter

Click below for Sample Chapter related to this title:
clementsch02.pdf

Table of Contents



List of Figures.


List of Tables.


Preface.


Acknowledgments.


Reader's Guide.


1. What Is Software Architecture?

Architecture as a Vehicle for Communication among Stakeholders.

Architecture and Its Effects on Stakeholders.

Architectural Views.

Architecture Description Languages.

Architecture as the Manifestation of the Earliest Design Decisions.

Architectural Styles.

Architecture as a Reusable, Transferable Abstraction of a System.

Summary.

For Further Reading.

Discussion Questions.



2. Evaluating a Software Architecture.

Why Evaluate an Architecture?

When Can an Architecture Be Evaluated?

Who's Involved?

What Result Does an Architecture Evaluation Produce?

For What Qualities Can We Evaluate an Architecture?

Why Are Quality Attributes Too Vague for Analysis?

What Are the Outputs of an Architecture Evaluation?

Outputs from the ATAM, the SAAM, and ARID.

Outputs Only from the ATAM.

What Are the Benefits and Costs of Performing an Architecture Evaluation?

For Further Reading.

Discussion Questions.



3. The ATAM—A Method for Architecture Evaluation.

Summary of the ATAM Steps.

Detailed Description of the ATAM Steps.

Step 1: Present the ATAM.

Step 2: Present the Business Drivers.

Step 3: Present the Architecture.

Step 4: Identify the Architectural Approaches.

Step 5: Generate the Quality Attribute Utility Tree.

Step 6: Analyze the Architectural Approaches.

Step 7: Brainstorm and Prioritize Scenarios.

Step 8: Analyze the Architectural Approaches.

Step 9: Present the Results.

The Phases of the ATAM.

Phase 0 Activities.

Phase 1 Activities.

Phase 2 Activities.

Phase 3 Activities.

For Further Reading.

Discussion Questions.



4. The Battlefield Control System—The First Case Study in Applying the ATAM.

Preparation.

Phase 1.

Step 1: Present the ATAM.

Step 2: Present the Business Drivers.

Step 3: Present the Architecture.

Step 4: Identify the Architectural Approaches.

Step 5: Generate the Quality Attribute Utility Tree.

Step 6: Analyze the Architectural Approaches.

Phase 2.

Step 7: Brainstorm and Prioritize Scenarios.

Step 8: Analyze the Architectural Approaches.

Step 9: Present the Results.

Results of the BCS Evaluation.

Documentation.

Requirements.

Sensitivities and Tradeoffs.

Architectural Risks.

Summary.

Discussion Questions.



5. Understanding Quality Attributes.

Quality Attribute Characterizations.

Performance.

Availability.

Modifiability.

Characterizations Inspire Questions.

Using Quality Attribute Characterizations in the ATAM.

Attribute-Based Architectural Styles.

Summary.

For Further Reading.

Discussion Questions.



6. A Case Study in Applying the ATAM.

Background.

Phase 0: Partnership and Preparation.

Phase 0, Step 1: Present the ATAM.

Phase 0, Step 2: Describe Candidate System.

Phase 0, Step 3: Make a Go/No-Go Decision.

Phase 0, Step 4: Negotiate the Statement of Work.

Phase 0, Step 5: Form the Core Evaluation Team.

Phase 0, Step 6: Hold Evaluation Team Kick-off Meeting.

Phase 0, Step 7: Prepare for Phase 1.

Phase 0, Step 8: Review the Architecture.

Phase 1: Initial Evaluation.

Phase 1, Step 1: Present the ATAM.

Phase 1, Step 2: Present Business Drivers.

Phase 1, Step 3: Present the Architecture.

Phase 1, Step 4: Identify Architectural Approaches.

Phase 1, Step 5: Generate Quality Attribute Utility Tree.

Phase 1, Step 6: Analyze the Architectural Approaches.

Hiatus between Phase 1 and Phase 2.

Phase 2: Complete Evaluation.

Phase 2, Step 0: Prepare for Phase 2.

Phase 2, Steps 1-6.

Phase 2, Step 7: Brainstorm and Prioritize Scenarios.

Phase 2, Step 8: Analyze Architectural Approaches.

Phase 2, Step 9: Present Results.

Phase 3: Follow-Up.

Phase 3, Step 1: Produce the Final Report.

Phase 3, Step 2: Hold the Postmortem Meeting.

Phase 3, Step 3: Build Portfolio and Update Artifact Repositories.

For Further Reading.

Discussion Questions.



7. Using the SAAM to Evaluate an Example Architecture.

Overview of the SAAM.

Inputs to a SAAM Evaluation.

Outputs from a SAAM Evaluation.

Steps of a SAAM Evaluation.

Step 1: Develop Scenarios.

Step 2: Describe the Architecture(s).

Step 3: Classify and Prioritize the Scenarios.

Step 4: Individually Evaluate Indirect Scenarios.

Step 5: Assess Scenario Interactions.

Step 6: Create the Overall Evaluation.

A Sample SAAM Agenda.

A SAAM Case Study.

ATAT System Overview.

Step 1: Develop Scenarios, First Iteration.

Step 2: Describe the Architecture(s), First Iteration.

Step 1: Develop Scenarios, Second Iteration.

Step 2: Describe the Architecture(s), Second Iteration.

Step 3: Classify and Prioritize the Scenarios.

Step 4: Individually Evaluate Indirect Scenarios.

Step 5: Assess Scenario Interactions.

Step 6: Create the Overall Evaluation—Results and Recommendations.

Summary.

For Further Reading.

Discussion Questions.



8. ARID—An Evaluation Method for Partial Architectures.

Active Design Reviews.

ARID: An ADR/ATAM Hybrid.

The Steps of ARID.

Phase 1: Rehearsal.

Phase 2: Review.

A Case Study in Applying ARID.

Carrying Out the Steps.

Results of the Exercise.

Summary.

For Further Reading.

Discussion Questions.



9. Comparing Software Architecture Evaluation Methods.

Questioning Techniques.

Questionnaires and Checklists.

Scenarios and Scenario-Based Methods.

Measuring Techniques.

Metrics.

Simulations, Prototypes, and Experiments.

Rate-Monotonic Analysis.

Automated Tools and Architecture Description Languages.

Hybrid Techniques.

Software Performance Engineering.

The ATAM.

Summary.

For Further Reading.

Discussion Questions.



10. Growing an Architecture Evaluation Capability in Your Organization.

Building Organizational Buy-in.

Growing a Pool of Evaluators.

Establishing a Corporate Memory.

Cost and Benefit Data.

Method Guidance.

Reusable Artifacts.

Summary.

Discussion Questions.



11. Conclusions.

You Are Now Ready!

What Methods Have You Seen?

Why Evaluate Architectures?

Why Does the ATAM Work?

A Parting Message.



Appendix A: An Example Attribute-Based Architectural Style.

Problem Description.

Stimulus/Response.

Architectural Style.

Analysis.

Reasoning.

Priority Assignment.

Priority Inversion.

Blocking Time.

For Further Reading.



References.


Index. 020170482XT10082001

Preface

The foundation of any software system is its architecture, that is, the way the software is constructed from separately developed components and the ways in which those components interact and relate to each other. If the system is going to be built by more than one person—and these days, what system isn't?—it is the architecture that lets them communicate and negotiate work assignments. If the requirements include goals for performance, security, reliability, or maintainability, then architecture is the design artifact that first expresses how the system will be built to achieve those goals. The architecture determines the structure of the development project. It is the basis for organizing the documentation. It is the first document given to new project members, and the first place a maintenance organization begins its work. Schedules, budgets, and workplans all revolve around it. And the senior, most talented designers are paid to create it.

A system's longevity—how viable it remains in the face of evolutionary pressure—is determined primarily by its architecture. Some architectures go on to become generic and adopted by the development community at large: three-tier client-server, layered, and pipe-and-filter architectures are well known beyond the scope of any single system. Today, organizations are recognizing the importance and value of architectures in helping them to meet corporate enterprise goals. An architecture can give an enterprise a competitive advantage and can be banked like any other capitalized asset.

The right architecture is the first step to success. The wrong architecture will lead to calamity. This leads to an important question: If your organization is betting its future—or at least a portion of it—on an architecture for a system or family of related systems, how can you be sure that you're building from the right architecture and not the wrong one?

The practice of creating an architecture is maturing. We can identify causal connections between design decisions made in the architecture and the qualities and properties that result downstream in the system or systems that follow from it. This means that it is possible to evaluate an architecture, to analyze architectural decisions, in the context of the goals and requirements that are levied on systems that will be built from it.

And yet even though architecture is regarded as an essential part of modern system development, architecture evaluation is almost never included as a standard part of any development process. We believe it should be, and this book is an attempt to help people fill that gap.

The time has come for architecture evaluation to become an accepted engineering practice for two reasons. First, architecture represents an enormous risk in a development project. As we've said, the wrong one leads to disaster. It makes good sense to perform an evaluation on such a pivotal artifact, just as you would plan risk-mitigation strategies for other sources of uncertainty. Second, architecture evaluation can be remarkably inexpensive. The methods described in this book add no more than a week to the project schedule, and some abridged forms require no more than a day or two. Architecture evaluation represents a very cheap insurance policy. Compared to the cost of a poor architecture, the modest expense of a software architecture evaluation makes all the sense in the world. What has been lacking up to this point is a practical method for carrying it out, which is where this book comes in.

This is a guidebook for practitioners (or those who wish to become practitioners) of architecture evaluation. We supply conceptual background where necessary, but the intent of the work is to provide step-by-step guidance in the practice of architecture evaluation and analysis. To help put the methods into practice, we have included sample artifacts that are put into play during an architecture evaluation: viewgraph presentation outlines, scenarios, after-action surveys, final report templates, and so forth. The goal is that after reading this book, you will feel confident enough to try out the methods on an architecture in your own organization. We have tried to help answer the question, during an evaluation, "What should I do now?"

While the book is written from the point of view of the evaluator, there are others involved in an evaluation—project managers, architects, other stakeholders—who will gain valuable insights by reading this book. They will come to understand how their products will be evaluated and thus can position themselves to make those products fare better with respect to the evaluation criteria. This is rather like scoring well on a test because you've seen an early copy of the test, but in this case it isn't cheating but rather sound management and engineering practice. But know that when we use the word you in the text, we are speaking to the evaluator.

The techniques in this book are based on actual practice in government and industry. Most of the methods were developed by ourselves and others at the Software Engineering Institute and applied by ourselves and others to our customers' and collaborators' systems. Other material was gleaned by holding industrial workshops whose participants were experts in the analysis and evaluation of architecture. In short, we have learned by doing, and we have learned from others' doing.

This book will not teach you how to become a good architect, nor does it help you become fluent in the issues of architecture. We assume that you already have a good grasp of architectural concepts that comes from practical experience. This book will not help you assess the job performance of any individual architect nor a project's architecture (or development) process. What it will do is show you how to evaluate an architecture with respect to a broad spectrum of important quality attributes having to do with the architecture and the future system(s) that will be built from it.

Finally, we should say a word about software versus system architecture—that is, the architecture of software-intensive systems. This is a book about the evaluation of software architectures, but we often hear the question, "Well, what about the architecture of the system, not just the software? It's just as vital." We couldn't agree more. System architectures embody the same kinds of structuring and decomposition decisions that drive software architectures. Moreover, they include hardware/software tradeoffs as well as the selection of computing and communication equipment, all of which are completely beyond the realm of software architecture. System architectures hold the key to success or failure of a system every bit as much as the software architecture does for the software. Hence, they deserve to be evaluated every bit as much and for exactly the same reasons.

The methods presented in this book will, we believe, apply equally well to system architectures as to software architectures. If modifiability is a concern, the methods can be used to gauge the expense of making changes over the system's lifetime; if performance is a concern, the methods can be used to spot bottlenecks and problem areas in the system as well as the software; and so forth.

Why, then, do we call it a book about software architecture evaluation? Because that is the realm in which the methods were invented, developed, tested, and matured. In the remainder of this book when we speak of architecture, you can always safely prefix it with software. You can prefix it with system depending on how applicable you feel the methods are to system architectures and how confident you are about our intuition in the matter.

As a final word, we invite you to share your experiences with us. We would be keenly interested in knowing what you discover works well and what doesn't work so well. Writing a book is an opportunity to share lessons, but more importantly to us, it is an opportunity to gather new ones.

—PCC, Austin, Texas
—RK, Pittsburgh, Pennsylvania
—MHK, Pittsburgh, Pennsylvania



020170482XP10082001

Index

A

ABASs. See Attribute Based Architectural Styles
Abowd, Gregory, xvii, 273
Acknowledged communication, 95
Acknowledging backups, 96, 98, 100
Acme language, 266
Active design reviews, 241, 245, 288
nbsp conventional reviews versus, 242-243
origins of, 243-244
"Active Design Reviews: Principles and Practices" (Parnas and Weiss), 243
Active Reviews for Intermediate Designs, xiv, 20, 32, 241-253, 287
as ADR/ATAM hybrid, 245
advantages with, 252
case study, 248-252
overview of, 288
Phase 1 for, 246
Phase 2 for, 246-248
SAAM and ATAM compared with, 256
sample agenda for review, 250
steps of, 245-248. See also Architecture Tradeoff Analysis Method; Software Architecture Analysis Method
Actuators, 8
ADD. See Attribute Driven Design
ADLs. See Architecture description languages
ADRs. See Active design reviews
Alice in Wonderland (Carroll), 27
Allocation policies, 111
Analysis, within Attribute Based Architectural Style, 125
Analysis support, with ATAT, 222
Analytical point of view, 302
APIs. See Application program interfaces
Application builder, and architecture evaluation, 66
Application program interfaces, 88
Aqua satellite, 128
Architects
and architectural approach analysis, 197
architectural approaches identified by, 162
and architecture evaluation, 67
architecture presented by, 157-158, 159
and ARID, 251
checklist of questions for, 145
and evaluation process, 215-216
expressing appreciation to, 152
finding, 277
missing, 292
Phase 1 for EOSDIS Core System and attendance by, 143, 146
and quality attribute characterizations, 125. See also Stakeholders
Architectural approaches, 13, 81, 186
analysis of, for Battlefield Control System, 92-94, 102
analyzing, 45, 68, 172-173, 196
benefits in identifying, 181
example of analysis of, 59
example of description of, 123
identifying, 44, 47-50, 90, 162-163
template for analysis of, 58, 122
Architectural decisions, 111, 112, 122
availability-related, 116, 117
modifiability-related, 119, 120
performance-related, 114, 115
Architectural documentation, improving quality of, 37
Architectural risks, with Battlefield Control System, 106-107
Architectural styles, 12-13, 43, 302
attribute-based, 124-125
importance of, 49
Architectural views, 4, 47
Architecture, 3, 20, 21-23
audits, 259, 260, 276
example template for presentation of, 48
importance of, ix
presentation of, 47
presentation of, for Battlefield Control System, 89
quality attributes determined by, 109, 110
reasons for evaluating, 288-290
subdesign problems in, 241
Architecture-based design, 16
Architecture description languages, 10, 266
Architecture documentation, 133, 143, 162
discussing/reviewing, 148
as reusable artifacts, 284
reviewing, 164
walk throughs of, 145, 147
Architecture evaluations, x, 19-42
benefits and costs of, 37-41
comparison of methods for, 255-273
countering skepticism during, 28
customizing methods for, 296
danger signals to look for with, 258-259
follow-up survey: organizational impacts, 281
follow-up survey: project impacts, 280
full readiness for, 287, 291-294
goals of, 109, 289
growth of capability for, by organizations, 275-285
importance of, x
leading, 182-183
maximizing evaluators efforts in, 97-98
outputs of, 34-36
people involved in, 26
qualities in, 30-32
and quality of stakeholders, 187
reasons for, 23-24
results produced by, 27, 29-30
stakeholders for, 65-68
techniques comparison, 271-272
timing for, 24-25
vague quality attributes and, 32-33
and what's architectural?, 21-23
Architecture presentations, distillation of, 164
Architecture review, people issues involved in, 182
Architecture Tradeoff Analysis Method, xiii, xvii, 20, 30, 32, 43-85, 241, 245, 272, 273, 287
analysis put in, 80-82
approximate cost of medium-size checklist-based evaluation with, 41
approximate cost of medium-size evaluation with, 39
approximate cost for small evaluation with, 40
conceptual flow of, 295
consensus-building qualities of, 133
description of steps within, 45-51, 55-62, 68-70
end-of-exercise participants' survey, 279
enhanced ability of, in finding risks, 294
EOSDIS Core System (NASA) application of, 127-210
experiences with, 104-105
hiatus between Phase 1/Phase 2, for EOSDIS Core System, 183-184
as hybrid technique, 268
maximizing evaluators efforts with, 97-98
outline of standard, 151
outputs only from, 35-36
overview of, 288
Phase 0 in, 71-76
Phase 0 of, for EOSDIS Core System, 129-148
Phase 1 information exchange in, 149
Phase 1 of, for EOSDIS Core System, 148-167, 170-181
Phase 2 of, for EOSDIS Core System, 184-202
Phase 3 of, for EOSDIS Core System, 202-209
phases of, 70-78, 79, 82-83
postmortem meeting sample agenda, 206
presentation of, for Battlefield Control Systems, 89
quality attribute characterization used in, 121-124
reasons behind success of, 290, 294-296
results presented from, 68-70
SAAM, ARID compared with, 256
sample agenda for Phase 1, 147
sample outline for final report, 204
and sensitivities and risks, 69
steps and outputs of, correlated, 71
summary of steps within, 44-45
supply checklist for exercise in, 144
two faces of, 80-82. See also Active Reviews for Intermediate Designs; Software Architecture Analysis Method
Architecture Tradeoff Analysis Tool
assessment of scenario interactions, 235-236
classifying/prioritizing scenarios, 228-231
creation of overall evaluation: results/recommendations, 236-238
describing architecture(s) of, 224-225, 227-228
evaluation of indirect scenarios, 231
functional view of architecture, 225
SAAM applied to, 222-231, 235-238
system overview, 222-223
ARID. See Active Reviews for Intermediate Designs
Artifact repositories, maintaining/updating, 83, 202, 207, 285
Asynchronous calls, and metrics, 264
AT&T, 259, 276
ATAM. See Architecture Tradeoff Analysis Method
ATAT. See Architecture Tradeoff Analysis Tool
ATAT editor
code view of generic, 238
code view of MVC-based candidate architecture of, 228
code view of PAC-based candidate architecture of, 229
ATAT-Entities, 228
Attribute-Based Architectural Styles, xiii, 49-50, 124-125, 126, 181
analysis, 299-302
architectural style, 298
example of, 297-302
parts of, 297
problem description, 297-298
as reusable artifacts, 284
stimulus/response, 298
Attribute Driven Design, 16
Audits, architecture, 259, 260, 276
Authentication, 11
Automated tools, 266
Availability, 19, 31, 93, 94, 164
and backup commander approach, 94, 95, 96
and BCS system, 102
characterization, 116
and ECS architecture, 166
and ECS utility tree, 169
as quality attribute, 115-117
sample questions about, 118
and scenarios, 52
Avritzer, A., 259, 260, 261B
Back-of-the-envelope performance model, 98
Backup commander approach, 94-96, 98
Backups, and BCS architecture, 96
Bandwidth limitations, mitigating against, 103
Barbacci, Mario, 268
Bass, Len, xvii, 8, 15, 239
Battlefield Control System (BCS), 87-108, 289
evaluation results for, 103, 105-107
hardware view of, 90
Phase 1, 89-90, 92-96, 98-100
Phase 2, 100-103
preparation, 88
sample scenarios for evaluation of, 101
Benefits surveys, 83
Berra, Yogi, 109
Blocking
sources of, 301-302
time, 299, 300, 301-302
Boehm, Barry, 19, 84
Bosch, J., 15
Bottom-up strategy, for organizational buy-in, 276
Brooks, Fred, 1
Buildability, 27, 29
Build-time, 31
Buschmann, F., 15, 126, 239, 302
Buses, 111
Business case presentation, example template f

Updates

Submit Errata

More Information

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020