Home > Store > Software Development & Management > Agile

Testing Extreme Programming

Register your product to gain access to bonus material or receive a coupon.

Testing Extreme Programming


  • Your Price: $31.99
  • List Price: $39.99
  • Usually ships in 24 hours.


  • Copyright 2003
  • Dimensions: 7-3/8x9-1/4
  • Pages: 336
  • Edition: 1st
  • Book
  • ISBN-10: 0-321-11355-1
  • ISBN-13: 978-0-321-11355-9

The rapid rise in popularity of Extreme Programming (XP) has put the practice of software testing squarely in the spotlight of application development. At one time, testing was a neglected practice, a highly specialized activity that came as an afterthought as complex, code-intensive projects were rushed to completion. But in today's world of escalating quality expectations, testing is a key component of the development process.

XP accelerates testing by demanding its complete integration with development. This in turn has pushed software professionals to rethink their traditional attitudes toward testing. XP asks the entire development team to embrace testing. In fact, testing is so critical to the XP methodology that programmers are required to write automated tests before they begin coding. Until now, however, there has been a distinct lack of instruction specific to testing and how it relates to XP.

Testing Extreme Programming is a practical tutorial that gives software builders a lucid understanding of this important aspect of development. This book demonstrates how testing is central to the XP project, clearly spelling out what testing should be done and when and how it should be performed. The authors teach by example, and readers will be able to improve their knowledge of the testing process by completing the book's exercises.

In addition, this book:

  • Provides a general overview of the XP methodology
  • Defines the roles of XP team members
  • Shows how to write effective tests before coding begins
  • Helps you avoid the traps and pitfalls that can derail software projects
  • Sheds light on the important practice of refactoring and how it relates to testing
  • Compares and contrasts manual and automated tests

Many software engineers have dismissed XP as a throw-out-the-rulebook, anything-goes technique. It isn't. As this book shows, XP is a deliberate and disciplined approach to software development. Many software engineers have reaped the benefits of this agile methodology because its emphasis on testing eliminates much of the risk inherent in software projects. XP helps developers produce software on time, under budget, and at a higher quality level. But you can't XP if you don't test. With this book as a guide, you will learn to embrace testing. A sound testing program is the engine that drives an XP project.


Sample Content

Online Sample Chapter

Why XP Teams Need Testers

Downloadable Sample Chapter

Click below for Sample Chapter(s) related to this title:
Sample Chapter 2

Table of Contents



1. An Overview.
Overview of XP.
How XP Solves Testing and Quality Assurance Problems.
System and Acceptance Testing Resources Wasted on Unit- and Integration-Level Bugs.
Missing and Out-of-Date Requirements.
Huge Gaps between the System and User Expectations.
Wolves in Sheep's Clothing.

2. Why XP Teams Need Testers.
Definition of Tester.
The Tester's Contribution, Illustrated.
Shun the Dark Side.

3. How XP Teams Benefit from Having Testers.
Checks and Balances.
Acceptance Tests versus Unit Tests.
Navigating for XP Projects.

4. XP Testing Values.

5. Overview of the XP Tester Role.
XP Tester's Bill of Rights.
XP Tester Activities.

6. Quality and XP.
Defining Quality.
Setting Quality Criteria.
Who Is Responsible for Quality?


7. User Stories and Release Planning.
The Tester's Role in Up-Front Activities.
Goals of Up-Front Tester Activities.
Exercise 1.

8. Identifying Hidden Assumptions.
A Process for Finding Hidden Assumptions.
Example 1.
Exercise 2.
Introducing the XTrack Application.

9. Defining High-Level Acceptance Tests.
Basic Acceptance Test Definitions.
Example 2.
Example 3.
Exercise 3.

10. High-Level Acceptance Test Estimates.
Ways to Estimate Acceptance-Test Effort.
Quick-and-Dirty Approach.
Example 4.
A More Detailed Estimating Method.
Example 5.
Exercise 4.

11. Enabling Accurate Estimates during Release Planning.
Why We Care about Estimates.
How You Can Improve Estimate Accuracy.
Exercise 5.

12. Planning the First Iteration.
Overview of Iteration Planning.
The Tester's Role in Iteration Planning.
Thinking of All the Tasks.
Enhancing Communication.
Exercise 6.

13. Defining and Estimating Testing and Test Infrastructure Tasks.
Identifying and Estimating Test Infrastructure Tasks.
Identifying and Estimating Functional and Acceptance Testing Tasks.
A Note on Separate Test Teams.
Example 6.
Test Infrastructure Tasks.
Acceptance Testing Tasks.
Exercise 7.

14. Acceptance Tests and Quality.
Acceptance Test Details.
Internal and External Quality.
Exercise 8.

15. Nailing Down the Details.
Picking the Customer's Brain (and the Programmers'!).
The Good, the Bad, and the Ugly.
Example 7.
Optional Tests.
Getting Creative.
Lights-Out Test Design.
Exercise 9.

16. Writing Acceptance Tests.
Executable Tests.
If You Have Trouble Getting Started.
Exercise 10.

17. Organizing Acceptance Tests.
Version Control of Acceptance Tests.
Executable Test Files.
Organizing Acceptance Tests in Spreadsheets.
Exercise 11.

18. Test Design and Refactoring.
Establishing the Initial System State.
Tests That Leave the System State Unchanged.
Coupling between Tests.
Exercise 12 130

19. Manual Tests.
Exercise 13.

20. What!?!!
Manual Tests Are Unreliable.
Manual Tests Undermine the XP Testing Practice.
Manual Tests Are Divisive.
The Wings-Fall-Off Button.
What If You Have Manual Tests?
Exercise 14.

21. Test Automation.
Modular Tests.
Data-Independent Tests.
Self-Verifying Tests.
Exercise 15.

22. Making Executable Tests Run.
Linking the Executable Test to an Application Test Class.
Defining the Application Test Class.
Calling the Code to be Tested.
Running the Test.
Getting Additional Tests to Run.
Combining Multiple Tests into Test Suites.
Exercise 16 156

23. Running Executable Tests through Other Interfaces.
Code Missed by Direct Calls.
Expanding Coverage of the Executable Tests.
Interfacing to a Test Tool.
Creating an Application Test-Interface Class.
Refactoring the Direct-Call Interface.
Refactoring the Application Test Class.
Creating a Tool-Specific Interface Class.
One Team's Experience with Direct-Call Test Automation.
Exercise 17.

24. Driving the System with a Test Tool.
WebART Overview.
Main WebART Script.
Login Module.
Validation Criteria.
Exercise 18.

25. Bugs on the Windshield: Running Acceptance Tests.
How Often Do You Run Acceptance Tests?
Educating the Customer.
Acceptance Criteria.
Defect Management.
Road Food for Thought.
Exercise 19.

26. Looking Back for the Future.
Exercise 20.

27. Keep On Truckin': Completing the XP Road Trip.
Regression Testing.
Catching Up.
The Release.
When XP Projects End.
Exercise 21.


28. Challenges in “Testability” .
Designing for Testability.
A Real-Life Example.
Exercise 22.

29. Selecting and Implementing Tools.
Evolving Tools.
Test Tools.
Other Tools Related to Quality.
Choosing an Off-the-Shelf Tool.
Implementing Tools.
Experimenting with Tools.

30. Project Tune-Ups.
Office Space.
Accessorizing for XP.
Celebrating Successes.
Test Environment.
Other Obvious Best Practices.
Additional Tester Duties.

31. Introducing XP to Your Organization: A Tester's Point of View.
Test Phases and Practices.
Introducing People to the XP Tester Role.
Helping XP Testers Succeed.
XP Testing with Blended Practices.
What If You Don't Have Enough Testers?

32. XP for Projects of Unusual Size.
Adjusting XP.
Advance Planning Pays Off.
Working with Customers.
Satisfying Customer Test Documentation Requirements.
Iteration Planning and Execution for Large or Multilocation Projects.

33. Extreme Testing without Extreme Programming.
Gathering Requirements.
System Design.
Planning and Defining Tests.
Running Tests.
Let Worry Be Your Guide.

34. In Closing: May the Road Rise Up to Meet You.
Answers to Exercises.
Index. 0321113551T10072002


This is a book about being a tester on an Extreme Programming (XP) team. It plugs a gap in the currently available XP materials by defining how an XP tester can contribute to the project, including what testers should do, when they should do it, and how they should do it. We are writing it because we think that XP is a better way to develop software and should be used by more teams. We believe that an acknowledged place in XP teams for testing and quality assurance will help bring that about.

Our goals in this book are to:

  1. Convince current XP practitioners that there is a valid role for a tester on the team
  2. Convince testing and quality assurance professionals that XP offers solutions to some of their worst problems
  3. Convince both groups that the testers are needed just as much as in an XP project as in a traditional development project
  4. Provide enough detail and practical example to allow you to either perform the XP tester role yourself or work productively with a tester on your team, whether you are an XP newbie or veteran, tester, programmer, guide, customer, or manager

We hope that if you are not currently using XP, that you can influence your own organization to try it. Even if your team uses some other process for software development, we think you can apply "extreme testing" practices to add value.

Because not everyone will be familiar with XP, we provide an overview of the basic concepts in the introduction, and describe a few aspects in more detail as necessary throughout the text. But this will be a bare-bones summary, at best, and there are several excellent books on the subject, as well as a wealth of information on the Web.

The book is divided into three major parts:

Part I - The XP Tester Role

This is where we define what we think the tester role is (and is not), how a project will benefit from it, what is in it for the tester, and generally why XP needs a tester role.

Part II - The XP Test Drive

Here we go through an XP project step by step and suggest what goals to shoot for, which activities to engage in, and helpful techniques to try as a tester on an XP project.

Part III - Road Hazard Survival Kit

Finally we provide some resources to help you cope when the real world doesn't conform exactly to the ideal XP project. Large projects, for instance, where an XP team is imbedded in a larger, non-XP effort, or when critical XP practices are modified or omitted.

We've tried to keep the things as practical as possible, and provided real-life examples as well as exercises for you to try it out for yourself. The exercises are built around an XP project to develop a simple web-based tracking application, and we provide portions of the application at various stages for you to practice on.

We think you will find this book helpful if you are already a member of an XP team, or if you are a testing/quality assurance professional, or if you are in any software development role and considering XP.



As I see it, I have two jobs to do in this foreword. The first is to persuade you that it's worth your time to keep reading this book. The second is to place the book in context: what does it say about the world of testing and about how that world is changing?

My first job is easy. The reason is that the book you're holding is both skimmable and eloquent. Stop reading this foreword. Go to Part II. Browse some chapters. Do you see tidbits you can put to immediate, practical use on an XP or agile project? That's a sign Lisa and Tip are writing from experience. Do the chapters seem to hang together into a coherent testing strategy? That's a sign they've thought about their experience. Is the tone flexible, avoiding dogmatism? That's a sign that you'll readily be able to adapt what you read to your local circumstances. These are the things that make the book worth reading. Don't believe me; check for yourself.

The second job is harder. How does this book fit in? As I write (May 2002), the world of testing seems stuck. We're all supposed to know certain fixed concepts: what the purpose of testing is, what the relationship of testers to programmers should be, what test planning means. When presented with a new project, it seems we're intended to take those concepts as givens, follow some methodology, and fill in the blanks through a process of top-down, stepwise refinement.

But that's not really what happens, at least not on good projects. The tester comes to a project equipped with a hodgepodge of resources: concepts, attitudes, habits, tools, and skills--some complementary, some contradictory. She begins by modeling her new situation after one she's experienced before. She applies her habits and skills. She keeps what works and changes what proves awkward. She fluidly adapts any of her resources to the situation. In her project, testing goes through what Andrew Pickering calls "the mangle of practice" (in his book of the same name).

All this is hidden, because it's disreputable in at least two ways. First, everything is up for grabs, including those seemingly fixed concepts we're all supposed to know. The practice of testing, when applied in specific contexts, changes the purpose of testing, the relations of people, what test planning means. There's precious little solid ground to stand on. Second, the trivial can be as important as the lofty. Knowing how to use 3 x 5 cards well may matter more than knowing test design techniques. Unthinking work habits may have more effect than reasoned decisions.

We need to make the mangle respectable. It's harmful when people feel vaguely ashamed of doing what has to be done. It's even worse when they don't do it because they think they must follow the rules to be "professional."

And that marks this book's significance beyond simply (!) helping XP testers. In it, you can see the traces of two people interactively adapting their practice of testing to a new sort of project, arriving at last at a point of stability: something that works well enough that they can gift it to others. It's an inspiring example. I mean that literally, in that I hope it inspires you to model your testing after it, run that model through the mangle of practice, and report back to us, saying, "Here. Here's what I do. Here's what I've seen. Here's what works for me. Try it."

Brian Marick
Champaign, Illinois
May 2002


Click below to download the Index file related to this title:


Submit Errata

More Information

Unlimited one-month access with your purchase
Free Safari Membership