Home > Articles > Programming

Conventional Software Testing on an Extreme Programming Team

For a conventional software tester, an Extreme Programming project may be an intimidating challenge. Testers often find that they're not welcome on XP projects, but Jonathan Kohl shares how skilled testing can overcome such difficulties. Jonathan discusses lessons he learned from working on two different XP project teams, only one of which initially welcomed his testing and feedback.
Like this article? We recommend

If you're a professional software tester, or work in quality assurance, I consider you to be (like me) a "conventional software tester." Lately, conventional software testers are finding themselves on Extreme Programming (XP) projects. XP is one of the best-known Agile development methodologies; it employs an iterative lifecycle, team collaboration, and customer involvement.

I'm often asked questions about the challenges faced by conventional software testers on XP teams. In my experience, conventional testers can end up on XP teams in many situations:

  • A customer requests "QA people" on a project being conducted by an XP team.
  • An XP project has such good results that a development lead from that team is put in charge of a QA group.
  • A software company with an existing QA department tries XP.
  • XP developers who want to learn more about testing request conventional testers on the project.

If you can relate to these scenarios, or if you're a conventional software tester who's curious about Extreme Programming, this article is for you. I'll discuss two experiences in which I worked as a conventional tester on an XP team.

Experience 1: Early Success

When I first read Kent Beck's book Extreme Programming Explained, I felt that it was the values of XP that were most important. I wanted to apply these values immediately in my testing, whether or not I was on an XP project. Yet, early on, Extreme Programming literature focused very much on activities for developers. Without explicit guidance on what to do, I thought the best thing would be to experiment with familiar testing activities and use the XP values as a guide for my behavior. I did testing as described by the context-driven testing school, including risk-based testing, exploratory testing, and whatever I could do to help provide feedback to the team. Testing was a service role, providing information to the project team.

By the time I found myself on a team with experienced XP developers, a lot more had been written about testing on XP projects. Much of the talk of testing on XP teams was focused on automated testing using a framework such as JUnit. The general opinion was that there was no need for conventional software testers on an XP team. Most people believed that with the developers testing and the customer representative testing, there was no place for a dedicated tester. As a tester, I felt torn. I identified with the XP values and was a fan of XP's communicative and collaborative iterative lifecycles, so I wanted to try to work on an XP team, but I didn't want to be on a team where I wouldn't provide value. To address this issue, I asked the developers to let me know if I wasn't providing value, in which case I would ask to be removed from the project and go to work on a team that needed my help. The developers promised that they would let me know.

So the developers and I began working together to figure out how I could add value on the team. I explained my view of testing as a service role, wherein the tester provides information to the project team and decision makers. I plugged myself into the team, focusing on providing testing ideas and working on finding faults before we released the software. The developers on this team were well-practiced in test-driven development, but they were interested in having my help to write other kinds of tests. One developer wanted my help in writing acceptance tests before writing code that would pass the test. Another developer wanted me to work with him on unit test development. The team was excited to have me involved in planning activities (known as the Planning Game) to provide feedback on project risk and testability.

Because most of the developers were focused on having me do more automated test development, it took some convincing to get developers to see value in manual exploratory testing. I did some automated functional tests to complement their unit tests, and we did a lot of tester/developer pairing. I worked with one developer to set up a test environment in which I could pull a daily build and do manual testing on the installation, on the application working as a system, and on the application itself. At first he was indifferent because he thought I was duplicating the work of the automated unit tests that ran on automated builds throughout the day. Even so, I continued to work exploratory testing on the daily builds. I even had my doubts that I would find anything, because I had worked so much with the developers—especially on test ideas for unit tests—and had seen how much they were doing with automated unit tests. In the end, we were both wrong. Even though I found fewer obvious bugs, (installation errors, bad builds, input overflows), most of the important, show-stopping bugs came out of my manual exploratory testing.

One developer was disappointed that I couldn't write test cases up front for the bugs I found during exploratory testing. He wondered why I wasn't thinking of these ideas at the beginning of the iteration. I realized then that, as in Agile practices, exploratory testing requires adaptive rather than predictive behavior. As Cem Kaner, James Bach, and others will tell you, exploratory testing is simultaneous learning, test design, and execution. A lot of the learning I needed to do to design new test cases came from actually using the software and observing behavior. In fact, a huge benefit of an XP project is that a tester can start doing this very early in the project. The developer agreed that the information I was able to provide early on was beneficial, even if I couldn't predict every fault that occurred with a test case.

As we started to hit our stride on the project, we discovered something simple, seemingly obvious, and yet very profound: The less I worked on developer-like activities (such as test automation and design) and the more I focused on conventional testing activities (manual exploratory testing, thoughtful feedback, and test idea generation), the happier the developers were with my work. By engaging in conventional testing activities, I gave developers the information they needed most. This included test design ideas and what the developers were most pleased with: detailed bug reports.

Testing and development activities can require different thought processes. When thinking of useful testing activities within a programming methodology, the developers were thinking in terms of programming because they found automated testing so useful. I was thinking about testing in terms of assessing risk, asking questions about the product by using tools and techniques and assessing behavior against a mechanism to let me know whether there was a problem.

When we realized the value that conventional testing was providing, the developers started worrying about having a specialist on the team, since XP teams favor generalists and they thought that having specialists would lead to problems. However, when we did an inventory of my testing work, I didn't work like a specialist at all. My work involved manual testing, basic programming for test automation, working with customers, writing documentation, and helping to administer test equipment. If I had worked like they thought a generalist should, I would have worked only on automated test development and maintenance. To a conventional tester, that's a specialist role in automated testing.

As pragmatists, we let the issue of whether I was a generalist or specialist slide, and our testing strategy became a hybridized Agile/conventional testing approach that moved toward complementary tasks with a unified focus. Because I bought into the values of XP and used those values to guide my work, the developers didn't feel that the process was threatened. They also liked that I was willing to step up and do any job that needed to be done. Finally, my approach in reporting bugs wasn't belittling or accusatory, so they grew to respect and value what I had to say.

Of course, the developers are only part of the picture on an XP team. Another important role is the customer representative, who is the voice of the business and the user community. The customer representatives provide specialized subject matter expertise and also do testing, called user acceptance testing (UAT). I spent quite a bit of time working with the customer representatives. At first, this work mainly involved helping set testing strategies during planning. I would work with them to assess the risk for each proposed feature, asking, "What is the potential cost to the business if this functionality is missing or doesn't work?" This risk assessment helped the customer to determine which parts of the system were critical. At first, the customer wanted to include all the features that came to mind, but asking the right questions helped to narrow the list to the important features to help prioritize the team's development work. Once we had the feature list, I also helped identify testing techniques that we could use to mitigate risk for those features. Finally, when testing time came, I coached the customer on testing the software.

Working with customers showed me that what techies consider to be bugs, customers usually think are their own mistakes made while using the software. If customers see error messages, they usually blame themselves. It turned out that the most important thing I taught the customers was how to recognize a bug for what it is, rather than to see it as a user error. We used Bret Pettichord's bug definition: "A bug is something that bugs someone." I encouraged customers to voice concerns whenever something bothered them. (Indeed, I worked with customers the way I do with any new tester, much the same way I was trained as a tester myself.) I provided support, answered questions, and encouraged customers to voice concerns over any issue that felt wrong. If they saw an error message and blamed themselves, I explained why this was actually a failure in the software. One particular customer representative was naturally gifted as a tester, and caught on quickly with just a bit of coaching; she ended up logging excellent bugs that the rest of us had overlooked.

A typical day working as a conventional tester on this XP team might include the following tasks:

  • Act as sounding board on an emerging design for developers working on a story.
  • Pull a new build, install the software, and run automated acceptance tests against it.
  • Lightly test software from a story in progress to provide initial feedback to developers.
  • Heavily test software from a recently completed story to provide more feedback on a story.
  • Integration-test the software delivered to date.
  • Work with a customer on new test ideas.
  • Answer questions and provide support for customer testing.
  • Add to or do maintenance on the automated acceptance test suite.
  • Provide testing status information at a standup meeting.

Later, at the end of iteration, more of my time might be spent working with customers on testing, or working on risk assessments and testing strategy on emerging stories.

Looking back, I found that I was approaching testing from two fronts:

  • Supporting the development team with testing and feedback
  • Supporting customers by helping them assess risk and perform testing activities

These tasks fit nicely into what Cem Kaner and Brian Marick call business-facing and technology-facing activities. The information gathered in each of these areas helped the team to have more confidence in the software they were delivering, and helped customers to have more confidence in the product they were receiving.

When I did a retrospective with the developers, they gave me good feedback. They said I found bugs that they would never catch with their own unit testing. They were pleased that someone was doing integration tests on an installed system daily, because those tests revealed new sources of errors. They said my exploratory testing was very effective, and, unlike them, I was able to spend time tracking down intermittent errors. They were pleased to have someone on the team who thought about testing all the time, complementing their ideas. They liked my attitude about bug reporting: They didn't feel I was trying to humiliate them or catch them out when I found a bug, and bug reports were not the only way I provided feedback. They said I supported them and the customer and provided feedback in areas I didn't even realize, such as during initial design. They liked that I would work with customers and help them verbalize questions and concerns in terms everyone understood. Most importantly, the developers wanted to work with me again.

Overall, we found that my conventional software testing skills were complementary to what development and customers were doing. When I focused on areas the developers weren't testing with unit tests, such as installing the software in a test system daily and testing it, I found bugs that the automated tests didn't catch. When I worked with customers on testing ideas, I helped them learn how to test and taught them the technical vocabulary used by the development team. When a customer was the lone voice raising a concern, I was able to encourage her and back her up when facing down several developers. When coming up with features to develop for an iteration (called story cards because they're written on index cards), I could help to determine whether a story was testable and written coherently; incoherent stories were usually a sign of an incoherent design or areas we might have missed.

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020