Home > Articles > Software Development & Management > Agile

Software Development: Agile vs. Disciplined Methods

  • Print
  • + Share This
So what does it feel like to develop software according to plan-driven or agile methods? In this chapter we try to portray the activities in a typical day on a software development project as performed by a plan-driven, PSP/TSP-trained team and by an agile, XP-trained team.
This chapter is from the book

Agile and disciplined methods each have a vision of how software development should be.

So what does it feel like to develop software according to plan-driven or agile methods? In this chapter we try to portray the activities in a typical day on a software development project as performed by a plan-driven, PSP/TSP-trained team and by an agile, XP-trained team. First, we describe the activities of a typical day—one that falls well within the normal range of expected activities. Then we show how different the days might be if a significant event disrupts the normal activities—a crisis day. Finally, we discuss the similarities, differences, and implications of our two examples.



ch03.gif

Typical Days

The typical day is based on a generic software development project. The product is a tool that processes a complex sales reporting and inventory management file. The file contains a large amount of data and is automatically generated by a legacy system and delivered electronically to all relevant organizations. Data applicable to any one specific department is spread throughout the file, making it difficult for the department to manually extract the information it needs. The product under development will automatically break out the information needed by a specific department. A prototype has been developed, and the current project is to enhance the prototype and port it to another platform and operating system. The total project was initially estimated by the customer at around 20 thousand source lines of code (KSLOC) and approximately eight months' duration.

While the activities are fictionalized, they are representative of the types of tasks undertaken on a typical project work day. We've selected Friday as an example day because it allows us to recap the week's work, thus showing a broader view of activities performed.

For each of the approaches, we identify the specific training that the participants have undergone in order to apply the method. We describe the preproject planning work that was undertaken and summarize the development environment and current status of the project. The team is identified, and we define any specific roles.

A Typical Day Using PSP/TSP*

Training

Each of the nine team members completed a 125-hour PSP for Engineers course, which provides the basic process, discipline, measurement, estimating, and quality management skills. Jan, the team leader, has participated in two TSP teams, one as Design Manager. She and Fahad have completed the five-day TSP Launch Coach Training. Fahad, Greg, and Jim have also worked on TSP-based projects. Two of the members are familiar with the legacy system and two others have studied parts of the legacy documentation. Their assigned roles are shown in Table 3-1. Jan and Greg have Cockburn Level 2 skills. The other team members include five Level 1As and two Level 1Bs.

Table 3-1. PSP/TSP Roles

Person

Role

Responsibilities

Jan

Team Leader

Leading team, ensuring engineers report process data, ensuring work is completed as planned.

Fran (and all team members)

Programmer/Analyst

Developing requirements, design specifications, and software; reviewing designs and software; testing software; participating in inspections.

Fahad

Implementation Manager

Ensuring the implementation package is complete and ready for delivery.

Greg

Planning Manager

Supporting and guiding team members in planning and tracking their work.

Jim

Design Manager

Technical lead for developing and validating the overall architecture as well as package design.

Panitee

Quality/Process Manager

Defining process needs, making the quality plan, and tracking process and product quality.

Bashar

Support Manager

Determining, obtaining, and managing the tools needed to meet the team's technology and administrative support needs.

Margaret

Customer Interface Manager

Communicating with the customer.

Felipe

Test Manager

Package-level and integration testing.

Tools and Environment

The team is using a Web-based tool to support PSP/TSP data collection and earned value reporting. The development environment is typical of object-oriented (OO) programming. Office layout is standard modular offices with each developer having a dedicated machine. A conference room with computer network and projection capabilities is available for meetings and inspections. There is a dedicated integration and test machine in an unoccupied cubicle.

Project Planning

At the beginning of the project, a four-day planning session (referred to as a TSP Launch workshop) was held with all the team members and the project management personnel. During the workshop the team defined the project goals, established team roles based on TSP role scripts (Figure 3-1 is an example of a role script from TSPi, Humphrey's instructional version of TSPi), defined the project processes, developed quality and support plans, and agreed on an overall product plan and schedule. Over 180 tasks were identified, estimated, and planned. Upper management, marketing, and customer representatives communicated their requirements to the team on the first day of the launch and agreed to the plan the team created on the final day of the launch.

03fig01.gifFigure 3-1. TSPi Script for the Quality/Process Manager Role[1]

The team recognized the need for considerable agility during the early prototyping phase of the project, and relaxed the requirements for specification change control, inspections, defect tracking, and statistical process control during that phase. This phase, however, is more critical to the delivery of the system, and so more of the TSP discipline is being applied.

Status

The project is in the third month, and a relaunch workshop is planned within the next two weeks. Nothing has been delivered, but a prototype of the enhanced functionality has been demonstrated to management. Integration testing for the first phase is scheduled to begin next week.

The Day's Activities

Usually the team begins work on their planned tasks upon arrival. However, today, because of some organizational information that needs to be discussed, Jan gathers the group and provides a brief summary of her organizational staff meeting. While they are gathered, Fahad raises an issue regarding an item in the graphical interface as specified in the Software Requirements Specification (SRS). Margaret indicates she has a meeting with the customer at 2:00 and will address the question. With no other concerns raised, the group disperses to their work areas. Panitee reminds Bashar and Jim that there is a detailed design inspection of Fran's inventory order projection module at 1:00.

Fran finishes the unit test development on her module design in preparation for the inspection this afternoon. Her design is based on the Software Requirements Specification developed following the tailored REQ script earlier in the project.

Jim begins a personal code review on the inventory status reporting module he finished coding yesterday. He knows from his personal review history that he typically injects 27 coding defects per KSLOC and that in his code reviews he removes about 6 defects per hour. This morning he plans to spend one hour on the review. As he starts, he logs into the time-tracking system and indicates the activity he is working on. Jim reads through the code multiple times, each time looking for a different kind of defect, using a checklist of defect types.

The phone rings and he changes the mode in the time-tracking system. After conferring with his wife on the weekend's activities, he changes the mode back to code review. When he is confident he's found and corrected all the defects that he can and that the number is sufficiently close to his target, he again changes the mode on the time-tracking system, compiles the module, fixes any compilation errors, and begins using the test procedures he developed to make sure the module behaves properly. He logs all the defects he finds according to whether they were found in review, compilation, or test so he can maintain his personal defect rates as well as support project-wide tracking.

Bashar works with the corporate IT staff to resolve an issue with the automated configuration management system. He uses a TSP manual form for recording his time distribution in minutes, including breaks.

Jan attends a divisional strategic planning meeting.

Panitee reviews the current component data collected against the team quality plan to make sure the modules completed are of sufficient quality to be added to the baseline. She uses the automated data collection tool to check the relationship between time coding and time designing, time reviewing and time designing and coding, defects found in review against those found in compiling, defect discovery rate, and review rate. The tool produces a component quality profile (see Figure 3-2), which Panitee uses to identify any questionable work. No modules were identified as problematic, so no decisions need to be made as to how to proceed. Overall, at this point in the development, all the metrics seem in line with those projected, except for the review rate, which has been consistently higher than the planned 200 lines of code per hour. Given that the defect detection yield has not decreased, this is not seen to be a problem.

03fig02.gifFigure 3-2. Example Quality Profile Showing Anomalous Design Review Time[2]

Greg and Jan begin preparations for the relaunch workshop for the next cycle of the project. They compare the actual progress against the original launch plan, noting any significant deviations that have occurred since that time and how they might impact the next cycle. It is clear that at least one module that was slated for completion in this cycle won't be ready, but there are two others that actually came in under the estimates and so some work has begun on modules originally slated for the next cycle.

The other team members continue working, making sure they log the time spent on their various project tasks as well as interruptions to their work due to responding to e-mail, phone calls, meetings, breaks, and such. So far the average time spent on project tasks has been around 18 hours per week per person—slightly lower than their initial projections.

Panitee conducts the detailed design inspection of Fran's inventory order projection module according to the TSP script (see Figure 3-3 for the TSPi inspection script as an example). Fran, Jim, and Bashar support Panitee in the inspection. They follow a formal inspection process, and data is collected on the number and severity of defects found, time spent, and the size of the module. The inspection rate as well as an estimated inspection yield are calculated at the end of the inspection. The team's experience has shown that high inspection yield (the percentage of the overall module defects that were found in an inspection) can significantly reduce time spent in testing. The inspection of Fran's module takes just over three hours to complete with an estimated yield of 64 percent, which is slightly better than the 61 percent average inspection yield for the project to this point.

03fig03.gifFigure 3-3. TSPi Inspection Script[3]

Margaret meets with the customer representative to update the status of the project. She raises Fahad's issue regarding the user interface and is told she'll have an answer on Monday. Since this will not impact the schedule, she agrees.

The team members not involved in the design inspection meet to discuss the integration testing, which will begin next week. As the Testing Manager, Felipe leads the discussions and tasking, with help from Fahad and Greg. They discuss what strategy to employ in the integration, given their current status and the experience so far in the cycle. Integration testing will show that all necessary components are present and that the interfaces between them work as specified. They identify two units that still need to satisfy their unit test completion criteria and flag this for management attention.

The team ends the week by completing and discussing the weekly status reports and establishing the earned value status for the project. The reports include a Role Report, Risk Report, and Goal Report. Earned value is tracked based on the responsible engineer's estimates for task completion as well as a top-down allocation by the Team Leader and the Planning Manager. Today, the cumulative earned value is 33.74 percent compared to a planned value of 34.5 percent, validating Greg and Jan's findings regarding progress. The Top 10 risk list is reviewed, and no change is made to the order or thresholds. Margaret shares the results of her discussions with the customer. Felipe brings up the unit test completion issue. The problem is some missing customer validity criteria. Margaret takes an action to work this with the customer representative. Panitee brings up the anomaly in review rate, and the team suggests that she investigate the relationship between the review rate and defect detection yield at the unit level. She agrees to do this. All modules are checked into the CM facility, and the team heads home for the weekend.

A Typical Day Using Extreme Programming*

Training

The Cockburn Level 2 team leader (coach) and one of the senior programmers attended a 40-hour XP development workshop. Two of the senior programmers had been on an XP team before. One is a Level 2, the other is a strong Level 1A. The remainder of the nine-person team are junior Level 1As who were introduced to the concept through in-house presentations by an XP consultant, enhanced by books and papers suggested by the XP-experienced team members. However, an essential part of the training is done when each of the seven inexperienced team members works side-by-side with one of the trained and/or experienced team members. Two team members are familiar with the legacy system; the others have all reviewed parts of the legacy documentation. The environment ensures that the experts are within earshot of the inexperienced team members. In this way, their experiences are shared (and tacit knowledge is transferred) through their normal work activities. “Default” roles have been established for the team members, as shown in Table 3-2. However, all the team members are always ready to assume other roles within the team in order to get the job done.

Tools and Environment

The team is using an automated tool to support test case and test suite development. The development space is an open bullpen where all of the team members work and where everyone can respond immediately to questions or calls for help. Workstations are set up in the bullpen to support pair programming, but have been occasionally rearranged when different configurations are needed. Wall space around the bullpen provides white boards and bulletin boards. Areas for phone calls and private time are provided on the periphery. A lounge room with refreshments and comfortable furniture is available for informal discussions and relaxation.

Project Planning

A one-day exploration session was held with the customer, and the fundamental stories needed for planning were captured on story cards. A two-day planning session was then conducted to allow the customer to refine and prioritize the work to be done based on the developers' estimate of the resource effort required for implementing the stories. The result of this iterative work determined the iteration length (two weeks), the release length (three iterations), and the release schedule. The workshop also derived an estimate of project velocity (the number of reasonably equal-size stories that can be done in an iteration). Based on the customer priorities, the developer resource estimates, and this velocity, the iteration plan is developed by the developer team and the customer. At the start of each two-week iteration, the programmers select which user stories to own (any left over are assigned). Tasks (which break down the work elements of the stories) are then established from the stories and are described on task cards.

Table 3-2. XP Roles

Person

Role

Responsibilities

Jill

Coach

Responsible for the process as a whole. Supports programmers. Maintains stability, depth of understanding of XP. Nudges rather than directs. Separates technical from business decisions.

Melissa

Customer

Responsible for writing and prioritizing stories, making decisions about how to resolve problems, developing functional tests, representing the various user departments' interests.

Ferdinand

Tester

Helps the customer choose and write functional tests; supports programmers in running their tests.

Patricia

Tracker

“Conscience” of the team. Ensures validity of estimates, checks that the velocity is maintained, and acts as historian.

Ford, Gary, Ben, John, Felicity

Programmer

Main player in XP. Responsible for selecting, estimating, and completing tasks; developing unit tests; pair programming; refactoring.

Status

The project's second release is due the end of next week, the project having been through five and a half iteration cycles. The first release was deployed with two less stories than originally planned due to the predictable overestimate of project velocity in the first two iterations. One of those stories is included in the current release and the other has been deferred to the next release. Customer feedback has been generally positive, leading to increased clamor for complete functionality.

The Day's Activities

The team begins the day (as they do every day) with a standup meeting. The whole team stands in a circle. Going around the circle, each person reports specifics on what they accomplished the prior day, any problems they encountered, and what they plan to accomplish today. As they go around the circle, Ford has a question about part of the user interface and sets up a time with Melissa to work the issue. Felicity has a concern about the estimate for her communications utility task, and Ferdinand says he'll take a look at it with her. John asks Gary to work with him on his task to add functionality to the inventory report generator since Gary and Ford did the original development of the functionality. Other pairs form according to the current needs. Jill asks if anyone sees any showstoppers for the planned release. With none identified, the developer pairs settle at workstations to begin the day's work.

Gary and John sit at the workstation to enhance the report generator functionality according to the story card and the task description (see Figure 3-4). Initially, John “drives” the workstation and Gary “navigates” as John begins developing test cases for the enhancements. Gary points out some reuse opportunities from the test cases he and Ford developed. He takes over the keyboard and pulls them up to modify them for this new task. The pair work the rest of the morning synchronizing breaks and iterating between test case generation and code implementation on a minute-by-minute basis, using the automated test tool to make sure the tests pass consistency and compatibility checks with respect to the rest of the project tests.

03fig04.gifFigure 3-4. Story Card and Task Description

Ford and Melissa sit in the lounge to discuss Ford's user interface questions. Melissa lets Ford know that the story doesn't quite fit what they need now, and that's why it may be confusing. She explains what the users need to see to be most effective. Ford takes notes on the story card. Melissa agrees to his suggestion of a quick mockup, and they schedule a meeting after lunch to show her what he's done. Melissa agrees to bring along one of the report users who had a different interpretation of the story. Ford then moves to a free workstation to develop some prototype screens.

Jill meets with her boss to make sure the completion bonus funding is available and that the consultant she wanted to support planning on the next release will be available at the right time.

Ferdinand and Felicity meet to discuss the estimate for her communication utility. When Ferdinand looks at the task card for Felicity's story he is not certain exactly which of the target platform's system functions would be the most appropriate. He suggests that they perform a spike to resolve the problem. A spike is a narrow but deep coding experiment that allows the developers to see how various functions might work and to better estimate both system performance and the difficulty of the coding. Felicity agrees and she begins looking at options for the spike.

Ben has asked Jill to help him with his parsing task. As they sit at the workstation, Jill suggests that the new capability enhancements won't fit easily into the existing code. So together they begin to refactor the initial code to remove some redundancy and provide a better design for adding the enhancements.

Melissa and Ferdinand begin to finalize the acceptance tests for the current iteration. Because the functionality in this release is more complex than the first release, the tests are more complicated. Ferdinand helps to organize the stories in a way that helps Melissa to develop scenarios that are closer to the way her organization does business and that will cover all of the activities involved in using the new functions. They also refine the approach for stress testing and the always appropriate “Can my 6-year-old break it?” test for unanticipated inputs or sequences. Patricia, as her role as tracker, both participates in and takes notes for the meeting. When the tests are run, she'll collect from the developers the defects found and who's responsible for correction, as well as the new test cases that will be generated to test for those defects in future releases.

John and Gary code their changes, working at the terminal, but breaking a couple of times for snacks and to answer questions from Felicity regarding her spike effort.

Ben and Jill continue their refactoring.

Ford and Melissa and the report user grab a workstation and Ford presents his prototype screens. Melissa and the user make some small suggestions (and one pretty big one). Ford counters with an easier but similar suggestion that can be done within the current schedule. The user agrees, but asks that a new story be written to capture the enhancement she wanted for a later iteration. Melissa agrees to coordinate the change with the other report users. Ford decides to get John to pair with him on the implementation of his prototypes on Monday.

John and Gary finish their code and make sure all the unit tests run correctly. Corrections are made until the code passes all of their tests.

The team migrates into the lounge area for a wrapup of the day and the week. Progress is reported (and dutifully captured by Patricia) on the various tasks. Patricia then states that according to her data, the project velocity is satisfactory; the team is a little behind schedule, but will not need to renegotiate stories for this iteration. She has updated the schedule and task chart on the wall with her new data. Felicity reports on the findings from her spike, since other tasks will be facing the same problem. Jill congratulates all on a great week, reminds everyone of the postrelease party on next Saturday night after the official installation and acceptance of the new release, and tells everyone to hit the road and have a great weekend.

  • + Share This
  • 🔖 Save To Your Account