For those not familiar with FDD, I'll try to summarize in a few pictures and paragraphs (please do refer to [Coad] for a more detailed introduction). Those who are familiar with FDD might want to skip to the comparison.
FDD is a model-driven, short-iteration process. It begins with establishing an overall model shape. Then it continues with a series of two-week "design by feature, build by feature" iterations. The features are small, "useful in the eyes of the client" results. FDD consists of five processes or activities (see Figure 1), as discussed in the following sections.
Figure 1 The five processes of FDD.
1. Develop an Overall Model
For the first activity, domain and development members work together, under the guidance of an experienced component/object modeler (chief architect). Domain members present an initial high-level, highlights-only walkthrough of the scope of the system and its context. The domain and development members produce a skeletal model, the very beginnings of that which is to follow. Then the domain members present more detailed walkthroughs. Each time, the domain and development members work in small sub-teams (with guidance from the chief architect), present sub-team results, and merge the results into a common model (again with guidance from the chief architect), adjusting model shape along the way.
2. Build a Features List
Using the knowledge gathered during the initial modeling, the team next constructs as comprehensive a list of features as they can. A feature is a small piece of client-valued function expressed in the following form:
<action> <result> <object>
Calculate the total of a sale
Existing requirements documents, such as use cases or functional specs, are also used as input. Where they don't exist, the team notes features informally during the first activity. Features are clustered into sets by related function, and for large systems, these feature sets are themselves grouped into major feature sets.
3. Plan by Feature
The third activity is to sequence the feature sets or major feature sets (depending on the size of the system) into a high-level plan and assign them to chief programmers. Developers are also assigned to own particular classes identified in the overall object model.
45. Design by Feature / Build by Feature
Activities four and five are the development engine room. A chief programmer selects a small group of features to develop over the next 12 weeks and then executes the "Design by Feature (DBF)" and "Build by Feature (BBF)" activities. He or she identifies the classes likely to be involved, and the corresponding class owners become the feature team for this iteration (see Figure 2). This feature team works out detailed sequence diagrams for the features. Then the class owners write class and method prologs. Before moving into the BBF activity, the team conducts a design inspection. In the BBF activity, the class owners add the actual code for their classes, unit test, integrate, and hold a code inspection. Once the chief programmer is satisfied, the completed features are promoted to the main build. It's common for each chief programmer to be running 23 feature teams concurrently and for class owners to be members of 23 feature teams at any point in time.
Figure 2 Feature teams are formed dynamically as needed.
Track by Feature
With FDD, we can track and report progress with surprising accuracy. We begin by assigning a percentage weighting to each step in a DBF/BBF iteration (see Figure 3).
Figure 3 A set of sharp milestones and percentage weightings.
The chief programmers indicate when each step has been completed for each feature they're developing. Now we can easily see how much of a particular feature has been completed. Simply posting the list of features on a wall, color-coded green for "complete," blue for "in progress," and red for "requiring attention" provides a good visual feel for overall progress, with the ability to "zoom in" to read the detail by simply walking closer to the wall (see Figure 4).
Figure 4 Part of the Features wall chart.
Then use straightforward tools to roll up these percentages to feature set and major feature set level to provide highly accurate, color-coded progress reports for development leads, project managers, project sponsors, and upper management (see Figure 5 and 6).
Figure 5 Project status report.
Figure 6 Key to project status report.
Graph and trend over time to monitor progress rates (see Figure 7).
Figure 7 Graph of features complete against time.