Let’s Go!
If you really want to learn how to do TDD, the best way is to code the examples as we go along. I’ll post the code that we build in each installment, so that you can pick up at any given point.
We’ll start by building enough of a core domain object to get going. We’ll then start driving down from the application level. I don’t like to go too far with domain objects. Usually the interfaces into those objects changes as you detail application needs.
Key domain objects in poker might include players, hands, rules, and pots. But the core component is a deck of cards, so we’ll start by building a Deck class.
Well, not really. We’ll start by building a test class to drive out the specifications for the Deck class. Create a class named DeckTest (see Listing 1). Have it extend from the JUnit class junit.framework.TestCase. Add to this class an empty method with the signature public void test().
Listing 1 Basic DeckTest class.
package domain; import junit.framework.*; public class DeckTest extends TestCase { public void test() { } }
You now have a valid JUnit test class with a single test method. If you run the test through JUnit, you’ll have one passing test—something JUnit indicates with a green bar. By definition, test methods that fall through to the end pass. One way to make a test fail is to explicitly insert a failure point, as shown in Listing 2.
Listing 2 DeckTest with a failure point.
package domain; import junit.framework.*; public class DeckTest extends TestCase { public void test() { fail("just because"); } }
Running JUnit against DeckTest will now result in a single failure, represented by a red bar.
Let’s build a real first test for Deck. Rename the test method as testCreate. JUnit recognizes any method with a comparable signature, as long as the method name starts with the word test. This first test will involve specifying the state of a new deck. The most obvious fact about a poker deck is that it has 52 cards (see Listing 3).
Listing 3 DeckTest with testCreate() method.
package domain; import junit.framework.*; public class DeckTest extends TestCase { public void testCreate() { Deck deck = new Deck(); assertEquals(52, deck.cardsRemaining()); } }
The assertEquals method is one of many TestCase superclass methods available to verify that things are as expected. It compares the first argument, an expected value, to the second argument, the actual results. If they don’t match, the test method halts immediately and JUnit reports a failure.
The test won’t even compile yet—there’s no Deck class! Build just enough code to get everything compiling, as shown in Listing 4.
Listing 4 Basic Deck class.
package domain; public class Deck { public int cardsRemaining() { return -1; } }
Now run the tests. You should expect to see a JUnit failure:
"expected <52> but was <-1>"
The failure is valuable feedback. It sets you in the right direction, which is to hard-code a return value of 52 from cardsRemaining. Sometimes a test will pass when you know it shouldn’t. That’s even more valuable feedback; things aren’t as they seem—a potentially very bad sign.
Make the change to Deck as shown in Listing 5. Going forward, I’ll show only relevant or changed bits of code.
Listing 5 Deck with hard-coded return value.
public int cardsRemaining() { return 52; }
Rerun JUnit. You should expect it to pass. (Of course it does!) As the final step in getting this first test going, review the code we just introduced. Look for opportunities to improve the design by eliminating duplication and by improving comprehensibility. Here, the hard-coded literal is a magic number. The literal is also duplicated between the test and production (Deck) code. You can introduce a constant as a way of addressing the duplication problem, as shown in Listings 6 and 7.
Listing 6 DeckTest with SIZE constant.
public void testCreate() { Deck deck = new Deck(); assertEquals(Deck.SIZE, deck.cardsRemaining()); }
Listing 7 Deck with SIZE constant.
public class Deck { public static final int SIZE = 52; public int cardsRemaining() { return SIZE; } }
Rerun JUnit to ensure that we haven’t broken anything. This is a very important step!
You got the test passing, but then you went back and fixed problems that we just introduced. Most developers think of this as a risky proposition: Once they get code working, they don’t touch it. "If it ain’t broke, don’t fix it!" It’s this pervasive attitude that has led to the poor state of code in most systems. Here, you have tests. You have the opportunity to make code changes with impunity. You can simply rerun your test suite each time to ensure that everything is still okay.
Let’s summarize the technique:
- Write a bit of test specification.
- Get the test to compile by providing stub production code.
- Run JUnit, expecting failure.
- Write just enough production code to get the test to pass.
- Run JUnit, expecting success.
- Fix any newly introduced design deficiencies in the code.
- Run JUnit, expecting success.
You’ll learn more about what to do in each of these steps as we go forward. This sequence of steps is very short. It should take you from perhaps thirty seconds to five minutes at any point to get a green bar. You’ll repeat this cycle many times over the course of your programming session.
This is how I actually build code. I make a bunch of very small, very rote, very simple steps, each providing some level of negative or positive feedback. Negative feedback isn’t a bad thing; it’s just an indicator that says I have a bit more work to do before I can consider that small step complete.
Going forward, I’ll provide the tests and production code in pairs, instead of stepping you through this very incremental development process. Were I to describe the entire development of the poker application in such detail, both you and I would be dead of boredom somewhere around the second installment. Just remember that the tests and production code don’t get baked all at once. The incremental feedback is key to making TDD work.