Back to the Test Lab
It's high time you reconsider test-driven development.
"My application was at first fluctuating and uncertain; it gained
strength as I proceeded and soon became so ardent and eager that the stars
often disappeared in the light of morning whilst I was yet engaged in
- By Mike Gunderloy
- July 01, 2004
Frankenstein, Chapter 4
Plenty has been written about test-driven development in the last year
or two. Indeed, I even tackled the subject in this space last
September, when I reviewed Kent Beck's book Test-Driven
Development By Example. But a new book has brought me back to
revisit the subject sooner than I'd expected.
James W. Newkirk and Alexei A. Vorontsov have written Test-Driven
Development in Microsoft .NET (Microsoft Press, 2004), and it's
become the book I'd recommend as a first book on the subject. Not only
is it a reasonably easy read, but it covers topics that haven't been treated
extensively in other books.
From Dinosaurs to Mammals
Let's pause for a moment to review the basic lifecycle of test-drive
development. Here's how the authors lay out their cycle:
- Write the test code.
- Compile the test code.
- Implement just enough to compile.
- Run the test and see it fail.
- Implement just enough to make the test pass.
- Run the test and see it pass.
- Refactor for clarity and to eliminate duplication.
- Repeat from the top.
The first few books on TDD were content to explain this process in detail
and, perhaps, to show a simple example or three. But, by now, many developers
know the basics and we're ready for a second generation of TDD books that
dig into the subject more deeply. This is the first of those new, more
No More Cherrypicking
The book starts with a few simple examples to set the stage and
to demonstrate the effective use of NUnit
(the authors' tool of choice; hardly surprising, as Newkirk helped develop
NUnit). There's a stack class with 14 different tests to drive its development
and a prime number sieve that needs to be refactored. But from there,
the authors rapidly delve into more advanced (and interesting) topics.
Some of the things they cover include:
- Testing database code
- Testing ASP.NET pages
- Testing Web services
- Testing user requirements
- Testing transactions
All require at least some careful thought and design, both on the test
fixture side and in the test itself. The authors show plenty of code (and
their complete examples are available on the Web), and they don't try
to water it down. You should expect to put some real work into understanding
how everything fits together as you get into the later chapters of the
book. Of course, you could just play "monkey see, monkey do"
and try to copy their code to test your own classes, but that rather defeats
the purpose of moving forward with your code slowly and deliberately.
Keep Your Tests in Tip-top Shape
One thing you might not have picked up from other TDD books and
articles is that TDD superstars pay as much attention to their tests as
they do to the code being tested. When you go back after a successful
test run to refactor, you shouldn't refactor just the application code.
After that's done, spend a bit of time looking over the test harnesses,
and see whether any of them need to be refactored as well. To some extent
your test cases are documentation of your application; they show all of
the things that it is designed (and guaranteed) to do. So it only makes
sense to make sure that the tests are as clear as possible, just as your
application code should be as clear as possible.
It's a well-known paradoxical result that test code tends to be longer
than the code being tested. In their first example, Newkirk and Vorontsov
end up with four times as much test code as application code. This just
emphasizes the need to keep your test code clean and understandable. A
rat's nest in test code will ultimately result in an application that
you can no longer test. And computer programming being what it is, that
will happen at the worst possible time: when the customer demands additional
Those Grapes Were Sour, Anyway
My one qualm about this book is more of a qualm about TDD in general:
The authors really do let the tests drive the development. This is true
even when more traditional software engineering practices might suggest
a different design. For instance, having admitted that there's no good
technique for testing code behind ASP.NET Web pages directly, they design
their ASP.NET pages to be thin wrappers atop utility classes. This makes
more of the code testable, at the cost of some cohesion lost within the
Another example: When they refactor nearly common code out of two methods
and remark, "We did have to make a change to the message that was
in the exception—it is more generic now, which is a small price to
pay for consistency in the code." Of course, you might feel otherwise
and prefer specific error messages to slightly less duplicated
code. If so, I'd recommend adding the techniques in this book to your
toolset, without necessarily adopting them blindly.
If Not TDD, Then What?
Having some minor qualms isn't enough to keep me from recommending
TDD, though. I don't use the technique on every project, but it is slowly
sinking into my normal way of working. There are times (particularly when
the problem domain is familiar to me) that it feels more productive to
me to write a lot of code and then debug as necessary. But especially
when exploring new ground, TDD keeps the amount of time wasted in pursuing
dead ends to an absolute minimum—and that almost always translates
to better software faster. If you haven't tried it yet, I urge you to
pick up this book and give TDD a shot. It might just be the technique
you need to make your software projects come out better.
Have you tried TDD yet? Or are you waiting for the hype machine to
abandon it and go on to something else? You can get hold of me at MikeG1@larkfarm.com.
I'll use the most interesting comments in a future issue of Developer
Mike Gunderloy, MCSE, MCSD, MCDBA, is a former MCP columnist and the author of numerous development books.