Wednesday, July 22, 2009

Test Driven Development

I always need to take a deep breath when this subject comes up. I don't want to pretend that I know the answer to the best method of developing software. I would deem it highly unlikely that there actually is one single best method. If you have found the best method, my opinion on the subject obviously is no longer relevant. If I would have found the best method, I wouldn't write only a couple of sentences in a blog entry about it.

I did however work for a while as a test consultant and have thus some understanding of testing and QA. I also did some work on deriving test cases from formal specifications. And opinions start to take shape after a while. Opinions are like weeds in that respect. Come to think of it, opinions are also appreciated like weeds. Rather than writing a lengthy entry on my opinion on the whole thing, I'll provide a link to an article that captures the essence of my opinion in rather compact format: http://www.eiffel.com/general/column/2004/september.html . In case the link gets broken:

Test or spec? Test and spec? Test from spec!

Which came first? As an intermediate result towards a more general research goal, it has recently been demonstrated that the egg precedes the omelet. A full proof falls beyond the scope of this modest EiffelWorld column, but here is the idea: you can construct the omelet from the egg, but not the egg from the omelet. (The chicken is covered by a separate lemma.) The reader will already have jumped mentally to an important special case; for omelet read test, and for egg read specification, particularly in the form of Eiffel contracts.

We are being told from some quarters that you can't specify anyway, and that development should be "test-driven". Eiffel programmers know from daily practice and from observation of the libraries that the first proposition is false: precise specifications in the form of contracts, even if they cover only part of the functionality, radically change the software development process. In the new scheme, tests play a critical role (see another little article of a few years back, "Quality first" at:
http://www.inf.ethz.ch/~meyer/publications/computer/quality_first.pdfwhich I believe already suggested a few of the good ideas of agile methods), but tests are not a replacement for specification: they follow from these specifications. It is indeed remarkable to see how the presence of contracts can drive the testing process. If a slogan is needed and two can do, I will venture "Contract-driven testing" and "Test-obsessed development". That works very well -- you should test all the time, with the intent of finding bugs -- but it's not a reason to drop specification and design. Specification and design are what propels both the testing process and the test cases themselves.

Going from specifications to test is one-way: you lose the abstractions. A specification describes the general properties of a computation, for all possible inputs; a test addresses one particular result for one particular input. From the general you can deduce the particular, but not the other way around: even a billion tests don't reveal the insight -- the abstraction -- of a specification. Omelets beget no eggs.

For an Eiffel programmer, testing consists of turning on contract monitoring, exercising components, and waiting for a contract to fail. Since any test oracle can be expressed by a postcondition, but no collection of test oracles can replace a specification, we lose nothing, but we gain a great advantage over naïve (contract-less) test-driven development: retaining the fundamental abstractions that underlie our programs and their individual modules.

In some later installment of this column focused more on research, I'll discuss how it is becoming possible to automate the component testing process completely, test case generation included. Suffice it for the moment to note that while we should be grateful to our extreme friends for helping to rehabilitate the much maligned process of software testing, we -- Eiffel programmers -- know something they don't seem to have realized yet: you can test from specs, but you can't spec from tests.-- Bertrand Meyer


Use cases, and tests derived from them, are a great tool for exploration. A specification, or for instance a mathematical proof for that matter, does not just appear out of thin air. It can start many ways, for instance as an observation of a particular instance or scenario that inspires an idea. You can call that a use case. And some use cases can get you to rethink what you thought of as a good specification. Just as a single counter example can cause a mathematical proof to bite the dust (as I side note, I did enjoy reading "Proofs and Refutations: The Logic of Mathematical Discovery" by Imre Lakatos ( http://www.amazon.co.uk/Proofs-Refutations-Logic-Mathematical-Discovery/dp/0521290384 ) ). Test-driven development can be a useful tool for learning about the specifications. But don't take it as an excuse not to bother with even trying to understand the fundamental abstractions that underlie the software.

As an example, I would like to reference http://bitsthatbite.blogspot.com/2009/07/five-orders-of-ignorance.html (wondering if referencing my own blog entries within my blog entries will up my score in search engines). Obviously, for the test case with the lambda containing the multiplication, the original code works well. The code was more than likely developed with the multiplication operator in mind as the test case. However, the contract of the Fold method captures an abstraction beyond just multiplication: you can pass lambda's that apply other binary operators like addition, subtraction or division. It is easily shown that the original implementation does not handle every binary operator correct.

Also don't forget the refactoring step in test-driven development. I have seen bad code. The images are still floating in my memory banks. Introduce as many variables as you can when you are not sure. Don't remove code you no longer use or shouldn't have written in the first place. Keep tagging on to the same method until it's several pages long. When you happen to be bored, make sure you spread out your abstractions across as many classes as you can create within 30 minutes. Refactoring doesn't help at that point. Jumping of a bridge might, but that's rather final. The little trash can on your desktop might be your best friend. An unexplainable hard disk crash on all development machines is all that can save you in explaining to management why you need to rework the application. In short: the fact that you plan on refactoring later should not open up the doors to bad practice.

Here's my dilemma: the above story just doesn't go over very well as a casual conversation at the coffee machine. I tried. Doesn't work. So if you ask me about test driven development, I'll say: "It's marvellous. Sugar?"

No comments:

Post a Comment