Wednesday, May 31, 2006

Taking things into consideration

I've been working on a new style of writing tests for a while. One of thing we're finding at the moment, is that you often want to narrow down the available objects to pick out a particular part, and then make a number of assertions about this same part.

For example, you might be modelling a garden, and you want to write some tests concerning a certain set of plants, maybe those that are yellow. Using the style that Tom White and I have discussed before, we could write something like:

assertThat(garden, has(flowers().coloured(yellow)));
assertThat(garden, has(flowers().in(flowerbed()).coloured(yellow)));

This checks that there exist some flowers that are yellow. But what if we want to check that all the flowers in the flowerbed are yellow? We've come up with the idea of a consideration. We use a considering(...) clause to refine the scope of our assertion, and have also created an assertThatAll() assertion to take into account the universal quantification. The results look like this:

assertThatAll(flowers(), areColoured(yellow));
assertThatAll(flowers(), areInBloom());

I think this will be subject to some refinement as we put it to use in more places, and we'll see whether or not it is useful. We've certainly found a few places so far where we want to check a property for a set of objects, and writing a loop isn't very in keeping with our literate style.

Wednesday, May 24, 2006

Tying Tests to Requirements

On my current project, we've got a large requirements document, with a lot of numbered requirements. We want to be sure that we've covered all the requirements in our automated test suite. Rather than going through by hand checking off tests against the list, we had the idea of marking each test with its corresponding requirements, in the code.

As we're working with Java 5, it seemed appropriate to do this using Java annotations. We created an annotation that can be applied to test methods, parameterised by the number.

Even better, we were able to use the annotation processing tool to run over our test code, and produce a report of which tests test which requirements, and which requirements are not covered by any of our test cases. When I add a test, I can quickly regenerate the report. I have a feeling this is going to be very useful.

Literate Functional Testing

For a while now, I've been working on some ideas about writing automated tests that are readable even for non-programmers. This makes it easier to write tests in language that is understood by users or customers, making sure that requirements are agreed.

I've been working on this with Tom White, who has written up some good posts on the subject, talking about our use of constraints, and our ideas about using anaphora, on his blog on

I really meant to write these up myself, but haven't had time for much blogging lately. Thanks to Tom for his articles.

This page is powered by Blogger. Isn't yours?