Wednesday, September 17, 2008

Agile Acceptance Testing

Gojko Adzic has been doing some good writing on acceptance testing in agile development. See his article here: http://gojko.net/2008/09/17/fitting-agile-acceptance-testing-into-the-development-process/
I'm not sure whether Gojko considers acceptance testing as a part of what he calls 'normal agile development' or not, but to me, it's an integral part of development. I see coding and testing as two parts of a whole. I wrote a long posting in response to his post, and so as not to waste it, I'm going to include it here too.

Here's an example of the business-facing testing activities we do during the iteration. I wish I could do a nice graphic like Gojko.

Day before start of iteration - product owner goes over stories with us, we ask questions, get some examples, perhaps break a big story into thin slices, go away and think about it. Examples and overview may be put on wiki, along with any pictures/flow diagrams drawn. Product owner posts "conditions of satisfaction" for each story on wiki.

Day 1 of iteration - Retrospective, then iteration planning, we write and estimate task cards for all coding and testing activities for each story, until we have enough stories to keep us busy for at least the first few days of the iteration. Testing cards for each story might be "Define high level tests", "Write FitNesse test cases", "Manual exploratory testing", "Write GUI smoke test", "Obtain test data". Then demo last iteration to customers. Then release last iteration's code.

Day 2-3 of iteration - Write high level test cases, which involves more discussions with customers, perhaps design meetings. Do paper/whiteboard prototyping with customers where needed.

Day 2+ - When a programmer picks up first task card for a story - start writing detailed test cases, in FitNesse if that's possible for the story. When a happy path FitNesse test passes, write more interesting test cases, collaborating closely with programmer (sometimes it's the programmer writing the FitNesse tests)
When a testable chunk is available for exploratory testing, start on that. Show things to customers and get feedback whenever possible.
When all coding task cards are done, do exploratory end to end tests, automate GUI smoke tests where appropriate. (Or if we're working in thin slices, do this for each slice).

We focus on completing one story at a time, in priority order. Not everyone can work on one story, but the idea is we are trying to get stories done. Bring in more work as we have room. If we took on too much, figure out what to set aside, and focus on finishing.

So not all our integration and exploratory testing is done the last couple of days, it's spread out more. On the last day, we are wrapping up the last story or two.

If we have a big theme coming up, we write cards to have brainstorming, research and design discussions for an iteration or two in advance of actually starting on the stories.

For my team, we've found that getting into too much detail in advance wastes time and leads to confusion. We do try to get examples, but we try to keep the discussion pretty high level and not get too much into details.

No comments: