Wednesday, December 24, 2008

New! and Improved!

Now that our book is almost out (and it IS out on Safari Rough Cuts), I've decided to start a new website, where I will blog as well as posting articles and interesting links. This blogspot will be deprecated and I'll be transitioning over.

Please visit the new site, lisacrispin.com, and comment. I'm developing it with agile practices and I need "customer" feedback!

My old website will remain as a prime location for new action photos of the wonderfully agile miniature donkeys, Chester and Ernest!

Tuesday, December 23, 2008

A Plague of Updates

I have a basic distrust of OS updates. My coworker Joe used to religiously update his Windows box, and it died several times. I never used to update mine and it just kept marching along.

Now, I know that's bad, I need to do the updates for security and so on. So I am better about doing it on my PC and Mac. As for my Linux box, one time I asked our sys admin Tony if I should install the updates, but he said no. Then recently, he noticed I had over 100 pending updates, and said I should update. I pointed out that he had recommended against that earlier, but he couldn't remember any reason he would have said that.

After updating, my network connection no longer worked. Tony suddenly remembered why he had told me not to update. He worked some magic and I was connected again.

That was only a couple of weeks ago. I forgot about the bad part of that updating, so yesterday when I noticed some pending updates, I let it go ahead and update. Even after I lost my network connection, I couldn't remember from two weeks ago that the updates had caused that, so I had to call Tony over again.

This time I watched what he did, and when I twittered about it, Jeffrey Fredrick suggested that blogging about things that you might forget is a good idea. So next time I let my Linux box do updates, and can't connect to the network anymore, I'll remember to su root, go to /root/l1-linux-v1.2.40.2, and make install. This is only needed for kernel updates. Note to self!

I'm trying to get organized in other ways too. Janet Gregory has her own personal storyboard, just stickynotes on the wall next to her desk. I've adopted this at work, it really helps. I'm going to do it at home, too, I'm thinking about using the microwave door for it!

Wednesday, December 17, 2008

Tester-Developer Ratios on Agile Teams

A question that came up yesterday on the Agile Testing Realities discussion, and that comes up a lot everywhere, is "What is the ideal tester-developer ratio on agile teams"?

Naturally, the answer is "it depends". Here are some factors to consider:
  • Are the developers doing a good job of TDD? Does the team use acceptance tests to drive development, also?
  • How good is the automated regression test coverage?
  • How good is the communication between developers and customers?
  • Is the application testing-intensive, or does testing go pretty fast compared to coding?
  • Do the developers help with testing activities such as automating acceptance tests?
My current team works with a financial services web app that is quite testing-intensive. The functionality is complex, and since we are dealing with peoples' money, we have to get it right. For some stories, testing takes more time than coding. Our programmers are quite experienced in TDD and our regression test coverage is excellent. While our developers and customers sit close to one another and communicate well, and the developers have a high degree of domain knowledge, our stories are often quite complex. A lot of tester time is devoted to working with customers to get examples of desired behavior and sort out technical challenges and other questions. The developers take on a lot of testing tasks, including automating FitNesse tests and often writing the FitNesse test cases themselves. We have five programmers and two testers, and even so, the developers often have to pitch in extra on testing.

I worked on another team of 20 more more developers (subdivided into smaller teams but all working on one project) where I was the only official tester. While the team did a good job of TDD, and we had good communication with customers despite the fact that we were a distributed team, this was clearly not enough. Fortunately, we were developing a non-critical internal application that would be used by a few expert users, so it wasn't the end of the world if the functionality wasn't perfect. One or two developers or business analysts wore a tester "hat" each iteration and performed testing activities. I worked hard to transfer my testing knowledge to everyone on the team, so that our unit tests covered more test cases, and we could automate at least some of the functional regression tests.

My third successful agile team consisted of 8 programmers and me, the tester. At first, the programmers could not get a handle on TDD or even unit test automation after coding, and I was buried. I asked our coach for help. He helped us figure out how to get traction on unit testing, by providing training and time to learn. Once the team mastered TDD, and pitched in on functional test automation, I was able to handle the other testing activities and we all worked well together.

Those are my stories on tester-developer ratio. Got any of your own to share? Please comment.

Tuesday, December 16, 2008

Agile Testing Realities

Today I participated in an Agile Testing Realities live Q&A discussion with Elisabeth Hendrickson, Tauseef Kahn and Tom Wissink. It was fun and interesting, especially hearing Tom's experiences working on huge government projects (how fun would it be to work on the Hubble Telescope?)

I was frustrated, though, because there were so many excellent questions being posted, and we only got to a fraction of them. So, I've decided to start addressing some of the question areas here. If you posted a question in the session today and didn't get it answered, feel free to send it directly to me. Another good forum is the agile-testing Yahoo group.

Test automation is a big challenge for everyone and I noticed questions related to "How can I get tests automated with back-to-back, short iterations? I don't have time". Obviously, I can write whole book chapters on this subject, but here are a few key points for a successful automation strategy:

  • The whole development team, not only the testers, needs to tackle test automation.
  • Start by implementing a continuous integration and build process so you have a way to run your tests automatically, and get quick feedback from them.
  • Unit tests have the best ROI, so start test automation efforts there. For legacy code, try a lightweight automation tool and do simple smoke tests that don't have a lot of logic in them and are easy to maintain. You'll be surprised how many regression bugs they find.
  • Cut down the scope of work your development team commits to in each iteration so you make sure there is time to finish all test automation tasks. Don't put them off until the next iteration - that's the road to perdition.
  • Repeat this mantra, write it on your story board or whiteboard: No story is done until it's tested! This includes automating regression tests for it too!
Test automation obviously has many more facets. What about exploratory testing, what about non-functional tests such as performance and security testing? We'll need more blog posts for those.

Tuesday, December 9, 2008

'Tis the Season

I just ran across this cool inspirational story.

HP has a "magic giveaway" where they give top bloggers $6,000 in HP products to review, and in turn, each blogger runs their own contest to give this equipment away.

Liz Henry's winner is Sara Moriera, a Portuguese professor who teaches computer engineering in East Timor. Sara is using the computers and other equipment to help young women in East Timor.

What cool women these are to be able to use technology to help educate other women who are in the most difficult of circumstances. What a difference the computers and other equipment will make for them.

Now that makes me feel this really is the season of giving (not that all year shouldn't be). Hooray for HP, Liz and Sara.

Tuesday, November 18, 2008

Much Ado

Agile Vancouver's Much Ado about Agile conference was terrific. Sold out at 220 participants. Agile Vancouver held "Agile 101" tutorials before the conference, so everyone in our tutorial and track sessions was pretty savvy. Lots of agile practitioners doing cool things in BC! I had a lot of fun, learned a lot, and enjoyed the nice BC wine that was presented to me by the organizers! Really a cool agile group there, I look forward to going back.

Vancouver is a beautiful place too, with amazingly friendly people.

This past Thursday, our manager, Trevor, did a really cool thing. He gave us a presentation on how our team has done for the past 6 months. Every group in our company sets goals every 6 months, and evaluates how we did, and bonuses are partly based on whether we meet our goals. This is the first time we had such complete feedback into how we did.

Trevor started with a list of all the projects and features we have released in the past 6 months. We couldn't believe how much we did. When you focus on a few stories at a time, it's hard to remember the big picture. We delivered a half-dozen major projects and lots of smaller things. Many of them were highly complex, and our customers were happy with all of them. One major accomplishment of our IT team was to move our production servers to a new ISP provider with no downtime. Our company signed a major new partner. We met 100% of our goals!

The coolest thing was that Trevor gathered feedback from the business. Questions he asked them included:
  • Are we building features fast enough for the business?
  • Are the features we build high quality and meet end users' needs?
  • What do you think of the volume, quality, and turnaround times for defects and production support?
  • What does the team do that makes your job easier, or derails you?
  • Are you getting the information you need from the team?
  • Do you see anything in our way that we don't realize?
  • Are there any additional tools or software your team needs?
The feedback wildly exceeded our expectations. In my next post, I'll go into some details.

Meanwhile, back at the Agile Testing book publishing effort - our book will be available December 26. See the news to the right for a coupon code. Also, you can see an announcement of it on my website.

Thursday, October 23, 2008

Much Ado About Agile conference - Vancouver

I'm getting my tutorial and track session ready for Agile Vancouver: "Much Ado About Agile".
This conference is put on by the all-volunteer Agile Vancouver user group, and they've put together an impressive lineup of both 'thought leaders' and practitioners, including Key Schwayber, David Anderson, Jim Shore, Jonathan Kohl, Frank Maurer, David Hussman, Janet Gregory, Jennitta Andrea and many more. The tutorials are on Tuesday, Nov. 4, and the conference is on the 5th and 6th.

I'm presenting a tutorial called "Crossing the Chasm: Helping Testers Make an Agile Transition", to help testers new to agile, and teams wanting to help their testers get engaged in agile development. Agile development may have "crossed the chasm", but a lot of times, testers are left stranded on the other side. This tutorial is designed to show how various members of the organization can help testers get the skills they need to be successful agile team members, and help testers understand how to transition traditional testing activities to an agile setting. We'll do some group exercises to help each participant be able to go back to their teams and build bridges across the 'chasm'.

I'm also presenting a track session, "Are Agile Testers Different?" This material came out of questions I have heard a lot: If developers are writing tests, what do these testers do? What's so special about an agile tester? Do agile testers need different skill sets? How do I hire a tester for an agile team? I'll present a list of Five Principles for an Agile Tester that participants can use to learn new ways to add value to their organizations.

There are several other sessions on agile testing, including one on the Agile Testing Matrix by my co-author Janet Gregory, and one on Agile Requirements by Jennitta Andrea. Both Janet and Jennitta are awesome presenters and both sessions should both be quite valuable to testers and agile teams. In addition, Gerard Meszaros is doing a tutorial on Unit Test Patterns, and Mike Stockdale is doing an Automated Testing Clinic.

I haven't been to Vancouver but I'm really looking forward to seeing it, and meeting all the folks at the conference. It's worth traveling to get to this conference, especially if you haven't been to any agile or testing conferences recently. There's a lot of good new information to learn.

If you sign up for my tutorial, please send me any questions or issues you want covered in it.

Wednesday, October 15, 2008

An Example of How Testers Contribute

Here's an example to support my previous post about the "hidden resources" that may be lurking in testers, but not used for whatever reason (lack of support from team or managers, lack of time, lack of resources, lack of imagination).

Our web-based app manages all aspects of 401(k) plans, including doing trades. Lots of money moves around, but a lot of it moves around outside the app itself, in QuickBooks, via emails to a custodian, and so on. We, the development team, is quite expert in the 401(k) domain, and fairly lacking in the accounting domain.

We had a story three iterations ago to "tweak" a report so the accountant could use it to help track down the sources of problems when the cash accounts didn't reconcile. This was a small story, but because we never took time to really understand the big picture of how cash flows around the system (both inside and outside of the app itself), and really understand what the accountant needed to do, it grew into a giant story. There were plenty of issues that to me were smells. It was difficult to test because we didn't have production-like data for it and couldn't produce it; the results of running the report were erratic and not always explainable; it turned out that if certain batch processes ran at the same time, the data for the report would be corrupted and become useless.

Long story short, I raised all kinds of flags, but there was pressure from the business to just get this "little" story done. Although both the programmer and I spent time with the accountant trying to understand what was needed, we didn't even learn enough to know that we didn't know enough. We went through two iterations of fixing and re-releasing, to no avail. We wasted time in meetings discussing how to fix the report, when in truth the report was the wrong approach for what the accountant needed from the get go.

I finally convinced everyone that we needed to take a step back and have the accountant explain to us the larger picture of how she has to reconcile accounts, all the different ways money can move around and the possible sources of imbalance, and what information she really needed to have. It was finally decided that a completely different report was needed. The time on this story was wasted, except for the lessons we learned about how little we understood about the accounting domain.

Was it my job, as the tester, to put the brakes on this story and insist that we start from scratch to understand the requirements? No, the programmer could have raised that issue, or the ScrumMaster, or the product owner (the poor accountant tends to get ignored, for no good reason other than accounting doesn't sell 401(k) plans, and she was already doing all she could to try to make herself understood). However, my testing of this report showed that there were some basic issues in the system itself that would have to be fixed before this report could work, so either we had to address those with new stories, or take a different approach to get the necessary data.

I feel that as the tester, I had a better opportunity to see the big picture of what was going on. I took some time to sit with the accountant and understand more about her domain, and what she needed for her reconciliation process. We testers can detect smells and provide information about them to the rest of the team. We can, as I did in this case, bring together customers and developers to re-think a problematic story. I didn't throw myself down on the floor in a fit and say "we can't release this report because I don't think it works". I explained the various issues and asked if key players could meet to understand the problem and find a new approach.

I think many testers would do just what I did, but some may feel it's "not my job", or not feel confident enough to call for a dramatic change in approach. How can we empower all testers to dig into problems and bring together all the people needed to solve them?

Wednesday, October 8, 2008

Testers: The Hidden Resource

I spent last week at STARWEST, and enjoyed being amongst the fine testers who are interested in learning new things and delivering value. I learned a lot myself - great Ajax testing tutorial by Paco Hope, fun six hat idea from Julian Harty, and more.

I didn't have as much time as I would have liked to join in the interesting evening discussions at the Lost Bar (I had a lingering cough to tend) but we did have a brief discussion of how to raise the standard of the testing profession. I feel there are too many people calling themselves testers that are basically punching a clock, waiting for work to fly over the wall at them, and doing some manual scripted testing. It gives testers a bad name.

Someone suggested maybe we should invent a better title than "tester" to attract good people. Personally I am proud to be called a tester, but then I think back to my mom, a career secretary who had a business degree and good business knowledge, but insisted on being called a 'secretary'. She felt it was an honorable profession, and the title was not demeaning. Despite herself, she became an executive assistant.

Yesterday it occurred to me that maybe the way to approach this is to get employers interested in mining the hidden resources of their testing and QA teams. Testers have a lot to contribute. We're good at talking to customers and understanding requirements. We're good at raising a flag when we see a problem or a gap in communication. (I just raised one yesterday, because our accounting person is not getting the software support she needs, because we don' t understand her domain). We're good at collaborating with developers to help them understand the stories and help ourselves understand the architecture and design. We're good at writing test cases, and good at exploratory testing that helps the whole business learn more about how to improve the software.

But most companies leave testers' potential untapped. This is a waste. What if managers sent all their testers to at least one good conference a year, or if that's not in the budget, at least to local user group meetings? What if managers budgeted time for testers to learn and improve their skills? What if managers helped break down barriers between teams and enable collaboration? What if managers supported testers and gave them a chance to show how they can contribute?

Some of us (such as yours truly) are pushy and we step up and do these things for ourselves, but there's no reason businesses shouldn't help testers help themselves, and in turn, contribute more value to their teams. When I've managed QA teams, I helped each team member set goals for professional development, gave them time to learn, bought them the books they needed, got them the training they needed. I had amazing teams of self-motivated people who worked together well and collaborated with customers and developers well (and this was in my pre-agile days). I'm not saying I'm some genius of a manager, I'm saying that giving time, training and support pays off in spades.

How can we get the word out to those that don't understand the huge hidden resource which may lie in their testing and QA teams?

Wednesday, September 17, 2008

Agile Acceptance Testing

Gojko Adzic has been doing some good writing on acceptance testing in agile development. See his article here: http://gojko.net/2008/09/17/fitting-agile-acceptance-testing-into-the-development-process/
I'm not sure whether Gojko considers acceptance testing as a part of what he calls 'normal agile development' or not, but to me, it's an integral part of development. I see coding and testing as two parts of a whole. I wrote a long posting in response to his post, and so as not to waste it, I'm going to include it here too.

Here's an example of the business-facing testing activities we do during the iteration. I wish I could do a nice graphic like Gojko.

Day before start of iteration - product owner goes over stories with us, we ask questions, get some examples, perhaps break a big story into thin slices, go away and think about it. Examples and overview may be put on wiki, along with any pictures/flow diagrams drawn. Product owner posts "conditions of satisfaction" for each story on wiki.

Day 1 of iteration - Retrospective, then iteration planning, we write and estimate task cards for all coding and testing activities for each story, until we have enough stories to keep us busy for at least the first few days of the iteration. Testing cards for each story might be "Define high level tests", "Write FitNesse test cases", "Manual exploratory testing", "Write GUI smoke test", "Obtain test data". Then demo last iteration to customers. Then release last iteration's code.

Day 2-3 of iteration - Write high level test cases, which involves more discussions with customers, perhaps design meetings. Do paper/whiteboard prototyping with customers where needed.

Day 2+ - When a programmer picks up first task card for a story - start writing detailed test cases, in FitNesse if that's possible for the story. When a happy path FitNesse test passes, write more interesting test cases, collaborating closely with programmer (sometimes it's the programmer writing the FitNesse tests)
When a testable chunk is available for exploratory testing, start on that. Show things to customers and get feedback whenever possible.
When all coding task cards are done, do exploratory end to end tests, automate GUI smoke tests where appropriate. (Or if we're working in thin slices, do this for each slice).

We focus on completing one story at a time, in priority order. Not everyone can work on one story, but the idea is we are trying to get stories done. Bring in more work as we have room. If we took on too much, figure out what to set aside, and focus on finishing.

So not all our integration and exploratory testing is done the last couple of days, it's spread out more. On the last day, we are wrapping up the last story or two.

If we have a big theme coming up, we write cards to have brainstorming, research and design discussions for an iteration or two in advance of actually starting on the stories.

For my team, we've found that getting into too much detail in advance wastes time and leads to confusion. We do try to get examples, but we try to keep the discussion pretty high level and not get too much into details.

Thursday, September 11, 2008

Much Ado About Agile! Agile Vancouver 2008

I am fortunate enough to have been invited to present at Agile Vancouver (www.agilevancouver.ca). They're calling their conference "Much Ado About Agile", and Ken Schwaber is giving a keynote. Here are just a few of the presenters: Jim Shore, David Anderson, Janet Gregory, Frank Maurer, David Hussman, James Grenning, Jennitta Andrea. It's scheduled for Nov. 4 - 6, and registration will open in the next few days.

I'm doing a track session called "Are Agile Testers Different?" and a tutorial on an iteration in the life of an agile tester. I'll only get to do the tutorial if enough people sign up, so if you're interested, please sign up early!

The best part of any conference is the informal connections you make, talking to people facing the same issues you face, getting lots of ideas and inspiration. This conference will be a wonderful opportunity for that kind of networking. And it's put on by a fabulous group of volunteers, who are going out of their way to give back to our agile community. Please consider it!
(And a note to U.S. residents - flying to Seattle and renting a car to drive to Vancouver is WAY less expensive than flying straight to Vancouver, plus, I think you still don't need a passport at the border if you drive, but I could be wrong about that).

Tuesday, September 2, 2008

Agile Book Writing

Janet and I turned in our "final" manuscript (still have copy editing, proofreading etc to go) last night. The book is available for preorder on Amazon. on Amazon - how do they know how many pages it will be???

One cool thing (at least, in our opinion) is that we used agile practices to write the book. A year ago, we got together and mind mapped the whole book. Janet came up with a release plan, and we started in on it. We had two-week iterations. At the start of each iteration, we mind mapped two chapters, then started writing. We used an ftp site to keep the latest version of each chapter, and to pass chapters back and forth. (This was important, because otherwise it would've been easy to lose track of the latest version, or 'walk on' each others' changes). We each had a folder - 'Awaiting Janet Review', 'Awaiting Lisa Review'.

At the end of each iteration, we 'released' two chapters - that is, we posted them on our book site for review by our volunteer reviewers. We used a spreadsheet on Google Docs to help our 'source code control', checking chapters in and out. We also kept a 'to do' list and notes for the bibliography on Google Docs.

We got continual feedback throughout this development cycle. We were able to visually share ideas with our mind maps (using MindJet MindManager software). We kept in constant touch with IM, and met a couple of times in person. We build a couple extra weeks into the schedule during the holidays, and kept working at a sustainable pace (it's really hard to write a book when you have a full time job!)

By March, we had all the draft chapters posted and informally reviewed. Janet put together a new release schedule to revise all the chapters in time to send our draft manuscript to the publisher June 1. We did a major refactoring of the book, moving sections around and completely reorganizing the automation section. We made our deadline, although I confess we were more 'agile' about our release plan, we worked on the chapters in a different order than planned.

We waited a few weeks for our official reviewers to give us their feedback, then we did the last, most punishing stretch, making our final revisions. We again did a fairly major refactoring, eliminating two chapters and folding the information in them into different chapters where it made more sense.

If we took even more time, we're sure we could make the book even better. In fact, we've gotten some ideas here at the end that we could have used. However, people have been clamoring for this book (which is gratifying) since we started writing it. Every time we go to present at a conference or help a team, people want to know when it will be out, and complain when we say "January 2009". We feel (and so does our publisher) that it's better to get some really useful information out earlier, rather than to keep polishing the book and get a more perfect book out later. I don't know about you, but that's how my agile development team works, too.

It's been really interesting to see how we were able to apply agile values, principles and practices to writing a book, as well as to developing software. Wonder what else could be done better with agile!


Tuesday, August 19, 2008

Alternative Metaphors for Technical Debt

Brian Marick (see link on side nav) has been blogging about new metaphors for technical debt. He used "fertile assets", a gardening metaphor, which I like. If you don't amend the soil and plant your seeds at the right depth and weed thoughtfully, you won't get a good harvest. You might garden rather lazily like me and use ground covers and minimum tillage, but those are still good practices and produce good results. But if you let all the weeds grow up, it might be hard to remove them without damaging the good plants. Just like code that's hard to change if you don't have automated tests to tell you something got damaged.

I like Brian's metaphor, but to me, it's too positive. If we don't protect our code with tests, and don't refactor continually, and just hack in code changes without thinking, we won't just have a bad harvest, we might ruin the soil so it can't grow anything again.

I think technical debt is more like global warming. Lots of people claim it's just a myth and that we can ignore it. If I, as a tester, try to use good practices, but the programmers don't, it's like Denmark being green but not China. If everyone collaborates on a solution, emissions will go down quicker, and we will all breathe easier.

But, since global warming is a politically charged subject, it will get people arguing about that instead of thinking about how to make teams understand how shortcuts, poor design and lack of automation will grow an ever-larger burden for them to carry.

I'm now working on a metaphor that involves donkeys, perhaps log skidding. Recently my friends cut some small trees and branches near their creek bottom to make room for a bridge. We used the mini donkeys to haul the wood and brush over to the cabin's woodpile. If we tried to put too many logs in the bundle, or tied them so they dragged too much in the dirt, the donkeys had trouble pulling them up the hill. If we didn't take the time to tie them together well, they fell off. But when we took enough time to prepare optimally-sized, securely-tied, drag-dynamic bundles, we could make quick trips, and the woodpile grew fast. A little experimentation taught us that cutting corners didn't pay off.

Thursday, August 14, 2008

Testers can have an influence (so can anyone!)

We needed to do a story which had several tentacles, some into ugly parts of the legacy app. The story was basically to fix invalid data that's already out there, because business rules weren't enforced through the app.

Product owner never wants to "spend points", so he wanted just the very basic things that would get the immediate need resolved, and leave some other parts of the process manual. That means humans have to remember a lot of detailed, obscure business rules. That means in a couple of years, we'd get right back to where we are, with invalid data.

We do this sort of thing fairly often. It usually starts as "OK, just to get this out the door we'll do parts A and B, and we'll do C next sprint", but we never do C because the business can work around it by asking for manual updates to the database, which they seem to feel are "free". In fact, the very problem we were solving was caused by the fact that we never did the second part of a two-part series of stories, so a system requirement was just flung into a black hole.

I'm "only" (as some would say, not me) a tester, but I made my case. With fairly minor and inexpensive (say, 6-8 hours each) to 3 pages of the UI, we would prevent a lot of requests for manual updates later, and there was a high risk that the person doing the manual updates wouldn't know all the rules and would make an error. We'd also present the end users with valid data instead of, well, no data.

The product owner didn't like it much, but the developers saw the same problems I did, and wrote task cards for the three extra changes.

No matter what your main activities are on a team, if you see an issue, raise it, and present the problems and potential costs clearly. If your team can listen to reasonable arguments, you may be able to show them your point of view, and they'll likely agree with you. This shouldn't be adversarial in any way, but you can use your passion for delivering value to good effect.

We testers aren't here to be the Quality Police, but agile development should be about doing the right things "right". When you smell a hack, point it out and ask if it's reasonable to take the time to do it right.

Sunday, August 10, 2008

Thoughts on a Sunday

Why can't we Americans get a grip on how to reverse our energy problems? Thomas Friedman has a great column today.

And we could all step back and take a more generous, common-sense, and not-deathly-serious view of the world, like the Unitarian Jihadists.

I'm missing Tim Russert today.

Go, U.S. Dressage Team and especially Steffen Peters. The Chinese need to get their silly Olympic website working right, BTW.

OK, that's all off my chest, back to working on the final manuscript revisions for the Agile Testing book!

Friday, August 8, 2008

Back from Agile 2008 - what did I learn?

Agile 2008 was hectic and I couldn't stay for all 4 days, so I know I missed out on lots of great ideas. However, I have the cd, and the phone-book-sized book of research papers and experience reports, which I can delve into as soon as we get done with our final manuscript revisions!

It's always so good to see the friends I only get to see once a year, as well as meet people I only know through the Internt. There were so many attendees this year, I didn't run into everybody I know, but great to see the ones I did run into.

I was reminded of how welcoming the agile community is in general, and what good people they are. One of my role models is Kay Johansen. She rocks because she's good at developing AND testing. She's both highly technical and great with people. Her workshops are always fun and super effective. She and her husband Zhon taught me mind mapping, which Janet and I have used to great effect to write our book. I'm looking at her materials from the "testing with a purpose" clinic she did at Agile 2008, and it reminded me that I don't focus enough on declarative testing, I get too involved in the "how" instead of the "what".

I met Patrick Wilson-Welsh this year because he asked me to help him with his tutorial on 'flipping the automated test triangle'. Patrick is such an engaging speaker, and he has a gift for fun group exercises - who else would think of having each group compose a Haiku about obstacles to automated unit tests? Like Kay, he is one of those quietly competent people, who is ready to learn from others. I loved hearing how he has been pairing with the developers at his current gig, letting them see he knows what he's doing, before helping them think of ways to improve. People learn a lot more from that than from some "expert" just coming in and telling them what to do.

And the best part about Patrick and Kay is they are two of the nicest people I know. So, I will strive to achieve their level of expertise and knowledge, while trying to help people as best I can, learning something from everyone.

Someone at Agile 2008 observed that there are lots of women at this conference, as opposed to, say, Oopsla, and wondered if it was because Agile creates an environment that women enjoy. I think there's a lot of truth to that. Myself, and most women I know, like the people side of things, we like collaborating with people. Sitting in a corner writing code by myself isn't my idea of fun. If agile development draws more women into our profession, so much the better.

Friday, August 1, 2008

Agile 2008

Agile 2008 starts next Tuesday. Janet Gregory and I will do a tutorial Tuesday morning on "The Tester Who Came In from the Cold". We designed this tutorial in response to meeting so many teams where the programmers got trained in TDD and CI, the project managers all went to ScrumMaster training, but the testers were just left wondering what they were supposed to do on this new agile team. We'll give examples of how managers can help testers "come in from the cold", as well as how testers can help themselves. We have a cool exercise planned, we're pumped.

On Wednesday, I'll help Patrick Wilson-Welsh with his super tutorial on how to get your "test pyramid" right side up. Most new agile teams have the most test automation at the GUI layer, because those tools are usually easy for testers to learn, and the least at the unit level, because TDD is hard to learn. Patrick has added to Mike Cohn's test pyramid metaphor with a "three little pigs" reference. The unit level base of the pyramid is bricks, the behind-the-GUI middle layer is sticks, and the top GUI layer is straw. Of course you need GUI test automation, but it's usually the highest maintenance and the lowest overall ROI. He'll demo examples of tools for each layer. We'll both share stories of how our teams inverted their test pyramids. If time permits, we'll have a "micro open space" for participants to brainstorm how they'll go back and get their pyramids flipped around!

Please let me know if you'll be at Agile 2008. I'll only be there Tuesday and Wednesday (many conferences this year, a full time job and not enough vacation time!) but I'd like to catch up with as many friends (old and new) as I can. I'm especially looking forward to catching up with all the great agile Canadians (particularly those from Calgary!)

I've been to every XP, XP/AU, Agile Development Conference and Agile conference that's been held, starting with XP 2001 in N.C. I'm glad I'll make to Toronto! (I love Toronto too, wish I had more time to spend there...)

Thursday, July 24, 2008

Optimizing with a Remote Team Member

Last December, our manager moved back to India. He's still a VP and still writes code, although he's not our direct manager anymore.

Although he arranges his hours to accommodate us, working until almost midnight his time, there are still those hours where he's working and we're asleep. Sometimes he's left with no work to do, especially the first day of the sprint when we haven't planned our work yet.

This sprint we're trying an experiment. Our sprint starts Friday. On Wednesday morning, we had a meeting with the remote team member (let's call him "G"), our ScrumMaster, the product owner, our manager, and me (a tester, so I could get the feel for how much testing might be involved). The product owner had come up with three stories to "earmark" for G. We discussed the first story at some length, as it's rather complex and involves a business partner. We didn't discuss the other two stories. G felt the first story would keep him busy for a few days.

He proposed to write his task cards on his Friday morning, and go over them with the rest of the team in our sprint planning. (Now, for all of you who are down on task cards now, I can see why, but it's working for our team). We'll have to write the testing task cards then, too.

Today (Thursday), we had our "pre-planning" meeting. We went over the stories earmarked for G, plus the other stories that might be in this sprint. Tomorrow we will plan the sprint.

The three stories for G add up to almost as many points as the whole team does in a sprint. Now, G is a superman, for sure, but that seems kinda crazy. I have raised this issue. What if G stays busy for several days with Story 1, and other developers on the team free up? Do they have to keep their mitts off G's other stories?

But it's just an experiment. We'll see how it goes. Do you have a distributed team? How do you make sure everyone always has enough work? Please comment.

Friday, July 18, 2008

Agile Tester Skills

A StickyMinds column by Johanna Rothman about exploratory testing on agile projects prompted a discussion about what skills agile testers need on the agile-testing mailing list.

Janet Gregory posted this response, I thought it was insightful.

Johanna talks about the perfect tester skills but acknowledges generalists without coding skills may have a place. I have seen a few teams where none of the testers had coding skills, but worked with the developers to create the automation framework using FIT or some other similar tool.
If the team works closely, and develops a framework that the 'non-coding' testers can use easily, the testers can be very productive. I do agree that it helps the team if the testers can code, but it is not absolutely necessary. Because the whole team is responsible for quality, the whole team usually can solve any problem.
I think that some of the most valuable skills a 'tester' can bring to the table, are critical thinking and understanding of the big picture - implications of proposed changes. Exploratory testing helps to expose issues if testers have those skills, and can use them properly.

She adds in a later post:
I didn't take it one step farther which is to say, that many of the "manual" testers who started using the spreadsheets for Ruby/watir implementation, eventually got comfortable enough to start understanding the code and actually doing some of their own scripting. That shows initiative and proves that everyone is capable of learning :-)
I do think it is absolutely beneficial to understand scripting and coding, and I do encourage learning those skills. I think testers will go so much farther if they do. However, it is important that we don't discourage those testers that don't have any coding skills when they first come to an agile team. I guess those are the testers I work with mostly.

I've worked with "manual" testers who were happy to learn new skills such as automation when given the time and support they needed. I've met so many teams where the testing team freaked out when the development organization adopted agile. Telling testers they need these skills they've possibly never had the chance to acquire is not going to help with the transition.

I also know lots of testers who had no interest in learning new skills or improving. People like that won't fit on an agile team, for sure. But anyone who gets excited about agile and wants to work on an agile team is worth a bit of investment, in my book.

Tuesday, July 8, 2008

The "Not My Job" Game

Expanding on last week's post...

The NPR news quiz show "Wait Wait, Don't Tell Me" always includes a segment called the "Not My Job Game". A celebrity has to answer questions about something completely unrelated to their own profession. Sometimes, through logical thinking, intuition or both, they're able to guess the correct answers.

Agile team members play this game every day. OK, maybe the questions or tasks aren't completely unrelated to our normal job, but we don't worry much about whether something is really our job or not. If it needs doing, we do it.

My team is about to embark on a theme to rewrite the code that produces account statements we send out for every 401(k) plan participant. We have several outstanding issues about the statements, and I wasn't sure if they are bugs or stories, as we are going to rewrite the statements anyway.

I asked the product owner and the head of plan administration for a quick meeting. We discussed each issue, and decided that the product owner would write new stories to ensure they are addressed. We'll probably need an engineering meeting to talk about potential solutions for one or two of them. Then we'll estimate the stories and we'll have a better idea when we'd better start on this theme, because it has to be done in time for the 3rd quarter statements.

Was it my job to worry about writing stories and organizing this meeting? Some people might say that was the job of the ScrumMaster, product owner, coach or manager. I didn't want these issues to fall through the cracks. Nobody here thinks it's weird when a team member takes on a task such as this.

This is one thing I love about agile development. Each of us is empowered to take the action needed to make sure we can deliver the most business value.

Thursday, July 3, 2008

What an Agile Tester Does

This has been kind of a crazy week. I've had to multitask a lot, which I know is bad and inefficient. Here's a sample of the types of tasks I've done this week:
- Write high level test cases
- Automate tests in FitNesse
- Debug problems with Canoo WebTest GUI tool and tests
- Look into three different production issues
- Manual exploratory tests of various stories
- Participate in estimation meeting
- Spend a lot of time getting requirements and clarifications from various customers
- Help team strategize best way to make sure we get everything completed this sprint

If you're an tester on an agile team, are those in line with the things you do? Are you surprised by any of them?

I think some people might not expect a tester to work on production issues, but as testers, we tend to learn a lot about the business domain. It's natural for business people to come ask me where something is documented on our wiki, or how a piece of functionality is supposed to work (at least, how we delivered it, maybe we were wrong on how it should work!) I get interested in production problems because I want to know how we didn't run across a particular issue in testing. Working on production issues isn't in my job description, I suppose, but it helps me learn a lot about the app and what we can do to improve it.

I know a lot of agile testers who do similar tasks. I'm interested in what other things agile testers do.

Monday, June 30, 2008

CITCON Takeaways

Sounds like CITCON Melbourne was a huge success. Reading about it made me think back to CITCON Denver in April, and what my team is doing differently as a result of what I took away from that. The biggest concrete change is our sys admin is working on moving to Hudson instead of CruiseControl. He's setting up a new build machine with it.

Another takeaway was how intrigued I am with behavior-driven development and tools such as Easyb, although I'm not sure yet how or whether to apply it here. It did affect my thought pattern when writing test cases, although I know BDD is supposed to be about coding and design, not customer-facing tests.

SO MUCH FUN!

The donkey boys (our miniature donkeys) did real work this weekend, hauling brush and logs from a friend's creek bottom to their woodpile. Driving is a much bigger challenge when you have to go up and down steep banks, and navigate around cactus and willows. My skills improved, but I need to get a lot better! Will post photos when I find time...

I suppose I should think of a metaphor to connect this to agile testing in some way, but it was just plain fun, and a great break from so many months of hard work on the book and on presentations. Now, back to work!

Friday, June 27, 2008

Improving

Last sprint was tough, we didn't get finished and we couldn't release. Fortunately this wasn't a big deal, there is no big deadline to meet. However, we only finished one story - that's not good. And it's only about the third time in four years we didn't finish all stories.

We had a long discussion about what went wrong and how to do better. One thing was that although we identified 6 "steel threads" (aka thin slices, tracer bullets) in our big UI story, we didn't finish the third thread until the 7th day of the sprint - at which time most of four more threads were also done. It was incredibly confusing to try to test it, we couldn't tell what should work and what shouldn't, and we missed a couple of gaps in the features. Plus, unit tests were being written after the fact.

Here's our stop/start/continue list that resulted from our retrospective:

Start:
  • Transfer whiteboard drawings from conference room to Scrum arena, and check off threads as they are done
  • First thread of first story has to be done by first Tuesday of Sprint
  • Log remaining hours on task card (in Mingle) prior to Scrum
  • Test first
Stop:
  • Working more than one thread ahead of unfinished thread
We know the steel thread concept works for us - we just have to do it properly. UI stories are harder to do test-first, but we suffer when we don't. Lots of reasons we got off track last sprint, but we think this will get us back on.



Wednesday, June 25, 2008

Even Agile Testing Can Be Frustrating

We're having a hard sprint. We thought the stories were pretty straightforward, but they turned out to be hard, and there are also a lot of interdependencies. We did our steel thread exercise, but for whatever reason, the first thread took many days to finish and that put everything way behind. It will be an interesting retrospective.

I've been multitasking from one story to another and run into roadblocks, both in my manual testing and my automated testing. FitNesse fixtures don't work like I expect, and they're hard to fix. We're replacing old code with new code, but don't want to rip out the old code yet just in case we can't release. So that's confusing. I feel like I just can't get anything really done.

There's much testing left to do, and only one more day left in the sprint. We rarely have sprints like this, and when we do, I have a hard time feeling productive.

Plan of action: Try to quit multitasking, and focus on getting one thing done. Hope for lots of good luck!

Monday, June 23, 2008

Pass-It-On Grants for Women in Technology

If you or anyone you know might be interested in one of these grants, please check it out. If you're a woman in a technology field and not already on the Systers mailing list, check that out too.

The Anita Borg Systers Pass-It-On Grants honor Anita Borg’s desire to create a network of technical women helping one another. The grants, funded by donations from the Systers Online Community, are intended as means for women established in technological fields to support women seeking their place in the fields of technology. The grants are called “Pass-It-On” grants because they come with the moral obligation to “pass on” the benefits gained from the grant.

Pass-it-on Grants are open to any woman over 18 years old in or aspiring to be in the fields of computing. Grants are open to women in all countries and range from $500.00 to $1000.00 USD. Applications covering a wide variety of needs and projects are encouraged, such as:

  • Small grant to help with studies, job transfers or other transitions in life.
  • A broader project that benefits girls and women.
  • Projects that seek to inspire more girls and women to go into the computing field.
  • Assistance with educational fees and materials.
  • Partial funding source for larger scholarship.
  • Mentoring and other supportive groups for women in technology or computing.
For more information, go to:

Pass It On Grants

Thursday, June 19, 2008

Interdisciplinary Awareness

One of the most interesting sessions I attended at Better Software was "You Just Don't Understand Me: Interdisciplinary Awareness to the Rescue" by Mike Tholfsen, principal test manager of the Microsoft OneNote team. (See his blog at http://blogs.msdn.com/onenote_and_education/) He presented a "Team Pyramid" (don't we love all these pyramids?) showing that for a successful team, you need trust as your base. Results are the little top of the pyramid, they come from trust, healthy conflict, commitment, accountability and results.

Mike feels that one way to achieve this is to help people understand their peers' viewpoints better. He introduced an exercise to help people in different roles on the team understand the important traits of each discipline, and trade ideas on what teammates in other roles like or dislike about each discipline.

An interesting point of the presentation was that there was a development team where all the team members scored the same on a Myers-Briggs style test. The manager had hired himself four times. Lack of diversity is not a good thing.

This made me think about the times we've hired a tester onto my agile team. I brought in testers who were great with communication, collaborating with customers, exploratory testing, but not a lot of skills on the automation/technical side. I felt they could contribute value, but my developer coworkers gave them thumbs down. In each case, we hired a very techy tester (fortunately, also good at all the other things).

Mike's presentation made me realize that the developers were most comfortable hiring someone like themselves. This is understandable, but not always in the best interests of the team. Having realized this will help us in future hiring efforts, I think. Do we really not like something about the candidate, or is it just that they're different than we are? Could the differences make us better?

There appears to still be a lot of controversy in the agile community over testers and their role on the team. Interdisciplinary awareness might help agile teams realize that people with a different skill set might add tremendous value to their team.

Friday, June 13, 2008

Better Software

I attended Better Software this week. Each time I go to a conference such as this, I am pleasantly surprised that more and more people are on agile teams and have a good understanding of agile principles, values and practices. There are also plenty of people wanting to learn about agile.

In my class on "Ten Principles of an Agile Tester", a QA manager described an interesting problem (to which I had no ready answer!) The programmers in her company have decided to implement agile, but they think they can do it all by themselves, in isolation. They feel they don't need the testers, analysts and all the other groups to go along. From what I understood, the programmers felt they could still just hand off the code to the QA group when they were finished, and they apparently believed they could communicate with the customers directly and get the requirements.

The QA manager is understandably frustrated with this situation. She has tried repeatedly to sit down with the programmers and figure out how the testers can work with them, and show them how the testers can add value. They aren't cooperating.

What to do? I was once in a situation where I joined a team that had never had testers, only developers, although they did have BAs. They wouldn't let me join the daily standup, they wouldn't include test estimates in story estimates or in iteration planning, and they declared stories "done" before they were tested. I argued that I should be a part of the team, to no avail. After we missed the release date because the testing wasn't finished, I asked the coach if we could try things my way, and work together all as one team, everyone taking responsibility for testing, not considering stories done until they were tested. He agreed to try it for the next release cycle. We released on time and the customers were happy. So there was no more debate.

In my experience, sometimes people have to feel the pain before they'll change. I know for sure that evangelizing doesn't work. (Blogging feels a bit like evangelizing which is one reason I never did it before).

I'm interested in your comments, if you've ever had a situation like this, what did you do?

BTW, a couple of posts back I mentioned that we were tracking our refactoring tasks in our DTS, in our online story board tool, and in our Wiki. We have decided to try tracking them only in the online story board tool, as task cards, and add some fields so we can find them easily and include all the information needed before and after doing the task. We'll see how it works! Experimenting is good.

Friday, June 6, 2008

The Future is Now

I heard Ray Kurzweil on Talk of the Nation Science Friday today. As the NY Times put it last Tuesday in the Science section ( The Future Is Now ), he's a futurist with a track record. He observes that certain aspects of technology follow predictable trajectories. Computing power first doubled every three years, then every two, now every year. IT is revolutionizing biology, medicine, energy and other fields. Nanotechnology, gene sequencing and brain scan resolution all progress exponentially.

I'm no futurist, but I think we are seeing this in testing as a result of agile development. Since the last 90s, tools such as JUnit have led many teams to use practices such as TDD and CI that vastly improve software quality. Getting programmers interested in testing and test automation has led to an explosion of useful open-source testing-related tools.

We testers (and when I say that, I mean all you programmers and everyone else who tests) have been left out in terms of IDEs and other tools that would make writing and automating tests way faster and easier. That's changing now and I bet it will change really fast. Maybe in a year or so we will have fabulous tools for writing and refactoring tests, tools to help us focus better on exploratory testing, maybe tools for types of testing we haven't thought of yet. The Agile Alliance functional test tools group headed by Jennitta Andrea is pushing an effort to get to a new generation of functional test tools.

I think the future of software testing is just about here.

Speaking of the future being now, Janet Gregory and I turned in our draft manuscript for our Agile Testing book last Sunday night. Whoohoo! Four lovely people will be doing technical reviews of it, and then we get to revise it some more, and it's supposed to go into production in August. I hope that means it will be out in the fall. I wish I had a way to express our gratitude to the many people who have helped us, and in turn agile testers and teams everywhere, with this book.

Thursday, June 5, 2008

What to Track Where

We are in the midst of a "refactoring" sprint. This is an opportunity to do tasks that reduce technical debt, and not have to deliver any stories for the business. We're upgrading Spring, log4j, Canoo WebTest, refactoring user searches to use a custom DTO for searching, removing an unnecessary and confusing column from a table (which requires a ton of code changes), and stuff like that.

How to keep a record of what we changed, so that later, someone can say "When was it that we removed that tax id column and changed all that code?" and we can find it quickly? We have a wiki page for each sprint, and up to now, we wrote all the refactoring changes on the sprint wiki page. However, now that we can't use physical cards (because one team member is remote), we have more tools to track things. We put all our refactoring "cards" in a "refactoring" database in our DTS. When we started this sprint, we wrote cards in our online story board (Mingle) for all the refactoring cards we might do (which seems like extra work, but Mingle has some visibility advantages over the DTS).

As we started to log all our refactoring changes actually made on the wiki, it starts to seem like double work. In the spirit of DRYness, we want to pick one place to track the refactoring. The DTS and MIngle seem like the most likely options, but we should pick one or the other.

I'm trying to get a consensus, see this space for what we decide. If you have solved a similar problem, please post a comment!

I haven't figured out the answer to yesterday's dilemma either, about some automated way to verify our test schema whenever we refresh it from production. So far this blog seems to be about questions, rather than answers.

Wednesday, June 4, 2008

Finally!

I've been meaning to start a blog for ages. Now that Janet and I have finished our draft manuscript, it feels like there might be time for a post here or there. Thanks to Dawn Cannan for inspiring me with her great new blog.

So how about something interesting to start out with. Today there was a thread on the agile-testing Yahoo group about defect tracking systems (DTS), and someone asked for an example of when you got information from a DTS that was useful. Today our test system got an error that seemed mighty familiar, but I couldn't remember the cause from when it happened before. I was able to find it easily in the DTS, it was a missing trigger. Periodically we replace one of our test schemas with a fresh copy of production, and in the process, triggers get disabled or lost (not sure why). If I couldn't have found this in the DTS, it would have taken longer to track down the problem.

Someone replied to this post wondering why I didn't have a test to detect the problem. The regression suite that runs in our build uses a different schema, that has canonical data. However, I could easily have run a regression suite against the new copy of production and found the problem. I'll do that from now on, but how to remember to do it? Manual steps are easily forgotten...