I'm getting my tutorial and track session ready for Agile Vancouver: "Much Ado About Agile".
This conference is put on by the all-volunteer Agile Vancouver user group, and they've put together an impressive lineup of both 'thought leaders' and practitioners, including Key Schwayber, David Anderson, Jim Shore, Jonathan Kohl, Frank Maurer, David Hussman, Janet Gregory, Jennitta Andrea and many more. The tutorials are on Tuesday, Nov. 4, and the conference is on the 5th and 6th.
I'm presenting a tutorial called "Crossing the Chasm: Helping Testers Make an Agile Transition", to help testers new to agile, and teams wanting to help their testers get engaged in agile development. Agile development may have "crossed the chasm", but a lot of times, testers are left stranded on the other side. This tutorial is designed to show how various members of the organization can help testers get the skills they need to be successful agile team members, and help testers understand how to transition traditional testing activities to an agile setting. We'll do some group exercises to help each participant be able to go back to their teams and build bridges across the 'chasm'.
I'm also presenting a track session, "Are Agile Testers Different?" This material came out of questions I have heard a lot: If developers are writing tests, what do these testers do? What's so special about an agile tester? Do agile testers need different skill sets? How do I hire a tester for an agile team? I'll present a list of Five Principles for an Agile Tester that participants can use to learn new ways to add value to their organizations.
There are several other sessions on agile testing, including one on the Agile Testing Matrix by my co-author Janet Gregory, and one on Agile Requirements by Jennitta Andrea. Both Janet and Jennitta are awesome presenters and both sessions should both be quite valuable to testers and agile teams. In addition, Gerard Meszaros is doing a tutorial on Unit Test Patterns, and Mike Stockdale is doing an Automated Testing Clinic.
I haven't been to Vancouver but I'm really looking forward to seeing it, and meeting all the folks at the conference. It's worth traveling to get to this conference, especially if you haven't been to any agile or testing conferences recently. There's a lot of good new information to learn.
If you sign up for my tutorial, please send me any questions or issues you want covered in it.
Thursday, October 23, 2008
Wednesday, October 15, 2008
An Example of How Testers Contribute
Here's an example to support my previous post about the "hidden resources" that may be lurking in testers, but not used for whatever reason (lack of support from team or managers, lack of time, lack of resources, lack of imagination).
Our web-based app manages all aspects of 401(k) plans, including doing trades. Lots of money moves around, but a lot of it moves around outside the app itself, in QuickBooks, via emails to a custodian, and so on. We, the development team, is quite expert in the 401(k) domain, and fairly lacking in the accounting domain.
We had a story three iterations ago to "tweak" a report so the accountant could use it to help track down the sources of problems when the cash accounts didn't reconcile. This was a small story, but because we never took time to really understand the big picture of how cash flows around the system (both inside and outside of the app itself), and really understand what the accountant needed to do, it grew into a giant story. There were plenty of issues that to me were smells. It was difficult to test because we didn't have production-like data for it and couldn't produce it; the results of running the report were erratic and not always explainable; it turned out that if certain batch processes ran at the same time, the data for the report would be corrupted and become useless.
Long story short, I raised all kinds of flags, but there was pressure from the business to just get this "little" story done. Although both the programmer and I spent time with the accountant trying to understand what was needed, we didn't even learn enough to know that we didn't know enough. We went through two iterations of fixing and re-releasing, to no avail. We wasted time in meetings discussing how to fix the report, when in truth the report was the wrong approach for what the accountant needed from the get go.
I finally convinced everyone that we needed to take a step back and have the accountant explain to us the larger picture of how she has to reconcile accounts, all the different ways money can move around and the possible sources of imbalance, and what information she really needed to have. It was finally decided that a completely different report was needed. The time on this story was wasted, except for the lessons we learned about how little we understood about the accounting domain.
Was it my job, as the tester, to put the brakes on this story and insist that we start from scratch to understand the requirements? No, the programmer could have raised that issue, or the ScrumMaster, or the product owner (the poor accountant tends to get ignored, for no good reason other than accounting doesn't sell 401(k) plans, and she was already doing all she could to try to make herself understood). However, my testing of this report showed that there were some basic issues in the system itself that would have to be fixed before this report could work, so either we had to address those with new stories, or take a different approach to get the necessary data.
I feel that as the tester, I had a better opportunity to see the big picture of what was going on. I took some time to sit with the accountant and understand more about her domain, and what she needed for her reconciliation process. We testers can detect smells and provide information about them to the rest of the team. We can, as I did in this case, bring together customers and developers to re-think a problematic story. I didn't throw myself down on the floor in a fit and say "we can't release this report because I don't think it works". I explained the various issues and asked if key players could meet to understand the problem and find a new approach.
I think many testers would do just what I did, but some may feel it's "not my job", or not feel confident enough to call for a dramatic change in approach. How can we empower all testers to dig into problems and bring together all the people needed to solve them?
Our web-based app manages all aspects of 401(k) plans, including doing trades. Lots of money moves around, but a lot of it moves around outside the app itself, in QuickBooks, via emails to a custodian, and so on. We, the development team, is quite expert in the 401(k) domain, and fairly lacking in the accounting domain.
We had a story three iterations ago to "tweak" a report so the accountant could use it to help track down the sources of problems when the cash accounts didn't reconcile. This was a small story, but because we never took time to really understand the big picture of how cash flows around the system (both inside and outside of the app itself), and really understand what the accountant needed to do, it grew into a giant story. There were plenty of issues that to me were smells. It was difficult to test because we didn't have production-like data for it and couldn't produce it; the results of running the report were erratic and not always explainable; it turned out that if certain batch processes ran at the same time, the data for the report would be corrupted and become useless.
Long story short, I raised all kinds of flags, but there was pressure from the business to just get this "little" story done. Although both the programmer and I spent time with the accountant trying to understand what was needed, we didn't even learn enough to know that we didn't know enough. We went through two iterations of fixing and re-releasing, to no avail. We wasted time in meetings discussing how to fix the report, when in truth the report was the wrong approach for what the accountant needed from the get go.
I finally convinced everyone that we needed to take a step back and have the accountant explain to us the larger picture of how she has to reconcile accounts, all the different ways money can move around and the possible sources of imbalance, and what information she really needed to have. It was finally decided that a completely different report was needed. The time on this story was wasted, except for the lessons we learned about how little we understood about the accounting domain.
Was it my job, as the tester, to put the brakes on this story and insist that we start from scratch to understand the requirements? No, the programmer could have raised that issue, or the ScrumMaster, or the product owner (the poor accountant tends to get ignored, for no good reason other than accounting doesn't sell 401(k) plans, and she was already doing all she could to try to make herself understood). However, my testing of this report showed that there were some basic issues in the system itself that would have to be fixed before this report could work, so either we had to address those with new stories, or take a different approach to get the necessary data.
I feel that as the tester, I had a better opportunity to see the big picture of what was going on. I took some time to sit with the accountant and understand more about her domain, and what she needed for her reconciliation process. We testers can detect smells and provide information about them to the rest of the team. We can, as I did in this case, bring together customers and developers to re-think a problematic story. I didn't throw myself down on the floor in a fit and say "we can't release this report because I don't think it works". I explained the various issues and asked if key players could meet to understand the problem and find a new approach.
I think many testers would do just what I did, but some may feel it's "not my job", or not feel confident enough to call for a dramatic change in approach. How can we empower all testers to dig into problems and bring together all the people needed to solve them?
Wednesday, October 8, 2008
Testers: The Hidden Resource
I spent last week at STARWEST, and enjoyed being amongst the fine testers who are interested in learning new things and delivering value. I learned a lot myself - great Ajax testing tutorial by Paco Hope, fun six hat idea from Julian Harty, and more.
I didn't have as much time as I would have liked to join in the interesting evening discussions at the Lost Bar (I had a lingering cough to tend) but we did have a brief discussion of how to raise the standard of the testing profession. I feel there are too many people calling themselves testers that are basically punching a clock, waiting for work to fly over the wall at them, and doing some manual scripted testing. It gives testers a bad name.
Someone suggested maybe we should invent a better title than "tester" to attract good people. Personally I am proud to be called a tester, but then I think back to my mom, a career secretary who had a business degree and good business knowledge, but insisted on being called a 'secretary'. She felt it was an honorable profession, and the title was not demeaning. Despite herself, she became an executive assistant.
Yesterday it occurred to me that maybe the way to approach this is to get employers interested in mining the hidden resources of their testing and QA teams. Testers have a lot to contribute. We're good at talking to customers and understanding requirements. We're good at raising a flag when we see a problem or a gap in communication. (I just raised one yesterday, because our accounting person is not getting the software support she needs, because we don' t understand her domain). We're good at collaborating with developers to help them understand the stories and help ourselves understand the architecture and design. We're good at writing test cases, and good at exploratory testing that helps the whole business learn more about how to improve the software.
But most companies leave testers' potential untapped. This is a waste. What if managers sent all their testers to at least one good conference a year, or if that's not in the budget, at least to local user group meetings? What if managers budgeted time for testers to learn and improve their skills? What if managers helped break down barriers between teams and enable collaboration? What if managers supported testers and gave them a chance to show how they can contribute?
Some of us (such as yours truly) are pushy and we step up and do these things for ourselves, but there's no reason businesses shouldn't help testers help themselves, and in turn, contribute more value to their teams. When I've managed QA teams, I helped each team member set goals for professional development, gave them time to learn, bought them the books they needed, got them the training they needed. I had amazing teams of self-motivated people who worked together well and collaborated with customers and developers well (and this was in my pre-agile days). I'm not saying I'm some genius of a manager, I'm saying that giving time, training and support pays off in spades.
How can we get the word out to those that don't understand the huge hidden resource which may lie in their testing and QA teams?
I didn't have as much time as I would have liked to join in the interesting evening discussions at the Lost Bar (I had a lingering cough to tend) but we did have a brief discussion of how to raise the standard of the testing profession. I feel there are too many people calling themselves testers that are basically punching a clock, waiting for work to fly over the wall at them, and doing some manual scripted testing. It gives testers a bad name.
Someone suggested maybe we should invent a better title than "tester" to attract good people. Personally I am proud to be called a tester, but then I think back to my mom, a career secretary who had a business degree and good business knowledge, but insisted on being called a 'secretary'. She felt it was an honorable profession, and the title was not demeaning. Despite herself, she became an executive assistant.
Yesterday it occurred to me that maybe the way to approach this is to get employers interested in mining the hidden resources of their testing and QA teams. Testers have a lot to contribute. We're good at talking to customers and understanding requirements. We're good at raising a flag when we see a problem or a gap in communication. (I just raised one yesterday, because our accounting person is not getting the software support she needs, because we don' t understand her domain). We're good at collaborating with developers to help them understand the stories and help ourselves understand the architecture and design. We're good at writing test cases, and good at exploratory testing that helps the whole business learn more about how to improve the software.
But most companies leave testers' potential untapped. This is a waste. What if managers sent all their testers to at least one good conference a year, or if that's not in the budget, at least to local user group meetings? What if managers budgeted time for testers to learn and improve their skills? What if managers helped break down barriers between teams and enable collaboration? What if managers supported testers and gave them a chance to show how they can contribute?
Some of us (such as yours truly) are pushy and we step up and do these things for ourselves, but there's no reason businesses shouldn't help testers help themselves, and in turn, contribute more value to their teams. When I've managed QA teams, I helped each team member set goals for professional development, gave them time to learn, bought them the books they needed, got them the training they needed. I had amazing teams of self-motivated people who worked together well and collaborated with customers and developers well (and this was in my pre-agile days). I'm not saying I'm some genius of a manager, I'm saying that giving time, training and support pays off in spades.
How can we get the word out to those that don't understand the huge hidden resource which may lie in their testing and QA teams?
Subscribe to:
Posts (Atom)