I spent last week at STARWEST, and enjoyed being amongst the fine testers who are interested in learning new things and delivering value. I learned a lot myself - great Ajax testing tutorial by Paco Hope, fun six hat idea from Julian Harty, and more.
I didn't have as much time as I would have liked to join in the interesting evening discussions at the Lost Bar (I had a lingering cough to tend) but we did have a brief discussion of how to raise the standard of the testing profession. I feel there are too many people calling themselves testers that are basically punching a clock, waiting for work to fly over the wall at them, and doing some manual scripted testing. It gives testers a bad name.
Someone suggested maybe we should invent a better title than "tester" to attract good people. Personally I am proud to be called a tester, but then I think back to my mom, a career secretary who had a business degree and good business knowledge, but insisted on being called a 'secretary'. She felt it was an honorable profession, and the title was not demeaning. Despite herself, she became an executive assistant.
Yesterday it occurred to me that maybe the way to approach this is to get employers interested in mining the hidden resources of their testing and QA teams. Testers have a lot to contribute. We're good at talking to customers and understanding requirements. We're good at raising a flag when we see a problem or a gap in communication. (I just raised one yesterday, because our accounting person is not getting the software support she needs, because we don' t understand her domain). We're good at collaborating with developers to help them understand the stories and help ourselves understand the architecture and design. We're good at writing test cases, and good at exploratory testing that helps the whole business learn more about how to improve the software.
But most companies leave testers' potential untapped. This is a waste. What if managers sent all their testers to at least one good conference a year, or if that's not in the budget, at least to local user group meetings? What if managers budgeted time for testers to learn and improve their skills? What if managers helped break down barriers between teams and enable collaboration? What if managers supported testers and gave them a chance to show how they can contribute?
Some of us (such as yours truly) are pushy and we step up and do these things for ourselves, but there's no reason businesses shouldn't help testers help themselves, and in turn, contribute more value to their teams. When I've managed QA teams, I helped each team member set goals for professional development, gave them time to learn, bought them the books they needed, got them the training they needed. I had amazing teams of self-motivated people who worked together well and collaborated with customers and developers well (and this was in my pre-agile days). I'm not saying I'm some genius of a manager, I'm saying that giving time, training and support pays off in spades.
How can we get the word out to those that don't understand the huge hidden resource which may lie in their testing and QA teams?
Subscribe to:
Post Comments (Atom)
12 comments:
saw your battlecry on twitter - decide to stop by to say few words.
There is a reason we have two hands. Building things is easier with two hands, Right hand swings a hammer while left hand hold a nail. Potter pushes the clay wall out with one hand while the other presses in to support it and not allow it to go out of whack.
Or... a bridge over the river requires two sides to push against to retain its function.
Walking is effortless one one foot meets gravity firmly on the ground while the other floats to its next position for next point of support... it is hard indeed to walk on one leg, one would have to jump.
The point is that the two always are there to push against each other to create balance and assure the integrity of structure and functionality.
The carpenter measures twice, cuts once and nails the needed plank. It would be truly strange if he were to only cut and nail without testing the length needed
It would be strange for a cook to make a meal without smelling ingrediants to ensure freshness and tasting mixtures as he prepares the next stage of the meal.
So 'testing' - the problem is we don't see it as a complementary force in software development.
'testing' helps to 'validate' the decisions that go into 'development'. Carpetner measures to validate the 'decision' to use this or that piece of wood. Cook 'validates' the 'decision' to use this or that amount of salt, or nutmeng. Potter validates her decision to push out the clay by the tension needed from the other hand. - Those 'validations' are not called 'testing' because they happen in parallel, simultaneously for us engage in making things.
So why is software development have this thing called 'tester'?
Isn't it a waste of time to have her on the team? Surely the developer alone pressing on the 'code' will not destroy the potter's clay, surely nailing some large pieces of code together with some other code will not destroy the house and surely causing an NullPointerException will not collapse the bridget. - This is after all software, machines can be rebooted. threads can be killed, restarted, databse recoreds can be reinitialized. - In Software there is no 'penalty cost' at development time. So why would you want to have 'testers'? Just hire only few necessary to test at the end of development and that's it. Really - one question constantly on my mind is : why don't we look at 'testers' as that opposing force ensuring the balance of work being produced.
With Test Driven Dev we have a glimpse of why testing is important. Why developer starts with a 'decision' that is 'unvalidated' in a form of a test. The validation comes as she writes a class and methods that satisfy the 'decision' offered by the test. In this way developers do work like using both hands to do a task using the two opposing motions to arrive at realized integrity of the intention given by a set of decisions. But you need 'testers' because they are experts in 'pushing back' . Developers alone can use TDD to validate the 'building' but 'testers' many times validate the 'intentions' of the building pointing at things that a 'builder' may not see.
Well, this is an interesting topic but I got to run. It is fun to write something like this from time to time. Cheers. marekj (aka testrus)
Yes, so there is lots of value testers can add - how do we get testers who aren't adding enough value yet to learn how, and how do we get their managers to realize there is untapped potential?
Wow Lisa! I am still hanging on that statement in the second paragraph in terms of the assumption that there are people calling themselves testers just punching a clock and waiting for code to be thrown at them.
And how that gives testers a bad name. In my career, I have yet to meet a team of testers that fall into that silo. Perhaps I am a lucky one.
Testers in my experience, whether they specifically handle manual or automated scripting or both are highly involved in their work teams, identifying design, requirements and risk & contingency issues with their dev teams; working with the customers, BAs and project managers coaching for optimal results - plugged deeply into the lifecycle every step of the way.
Testers should not rely on management to send them to conferences to 'elevate' the profession wishing and hoping that they'd simply 'get it' and see the long term benefits of investments in resources. I agree that good conferences help develop the skill sets of the attendees as does independent study. However, I have been to a few conferences in the past where I would have liked my $2700+ dollars back in tutorial and travel expense. For the presentations were not only too high level but somewhat impractical for the work I was doing at the time. A conference in San Diego comes to mind some years ago. Not to mention a SDC in Orlando this past spring that seemed to be a never-ending sales pitch for the tools they develop.
Testers elevate their profession through their performance and desire for self development. I see it as a perk to get a chance to go to a good conference to rub elbows with colleagues and share tips, tricks, solutions as well as war stories. It's cathartic and yet expensive too. So I understand why managers are not able or perhaps unwilling to send their testers to the latest conference du jour. Schedules have to be considered as well as the tightening of the budgetary belts in this economy. With an investment in a resource who may just bail for the next pretty ship that sails also gives management the justification to hold back on the professional development dollars. And we all know the movement in this industry.
As you know, I have a snarky little saying on my twitter account indicating that I am just a happy little testing junkie traipsing merrily through the code thrown over the wall. Yeah that's me. Am I a clock puncher? Far from it.
There's more to that profile statement than what you see on the surface. For those who know me well, know that my profile statement is a farce as to the kind of tester I am. As a tester, a very proud one at that, I take my job very seriously and tend to inject myself in every portion of the iterations I support. (currently supporting 4 per month, actually.) Independently reading to improve my skill sets and serve on the process improvement teams to further develop our lifecycle process from each retrospective to the next. Someone in my work team once called me 'x2'; because I tend to do the work of more than just one tester. I take software quality and testing very seriously and I attend conferences of value on my own dime when time allows. If I left it to management to elevate my profession I would be sadly disappointed. As a result I have a tech library that has over taken my house.
So I take matters into my own hands. Because I love what I do and the end result that comes from it.
When I was hired into my current gig they told me I wouldn't actually be the one testing the software for the tests I design. That's not how I roll. I saw great room for improvements and a huge disconnect in the role of testing within their definition of agile. It was a challenge I was willing to accept. I have sufficiently changed the mindset on what it is to test in this organization. I took the job knowing that they dev organization undervalued the role of testing. I knew I could change that; and with the help of my converts, we are making the improvements to turn this organization right around and tap the testers for what they can truly provide in terms of service and value.
Although I get the gist of your blog entry, I am still shaking my head over the clock puncher statement. Help me understand.
You ask "How can we get the word out to those that don't understand the huge hidden resource which may lie in their testing and QA teams?"
I think this is a serious question. Naturally, I try to raise the credibility of the test team whenever I consult with an organization. But your question made me think, what else can we do?
What about teaching testers how to promote themselves? Perhaps a good venue for this would be a tester user group.
I'd like to hear about successful tester user groups. Does anyone have one in their area that they'd recommend?
Since my company is currently going through some organisational changes with a new alignment of the used methodology, we experienced something related to your blog entry.
During one of the success presentations the head of our product development department stated, that they were surprised during the development of the next increment of our product to see that our people from the PQA department were so far underestimated. They took over responsibility in the form, that they lead a team with many developers under them. Therefore I can fully acknowledge your points concerning testing resources.
The interesting point here is, that I keep askign myself whether I underestimate my four people under me as well...
T is very lucky if he/she hasnt come across testers who are just clock punchers. Lee Copeland is till giving his 9 Forgettings talk where he will relate that most tester havent read a testing book, dont surf the web to read the forums and blogs etc
Any replies to your blog will be somewhat biased as the people reading it are sadly in the minority
Some of it is down to managers not understanding software development, thats why Gerry Weinberg can write his Perfect Software and other Myths book - can we arrange for every manager to be given a copy of that book ?
My personal story is that I had to train myself - a library of books and hours on the web and was only offered training by management when I handed my notice in.
To T: I know lots and lots of great testers too. However, when I've been hiring testers, I can't tell you how often I've asked, in a phone screen, "What do you do for professional development?" and gotten dead silence. I get resumes from people who clearly haven't updated their skills in five years. Maybe that's why they're looking for a job, I don't know.
Over the past few years at conferences, I've talked to quite a few people who complain that they can't get their testers interested in participating in agile development at all. The QA teams hold their ground and refuse to cooperate, and say "I'm not touching that code until you are finished, and then only if you provide a requirements document ahead of time". They have their entry criteria and by golly, they're sticking to it. Maybe those people are skilled testers, but they don't want to change their approach at all.
I'm like you, I've never worked with people like that myself, but based on what other people have told me, plus these resumes and phone screens from the many times I've hired testers, tells me that these clock-punchers are out there. And common sense tells me that while programmers can't refuse to learn new technology or they won't be employable, "black box manual testers" can keep on doing the same old thing.
Kay, I was also thinking about a user group. What about a mentoring group of some kind? I'd be happy to give newbie testers advice on what skills they might want to develop, where they can find resources about different aspects of testing, that sort of thing. Could we have a user group of mentors and mentees? testing-career-building or something?
Hi Lisa,
A member of my QA team directed me to your post today, and it resonated with me because it is something that I've clearly seen reflected in my work experiences.
My solution has been, and still is, to "upgrade" my teams. In a world of Business Analysts, Project Management Professionals, and the like, it's easy for the term "test" to seem like it's missing a certain luster reserved for those with fancier titles.
So, for this reason, my team has always consisted of "Quality Assurance Analysts."
You'd be surprised at the amount of respect that one simple change has restored to my teams over the years. It was if we had achieved some mark of distinction in our area, or some hallmark of training, even though were still the same testers.
People, especially managers in a large corporation, find a sense of comfort in buzzwords and titles, and unfortunately, tend to respect those with more of each than those with simple titles. For better or worse, that's been my experience. (For the record I've spent most of the last 12 years in one of the largest financial institutions in the U.S.)
The flip side is that there are larger expectations of my teams, not that this is necessarily a bad thing. We are expected to be more persistent with our clients, the development and design teams, etc. and to be more vocal when we see issues. That gives us more leverage and allows us to drive better quality throughout the process.
It doesn't solve all the issues that you raise in your post, but it certainly has paved the way for my teams to proceed down a better path.
To freebfrost: Thanks for sharing your success story. Why did you choose "Quality Assurance Analyst" as the title? I know "QA" has come to be a common term for people who do testing activities, but is it really what your team does - do you make sure the code is done right the first time? I think agile testers contribute to this - but it takes an effort of the whole dev team. Are we more quality control?
It's easy to get pigeonholed into just "testing," but was is it really that we do? As I like to tell people who ask me what I do for a living - "we break things." We break them so that others can fix them and produce a better product.
Now what is it that we break? We break code, for certain. But don't we also have input (or should have input) into the other parts of the equation?
As you said yourself, we are good at talking to customers and understanding requirements. I would extend that to include we communicate well and have a basic understanding of what our developers do also. My teams are engaged in the requirements gathering piece as well as working side-by-side with development. Do we necessarily understand their code line-by-line? No. But we know enough to be able to identify a problem area when a defect occurs and that helps the developer fix the problem in a timely manner.
So yes, there are some people who just "test" and likely are comfortable with the label "tester." I have higher expectations for my team and myself, and that includes injecting quality throughout the process and understanding how to do so. Hence, "Quality Assurance Analysts."
:)
As I once heard Cem Kaner say, we testers don't break software, it comes to us already broken. But the great thing about agile development is the opportunities it gives us to prevent any breakage from occurring.
Post a Comment