Thursday, October 28, 2010

Am I Agile or Mini Waterfall? Do I care?

After our development team migrated to agile, we assumed that we are an agile test team.  After some time and belly button staring, we came to the conclusion that we were a mini-waterfall test team and not very agile.

We tested and delivered code every four weeks, and we went to scrum meetings.  It looked agile, but our mini waterfall approach was very traditional.  Programmers and product owners designed and implemented the product enhancements, and testers stood against the wall waiting for code to test at the end of the sprint.  We were second-class members of the development team.

The move to agile testing started by dealing with the denial that we were not already there and by trying to articulate what exactly agile testing is.  While not everything changed (we still analyze and test software), we changed our testing philosophy.  We found ways to change our role from tester to “developer.”  We took steps to own the early days of a sprint be taking on more of a business analyst role.  We engaged the product owners earlier and more often, changing from software watchdog to advocate for functionality and usability.  And we actively sought ways to provide immediate feedback throughout the sprint, reducing the end-of-sprint rush.

So, should you care if you are mini waterfall or agile?  Agile testing gave us something that we lacked with mini waterfall, a way to become full share members of the development team.

Tuesday, October 26, 2010

How HP Failed BPT

HP failed Business Process Testing (BPT) when it rolled the automated test framework out several years ago, and it has not really done much since.

HP marketed BPT as a way for non-technical content experts to build automated tests.  The original tutorials showed an implementation where the non-technical content experts designed and built components (the reusable building blocks of tests) by simply stringing together objects and keywords.  Then, the content experts drag the components together to build tests.  That tutorial demonstrated the functionality of BPT, but it failed to even give a hint about how to implement it successfully.

Non-technical content experts are the wrong group to use when building the infrastructure for a test system.  The system is good, but it is not automagic.

Non-technical content experts should never be the ones to decide the scope and purpose of components.  The components are way too important, and the approach is way too random.  There are too many ways to define a component, and if you don’t have a pattern or model in mind, you will end up with a bunch of components that are redundant, inefficient, and hard to maintain.  The strategy for designing components should be treated like a software development activity.  An automator should be involved here.

In addition to confusing the issue of who should design components, HP provided no advice on how to design them, and there is still a fair amount of confusion about this.  For example, should we build components that perform a business process?  That makes sense based on the name, “Business Process Testing.”  But, when you have a component called “Register a New Member,” you are limited to using the same positive flow.  How would you use that component if you wanted to verify negative tests such as what happens when you attempt to register a new member with an invalid social security number?  Business processes like this would make sense if you only wanted to test “happy path” cases.

After this confusing rollout, BPT languished without a lot of attention from HP or the testing community.  This has been really frustrating to those of us who use and see the value of BPT.  As I see it, BPT is the perfect tool for agile testing.  It is a great tool for getting the power of automation to all members of the team without being limited by the level of technical skills.

Monday, October 25, 2010

Testers Need Not Apply


The fault may be mine and the job description I have posted, but I am not seeing the candidates I would like to see for my agile testing team.  This is causing me to slow down and question what I want.

A few minutes of brainstorming and here is what I came up with (“up with which I came,” if you prefer):  
  • A question asker who makes no assumptions about requirements or how to test
  • A project manager who can herd developers
  • A negotiator adept at talking to gunmen in hostage situations and skilled at getting programmers to do his bidding
  • A questioner of authority
  • A nerd repelled by bureaucracy and unnecessary process
  • A get-it-done-no-matter-whose-job-it-is person who can own a project
  • A curious technician who knows how to build testing tools
  • A voracious reader and active learner who takes responsibility for professional skills and knowledge
  • A programmer and an automator (automation skill are not the same as programmer skills, but that is another story)
  • A lovable pain in the ass who looks for practical solutions, not the “process” that is always done
  • An optimist who has not been beaten down by bad experiences in conventional “QA” organizations

After reading a pile of resumes and talking with a number of candidates, I am not finding what I am looking for (or “that for which I am looking,” jerk).   I don’t think that my expectations are unreasonably high.  I want great, smart investigators who don’t know “no.”  Too many testers have been taught to limit themselves and to have low expectations.  Too many testers don’t take pride in professional knowledge and growth.

I am beginning to think that maybe I don’t want a “tester.”  Many testers have to unlearn what they know about testing before they can learn to be true agile testers.  They have to unlearn limits. That unlearning may not be worth the effort. 

Hopefully, I will find the right agile tester from the testing community, but I may look for a smart recent graduate – someone I can train and never let have low expectations.  

In testing, we often create our own limits. I hate that.

Sunday, October 24, 2010

The Magic of Automation Hour

We had a problem.  As a test team, we had two conflicting goals.  We needed to develop new automated regression tests, and we needed to work with developers and product owners at the beginning of the sprint.  Before introducing automation hour, we didn’t do either well. 

We develop software in four-week sprints.  In QA, our goal is for everyone to develop automated tests – we don’t want automation to be a specialized skill. 

Before we introduced automation hour, we found ourselves working on our backlog of automation projects after one sprint was completed – during weeks one and two of the following sprint.  When we focused on automation at the beginning of a sprint, we missed out on the planning and requirements discussions with programmers and product owners.  When we did become involved with the sprint work, we were behind and didn’t know the project well.  Weeks three and four were full-on testing efforts and our work on automation stopped.  We were never in sync with the developers.

About a year ago, we started automation hour.  Automation hour is one shared hour a day that everyone on the QA team sets aside to work on automation.  We now work on automation during all four weeks of the sprint, even the hectic third and forth weeks of the sprint.  During weeks one and two, we limit our time on automation work to the hour, and this gives and encourages us to find ways to involve ourselves more fully in the opening days and weeks of the sprint, long before anything is ready to test. We are able to follow automation hour during weeks three and four because we spread out our automation work evenly throughout the sprint.

Automation hour has freed us to focus on the broader aspects of agile testing while continuing to grow our automated test coverage.

Saturday, October 23, 2010

Introduction

In this blog, I intend to discuss topics related to agile testing: manual testing, automated testing, innovation, and professional growth. 

Some of my perspectives are colored by my frustration with how QA/test teams typically operate: testers are second-class team members with minimal technical and professional skills who have little influence on the projects being developed.  Additionally, most automated test implementations are failures, either too simple or too complex to work.  Among other things, I will share our approach to automated testing using HP’s Business Process Testing (BPT).

I look forward to sharing ideas and perspectives with you and learning from your feedback…Bob