For me one of the most difficult challenges I have faced as a tester is the move from a traditional project methodology to an agile one.
The process of adopting agile for a manual tester is tricky. It’s incredibly difficult and often it is the testers who offers the most resistance when teams make the move. Stories about testers being negative, throwing their toys out of the pram and generally being a bad egg are common.
And I completely understand why.
When I made the transition from traditional to agile it felt like my face was melting and my mind was bursting.
It was the toughest challenge of my career. I hated those first few weeks and wondered whether I had a role in the team or not. I was contemplating a change of career and feeling completely and utterly under valued. I hated it. I was terrified that this was the future of software testing and I didn’t get on with it.
For a tester, it’s not just about doing the same work in a different order or with tighter time constraints, it’s about changing your outlook on testing and how you fit in to the team. It’s about redefining your role (and your skills) and evolving to stay relevant. You need to do a mind shift that at first seems completely alien. A mind shift that seems so very wrong.
In the end I just let go, took the rough with the smooth and worked at seeing what all the fuss was about. And here’s what I found out.
The focus of the whole team shifts to quality
- You will become the quality expert. You will no longer be the person who tests just at the end
- You may need to devise tests with little to no formal documentation…fast
- You will need to feedback your test results rapidly
- You will need to be confident, vocal, capable, responsive and communicative, often taking charge and leading on quality
- The rest of the development team will come to you for feedback to their tests and code early
You will bridge the gap between the business and the techies
- Your role should now mean you liase closely with the customer. You will need to adopt a customer satisfaction role
- You will help to define the stories and acceptance critiria – these will become your tests and guidance so your input is essential
- You will have to report finding about quality to the customer and stakeholders….fast, timely, accurately and with diplomacy
You will need to put your trust in the Product Backlog
- Traditional projects with 100 requirements often end up delivering a large percentage of that 100 but with poor quality, misunderstanding and often incomplete
- Agile projects with 100 requirements at the start *may* end up delivering only 60. But these will be complete, exactly what the customer wanted and of course, be superb quality.
- This original number of 100 may grow and shrink with changing markets and business decisions. Trust the backlog.
- The customer will define and decide the next sprint of work for your team.
- You will simply advise, manage expectations and communicate
- This is a tough one – letting the customer decide what to do next….
- You will need to consider the longer term and bigger picture, but your main focus is the sprint in hand
You will need to increase your exploration and automation
- You will need to replace the tedious, checklist type manual tests with automation if possible.
- Your regression suite will get too large unless you make the most of automation and get the basics covered.
- The only other option is to hire a load of undervalued and demotivated testers to simply ‘checklist’ basic functionality.
- Your automation should be integrated with the continous integration and automated build deployments.
- This constant quality feedback is key to success as it allows you more time to do exploratory testing.
- It also gives you added confidence in your new features.
- Elisabeth Hendrickson summed up agile testing very nicely indeed (taken from her ruminations blog – http://testobsessed.com/):
- Checking and Exploring yield different kinds of information.
- Checking tells us how well an implementation meets explicit expectations.
- Exploring reveals the unintended consequences of meeting the explicitly defined expectations and gives us a way to uncovers implicit expectations. (Systems can work exactly as specified and still represent a catastrophic failure, or PR nightmare._
- “Checking: verifying explicit, concrete expectations”
- “Exploring: discovering the capabilities, limitations, and risks in the emerging system”
- A negative side effect of increased exploration is how you go about managing the test information.
- Most mainstream test tools don’t support exploratory testing recording and the agile process tools are just as poor.
- An idea might be to use Guidance Level Test Ideas in a traditional tool (http://pac-testing.blogspot.com/2009/03/normal-0-false-false-false-en-gb-x-none.html), notepad, various session tools or write your own tool.
- Excel is also often overlooked.
You will need to drop the concept of test case preparation and spec analysis
- It’s unlikely you will get a detailed spec.
- The acceptance criteria become your test cases and design.
- The software becomes the UI design.
- As you test and develop, fine tune it and get it right.
- There’s no way your upfront design will be 100% accurate (http://parlezuml.com/blog/?postid=760).
- Use wireframes if you must mock. http://www.balsamiq.com/ is awesome for this.
- If you must write a test plan, plan for the sprint only.
- Don’t assume you know how or what you will be testing in three sprints time.
- Prepare to be dynamic in your tool selection, approach and thinking to testing. You may need to change your tools to cater for new information.
- Don’t be too prescriptive.
- Add a quality toolsmith to your team. They will save you a fortune in the long run.
- Invest time in researching free, open source or cheap tools.
- The more tools you know of, the more likely you will be able to respond to changes.
- Don’t even consider what are supposedly Best Practices.
- Do what is right for your team, on that project and at that moment in time.
- Trust me, letting the stories and software guide the UI and design is a revelation. It’s just tricky changing your mindset to accept this.
You will need to get over the defect stats and metrics complexion
- Working software is fundamental. It’s what the end goal is.
- Each sprint you aim to deliver releasable standard software that meets the acceptance criteria.
- So along the way there is less emphasis on raising and recording every single defect in a tracking system.
- It’s more about shouting over to the programmer and getting it sorted between you.
- Look at low tech dashboards as a way of reporting metrics
- Defects that relate to the acceptance criteria and story under test mean the story is not done (even if it has been coded and the programmer has moved to a new story).
- Defects are no longer used to cover our backsides or blame other people.
- Defects that aren’t related to the story should be on the backlog, where the customer can prioritise.
- After all a defect is a piece of functionality that either exists and shouldn’t or doesn’t exist and should.
- Let the customer decide what to do with them.
- They may be less/more important to the customer than you think.
- If you truly must report then this needs to be done in the lightest way possible. And my guess is, that if you really are having to report each and every defect encountered along with test case metrics and stats in a formal way then someone in the process/system has not truly bought in to agile.
- Note: I’m not saying be slack with defect tracking and reporting.
- Far from it, if you need to put a defect on the backlog for the customer then you need to consider how you will describe this successfully for that audience.
- When shouting to the programmer it’s often easier as you can show them the defect in action.
- The people you report to, the information you report and the way you report it changes.
After getting my head around these differences and new concepts I noticed a few unexpected side effects;
- I was re-ignited with my passion for software testing
- I was being consulted far more on quality issues meaning I spent less time complaining and raising obvious bugs after the software was dropped
- I started to use my creativity and critical thinking in a rapid and responsive way, rather than testing a spec and thinking of a few edge cases up front.
- I was being engaged and used for my creativity, skill and critical thinking
- I started to work in teams where the whole team valued quality rather than an ‘over the wall’ mentality.
- I noticed that the customers were far happier with the process. They were getting to control the focus of the work and ending up with software that meets their needs at that moment in time, not the software they thought they wanted 6 months ago
- I lost a huge amount of negativity and became more positive, motivated and accomodating.
- I spent far less time sitting around after raising a barrel load of defects.
- I no longer waited for the triage, fix, build inclusion, release, retest, close.
- I got them fixed asap, released asap and retested asap.
- My job didn’t feel futile. I felt I was adding value.
Now I know there are people with frustrations with agile and there will be teething problems and issues for all new teams. And agile really may not be suitable for all types of work, but there are certainly some awesome principles and techniques we can all learn from agile.
If you have any agile testing stories to share then please let me know in the comments.