Easy Tiger – Don’t dismiss record and playback just yet

Again this week I’ve been reading blogs, forums and tweets from people dismissing record and playback as a viable automation option. Which is fine, providing reasons and justifications can be cited for not using it. But empty statements and re-iterations of other peoples reasons don’t wash too well, especially in the complicated world of testing where context appears to be king.

 

Sure, it should probably never be used as a long term automation strategy, but I’ve done a couple of projects recently where simple, low tech, “dirty” record and playback has been the perfect choice. And here’s why:

  • The project had incredibly tight timescales
  • The project, in the same guise, was unlikely to be re-run in the same way meaning a full considered automation strategy could have been a waste of money
  • The testers didn’t have time to plan, build and utilise a full automation strategy
  • The appropriate skills weren’t available
  • Quick feedback and regression was needed

Given that time was of the essense I needed a quick and dirty way of smoke testing the UI and using automation to load data. I didn’t need long term, dynamic, flexible and wide covering automation otherwise I would have adopted a different strategy.

 

I needed a simple and quick smoke test that hit some key acceptance criteria, gave me confidence core functionality was still working and loaded some data at the same time.

 

It was right for me. It gave me confidence. It showed up basic functionality that was no longer working. It wasn’t time consuming or difficult to maintain. It took only 5 minutes to run. It did the job.

 

Think of record and playback as a tool the manual tester can use to help them achieve their testing goal. In essense, it was a project that had no automation with a manual tester who used record and playback to lower the regression burden and load states. Does that make it sound more viable and appealing? The tester was using it lower burden and make their testing efficient, not as an automation strategy or plan. Far more appealing now.

 

So when someone says that record and playback is wrong, costly and pointless ask them to qualify why that’s so and under what circumstances. It’s always best to have a balanced view of these things. There’s a time and a place for all types of automation. And if that person has never used it, never worked under the conditions it can be suitable for or simply prefers to spend time manually checking basic tests that a computer could be doing – then maybe their point of view should be taken with a pinch of salt. My guess is, that point of view may also contain the words ‘best’ and ‘practice’.

 

Long term automation with a framework and key skillset is the way forward for most projects. However record and playback still has its place, so don’t dismiss it just yet.

8 thoughts on “Easy Tiger – Don’t dismiss record and playback just yet

  1. Oh BRAVO, Rob!It takes some guts to stand up and say “capture/playback” to this audience; I couldn’t agree with you more. We’ve got a sophisticated automation framework here (for all the “right” reasons), but I can and have used plain ole capture/playback for quick and dirty jobs and as a consultant, in those cases where the client didn’t have either the money, talent, or time to do things any other way. And remarkably enough, it worked just fine in those cases…- Linda

  2. Hi Linda,Thanks for the kind comments. Glad you find a need for record and playback too. Rob..

  3. A good plea for recognizing context. Like everything else, record/playback isn’t intrinsically a bad thing. I think the objection that most reasonable people make is towards its unreasonable application–that is, to believe that it provides thoroough testing of a product.In a peer conference, you’d get a couple of context questions:1) When using record/playback, what was the oracle? That is, was the sole criterion for determining a problem that the script didn’t complete?2) What were the risks that you were trying to address? What sort of testing were the programmers (not) doing to address those risks?Cheers,—Michael B.

  4. Hi Michael,Thanks for the comment. In answer to your questions:1. Absolutely. The script was a regression script and very basic. So if it failed then there was something wrong. Some functionality that used to exist now didn’t. Whether that is the software or the script is be determined later but something is clearly wrong. If it passed – it still didn’t prove that the software was right, but it added confidence that basic functions were still working.2. In an agile environment code is always changing and often in areas that, at first, might not be closely related to previously completed functionality. So the basic smoke test added confidence that functions were still working. The risk was that a small code change over here, could seriously affect something over there. In this case there was little unit testing. The app was a prototype, so didn’t warrant the same overhead. When the project becomes ‘real’ then record and playback will be gone and replaced with an automation strategy to suit that context.I was also using the scripts to visually check the site. Instead of me clicking around constantly in different browsers, resolutions and operating systems checking for alignment issues – I can have a script do it for me.CheersRob..

  5. The key thing you talked about is “appropriate time and place” for using automation in this manner. Or as Michael said, the context. The concern most of us have is that we do not want this to become the norm. Record/Playback does have its place for building skeleton scripts and Quick&Dirty tests that are limited use and/or throwaway. I totally agree with that. Even some of the “frameworks” are overkill for alot of projects (they add their own level of complexity and usability/maintenance issues).The problem is is that we also deal with a misrepresentation (fantasy) by the Tool Vendor Sales people to our management that R&P is the “best” way to go and that any monkey can do it (automate testing). Bottom line is that this is a tool, and has to be used correctly by someone who knows how in the first place. As the sayings go; “A fool with a tool is still a fool” and “It’s Automation, Not Automagic!”Finally, did you explain to other people & management on the project/team that this is why you did the automation work this way. Did they really understand why (or even care). Hopefully you did not set a precedence for yourself that you will not be able to change later on.

  6. Hi Jim,Thanks for the comment. Indeed you are absolutely spot on with your comment, I especially like the “A fool with a tool is still a fool”. In answer to your question about setting a precedence I absolutely always make it clear about why I use something and any limitations it will have. Managing expectations is one of the most important parts of my job…..I think you are right to ask that though because there are a huge number of testers out there who do use out of the box R&P and managers become easily amused, impressed and hypnotized by a UI that moves on it’s own after nothing but running through the app yourself. It therefore becomes standard and before long you have an automation management nightmare. But. Any tester that implements R&P and doesn’t manage expectations is probably the same tester that believes holding an ISEB certificate makes them instantly skilled to do the job.As for the tool vendors. Well, they are a business after all and are just doing their job – selling. Any tester who is suckered in to believing R&P will solve their automation problems will be the same tester who’ll buy the pointlessly complicated test management tool too. Blaming record and playback on salesmen is avoiding the painful truth – that too many testers out there don’t do their research, don’t say no to management and don’t manage expectations.A fool with a tool is still a fool. No doubt. But a good craftsman knows which tool is right for which job under which circumstance. A carpenter using a power saw when they should be using a hand saw has chosen the wrong tool for the job – it’s not the tool vendors fault – it’s a lack of understanding, experience and skill; just like a tester using R&P when they should be using a framework.I think the sales people are just doing their job. We, as testers, need to take more control over our own actions and be responsible for the decisions we make. And the only way to do that is to know what’s right for each context we work in by planning, researching, proof of concepts and plain old trial and error.Maybe that’s something we should push back up the chain to ISEB. Do they teach about how to choose the right tool under the right circumstance?Thanks for your feedback Jim. P.S Is this the Jim Hazen of http://www.sqablogs.com/jimhazen ? Rob

  7. Rob,Testing is Irrelevent, Shipping is Futile!Yep, that’s me.And I agree with your points Rob in your response. I guess the only thing I can debate is that the Sales people from the Tool Vendors are more interested in talking to the people with control of the checkbook, management. Thus they (management) are not informed/forwarned as we are and get sucked into the “Snake Oil” as James Bach talks about in one of his whitepapers. Then once that is done we get handed the tool and told to go get work done.In this respect it is our responsibility to educate management beforehand, and if it is after the fact then we need to be say “Whoa, hold your horses!” (using a US phrase) and reset expectations and understanding. That I definitely agree on.But how do we (testers) get that knowledge and experience about the tools and their uses. Typically it comes from OJT and hardknocks (trial & error). Even in formal training classes you only get so much. Real world experience is the best teacher in this situation. And that takes time.Best to you.Jim

  8. Hi Jim,Yeah, you’re spot on there. Uninformed management decisions are often difficult to carry forward and be productive with. One of the many challenges some face each day.Very much enjoy your blog. Thanks again for the comment.CheersRob

Comments are closed.