Test Retreat – Bournemouth


Last night was the Bournemouth Test Retreat organised by Rajya Bhamidipati (@peppytester).

Redweb office
Redweb office

The event was hosted at Redweb’s awesome social bar with drinks and nibbles provided. I was particulary interested in the UX lab for conducting usability testing.

Jealous of the UX Lab at Redweb
Jealous of the UX Lab at Redweb

We had Adrian Howard (@adrianh), John Stevenson (@steveo1967), myself (@Rob_Lambert), Rajya (@peppytester), Danny Dainton (@DannyDainton), Dave Wardlaw (@DaveyBoyWardlaw) and Lee Harvey (@LeeHarveyLand) turn up for the session.

Drink and retro games at Redweb
Drink and retro games at Redweb

We chatted about testing, the role social media is playing in bringing communities and people together and why it’s so hard to find good testers. We then did an exploratory session; a 30 minute session with the intent on finding the maximum price a web site would give us for some parking. We got some pretty good numbers.

Redweb's awesome hospitality
Redweb’s awesome hospitality

We had copies of cheat sheets, test design books and various other notes to give us ideas and inspiration whilst testing.

Exploring the site
Exploring the site

It was great to see everyone digging around and exploring the product. I paired up with Danny to explore the app. The guy is unbelievable passionate about learning about testing. It’s always great to pair with someone so eager to learn.

Exploring the site
Exploring the site

After the session we did a debrief that was regularly diverted on to interesting side topics such as writing clear notes, knowing when to stop the session, knowing when to stop testing around a feature and having an environment setup and ready to go (i.e. Firebug, Fiddler, Burpsuite etc).

Rajya and John during the debrief
Rajya and John during the debrief
Dave Wardlaw and Danny Dainton during the debrief
Dave Wardlaw and Danny Dainton during the debrief

Overall it was a great event with loads of good ideas being shared.


Thanks to Rajya for organising it. Thanks to Redweb for the use of their chill out bar. Thanks to John for leading the sessions and thanks to everyone who turned up to chat about testing. See you at the next one.


EuroSTAR 2013 – The Unofficial Guide

It’s here.

The Social Tester’s Unofficial Guide to EuroSTAR 2013.


I thought I’d get the guide out early this year. Hopefully I’ll see you all there.

Note: Since I completed this I’ve been informed that there is no gala dinner…..I have yet to confirm or deny this. If there is, the section on dressing smart still stands, if there isn’t….well….we’ll find somewhere to go on that night 🙂

Click the image above or here for the PDF download.

A conference story

Across the world there were similar journeys taking place.

Delegates are boarding planes, trains and automobiles to attend a European conference in the wonderful city of Amsterdam. For some this journey is short. For others it’s epic.

Some delegates would be travelling on their own. Others would be travelling with people they know; colleagues, friends, peers or family.

For the first time conference attendee the journey can be daunting and nerves can start to form around what to expect.

Some journeys go better than others.

In England one conference delegate is flying from a small city airport in a tired looking Bombardier Q400.

2012-11-05 14.18.07


The noise on-board is deafening, the coffee cold and the seats cramped. Despite these downsides the delegate can’t help but notice that the flight staff are welcoming, well trained and during the safety overview, incredibly well synchronised. He notices the menu in the seat-back and does some impromptu testing of the contents. The first obvious bug he spots is the use of the word “gourmet” to describe the microwaved cheese toastie.

Those who often attend testing conferences are somewhat obsessed. Testing isn’t just a career – it’s a calling (or a lucrative business). It’s something they’ve been doing their whole lives. Some might call this devotion to it strange, weird or sad. Others just see it as a way of life.

Over the Atlantic ocean is one of the conference speakers, asleep on a Virgin Atlantic jet. His journey started several hours ago. It won’t end for a few more hours yet. Tiredness is inevitable.

In Amsterdam there are already a large group of delegates descending on the city’s hotels, hostels, rentals and flats. Pockets of testers are occupying three or four of the main hotels. Many don’t know each other and are destined to spend the rest of their stay oblivious to the fact that the person next to them at dinner is a tester and attending the same conference.

All of the delegates share something in common; testing. They may approach the subject differently. They may oppose each others views. They may dislike each others personalities, but they all have a common thread to at least start a conversation. Yet many won’t. Many will spend the entire time alone. Some may enjoy this, others will inevitably feel isolated.

Some delegates arrive on the Monday, others wont turn up until the start of the presentations on Tuesday. The Twitter stream starts to fill with people discussing meet-ups, comments on the tutorials already taking place and general banter about the event.

Chat sessions fire up, text messages are sent and tweets are broadcast to arrange meals, drinks and gatherings. Many of these people are meeting others in the flesh for the first time. Relationships on social channels have paved the way for a seamless ability to meet-up in person. The hard work is done. The relationships can flourish further in person, or dwindle away at the realisation that online personas don’t always match reality.

As the evening comes around small groups of Testers are travelling to bars, hotels and restaurants to catch up and relax before the conference starts.

In an Indonesian restaurant in the city centre, a group of testers are gathered around vast quantities of food and beer. Conversations are flowing about testing, life and work. People are getting to know new people or are refreshing relationships with people they met at other conferences. As more beer flows the conversations become more lucid and discussions about the state of testing inevitably crop up.

As this particular group discuss certification schemes, standardisation and best practices, in the context of them destroying the industry, it is oblivious to them that across town, in a posh hotel where Heineken is served in impossibly tall glasses, there is another group of testers talking about how context doesn’t matter and best practices are what the testing world needs.

Just down the road in a small bistro is a group of well funded testers talking about how maturity models are the future for their giant consulting business. They are busy putting the final touches to new methods of quantifying the true value of testing to an organisation.

Across Amsterdam that evening there are many tribes of testers discussing testing. Some with polar opposite views, some in perfect alignment and some who are too tired, or drunk, to talk about testing. In hotel lobbies and rooms across Amsterdam there are also testers with no-one to talk to.

As Tuesday afternoon spins around the conference centre starts to get busy. In the corner is a tester filming the queues forming using a time delay app on his iPad. Upstairs are testers sitting around talking about strategies for handling regression testing and exploratory testing. Meetings are happening and discussions are flowing.

Many testers are pacing around the venue embroiled in conversations to someone on the other end of a phone call. Are they conducting business? Speaking to relatives? Pretending to speak to someone to look important?

In the Test Lab the lab rats are getting ready for people to do some testing at the conference – a concept that is alien to some delegates.

In the expo centre are salespeople and demonstrators trying to draw attention to their products, services and goods. Some are doing better than others. Some are more involved than others.

For some of the vendor representative it is their 20th conference this year. Some of the vendor reps haven’t been home for the last 8 months, their life is a constant cycle of conferences.

Throughout the event some presentations go to plan, some of the speakers don’t quite communicate their intended message well and some experience a few technical glitches.

Some talks are funny and insightful, others masked in terminology and concepts that aren’t appealing to everyone. Some are about systems, some about space rockets, some about video games and some about standards.

There’s a talk for everyone. Many delegates are complaining that there is too much to see and they are having to make tough decisions on which talk to get to. Some consider that struggle to decide the marker of an excellent conference.

This year sees a real sense of community and involvement. The community hub took a day to get going. But on the last day was well attended with delegates wanting to listen to lightning talks, chat to new people and contribute their views and ideas to the Testa Mappa.

It’s in the community hub that delegates get to talk to the speakers and other attendees in a welcoming and safe environment. Many delegates are changing their mind about the presentation after discussing the topic further with others. Many speakers are learning a lot about themselves, their presentations and their ability to communicate their core message.

Throughout the event there are many smaller and much more focussed meetings happening, some private and by invite only, some open to a wider audience. There is business happening, friendships being made, social connections starting and also ending.

As the conference comes to a close the delegates, speakers, organisers and vendors travel home, many leaving with new friends and connections. Many will return home to families, friends and colleagues more tired than they thought they would be. Conferences take their toll both physically and emotionally. The challenge for many will be filtering and putting in to action many of the interesting ideas they took away from the conference.

The challenge for some will be working out how they can talk to more people next time and how they can ensure they are invited to the many social gatherings that are happening.

For many though the conference didn’t end when the doors closed and they travelled home; the conference continued on the social channels for many more months. And that is often the real marker of a successful conference.

(I wrote this short observational story at EuroSTAR 2012 last year. It was a great conference and this years looks like it will another great conference too – see you there hopefully.)

Moving from 8 Month releases to weekly

The other week I facilitated a session at the UK Test Management Summit.

I presented the topic of moving to weekly releases from 8 month releases.

I talked about some of the challenges we faced, the support we had, the outcomes and the reasons for needing to make this change.

It actually turned in to a questions and answers sessions and despite my efforts to facilitate a discussion it continued down the route of questions and answers. It seems people were very interested in some of the technicalities of how we did this move with a product with a large code base of both new and legacy code (and that my facilitation skills need some fine tuning).

Here are some of the ideas.

We had a vision

Our vision was weekly releases.
It was a vision that everyone in the team (the wider team of more than just development) knew about and was fundamentally working towards.

This vision was clear and tangible.

We could measure whether we achieved it or not and we could clearly articulate the reasons behind moving to weekly releases.

We knew where we were
We knew exactly where we were and we knew where we were going. We just had to identify and break down the obstacles and head towards our destination.

We had a mantra (or guiding principle)

The mantra was “if it hurts – keep doing it”
We knew that pain was innevitable but suffering was optional.

We could endure the pain and do nothing about it (or turn around) or we could endure the pain until we made it stop by moving past it.
We knew the journey would be painful but we believed in the vision and kept going to overcome a number of giant hurdles.

Why would we do it?

We needed to release our product more frequently because we operate in a fast moving environment.

Our markets can shift quickly and we needed to remain responsive.

We also hated major releases. Major feature and product releases are typically painful, in a way that doesn’t lead to a better world for us or our customers. There are typically always issues or mis-matched expectations with major releases, some issues bigger than others. So we decided to stop doing them.

The feedback loop between building a feature and the customer using it was measured in months not days meaning we had long gaps between coding and validation of our designs and implementations.

What hurdles did we face?

The major challenge when moving to more frequent releases (we didn’t move from 8 months to weekly overnight btw) was working out what needed to be built. This meant us re-organising to ensure we always had a good customer and business steer on what was important.

It took a few months to get the clarity but it’s been an exceptional help in being able to release our product to our customers.

We also had a challenge in adopting agile across all teams and ensuring we had a consistent approach to what we did. It wasn’t plain sailing but we pushed through and were able to run a fairly smooth agile operation. We’re probably more scrumban than scrum now but we’re still learning and still evolving and still working towards reducing waste.

We had a major challenge in releasing what we had built. We were a business based around large releases and it required strong relationships to form between Dev and Ops to ensure we could flow software out to live.

What enablers did we have?

We had a major architectural and service design that aided in rapid deployments; our business offering of true cloud. This means the system had just one multi-tenanted version. We had no bespoke versions of the product to support and this enables us to offer a great service, but also a great mechanisms to roll products out.

We owned all of our own code and the clouds we deploy to. This enabled us to make the changes we needed to without relying on third party suppliers. We could also roll software to our own clouds and architect these clouds to allow for web balancing and clever routing.

We had a growing DevOps relationship meaning we could consider these perspectives of the business together and prepare our plans in unison to allow smoother roll outs and a growing mix of skills and opinions in to the designs.

What changes took place to testing?

One of my main drivers leading the testing was to ensure that everyone took the responsibility of testing seriously.

Everyone in the development team tests. We started to build frameworks and implementations that allowed selenium and specflow testing to be done during development. We encouraged pairing between devs and testers and we ensured that each team (typically 4/5 programmers and a tester) would work through the stories together. Testing is everyone’s responsibility.

Testing is done at all stages in the lifecycle. We do TDD, Acceptance Test Driven Development and lots of exploratory testing.

We do a couple of days of pre-production testing with the wider business to prove the features and catch issues. We also test our system in live using automation to ensure the user experience is as good as it can be. We started to publish these results to our website so our customers (and prospective customers) could see the state of our system and the experience they would be getting.

We started to use techniques like KeyStoning to ensure bigger features could be worked on across deployments. This changed the approach to testing because testers have to adapt their mindsets from testing entire features to testing small incremental changes.

Why we love it
Releasing often is demanding but in a good way. The pressure is there to produce. The challenge we have is in balancing this pressure so as not to push too hard too often but have enough pressure to deliver. We don’t want to burn out but we want to ship.

We exceed the expectations of our customers and we can deliver value quickly. In an industry that has releases measured in months (sometimes years) we’re bucking the trend.

As a development team we get to see our work in production. This gives us validation that we are building something that is being used. Ever worked on a project that never actually shipped? Me too. We now see none of that.


It’s been tough getting to where we are now but we’ve had amazing support from inside and outside of the business which has helped us to really push ahead and set new markers of excellence in our business domain. We’ve still got lots to get done and lots to learn but that’s why we come to work in the mornings.


These are just a few of the factors that have helped us to push forward. There are companies releasing more often, and some releasing less often to good effect. Each business has a release cadence that works for them and their customers.

Did I mention We’re Recruiting?


Side Notes:

I got asked the other day how I come up with ideas for talks/blogs, how I think through these ideas and how I go about preparing for talks. I’ll take this opportunity to add a short side note of how I do this. This approach may not work for you.

I firstly create a central topic idea in a mind map (I use XMind).

I then brainstorm ideas around the central topic. After the brainstorm I go through the map and re-arrange, delete, add and rename until I feel I have a story to tell.

Moving to weekly releases

I then start planning the order and structure of the story. Every story has a beginning, a middle and an end.

I start by writing the beginning and then the end. The middle is the detail of the presentation.


I then doodle, sketch and plot.

2013-02-15 16.06.11

2013-02-15 16.06.20


I then move to my presentation tool of choice. In this case it is PowerPoint – sometimes it is Prezi.

The presentation typically takes a long time to prep, even for a very short intro like this. This is because I don’t like including too much text in my slides and also because I think simple, but attractive slides can add some impact to the topic. So I spend some time making sure they are right. Saying that, no amount of gloss in the slides will help with a bad/poor/boring story.




Tester’s need to learn to code

Tester’s need to learn to code…. and any number of other way of paraphrasing Simon Stewarts comments in his Keynote at EuroSTAR 2012.

I’m not entirely sure on the exact phrase he used but I heard a number of renditions after.

They all alluded to the same outcome:

“Testers need to code or they will have no job”

Speaking with Simon briefly after the event it’s clear his message was somewhat taken out of context.

I get the impression he was talking about those testers who simply sit and tick boxes. Checkers, as I think they are becoming known.

Simon suggested that these people should learn to code otherwise they will be replaced by a machine, or someone who can code. There were some who took great distaste to the general sentiment, and those who are in total agreement, and of course probably some in between and some who don’t care.

Simon himself was greatly pragmatic about it and suggested that learning to code will free people up to do good testing, a sentiment I can’t imagine many arguing with.

Those who know me have often commented that I often sit on the fence and try to see the positive in each side of an argument. It will therefore come as no surprise that I agree, and disagree about the message Simon was communicating.

I agree that Testers who do purely checking will (and should) be replaced by machines to perform the same action. Some of these people, (with the right peers and support, the right company and willing market conditions) could become great “Testers” whilst the machines automate the tedium. Some won’t and may find themselves out-skilled in the market.

But I don’t believe that all Testers have to learn to code. I know of a great many who don’t but are doing exceptional testing. Saying that, I think each team needs to have the ability to code. This could be a programmer who helps out, or a dedicated “technical tester” who can help to automate, dig deep and understand the underlying product.

I’m an advocate of encouraging Testers to learn to code, especially those new to the industry who are looking for early encouragement to tread down certain career paths. I’m also an advocate of using the wider team to solve team wide problems (and automating tests, reducing pointless activities and having technical assistance to testing are team wide problems).

When you hear a statement like “Testers need to learn to code” don’t always assume it tells the whole story. Don’t always assume it means learning enough code to build a product from scratch. Don’t always take these statements at face value. Sometimes the true message can become lost in the outrage in yours (or someone else’s) head, sometimes it might not have been communicated clearly in the first place and sometimes it might just be true; but too hard for people to accept.

  • Learn to understand code and your job as a Tester may become easier.
  • Learn to understand how the world around us works and your job as a Tester may become easier.
  • Learn to understand how you, your team and your company works and your job as a Tester may become easier.
  • Learn how to sell, convince, persuade and articulate problems to others and your job as a Tester may become easier.

There are many things which could help you become a good, reliable, future proof(ish) tester. You can’t learn all of them. You can’t do everything. Coding is just one of these elements just like understanding social sciences, usability, performance or critical thinking are some others.

Learning to code can be valuable, but you can also be a good tester and not code. It’s not as black and white as that.

Eclipse Testing Day 2012

I was really hoping to get to Eclipse Testing Day 2012 this year, but sadly can’t make it.

Check out the talks though! Some great ideas…and who can fault the price (especially if you’re in Germany) 🙂

It looks like there will be some good panel discussions too if the submitted questions are anything to go by:

  1. Some people say that testing, as we know it, is dead, because a testing phase is something that only happens in waterfall projects at the very end. Do you agree with this statement?
  2. When an active community of users works with a product, they are likely to give feedback in the form of bug reports and enhancement requests. How do you suggest incorporating their feedback into the process for the further development of the product?
  3. The community is a great source of feedback. Nevertheless, it is advantageous to release as few bugs as possible. What do you recommend to gain the balance between community feedback and minimizing bugs in production?
  4. A frequent claim when talking about testing is “my developers don’t make any errors, so we can save money and time by not testing”. What is your view on this?


Getting a grip on exploratory testing

Last week we had the pleasure of inviting James Lyndsay to our offices in rainy Basingstoke for a two day course on “getting a grip on Exploratory Testing”.

The whole test team were there for the two days and we worked through a number of elements of Exploratory Testing, right from time managements and focus to techniques and approaches. James is incredibly knowledgeable and his experience was tapped by the team for the two days.

Each of the team took away a number of insights, but here are my own personal observations:

  • I focus very heavily on the user and their experience of using software
  • I also tend to visualise the problem and walk through it often randomly
  • I also focus heavily on any claims made in any documents, guidance or specifications.
  • The team all approach the same problem in subtly different ways. This is a good thing.
  • Knowing how long we have to test is crucial. It changes our approaches and styles.
  • My approach of using Mind Maps for planning and documenting is a style that works well for me, but not always for others.
  • I tend to use a different map for different problems. I use mind maps, tables, doodles and general free form notes. I also tend to scrap a map if it’s not working.
  • Our judgements on what is an issue and where to focus testing differ between people even in the same business. What I see as issue – other might not and vice versa. This highlights the need to understand your target audience; something we’ve been working hard at here.
  • I need to work on my “straying” from charter. I often get too involved in side paths which are not on charter. This is not always a bad thing but I need to be mindful of it.
  • I often explore the application to drive out charters and ideas for further testing as a form of planning
  • We need to instigate more exploratory testing debriefs

The two day course was insightful, very well delivered and fun. There were lots of hands-on practicals and James tailored the content for our needs here at NewVoiceMedia. Overall it was a great two days and I know the team very much enjoyed it. We just need to put our learnings in to practice, something I am already seeing the guys doing.

Here’s James’ website with course details.

Agile South Coast Meetup

I attended the Agile South Coast Meetup last night in Southampton, the first one I’ve been to in about 3 years. It was good to meet up with the guys again and despite a small crowd it was an interesting evening. The first part of the talk was about change models and then the second part talked about how we could change the group and move it forward. I for one didn’t realise it was a group committee planning session but never-the-less some interesting discussions were had.

The group can be found on LinkedIn here and Twitter here.

Mike Williams kicked off the introduction followed by Plamen Balkanski who talked about making a difference using the change management 3.0 model by Jurgen Appelo.

Plamen talked about why we should make change happen?
He cited a prosci study 2009 which showed that teams that can change were successful.

Plamen also mentioned a number of excellent books about motivation, change and inspiration including Drive (By Dan Pink), and Creating Passion-Driven Teams (By Dan Bobinski)

How did we get here?
Plamen mentioned the three major changes in management

Management 1.0 – i.e Taylorism
Management 2.0 –  same frameworks plus add on like, TQM and Six Sigma
Management 3.0 by jurgen appelo – which talked about Complexity.

Plamen introduced the following ideas from Management 3.0

1. Energise people
2. Empowering teams
3. Align constraints
4. Develop competence
5. Grow structure
6. Improve everything

He then moved on to talk about change management 3.0.

He talked about dancing with the system if you cannot control it. And then moved on to talk about the Deming Plan. Do. Check. Act framework.

Talked about the adkar model
– awareness
– desire
– knowledge
– ability
– reinforcement

He then talked about stimulating the network and the innovation adoption curve (initiators, innovators, early adopters, early majority, late majority, laggards)

He also then moved on to talk about the 5 i’s (Information, identity, incentives, infrastructure, institutions)

As you can see we talked about a lot of useful models and frameworks which were then put in to practice in the second half of the sessions as we worked through each model with reference to Agile South Coast user group. There were some good ideas generated for the future and how the group can move forward.

If you’re in the South region of the UK (Hampshire, Dorset etc) and interested in scrum, agile, lean etc then it’s worth joining the group and getting involved.

Test Bash > Done

It was so great to be part of the Test Bash last Friday in Cambridge. I for one had a great day and I’m so glad that others did too.

The feedback we received about the event was great and it was clear from the vibe throughout the day that people enjoyed themselves, mingled and generally felt connected with others.

I’d like to thank the speakers for such an awesome day:

David Evans
Alan Richardson
Ben Wirtz

Steve Green
Huib Schoots and Markus Gartner
Adam Knight
Andy Glover

I’d also like to thank the attendees who brought such enthusiasm with them. Our sponsors MagenTys and QASymphony. And of course our community members who helped out on the day.

The venue was superb  and the food delicious. Me and Rosie were more than happy with how it went and we were thrilled to see Twitter buzzing and of course, Markus Gartner live blogging*.

We ran the Low Tech Social Network throughout the day which brought people together to connect with those who shared similar interests and hobbies. I witnessed a number of people seeking out others with the same hobbies such as surfing and fast cars and then talking testing with them. That’s the point of it; connections to people you might not normally talk to. Fantastic to see this in action.

Test Bash is a celebration of connectedness, community, creativity, ideas and most importantly, the sharing of these ideas in a safe and trusted environment. We saw loads of good ideas being bounced around and it was amazing how many people felt safe to share and talk about testing openly.

It was also great to finally put a face to an online name. It was good to meet so many people I already knew in the digital world. It was also great to meet so many new faces. I only wish I had more time to chat testing everyone.

Events like the Test Bash fill me with positivity. They make me realise that Testing isn’t a battlefield of right versus wrong. Instead, it’s a melting pot of ideas; some better and some worse depending on what context they find themselves in. There were some frank discussions and opinions aired, but the collaborative nature of the event meant these discussions were constructive, useful and open for all to experiment with.

I didn’t hear the words “Best Practices”, there were no certification bashing, nor evangelism, discussions nor were there any discussions of “my way or no way”. It was a day of optimism. A day of hope for our often negative and frequently argumentative domain.

It was also a great day of learning and a great chance to get together to talk about what matters; new ideas for testing.

And this, in a sense, is what me and Rosie and our community managers work so very hard for. These moments where people connect. Where people share ideas. Where the craft of testing is pushed forward. Where there is real hope of a shift from “one size fits all” testing to a value driven activity across all stages of the life cycle.

Yet, throughout all of this I couldn’t help but notice something blindingly obvious; even within our own industry we have wildly different ideas about what testing is and who is in actual fact a “Tester”.

I don’t think this is a bad thing. In fact, I think it could be the most amazing part of what we do and who we are. We get to do what we think is right. We get to add the value we are employed to add. And that means we’re edging ever so slowly away from the long held stereotypes of Testing and Testers.

We get to nudge our craft towards the future we want for it. And that’s a pretty cool thing.



* There is no one in the business faster than Markus for blogging. I swear he had posted the final remarks before the speakers had even uttered them 🙂 :







Professional Skeptics, Dispeller of Illusions and Questioner

Bear with me as a clear a few posts out of draft. This is my last one for a few weeks. Promise.


This one came about as response to James Bach’s excellent Open Lecture presentation. I took away a number of lessons from that lecture. (including the title – Professional Skeptics, Dispeller of Illusions and Questioner)


Off the back of that I decided to post some questions about “Testing the Spec” as an intriguing LinkedIn forum post got me thinking about why “Testing the Spec” is so common. “Testing the Spec” is where a Tester takes the spec and the system and then validates/verifies that the spec is correct. Yes, you read that correctly….that the spec is correct.

It got me thinking and asking a couple of questions:

  1. What benefits will you receive by testing the system against the spec?
  2. What don’t you know about the system? Will the spec help you?
  3. Do you have any other information sources or Oracles?
  4. Is the information sufficient? Or is it insufficient? Or redundant? Or contradictory?
  5. Would a system diagram suffice?
  6. At what point do you know you are complete? Once every page of the Spec has been “tested” – what about other parts of the system not covered by the spec?
  7. Have you seen a system like this before? Will it help you?
  8. Have you seen a slightly different system? Will it help you?
  9. Who has asked you to do this?
  10. What value do they see in you doing this?
  11. Does the spec accurately depict the system? (I guess this is what they were testing for….)
  12. Would it matter if the spec didn’t match the system? (which one would give you most value; the spec or the system?)
  13. Is it essential to bring the spec up-to-date to match the system?
  14. Will the spec tell you everything you need to know about the system under test?
  15. Would the spec tell you anything that the system wouldn’t?
  16. Do you know how out-of-date the spec is?
  17. Is the spec important as a communication medium? Or just something that gets produced?
  18. How would you communicate your findings? In Bug Reports? Or in a report?
  19. How would you know what was a bug and what wasn’t?
  20. Could the spec confuse you more than simply exploring the system?
  21. Are you using the spec as a crutch or a guide or an Oracle?
  22. How much of the unknown can you determine?
  23. Can you derive something useful from the information you have?
  24. Do you have enough information to model the system in your mind?
  25. Have you used all the information?
  26. Have you taken into account all essential notions in the problem?
  27. How can you visualise or report the results and progress?
  28. Can you see the result? How many different kinds of results can you see?
  29. How many different ways have you tried to solve the problem?
  30. What have others done in the past that might help?
  31. Where should you do your Testing?
  32. When should it be done and by whom?
  33. Who will be responsible for what?
  34. What milestones can best mark your progress?
  35. How will you know when you are successful?
  36. How will you know when you are done?

I’m not doubting the fact that testing a spec against a system is valuable. It *could* be the best thing you can do. But I would ask a lot of questions first. The system is always different to the spec, but in context, your spec could be used as the main reference point, so only you will know whether “Testing the Spec” is valuable for yourself.


I rattled out this list through a combination of the Phoenix Checklist but also after watching the very excellent video of James Bach doing an Open Lesson on Testing.


As with all things, I tend to sketch and draw as both a record of my thoughts, but also as a way of distilling some ideas. I read better in visuals so these rubbish doodles aid my learning and increase the chance I will revisit these points. You might struggle to see some of the text in the image but there are plenty of tools for opening images and zooming.


Note: The diagram is my take-aways from James’ lecture. There are many more lessons  which I’ve taken away earlier from James’ blogs and talks. I *may* have interpreted some things differently to how you would, or even how James intended, but they are my take-aways. I share them here just for completeness and urge you to watch the video

Unicom – Next Generation Testing Conference Review #ngtc2010

A change of venue for the “Next Generation Testing Conference” for last weeks Unicom event, and a welcome change at that. Much easier to get to now at Grosvenor Hotel, Victoria, London. The room itself was quite long with tables for delegates, a fairly small projector screen and a catering/display area at the back. This actually had a fairly bad effect on acoustics and lighting but overall not too bad that it spoiled any enjoyment. The lighting meant that I had difficulty getting some good photos though.

To start with Niel Molataux was the chairperson but this soon switched to Dorothy Graham which wasn’t made very clear, but was probably a wise move.


Dot Graham opened proceedings with a good talk about Test Automation Objectives. She extolled the virtue of making sure we are clear about what we want to achieve with automation. Especially so when trying to sell it to management. This led to a series of contentious metrics and measures you could use to work out a Return On Investment which Julian Harty and John Stevenson questioned and challenged at various points.

To start with though Dorothy was making it clear that we needed to be sure we knew why and when to automate. Here’s some key points.

  • 75% of automation efforts fail.
  • Objectives for automation should not be the same as testing
  • Automation takes longer. 10 times maybe.
  • Effectiveness is a characteristic of testing. Efficiency is a characteristic of automation.
  • We need to look at the objectives and decide whether they are valuable
  • Are the tests actually worth running at all?
  • Automation often requires more people.
  • If we begin to replace testers with automation then managers are lowering the testers to the level of machines.
  • Regression tests add confidence. Do not find many bugs.
  • Testers, tests, exploratory testing, testing new code – these are all the most effective at finding bugs
  • The worst time to automate is when the project is running late
  • Over time our knowledge and understanding changes. Go back and apply our new knowledge to our objectives
  • Often automation is used to support testers and not actually automate the tests. Dorothy mentioned Jonathon Kohl had written an article. I found this link here which might be helpful, but I’m not sure it’s the one Dorothy was referring to : http://www.stickyminds.com/BetterSoftware/magazine.asp?fn=cifea&id=103
  • Lisa Crispin suggests an automation refactoring sprint to focus on making the automation more effective.
  • We should measure the success of our automation and ask WHY are we automating?

I’m not a fan of measuring too much, mainly because I’ve found I spend too long measuring the activity rather than doing the activity. So in a sense I decrease my efficiency and effectiveness by trying too hard to measure it. But Dorothy gave a case for measuring automation using a simple calculation of working out how long it would take to run a manual test versus an automated one. Before you all get annoyed with how simple that is, Dot did explain some more examples and other measures to add to the mix.


Next up was Martin Gijsen who talked about Domain Specific Languages. I’ve seen Martin before at SIGIST and both times I’ve enjoyed the content but struggled with the delivery, mainly because he was very quiet. He knows his stuff, but he is naturally a fairly quiet and dry presenter which meant the audience started to get distracted often.  I actually tweeted mid way that I want to see real working examples and not just screen shots. I was too early though because Martin included a few real fitnesse tests running against Amazon. Always nice to see examples running rather than code snippets. It’s a really great topic and Martin certainly know his stuff.

Julian Harty was up next talking about mobile testing and how to use automation more effectively. Julian is a really talented speaker and his knowledge of the mobile web browser world is incredible. Julian suggested that testing on mobiles was very slow. Like wading through quicksand. It takes too long to test so many only test on a subset of available mobiles.

Julian’s suggestions to get around this are to actually start to design with testability in mind. The problems then evaporate.
Test Driven Development is also essential. Doing the testing first focuses attention. Code that is easy to test is easy to test for all, not just the programmer.
The faster the feedback, the better your code.

Some handy hints when doing mobile testing:

  • Use SMS to send URLs to the phone, rather than trying to type them in to the phone.
  • Capture user-agent date from the headers to gather data. You can then hijack this value and pretend to be a different phone.

Julian suggested you should all check your pocket/wallet/purse when anyone talks to you about UI automation. Make sure you have the same amount of money in there after as you did at the start. UI automation tools are easy sells to the unsuspecting.


Lunch was good and a great opportunity to network. To be honest I was a little disappointed with the stands this time around. It felt like there should have been more. The two sponsors/stands who were there had good stands with little giveaways but I didn’t get the feeling they were interacting with the crowd as much as at previous events.

After lunch Clive King from Oracle was talking about the need to automate and load / scalability test your applications with some great case studies on how he has done this. Clive’s a good speaker and I won’t attempt to repeat here the example he used as they were insightful and in depth.

Next up was one of the most contentious talks of the day for me. It drew the most comments from the audience and was in some respects a good example of false messages about agile. It was a great talk by Jenine Thorne who is a very good presenter and she had some of the best designed slides of the day, but a few people took issue with a lot of the content. Mainly the people who work on agile projects. In one respect it was a real life story of agile adoption at the Norwich and Peterborough Building Society. It was a complete warts and all description of a big bang agile adoption showing the trials and tribulations she and her teams encountered.

The main points raising questions were the following:

  • Jenine mentioned how they had consultants in to move them to agile. However, the consultants moved the ‘development’ team to agile but not the “test” team. The consultant then left. I raised the question to Jenine about whether or not she felt she had been mis-sold the consultancy.
    • She didn’t believe she had. In my mind though this is exactly what is troubling about the agile movement. There are too many consultants going in and making a Dev team agile, but no-one else. For me, I can’t really comprehend what that actually means. I can’t see how a Dev team can be agile without the test team.
    • It’s also not useful to think in terms of teams when you are in a more agile world. Agile, for me, doesn’t work unless the ENTIRE team (including PMs, BA and management etc) are all moving to agile at the same time and with the same motives.
  • Jenine described the process of delivery to test and release and it was clear that it was a series of mini waterfalls. Iterative development with a testing phase at the end. They didn’t release for 6 months.
    • For a few people I spoke to this in essence was not agile. Not being able to deliver for 6 months made many people feel this was not agile.
  • It sounded like the automation process wasn’t in place which is crucial for agile success, especially when working on a large project.
  • There were some invalid descriptions of some very crucial concepts, for example, TDD was described as developers automating the boundary cases and tedious tests that the testers didn’t want to do.

A lot of the audience didn’t work in an agile environment and were there to source information and find out how it’s done. Messages about agile vary wildly, some incorrect, some confused, some right, some slightly right.

Unfortunately this happens a lot at conferences where experience reports show a negative side of agile, or a side that actually isn’t right. In this case many would say that it is not agile. With a false statement of what TDD is, no release for 6 months and little explanation of how regression and automation was done it sounds like another example of mislabelling agile. It was though, an experience report, regardless of what label was applied to it. They are by their very nature…a personal experience. Moving to agile is hard. For sure. But moving to a mis-interpreted place called “agile” is almost impossible. No wonder so many experience reports tell a similar tale. Aim for the wrong goal, call it agile, report it as a disaster but keep calling it agile and keep claiming you are nearly there.

The thing is though, Jenine’s talk was the highlight for me. Although it was contentious to those who work in an agile environment, it was also obviously an emotive topic. For me, there is no point going to a conference if you agree with everything that is being said.

Next up was Keith Braithwaite. Keith is a seasoned agile professional and he gave a superb talk on testing with checked examples. In one example he was providing a fitnesse test, driven from an excel spreadsheet of financial reference data. The interesting thing was that this checked example was provided to development before the code was written. This is a great form of Test Driven Development as the development team have the test data before the code. Perfect for making sure the test team won’t report back cases and checks they had never thought of.

Keith explained how he saw the testers role in agile development changing to be a provider of tests and information at the start rather than actually doing this testing at the end. So informing the Devs of the test cases so they can be automated early. More test advising than test executing. It’s a sentiment I can truly get with. Testing at the start is the fastest way of getting test feedback. So why shouldn’t Devs and testers be pairing to write unit tests before the code? How great for a tester would that be; you could spend your time *really* testing the app when you get it, safe in the knowledge you’ve essentially already checked it. It makes finding bugs hard work. Exactly why we get paid. Right?

Keith said “Testers ticking boxes day after day is a waste of human existence”. As a developer himself he was asking “What can we do to make developers understand more about testing”

At the end of the day there were some lightning talks by Dot, Jenine and Gojko Adzic. What I found intriguing was how many people in the audience had no experience of agile yet were happy to chirp in with comments like “pie in the sky” and “unrealistic”. I don’t buy that. I think what they should be saying is “my thinking, interpretation and understanding of agile is pie in the sky”. It’s not the concepts of agile that are wrong. If you’ve never tried it, you’ll never really know. “But it would never work here” is another common one. How do people know unless they try?

But for all the agile bashing that happens at any testing event, there are always some interesting and balanced views being presented too, suitable for any methodology. I had a really great day and unfortunately couldn’t make it back for day 2. I heard that was good too.

And by the way. If you are anywhere around Cambridge, looking for a job and want to work for Redgate then they are recruiting. Here’s a link to a YouTube video on why it’s awesome to work for Redgate.[youtube http://www.youtube.com/watch?v=K9M2sA8EOVM?wmode=transparent]