Testing Exchange #agiletestingX

So that’s the AGILE SPECIFICATIONS, BDD AND TESTING EXCHANGE done. It was held at Skills Matter’s new centre in London last Friday. It was an interesting day with a really good turn out, around 125 people. The venue itself was brand new, finished the night before I believe.

I’m not going to highlight everything that was said at the event in this blog as no doubt someone has already summarised this elsewhere. I know Nathan Bain was doing some awesome coverage on Wave. Search string here: “with:public tag:#agiletestingX”

It was skills matters first conference/day in their new venue so there were always going to be niggles. Overall the day was very enjoyable and I took away some very interesting information and tips to try and put in to practice. However, the day simply wasn’t varied enough. The highlights for me though were Gojko Adzic’s hilarious talk about requirements, Dan North’s passionate and fired up talk about BDD and Dave Evans relaxed and witty review of current testing dualities.

As the event was informal and relaxed there were several questions from the crowd during people’s presentations. On paper this sounds positive however it had the following effects:

  • the flow of the presentation was interrupted and, in some cases, seriously affected
  • the timings of the talks were put out by quite a margin
  • the questions themselves were far too detailed, often very specific and sometimes so general that it seemed pointless answering them

Throughout the day the sound was also sketchy, with many questions and answers being discussed without half of the crowd being able to hear anything. As the venue is open plan I would suggest sitting as near as possible to the front of the room. As the reception/food area is at the rear and only seperated by a rack of coats it can be quite noisy. When people are in this area talking it seriously affects the ability to hear the speakers for those at the back. It’s also annoying.

The seats were also quite narrow, uncomfortable and cramped meaning you had to take it in turns to lean slightly forward just so people could actually sit next to each other.

The content overall was very good, but maybe a little samey. There appeared to be nothing vastly revolutionary, nothing controversial, not many insightful or practical examples either. But there was lots of theory often re-enforcing what you already knew. Which was fine but some concrete examples, some little breakout examples etc would have been good. One talk did try this but unfortunately it followed a series of questions most people at the back didn’t hear. We had no idea whether this was part of the presentation or simply responding to questions. A few other people also commented that there was “something” missing. We just couldn’t really put our fingers on what it was.

In my opinion the park bench idea didn’t really work (panel of experts for questions). Despite the fact there were questions already defined the people on the bench were too fluid. In fact, anyone could stand up and get on the bench meaning no consistency and a feeling of a free for all. Again the sound hindered things slightly. There were some really good answers though especially the one from Kieth (don’t know his second name).

Despite these niggles though the day was extremely good and I felt I had real value for money. The food and drink was great and the service was excellent. Venue was also nice, clean and new but was quite cold and noisy. Things that no doubt Wendy and team at Skills Matter will be addressing.

One of the highlights for me was the sheer interaction taking place between people on Wave and Twitter. Skills Matter were keen to promote the use of Twitter and many people in the crowd joined in; debating, promoting and discussing the talks.

I’d suggest keeping your eyes open for further courses and conferences at Skills Matter as it’s run by a group of people who are incredibly passionate about what they do. With a few changes to the venue I reckon skills matter conferences will become those highlight ones not to miss. Good crowd, good venue (despite some niggles) and passionate organisers. Recipe for success.

Agile Testing Exchange #agiletestingX

Very much looking forward to Agile Specifications, BDD and Testing Exchange tomorrow at Skills Matter. I’ll be sure to get a blog post done some time over the weekend with a round up and a more detailed posting over at my iMeta blog.

For those people attending – if you’d be interested in being involved in a little accessibility project then do speak to me. It’s quite interesting and I need some help if you’re up for it.

Should be an excellent day. I’m looking forward to the sessions and with Google Wave and twitter tag of #agiletestingX it’s going to be an interactive day too.

Hopefully see some of you there tomorrow.


Taking a break from the old routine

I’m a regular on many of the testing forums and mailing lists and I’ve noticed recently a very worrying trend. That is that many testers simply cannot appreciate different ways of working. Their way is a Best Practice. But it’s more than that, it’s not just about Best Practices but the vitriol in which they are delivered.

I see posts on forums asking for help on how to solve a problem. The responses are usually stories of success of working X way to which someone replies with Y way. Then someone chips in with Z way and pretty soon the whole thread is hijacked in an argument over which way is better. Mine, mine, mine.

Some of these threads are extremely useful where people learn a new way of approaching a similar problem. Other threads though have a slightly more sinister feel to them.

Now, I’m not one to hold back and I often respond with critical comments to posts but I like to think I never tell people how they should do something…my way. I can offer help, suggestions, stories, advice, mentoring and guidance but I can’t *tell* someone what to do. I have no best practices that will work for other people, just some honest suggestions from experience that could work.

However, when answering forum posts many testers don’t appear to consider the fact that there are thousands of testers out there, all working under different methodologies or interpretations of methodologies, time frames, technology, budgets, skills sets and software development life cycles. Some are working in highly structured environments, some with helpful programming teams, some with middle managers bean counting and some with processes that are old, destructive or wasteful. But it’s their environment and it’s their problems they are asking for help with. So let’s help – not dictate and ridicule.

Over the years I’ve created many many many Best Practices for myself. In fact, every time something goes right, it’s a best practice. Yippee. I have also developed some worst practices too. Boo. And along the way forums, user groups and mailing lists have been enormously helpful in helping me forge my way through my testing life. However, the time has come to stop visiting certain testing groups, forums and sites. To stop subscribing to some LinkedIn and Yahoo Groups; the job feeds are taking over and too many articles have Best Practice in the title.

The sad thing is, the resources are the ones I’ve long been supporters of. Resources that have helped me enormously in the past. Resources that have become the main staple of the testing community. And it’s a shame they are becoming hunting grounds for the Vitriol spitting testers pushing Best Practices.

If the negativity and anger included in some of the replies on these sites is the routine (which it appears it is), then I’d recommend we take a break from the old routine. Unfortunately for many of these forums, this seems to be the action many testers have taken too. No longer are people going to sit through a barrage of Best Practice suggestions, people calling other peoples suggestions ‘stupid’ and people leaving sarcastic comments like ‘maybe you didn’t read the question correctly’. We are not finding it helpful.

There are far too many new testing forums, groups, projects and communities who are there to help. Who still have a community feel. Who still serve the community they are part of. Communities who help testers and grow the community. Whose members don’t try to out-do one another, push their Best Practice as the only solution and degrade and belittle anyone who doesn’t agree – although no forum is ever immune to this, but some forums have a much lower rate of angry testers and vitriol Best Practice pushers.

Now let me clear something up. I’m not talking about people who offer criticism, constructive feedback, honest statements and points of views – sometimes in a heated way – this at times can be healthy – very healthy. I’m talking about people who pick apart other peoples ideas for fun, who say that other ways can’t work, who are so hung up on terminology that it’s painful, who actively take it upon themselves to counter argue forever despite real world success stories, who simply can’t understand that people work in different ways, who have black and white views on automation and who plainly abuse and call people stupid (believe me – it happens quite a lot – and a lot worse sometimes).

And I do sincerely hope that these testers who can’t appreciate contexts come to understand that the testing community is complicated, multi-threaded, diverse and evolving. It’s made up of people with no experience, some experience and a wealth of experience. And in a world where collaboration, community and learning are becoming more valuable it seems alien to be abused for posting on a forum a real and genuine question, problem or response. Sure there are “better” ways of working but there are very few absolute “best” ways of working.

But I’m not going to end of a low note. Nope. For every one of the negative, bullying and belittling testers, there are a thousand welcoming, positive and helpful testers. Testers who are happy to help. Testers who give honest opinions and feedback. Testers who understand that their way may not be the “best” way. Testers who are happy to improve, learn and share knowledge.

And that is a really great thing. In fact, it’s a truly great thing. It makes me glow with pride.

Planning for when cows attack

A picture of a cow
A picture of a cow

A few weeks back at one of the testing conferences I was lucky enough to meet a very interesting man, let’s call him Mr F. He was small, stout and incredibly beardy. He was a fascinating man who had some truly incredible stories to tell. A charming man with a razor sharp wit and incredibly strong opinions. I didn’t agree with 99.9% of what he was saying, but that’s beside the point.

Mr F was telling me a rather long winded but entertaining story around the old saying: “He who fails to plan, plans to fail”

Mr F had worked in the financial industry for most of his career but had recently moved to a big brand web company and was leading the testing of the latest version of this companies new website solution. Mr F explained to me how he planned for everything and anything. He told me, with great pride, how he refused to start the project without a risks and issues register. He even chuckled at the stupidity of the project team for thinking they could start without these crucial documents.

He also explained to me how he set up a team wide calendar entry for everyone to check the risks and issues register daily. He also, with some gusto, explained how he had a wealth of action plans he could refer to when the risks became issues or when the issue was becoming a real showstopping problem. (Note: When I refer to issue I mean an issue that is present and affecting the team/project, not an issue in the software or a defect).

He explained how he spent the first one month iteration finalising the documents ensuring each team member had input to the list of possible things that could go wrong. He had plans in place in case the first release of software was below the quality gate set in his test plan, in case any member of the team was off ill, for power failures, for the loss of test environments, for scope creep, for office security breaches, for testers not achieving 10 test cases per day yada yada yada. He listed very many more. It was a really fascinating (but long) story and I have to say, I was impressed with how much planning had gone in to it. I was also incredibly impressed with how passionate Mr F was for planning for the known, unknown, known unknowns and unknown unknowns.

The problem was that Mr F wasn’t a happy man. It turns out on day three of the second monthly iteration three testers threatened to hand in their notice because of Mr F’s insistence on planning for everything and anything (which, by the way, is an endless and impossible task).

Mr F was then removed as test manager and effectively given the boot. Something I’m not entirely sure he had planned for.

Mr F explained to me how he had never understood that there were different ways of working and that some teams simply didn’t need to plan for everything. He seemed genuinely surprised that different industries used different ways of working and that he faced so much resistance when he asked the team to plan. He said they kept saying things like “let’s just get something done” and “let’s build software, not write pointless documents”.

He’d heard all of the stories of teams just “doing” things with low cost, high efficiency tools and frameworks with only enough documentation and how some teams just seem to get stuff done but he’d  never experienced them first hand. He told me how he had shaken his head in anger at how misguided and misplaced these teams were with their “just do it” attitude without truly realising that there is a world outside of *his* testing domain.

Mr F was still a naysayer of many ‘new fangled’ techniques and approaches like Exploratory Testing, Test Driven Development, automated acceptance tests etc but he was beginning to realise that there are other ways of working and that planning too much in advance was pointless for some teams. It was a hard lesson for him to learn; that there are other contexts out there.

He was a fascinating man but it highlighted to me just how many people simply have no idea there are other ways of working. There were a few others at this same event where I met Mr F who simply couldn’t comprehend that teams are releasing software each month to a high quality without the need for massive plans up front. And this is not agile versus waterfall, some of the stories were of waterfall teams just getting stuff done – and some of these teams were more agile than most.

Planning, it seems, is something many testers love to do. And I believe all testers plan, after all “He who fails to plan, plans to fail”. But there is a point in which that extra planning buys nothing. It is waste. There are always things we will never plan for. Things we will never think of. There are always things we will plan for that will never happen. That is life. It is the way of the world.

Finding the right balance is tricky. Some plans are good. Too many can be bad. None may be very dangerous indeed. What works for one team might not work for another.

I do a fair amount of running and (maybe because I’m a tester) I’ve got a plan for various things happening whilst out running. Like if I fall and break something (the joys of having brittle bones), if I’m attacked by some Hoodies (the joys of living in the UK), or chased by a skulk of foxes (the joys of living in rural Hampshire) or hunted by a massive driver less truck (the joys of watching Duel too many times) but I have never ever planned for being mugged by a rabbit.

But sure enough, one day a rabbit the size of the England Rugby team tripped me up whilst I was running and I swear it tried to steal my MP3 player and new running shoes. I would never have planned for this. I still struggle to believe it even happened. And apparently it’s not that uncommon to be attacked by animals. It was in the news a while back about a man who was surrounded by cows and forced to jump in to the River Thames. And another story about a man being chased home from the pub by an angry badger. But do we plan for these things?

Anyway. I digress. But the point is there are too many variables involved in software development (and life) for us to effectively plan for everything we think we will ever encounter so we need to find that line that exists between careful and necessary planning, and time wasting processes that don’t lead to anything useful.

There are just some things that happen that you simply cannot plan for. So why try? Plan for the basics but don’t spend too long planning for the unknown, for the low probability and for everything you could ever think of. Most of it might not happen and I can guarantee there will be bigger problems you’ve never thought of.

So instead, be ready to accept new ideas and concepts, and be prepared to be flexible in how you approach a problem. But most important of all, be open minded to new ways of working and flexible in how you deal with problems.

And I leave you with a perfect quote:

“One should not respond to circumstance with artificial and “wooden” prearrangement”

And do you know who said that?
The legendary Kung Fu master Bruce Lee.





At the Unicom conference I attended last week I noticed for the first time an organisation called QAI. I'm sure they are not new to most people but I've not seen them in the UK before. They are an American organisation (I think) offering testing certification. Like we need yet more to choose from…

But despite the fact I was not interested in their certs their banner did make me pay attention. They have a certification called CAST (Certified Associate in Software Testing).

I'm not sure whether there is any link to CAST (AST conference), I doubt there is, but I've heard of CAST (obviously) and I wondered whether they were playing off the association or simply hadn't known about the link when they came up with the cert name.

QAI seem like a fairly large player and are just starting to push in to the UK market. More certifications, more arguments, more expense…. Whether we agree with certification or not, I'm doubt the certification arena needs anymore certs (not including mine obviously). So watch the certification space. Not only could it become very crowded, but also fiercely competed. It could ultimately be the downfall of the certs arena as more are available employers get even more confused, the certs start to lose their value and people stop taking them. Then again it could be a good thing for the cert industry in that testers start to get all of them….

With even more players offering certification it's made me realise I really need to crack on and get mine released. Mine (ST certs), unlike all the others, will be the ONLY one you need 🙂


Does testing on an Agile project negate the need for Metrics?

I've been interested by a post Mark Crowther put up on Test Republic a while back about requirements coverage and code coverage and any other sort of test coverage. (http://www.testrepublic.com/forum/topics/code-coverage-with-test-cases)

I've been testing in an agile environment for some time now and I've not yet had the need to record metrics of any type other than project level CI, code coverage and burndown; but nothing test case or defect releated. And here's why: (NOTE: This is my example. It might not be the same for you…..)

  1. The definition of done tells everyone what needs to be done for a story/sprint to be marked as complete.
  2. One of the criteria in the definition of done, in our instance, states it should be tested.
  3. After checking the software against the acceptance criteria I explore the app looking for unexpected or unknown bugs. I do a lot of this and record my test activities as exploratory notes. There are no metrics from this.
  4. We handle defects directly relating to stories by chatting them through with the programmers, who fix immediately and deploy within minutes (if possible). After all, the story is not done if it has an outstanding bug.
  5. For defects that can't be fixed immediately we raise them as tasks against the story. They then get done. Again, story is not done if outstanding tasks.
  6. For defects that don't relate to a story in the current sprint we raise them as stories on the backlog for the customer to prioritise. It's the customers decision in the end.
  7. We automate a regression suite which runs nightly so there is no need for regression test cases and no need for metrics other than a daily web page telling me something has failed.
  8. There is no need for requirement coverage because we are only really concerned with the current sprint and the stories included.
  9. Code coverage for unit tests is on by default as is CI and automatic deploymentments. It's how we roll whether agile or not. And these metrics are machine generated and not really traditional test metrics.
  10. The stories in the sprint are prioritised by the customer so if the top story is programmer complete and I find a serious issue the programmer stops story 2 and fixes story 1. The aim is top priority stories complete 100% rather than all of them 75%.
  11. The customer decides the next sprint so there is no need to produce test cases and metrics for future requirements. There's no guarantee that requirements will even be implemented.
  12. Our end sprint goal is always working, shippable software.
  13. I don't have testers who need detailed test cases

So given the above I've not yet, in three years, had to gather and analyse the usual metrics, on an agile project yet I know people still produce them. I appreciate that many agile teams work in many different ways so I'm really interested in finding out what these other ways are. I'd be interested to know in the comments how you all manage your tests and metrics.

Do you still really need to produce them?
Who needs them?
Why do they need them?
What are they used to meaure?
How do you measure?


Shared : Some interesting topics. Some are test related too.

Thought I'd share some social / tester links that I've found quite interesting recently.

Set up by Matt Locke of http://test.org.uk/ this new event is set to be one of the most interesting events of next year. The event is not test related but instead is based around good story telling. The ability to tell a good story about anything is a very human trait which us testers can learn a lot from, especially when the story is real. As testers we are asked to tell stories each and every day. When we raise bugs, when we write test cases, plans and scripts. They are all stories telling something, to someone and in some medium. We may not realise they are stories but they are.

And it's our job to make sure we tell the story right. To get the right tone, pitch, level of terminology and structure right. It's our job to get the Purpose, Audience and Context right. It's our job to tell a story. And despite the fact TheStory event is not about testing I think it's a great story to follow. Sign up to the RSS feed, recommend a story teller or offer to speak at the event yourself.

Hasn't everyone got a story to tell?

This is just the greatest document I think I've ever read. On the surface it is just a guide about how to create an environmentally aware, safer and friendlier New York. Dig a little deeper and it becomes a fantastic example of how teams can pull together to create a collaborative document. It shows that collaboration does work. It shows that for anything that seems to already work pr be in place, there are always ways to improve it. But for me it shows that there is no hard and fast way of doing something, but a set of recommendations, some tried and tested, some not, that can help to guide people in making difficult decisions.

And if you don't see any of that in there, it's still a fantastically fun read.

One of my all time favorite articles written about the street becoming a platform for technology. This article shows me that, as testers, we need to get cleverer at how we are going to test and how we are going to approach new problems. We can argue all we like about software and websites but how are we going to test the next generation of technology solutions?

It reminded me of a talk James Whittaker did at SIGIST early this year in which he introduced some cool concepts from Microsoft and asked how we would approach the testing of it.

Great article.

A great guide for creating and running a conference. Some really good ideas but the one that really struck me was about blogging. Inviting people to the conference just to blog….sounds like a fab idea. As conferences become more visible through social media why not have people there blogging and tweeting about the presentations, stands, vibe, organisation, hotels, cities etc? Whether it is good or bad it shows some awesome openness and willingness for transparency on behalf of the organisers.

At Agile Testing Days I saw this first hand as some people were blogging and tweeting about the event. It was brilliant to see the stream coming through. As more testers realise that social media sources are becoming a hugely important way of sourcing information blogging and tweeting about the events could become a great commercial separator.

The problem comes if the organisers start trying to influence the peeps doing the blogging.

[slideshare id=697534&w=500&h=417&sc=no] (http://equalweb.net/is/ann-mcmeekin/)
Stumbled across Ann McMeekin's presentation on how to make web accessibility sexy the other day whilst researching some accessibility links and online checkers. The presentation doesn't make a lot of sense without the speaking (or accompanying notes) but gives you some idea of how we can start to think about accessibility as part of our roles.

I was in Berlin when this blog post dropped through my reader and it really rang a resonance with me. It's Simon Morley's blog posting on bouncing ideas around. I've long been a fan of involving the whole team in solving a problem. There are points of views, mental models, personal theories, views and opinions in each and every single member of the team and often bringing these views together generates the most suitable outcome.

It's kind of like the Wisdom of Crowds theory. I love the process of bouncing ideas around and more often than not, the final outcome is fine tuned. Great posting.

Stumbled across this article on certification and how actually, the employer needs to do a little bit of work and see past the certs. Really liked the tone of this one. Well worth a read.

Whether you believe in Social Media or not (although the fact you are reading this blog suggests you do) you simply cannot ignore the outstanding facts surrounding facebook.

    *  Average number of friends per user is 130
    * Total number of Facebook applications is 350,000
    * It is the number 1 Photo Site on the Web
    * More events are created on the site than on Evite per day
    * People spend 6 8 billion minutes per day on Facebook which makes it the number 1 site on the Web according to many measurement services including Nielsen and ComScore.
    * The fastest growing demographic is users aged 35 and older.
    * The page with the most fans is Barack Obama with 6.8 million fans
    * The TV show with the most fans is South Park.
    * 70% of their users are outside the United States of America

I put a rant out on twitter a while ago about really stupid questions being asked on some forums. Questions like "how do I test?", "how can I test without a certification", "please answer fully how I test this problem" etc etc.

And Ben Simo (qualityfrog) responded with a link to a cool post he did a while ago. It's about stupid questions. I think this should be compulsory reading for all testers.

Wow. I love this theory, even though I'm not entirely sure I fully understand it. Let's say it's going to be taking a bit more of my time over the next few weeks. Social Objects – objects we interact with. Good reading, although heavy to understand at times. Well, for my tiny brain it is.

http://johnnyholland.org/ – awesome site. I visit this site to learn about social media, new ideas, concepts, communication and how we, as testers, can use social media to learn and develop. Really great content and some of it directly applicable to my daily testing tasks.

Let me know what's keeping you busy by leaving a comment.