Sunday, June 22, 2014

There is no such thing called Agile Testing

I struggled since long to find a reasonable meaning and definition for phrase 'Agile Testing". So far I have been unsuccessful in finding one definition that can stand my scrutiny. Probably there is nothing called "Agile Testing" exists. Possibly yes.

Before I proceed – let me make a distinction between "Agile" and "agile". James Bach has been suggesting since long - this difference. The word "agile" as a dictionary word meaning "swift", "quick" – when applied to software it simply means what it means in dictionary. Good and reasonable software people have attempting to be "agile" in the project context as demanded by stakeholders. This has been happening much before the industry invented the buzz word "Agile" (note caps "A" here). This word "Agile" is more of a marketing term invented to describe a ceremony-laden model of developing software. It promises continuous, small iterative and quicker pieces of deployable software – straight to market. It is fashion of the day and often seen as panacea to all problems of slow, buggy and boring year long projects draining millions of dollars where first 4-6 months of the project would be spent in agreeing upon the requirements or initial design. In Today's world Market demands speed and flexibility for businesses making software or using software – days of big upfront design and yea- long software projects are getting over.

You can consider "agile" as drinking water that you get in the tap and "Agile" as your favorite brand of mineral water bottled specifically and sold for a price promising certain level of purity.

Also, let me define for the purpose of this post – what is testing. Testing is an open ended activity of evaluation, questioning, investigation and information gathering activity around software and its related artifacts. This is typically done to inform stakeholders about potential problems in the product, advise them about risks of failures as quickly and as cheaply as possible. There is NO one "right"(certified) way to do testing and one right time in the project lifecycle to start with it. The context of the project defined by the people in the project including stakeholders – dictates the form and essence of testing. Testing does not assure ANYTHING, it informs (to best of testers ability and intent) about problems in the software that can threaten the value of software. Given constraints of time and money – testing (even though an open ended evaluation/investigation activity) constantly seeks to optimize its course to find problems faster and report them in right perspective. This requires testers to be good/quick learners, skeptics, thinkers with diverse set of skills in business, technology, economics, science, philosophy, maths/statistics amongst others. In some sense testing is like a sport or a performing art that becomes better with practice and improvisation. A professional tester needs to practice (meaning doing) testing as a professional musician or sports person.

Good testing thus -

  • Focused on working closely with programmer
  • Uses tools/automation to perform tasks that are best done by computer
  • Favored light weight bug tracking process that primarily focused on faster feedback cycle to developer and speed fixing of important bugs (important to stakeholders)

When books, blog posts, articles, conference presentations talk about "Agile testing" – it is always in contrast with so called "Traditional testing". Any meaning or interpretation of traditional testing assumes a stereotypic "traditional" tester. So, let me attempt to define one.

A traditional tester is one who has worked in a waterfall software project and was part of a dedicated (independent) testing team. There would be a wall between development and testing teams and code to be tested would be thrown over the wall for testing purposes. Testers used heavily documented test cases and relied on elaborate requirement documentation. Bugs are reported in a formal bug tracking system and it would be testers pride to fight to prove that bugs logged were to be defended. Testers resisted changes to requirements in the middle of the project and insisted that it would make them to rework on test cases, retest application and hence adds to cost of the project overall. Testers assumed the role of quality police and took pride in being final arbitrators of "ship" decision.

For the uninitiated few examples of what (I believe) is NOT Agile testing

  • Writing units tests in xUnit framework – you are not testing
  • Doing xDD – There are host of 3/4 letter acronyms on the lines of /something/ driven development. As many agile folks admit these are development methodologies – let me not go deep in explaining why these are not related to testing.
  • If you are working on continuous integration tools and your automation gets kicked off in response to a new build/check-in – you are not testing
  • If you are writing stories or participating in scrum meetings - you are not testing

Finally here are 3 reasons – why I believe that there is no such thing called "Agile Testing" -

Agile Testing people do not talk about testing skills

If you know what is testing and you do it – it is obvious that you know what skills you need and how do you work to improve them. Agile people often confused about what is testing and what is not. Hence you cannot expect them to articulate testing skills by them. You typically hear things like "collaboration", "programming skills", "think like customer" etc. I strongly feel that these folks have no clue about testing or testing skills. I bet they are just making it up. Software testing is now special skill in itself. Many people study and practice it as a profession and life time pursuit. There are testing conferences happen all over the world. There is a growing body of knowledge about craft of software testing.

Sad that Agile folks have no idea about these skills. All they are talking about is how developers or team members in Agile projects work and believe. This is what really bothers me about the idea of Agile testing. The idea is being badly articulated.

"Something that everyone in the team does" – that is how an Agile folks define testing. While everyone in a project team owning responsibility to make sure project succeeds is a noble and unquestionable idea – making testing as everyone's responsibility is shooting on own foot. Very soon we will get into "everybody-anybody-somebody" type of problem. Expecting developers do excel in their bit of testing is OK, expecting business analysts/story writers in capturing requirements well is fine too. But making everyone responsible for testing is about turning blind eye to skills required for professional testers. This idea of everyone-does-testing is rampant in Agile teams. Why call this testing with a special name "Agile testing"? In terms of roles – as everyone does testing – you may not have a designated role called testers.

Agile Testing is different from Traditional Testing – but not quite

Inevitably, I now need to introduce term "traditional testing". Agile folks would argue that testing that happens in Agile project is different from "traditional testing" – they point to testing against user stories as opposed to detailed requirements. Wow – if you are testing basis is a story instead of a detailed requirement document- you are doing Agile testing. But how different is that?

Much of the trouble for testers transitioning to agile projects is about their dominant beliefs about testing. For someone who worked in a typical outsourced IT environment – it was difficult to work with stories instead of elaborate requirement documents. It would be challenging to work closely with developers/programmers and speak their language while all along they worked with a wall between them and development team. Automation, for these testers was something on the lines of QTP or any GUI automation tool – where as agile teams used likes of Selenium and API or unit testing.

  • Many testers cannot work with leaner documentation (requirements)
  • When requirements constantly change – they are thrown off the track – they cannot test without test cases
  • No more there wall between dev and test – hence a tester is exposed to work directly with dev. Some are intimidated with this possibility
  • Testers that are familiar with GUI automation tools like QTP are suddenly exposed to tools that work under the skin – expectation is to understand and work with formal programming languages. This is terrifying to many testers.

So, there is no such thing called "Agile Testing" but there is "good testing". If you are a "good tester" and asked to work on an Agile project – what do you do? Fit yourself in project context – keep doing good testing that you always did. Do not get distracted by jargons and marketing terms that you might find people, consultants throwing around.

I think there are some ideas that I have not touched upon – Agile testing quaderants, why exploratory testing is such a hit with "Agile" people – well, that is for part 2. Let's see how this pans out.