Agile Testing FAQs and Mythbuster – Software Testing Atlanta Conference 2015

Here is my slide deck from my talk at the Software Testers Atlanta Conference. It was lots of fun to deliver it. Full, engaged, room (even though it was on of those dreaded post-lunch slots!), good questions, good laughs. It was also refreshing to do a US conference without checking into an hotel not to say a 12+ hours flight… (One of the advantages of living in Boston …)

Thank you for the conference team for inviting me! always fun to visit Atlanta (and have some BBQ on the way back to the airport…)

Why Agile Testing

Background

I recently had a couple of weeks with a few activities related to “Agile Testing”. “Agile Testing” for those not familiar with it is the name we give to the set of thinking guidelines, principles and practices that help run the testing aspects of product development/maintenance in a more effective way under an Agile delivery approach.

A question that came up while presenting the concepts today at a client was “What’s broken? Why do we need this?”. While my presentation covered some of the rationale the question made even more clear (not that I needed any convincing…) that the guided evolutionary approach to improvement is a winner. If they don’t yet feel the need/pain there is a lower chance they will do something about it.

The question comes up – why don’t they feel the pain? Or alternatively, maybe they feel the pain but don’t associate it with the need for “Agile Testing”.

So I wanted to briefly touch on a few questions/indications that you might need to pull ideas from “Agile Testing” as your next improvement step.

Some indications you might need to look at “Agile Testing”

  • You applied WIP Limits and testing is becoming a bottleneck. Not once or twice, but quite consistently.
  • You agreed to include test automation as part of the “Definition of Done” and you are seeing a queue building up meaning the automation is slowing the entire process down, creating significant slack for the people NOT doing automation.
  • You find a lot of defects which send you to rework technical design due to lack of mutual understanding of the functional requirements / stories, or you find yourself leaving things ugly since there is no time to do the rework – earning you some customer feedback that you are not really providing high quality deliverables.
  • You are not able to run a very granular flow – everyone claims smaller stories are not useful since the overhead to deliver them to testing and test them is so high. Let’s just keep using bigger items and deliver to testing not more than every 1-2 weeks.
  • People feel that the test automation approach you have now doesn’t cut it. The total cost of ownership / lifetime costs are very high, and even though people understand the need to have automated coverage in order┬áto integrate often, they are very frustrated by the costs.
  • Testers are confused. Do they need to be automation specialists? Domain experts? Technical experts? Supermen? In this Agile Whole Team approach where there is flexibility and collective ownership – what is their unique value? what should they focus on?

My latest presentation touches on some of the reasoning why these issues come up when going Agile, as well as introduces how “Agile Testing” can help. For more about this you are welcome to join me at one of the upcoming Agile Testing workshops AgileSparks runs in Israel and Europe. Contact us for more information.

 

 

Read my article on the journey of a Tester from waterfall land to Agile/Kanban land (in Hebrew…)

A couple of months ago I ran into Think Testing – a new magazine for testers in israel (published in Hebrew).

I found it very interesting, and decided I want to contribute.

My article has been published in issue number #3. It tries to provide some insight on the experience of a tester when his organization/team decides to go Agile.
(Update: article no longer available in original location so use slideshare now)

I also like the design work they are doing for the magazine. It has a real professional aura.

Let me know what you think.

Hag Sameach!

 

UPDATE: An english translation is now available at slideshare

Collaborating with specialized roles using kanban classes of service

 

I want to share a solution I came up with together with a team of performance / non-functional testing, working in a product group in a large enterprise. This solution deals with the challenge of bridging the principles of "Those who build the system test it", "Non functional testing is a collaboration role", and the fact that specialized roles such as performance testers are usually stretched to cover the demand for their services. 

This group wanted Performance testing for the product of course. What usually happened though is that the performance team only got usable features towards the end of the release (think waterfall like behaviour). This usually meant serious waivers and compromises around performance testing. 

The first step taken by the product group was to work more on effectively breaking the features into smaller viable features and stories. Once this was done, it was easier for the performance testing team to get involved throughout the version, and achieve a more reasonable flow. 

Things should have been great by now. 

BUT then another problem surfaced, even while we were discussing how this would work. 

It was clear that the capacity of the performance testing team wasn't sufficient to really address everything. 

The naive policy meant that when a feature arrived at performance testing, they would decide whether they have time to cover it, do risk management and either test it or skip it. 

The downside for this was that its a black/white decision. This misses the opportunity for the delivery agile teams to do at least SOME performance testing, even if they don't have the capabilities, tools, expertise of the dedicated performance testing team. 

Our suggested solution was to use the concept of kanban Classes of Service to improve on this naive policy. 

Since we already know not every feature requires the same performance test attention, lets not wait until it arrives at the performance team to make this classification. Lets do it earlier, before we go into development/testing. 

With this classification, policies can be setup that can involve both the performance testing team as well as the delivery agile teams in the work of performance / non-functional testing. 

We came up with a flag system:

       Red – performance team Must be involved hands on – probably by joining the Feature team working on this feature for the duration of the effort

       Yellow – performance team Advise/Consult, but most work is in Teams. Representative of the performance team will be visiting the Feature team quite often while the Feature is being worked on. 

       Green – don’t need any involvement from performance team

This system helps drive collective ownership of non-functional testing. One of the first things that happened is that the Feature teams told the performance testers that there are some kinds of tests they can run on their own, although they don't have highly specialized performance tools.

We are also experimenting with systems like this for involving architecture teams, and it applies for most kinds of super-specializations that are not easily integrated into Feature teams. 

Managing the overall flow using kanban will also help see whether a bottleneck develops, and what can be done further to overcome it. 

Lean/Agile Testing

I've been a bit quiet lately on the blog front (as well as twitter for those who are following)

Mainly I've been busy preparing an Agilesparks Agile Testing training with Ronen Bar Nahor. While a lot of work, it has been a great experience. We tried to take some of the Lean/Kanban work we've been focused on lately and apply it to the Testing domain. Those following my blog can see some of that thinking already. 

Applying concepts like Limited WIP to Defects, Hardening, DONE DONE, and how to instill collective test ownership make a lot of sense for Agile Testing in our view. 
It also helps that one of the teams that has progressed the most towards Agile Testing approach is a Scrumban team…

The first round is running next week, already sold out, and we are scheduled to run it publicly as well as internally several times in the upcoming months, all in Israel (for the moment…)

If you are interested to hear more about this, feel free to contact me or Agilesparks.

I hope in the upcoming days I'll return to blogging more regularly. My next area of focus is the synergy between Agile and Theory of Constraints. Keep tuned…