Wednesday 24 June 2015

A pairing experiment for sharing knowledge between agile teams

Over the past month I've started running a pairing experiment in my organisation. The primary purpose of this experiment is to share knowledge between testers who are working in different agile teams, testing different applications and platforms.

The Experiment Framework

After researching pair testing, I decided to create a structured framework for experimenting with pairing. I felt there was a need to set clear expectations in order for my 20+ testers to have a consistent and valuable pairing experience.

This did felt a little dictatorial, so I made a point of emphasizing the individual responsibility of each tester to arrange their own sessions and control what happened within them. There has been no policing or enforcement of the framework, though most people appear to have embraced the opportunity to learn beyond the boundaries of their own agile team.

I decided that our experiment will run for three one-month iterations. Within each month, each pair will work together for one hour per week, alternating each week between the project team of each person in the pair. As an example, imagine I pair Sandi in Project A is paired with Danny in Project B. In the first week of the iteration they will pair test Project A at Sandi's desk, then in the second week they will pair test Project B at Danny's desk, and so on. At the end of the monthly iteration each pair should have completed four sessions, two in each project environment.

In between iterations, the team will offer their feedback on the experiment itself and the pairing sessions that they have completed. As we are yet to complete a full iteration I'm looking forward to receiving this first round of feedback shortly. I intend to adapt the parameters of the experiment before switching the assigned pairs and starting the second iteration.

At the end of the three months I hope that each person will have a rounded opinion about the value of pairing in our organisation and how we might continue to apply some form of pairing for knowledge sharing in future. At the end of the experiment, we're going to have an in-depth retrospective to determine what we, as a team, want to do next.


An example of how one tester might experience the pairing experiment

A Sample Session

In our pair testing experiment, both the participants are testers. To avoid confusion when describing a session, we refer to the testers involved as a native and a visitor.

The native hosts the session at their work station, selects a single testing task for the session, and holds accountability for the work being completed. The native may do some preparation, but pairing will be more successful if there is flexibility. A simple checklist or set of test ideas is likely to be a good starting point.

The visitor joins the native to learn as much as possible, while contributing their own ideas and perspective to the task.

During a pairing session there is an expectation that the testers should talk at least as much as they test so that there is shared understanding of what they're doing and, more importantly, why they are doing it.

When we pair, a one hour session may be broken into the following broad sections:

10 minutes – Discuss the context, the story and the task for the session.

The native will introduce the visitor to the task and share any test ideas or high-level planning they have prepared. The visitor will ask a lot of questions to be sure that they understand what the task is and how they will test it.

20 minutes – Native testing, visitor suggesting ideas, asking questions and taking notes.

The native will be more familiar with the application and will start the testing session at the keyboard. The native should talk about what they are doing as they test. The visitor will make sure that they understand every action taken, ask as many questions as they have, and note down anything of interest in what the native does including heuristics and bugs.

20 minutes – Visitor testing, native providing support, asking questions and taking notes.

The visitor will take the keyboard and continue testing. The visitor should also talk about what they are doing as they test. The native will stay nearby to verbally assist the visitor if they get confused or lost. Progress may be slower, but the visitor will retain control of the work station through this period for hands-on learning.

10 minutes – Debrief to collate bug reports, reflect on heuristics, update documentation.

After testing is complete it’s time to share notes. Be sure that both testers understand and agree on any issues discovered. Collate the bugs found by the native with those found by the visitor and document according to the traditions of the native team (post-it, Rally, etc.). Agree on what test documentation to update and what should be captured in it. Discuss the heuristics listed by each tester, add any to the list that were missed.

After the session the visitor will return to their workstation and the pair can update documentation and the wiki independently.

To support this sample structure and emphasise the importance of communication, the following graphic that included potential questions to ask in each phase was also given to every tester:

Questions to ask when pair testing

I can see possibilities for this experiment to work for other disciplines - developers, business analysts, etc. I'm looking forward to seeing how the pairing experiment evolves over the coming months as it molds to better fit the needs of our team.

Wednesday 17 June 2015

Notes from Nordic Testing Days 2015

Nordic Testing Days 2015 was my first European testing conference, both as an attendee and a speaker. I really enjoyed my time in Estonia. It was fantastic to listen to, and learn from, a set of speakers that I have never heard before. I also enjoyed meeting a number of people that I have previously only known through Twitter.

As a speaker, I presented a two-hour workshop on the first day of the conference titled "Become someone who makes things happen". I was nervous about presenting this to an international audience. I needed people to interact with one another for the workshop to be successful, so it was a relief to have a group of participants who were prepared to discuss, debate and role-play scenarios.

On the second day of the conference I was a last minute replacement for a speaker who was ill, delivering a presentation titled "Sharing testing with non-testers in agile teams". This was a repeat of a talk I gave several times last year.

I'm particularly grateful to those who have included one of my sessions in their highlights of the conference:



Of the sessions that I attended, here are my key takeaways.

Security Testing

I started the conference by attending a full day tutorial titled Exploring Web Application (In)Security. Bill Matthews and Dan Billing presented some fundamentals of security testing by providing an intentionally vulnerable application. Each participant installed the application on a local virtual machine, which meant we could all independently exploit it - a fantastic learning environment.

This session was the first time I learned about the STRIDE security testing mnemonic, which I captured in a mind map format:



Effective Facilitation

Neil Studd presented a session, which I only later realised was his first full-length conference talk, on Weekend Testing Europe: A Behind-the-scenes Guide to Facilitating Effective Learning.

As a co-founder and intermittent organiser of WeTest Workshops MeetUp in Wellington, it was great to listen to some real-life experiences of selecting and facilitating practical workshops. I particularly liked the reminders about the limits of the facilitation role including "each attendee has their own light bulb moment, don't try and manufacture one" and "attendees are there to learn from each other and not just the facilitator".

I took the opportunity to talk to Neil after his presentation, as I had some specific questions for him. As a result I'm definitely planning to use the Weekend Testing Europe archive as a resource in future, both for internal and external workshops.

Gamification

Gamification is something I'd heard of, but never dug in to. Kristoffer Nordström shared his experience of engaging end users using gamification: game techniques in non-game situations to motivate people and drive behaviours.

I found his experience report really interesting and would encourage you to look through his slides in the Nordic Testing Days Archive. Though I'm not sure whether gamification will work in my organisation, or where it might be applied, this talk certainly gave me a better understanding of how others use it.

Bad Work & Quitters

Rob Lambert delivered a keynote on Why remaining relevant is so important. The point that particularly resonated with me was perhaps tangential to the main topic, his view of "bad work".

Rob talked about how it seems that quitting has become trendy with many people voicing their opinions about leaving a place that does "bad work". He questioned the definition of "bad work" by challenging how much of the concept was based on perception.

Rob also said that "sometimes 'bad work' is where the real change happens". This made me reflect on the opportunities I've had to make change. Perhaps it is from the worst situations that we can make the biggest difference.

Reflection

Erik Brickarp delivered a really interesting experience report on Going Exploratory. As he spoke, Erik repeatedly reiterated that he learned from attempting to implement change through regular reflection. Only when he stopped and thought about how he was working did he have the opportunity to realise how he could have approached things differently. 

Erik said "whenever I feel like I don't have time to reflect, that's a strong indication that I should stop and reflect." This was a good reminder to me. When I'm busy at work, that's when I need to take the time to pause and assess.

*****

I really enjoyed my experience at Nordic Testing Days. Thank you to Helena Jeret-Mäe for selecting my workshop as Content Owner, Kadri-Annagret Petersen for being my track co-ordinator, and Grete Napits for running a fantastic conference. I hope to be back again in the future.