Skip to main content

Community Building


David Högberg tagged me on a thread about teams in the Rapid Software Testing Slack the other day. I've paraphrased the conversation here:
Our team has expanded to include groups working on many different products. I'm thinking of starting a Testing Community of Practice with a Slack channel and perhaps a monthly meeting. I'd like to share things like articles, how we test our applications, interesting bugs, what we've learned, the business problems our products solve, who our users are, and so on. Looking for thoughts, ideas, advice, articles, etc.
I've spoken to David about this stuff in the past, and I'm an agreeable kind of chap, so I started listing some of the things my team at Linguamatics has done over the years until it became apparent that I had quite a lot to say and I'd be better off typing it up in a proper editor and posting it somewhere other than Slack. So here it is.

-- 00 --

Our setup is a little different to the scenario above in that we are a standalone Test team whose members support multiple development teams by working in them or very closely with them on a day-to-day basis. However, despite being nominally part of the same functional team, it can be a challenge for testers to keep abreast of what's going on elsewhere in the company (in terms of product or testing), to that get broader context for their work, or contextualise the work of others.

Because we recognise this challenge, we've experimented with a bunch of different devices for collaboration, information sharing, and empathy building over the years. Here's some of them.

We have a weekly team meeting. It's a brief sharing of status and then a presentation on something by a member of the team or, occasionally, a guest from another part of the company. There's no formal rota on presenting but it's part of our culture that we'll each regularly present, and that includes me.

Topics are not heavily policed but are expected to be relevant to our work. For example, the tester on each core product feature will give an overview of it, the motivation for it, the implementation, how it was tested, and so on. Team members who've been to conferences will summarise something back to the team about what they learned, or found interesting. If someone's been experimenting with a new tool or approach, they might do a demo of it.

The information exchange is useful in these meetings but there are valuable side benefits too, including a safe space to practice reporting on testing, to get experience presenting, and to build empathy with team mates.

On the other four days of the week we have a stand up meeting. This is intended to be short and is a place to give relevant updates, flag issues, or ask for help. We also make sure that any test suite failure investigations that haven't been picked up are allocated in this meeting. If conversation here starts to go deep, we ask for it to be taken to some other forum. When someone has something they'd like to share quickly, they'll often offer to do it for 5-10 minutes after stand up, as an optional thing for anyone who wants to stick around.

In stand up we have to regularly remind ourselves to keep focused on outcomes over outputs. It's so easy to slip into a laundry list of "I did this, I did that, I did the other" which consumes time and achieves little of benefit. As a manager, I also resist turning them into status reports from others to me by scheduling those conversations separately.

Around Christmas I run a session that I (jokingly) call Testing Can Be Fun. This takes the form of some kind of puzzle or game which requires skills that we might also use in testing. I've written about some of these over the years, for example in Merry Cryptmas, Going On Ahead, and One Way To Test. It's good to see each other in non-standard situations, be a bit more relaxed with each other, and watch how we all approach the same problems in an area where we're equally lacking in context and expertise.

One of my team started an occasional session called What I Don't Know About ... in which someone would present what they knew about an area of one of our products and then invite the rest of the team to fill in the holes. This benefitted both the presenter and the participants equally in terms of learning, but additionally showed us that it's OK to expose our lack of knowledge and to ask for help.

We have a reading group, again facilitated by someone on the team, which meets every three weeks or so and is currently working through categories of test ideas from The Little Black Book on Test Design. We're discussing a category and then trying to apply it as a kind of mob to one of the applications we work on. Again, I've written about some of the things we've looked at in the past, e.g. Observability, Safety, and Checklists.

Some of the team are running a Python coding club that meets a couple of times a month. The current project is a tool to extract information from our continuous integration system and chart some build statistics.

We have a quarterly retrospective for our team in which we try to talk about issues (positive and negative, but naturally more of the latter) that are on us rather than part of a specific project or product. This aims to be a place where we can reflect on and constructively criticise what we do. It's also a place where people get a chance to practice facilitation in a friendly environment. In a similar vein, one of the team has recently begun a series of events to build a Team SWOT analysis.

On a regular basis, over lunch, another team member sets up TestSphere Card of the Week. The format changes on a regular basis but always involves the drawing of TestSphere cards in some combination to provoke conversation about testing.

A couple of years ago we ran an internal test conference where everyone from the team had the opportunity to present on some aspect of testing. We budgeted it as a training day, but one in which we didn't have to pay for a trainer, and set aside some work time for preparation. When we retrospected on it, more prep time would have been appreciated, but spending a day together talking about our craft was appreciated. I blogged about my own presentation in Exploring It!

Katrina Clokie inspired a team member to start a pairing experiment. Anyone who signed up would get another tester to work with once a week for a few months. It ran for a couple of years and, while pairing isn't happening all day every day on our team, it's certainly been normalised. I carried on when the experiment ended and have been pairing with a different member of my team each quarter for a long time now.

In common with many places these days, we've got chat software (currently Microsoft Teams) in which we've made a few channels for standard work topics but also an informal channel (currently called Tea, Coffee, Beer, and Wine) in which we can chew the fat about very non-work stuff. To give just one example, I've been posting pictures of the unusual crisps I've bought during lockdown.

When we grew big enough to have line managers in the team, we started Community of Practice for line management. It ran its course for us but has evolved to become a company-wide thing. We also introduced a brown bag lunch meeting that we call Team Eating and which has been running for five years now, at around one session a month. It brings together colleagues from across the company to build inter-team relationships.

We've recently been trying to set up a Community of Practice with other test groups in our parent company. This is proving more challenging for us as we're geographically distant with little shared context. Three of us from Cambridge visited a group in Brighton last year and we were intending to host a return visit before circumstances got in the way. We've managed to have several virtual meetings but nothing regular has stuck yet. I feel that there's still enthusiasm for this kind of thing, but we need to find the right format and frequency.

If that feels like a looooong list don't be fooled into thinking that we're all doing all of those things all of the time. Everything except for the daily and weekly meetings and Testing Can Be Fun is optional, and there's no negative impact or offense taken by non-attendance.

Even the meetings that are non-optional are missed by all of us from time to time, when something else needs our attention, and we trust each other to gauge when that's appropriate.

I've found that it's natural for enthusiasm to wax and wane, and that it's OK to let the initiatives that have run out of energy stop or change direction. I was delighted when Drew Pontikis said that his experience was the same in his talk, Leading Communities of Practice, recently.

Also, some of these initiatives just won't work, and all of them will likely receive criticism and scepticism from some members of the team. I wouldn't say don't listen, because you might learn something useful by doing that, but I would definitely take the perspective that the people who turn up and participate are the right people to be there.

That doesn't mean I'd judge anyone who doesn't attend negatively: it's their choice and they'll have their reasons. I once gave a presentation to one person at a local meetup (even the organiser didn't turn up that night!) and, yes, of course it was disappointing in some respects but the person who attended was very engaged and we had a great conversation.

As the team manager I try to attend everything of this kind that anyone in my team organises. I want to show support for them as an individual, and to give them confidence that they can attempt something and it will be taken seriously. If they have an idea that they'd like to try something I'll encourage them to set it up and see what happens, and how they feel about it.

Finally, while I'm very interested in sharing ideas and experience, retrospecting, and learning together, there's a secondary aim here that I don't discount at all: the chance to see each other as rounded personalities rather than as one-dimensional words in a bug report, the opportunities to build trust and personal relationships with colleagues, and places to see that our team has shared purpose and values and to discover what they are.

Comments

  1. This is great! Thank you! This gives me several good ideas. I especially like the idea of using TestSphere cards to facilitate discussions, and your "What I don't know about" segment sounds brilliant.

    ReplyDelete

Post a Comment

Popular posts from this blog

Can Code, Can't Code, Is Useful

The Association for Software Testing is crowd-sourcing a book,  Navigating the World as a Context-Driven Tester , which aims to provide  responses to common questions and statements about testing from a  context-driven perspective . It's being edited by  Lee Hawkins  who is  posing questions on  Twitter ,   LinkedIn , Mastodon , Slack , and the AST  mailing list  and then collating the replies, focusing on practice over theory. I've decided to  contribute  by answering briefly, and without a lot of editing or crafting, by imagining that I'm speaking to someone in software development who's acting in good faith, cares about their work and mine, but doesn't have much visibility of what testing can be. Perhaps you'd like to join me?   --00-- "If testers can’t code, they’re of no use to us" My first reaction is to wonder what you expect from your testers. I am immediately interested in your working context and the way

Meet Me Halfway?

  The Association for Software Testing is crowd-sourcing a book,  Navigating the World as a Context-Driven Tester , which aims to provide  responses to common questions and statements about testing from a  context-driven perspective . It's being edited by  Lee Hawkins  who is  posing questions on  Twitter ,   LinkedIn , Mastodon , Slack , and the AST  mailing list  and then collating the replies, focusing on practice over theory. I've decided to  contribute  by answering briefly, and without a lot of editing or crafting, by imagining that I'm speaking to someone in software development who's acting in good faith, cares about their work and mine, but doesn't have much visibility of what testing can be. Perhaps you'd like to join me?   --00-- "Stop answering my questions with questions." Sure, I can do that. In return, please stop asking me questions so open to interpretation that any answer would be almost meaningless and certa

Not Strictly for the Birds

  One of my chores takes me outside early in the morning and, if I time it right, I get to hear a charming chorus of birdsong from the trees in the gardens down our road, a relaxing layered soundscape of tuneful calls, chatter, and chirrupping. Interestingly, although I can tell from the number and variety of trills that there must be a large number of birds around, they are tricky to spot. I have found that by staring loosely at something, such as the silhouette of a tree's crown against the slowly brightening sky, I see more birds out of the corner of my eye than if I scan to look for them. The reason seems to be that my peripheral vision picks up movement against the wider background that direct inspection can miss. An optometrist I am not, but I do find myself staring at data a great deal, seeking relationships, patterns, or gaps. I idly wondered whether, if I filled my visual field with data, I might be able to exploit my peripheral vision in that quest. I have a wide monito

Postman Curlections

My team has been building a new service over the last few months. Until recently all the data it needs has been ingested at startup and our focus has been on the logic that processes the data, architecture, and infrastructure. This week we introduced a couple of new endpoints that enable the creation (through an HTTP POST) and update (PUT) of the fundamental data type (we call it a definition ) that the service operates on. I picked up the task of smoke testing the first implementations. I started out by asking the system under test to show me what it can do by using Postman to submit requests and inspecting the results. It was the kinds of things you'd imagine, including: submit some definitions (of various structure, size, intent, name, identifiers, etc) resubmit the same definitions (identical, sharing keys, with variations, etc) retrieve the submitted definitions (using whatever endpoints exist to show some view of them) compare definitions I submitted fro

Testers are Gate-Crashers

  The Association for Software Testing is crowd-sourcing a book,  Navigating the World as a Context-Driven Tester , which aims to provide  responses to common questions and statements about testing from a  context-driven perspective . It's being edited by  Lee Hawkins  who is  posing questions on  Twitter ,   LinkedIn , Mastodon , Slack , and the AST  mailing list  and then collating the replies, focusing on practice over theory. I've decided to  contribute  by answering briefly, and without a lot of editing or crafting, by imagining that I'm speaking to someone in software development who's acting in good faith, cares about their work and mine, but doesn't have much visibility of what testing can be. Perhaps you'd like to join me?   --00-- "Testers are the gatekeepers of quality" Instinctively I don't like the sound of that, but I wonder what you mean by it. Perhaps one or more of these? Testers set the quality sta

Vanilla Flavour Testing

I have been pairing with a new developer colleague recently. In our last session he asked me "is this normal testing?" saying that he'd never seen anything like it anywhere else that he'd worked. We finished the task we were on and then chatted about his question for a few minutes. This is a short summary of what I said. I would describe myself as context-driven . I don't take the same approach to testing every time, except in a meta way. I try to understand the important questions, who they are important to, and what the constraints on the work are. With that knowledge I look for productive, pragmatic, ways to explore whatever we're looking at to uncover valuable information or find a way to move on. I write test notes as I work in a format that I have found to be useful to me, colleagues, and stakeholders. For me, the notes should clearly state the mission and give a tl;dr summary of the findings and I like them to be public while I'm working not just w

Build Quality

  The Association for Software Testing is crowd-sourcing a book,  Navigating the World as a Context-Driven Tester , which aims to provide  responses to common questions and statements about testing from a  context-driven perspective . It's being edited by  Lee Hawkins  who is  posing questions on  Twitter ,   LinkedIn , Mastodon , Slack , and the AST  mailing list  and then collating the replies, focusing on practice over theory. I've decided to  contribute  by answering briefly, and without a lot of editing or crafting, by imagining that I'm speaking to someone in software development who's acting in good faith, cares about their work and mine, but doesn't have much visibility of what testing can be. Perhaps you'd like to join me?   --00-- "When the build is green, the product is of sufficient quality to release" An interesting take, and one I wouldn't agree with in general. That surprises you? Well, ho

Make, Fix, and Test

A few weeks ago, in A Good Tester is All Over the Place , Joep Schuurkes described a model of testing work based on three axes: do testing yourself or support testing by others be embedded in a team or be part of a separate team do your job or improve the system It resonated with me and the other testers I shared it with at work, and it resurfaced in my mind while I was reflecting on some of the tasks I've picked up recently and what they have involved, at least in the way I've chosen to address them. Here's three examples: Documentation Generation We have an internal tool that generates documentation in Confluence by extracting and combining images and text from a handful of sources. Although useful, it ran very slowly or not at all so one of the developers performed major surgery on it. Up to that point, I had never taken much interest in the tool and I could have safely ignored this piece of work too because it would have been tested by

The Best Laid Test Plans

The Association for Software Testing is crowd-sourcing a book,  Navigating the World as a Context-Driven Tester , which aims to provide  responses to common questions and statements about testing from a  context-driven perspective . It's being edited by  Lee Hawkins  who is  posing questions on  Twitter ,   LinkedIn , Mastodon , Slack , and the AST  mailing list  and then collating the replies, focusing on practice over theory. I've decided to  contribute  by answering briefly, and without a lot of editing or crafting, by imagining that I'm speaking to someone in software development who's acting in good faith, cares about their work and mine, but doesn't have much visibility of what testing can be. Perhaps you'd like to join me?   --00-- "What's the best format for a test plan?" I'll side-step the conversation about what a test plan is and just say that the format you should use is one that works for you, your coll

Test Now

The Association for Software Testing is crowd-sourcing a book,  Navigating the World as a Context-Driven Tester , which aims to provide  responses to common questions and statements about testing from a  context-driven perspective . It's being edited by  Lee Hawkins  who is  posing questions on  Twitter ,   LinkedIn , Mastodon , Slack , and the AST  mailing list  and then collating the replies, focusing on practice over theory. I've decided to  contribute  by answering briefly, and without a lot of editing or crafting, by imagining that I'm speaking to someone in software development who's acting in good faith, cares about their work and mine, but doesn't have much visibility of what testing can be. Perhaps you'd like to join me?   --00-- "When is the best time to test?" Twenty posts in , I hope you're not expecting an answer without nuance? You are? Well, I'll do my best. For me, the best time to test is when there