This month's Lean Coffee was hosted by DisplayLink. Here's some brief, aggregated comments and questions on topics covered by the group I was in.
How to spread knowledge between testers in different teams, and how often should people rotate between teams?
- How to know what is the right length of time for someone to spend in a team?
- When is someone ready to move on?
- How do you trade off e.g. good team spirit against overspecialisation?
- When should you push someone out of their comfort zone, show them how much they don't know?
- Fortnightly test team meetings playing videos of conference talks.
- Secondment to other teams.
- Lean Coffee for the test team.
- Daily team standup, pairing, weekly presentations, ad hoc sharing sessions after standup.
- Is there a desire to share?
- Yes. Well, they all want to know more about what the others do.
- People don't want to be doing the same thing all the time.
- Could you rotate the work in the team rather than rotate people out of the team?
- It might be harder to do in scenarios where each team is very different, e.g. in terms of technologies being tested.
- There are side-effects on the team too.
- There can't be a particular standard period of time after which a switch is made - the team, person, project etc must be taken into account too.
- Can you rotate junior testers around teams to gain breadth of experience?
What piece of testing wisdom would you give to a new tester?
- Be aware of communities of practice. Lots of people have been doing this for years.
- ... for over 50 years, in fact, and a lot of what the early testers were doing is still relevant today.
- There is value in not knowing - because you can ask questions no-one else is asking.
- Always trust your instinct and gut when you're trying to explore a new feature or an area.
- Learn to deal with complexity, uncertainty and ambiguity. You need to be able to operate in spite of them.
- Learn about people. You will be working with them.
- ... and don't forget that you are a person too.
- Use the knowledge of the experienced testers around you. Ask questions. Ask again.
- Make a list of what could be tested, and how much each item matters to relevant stakeholders.
- Pick skills and practice them.
Where you look from changes what you see.
- I was testing a server (using an unfamiliar technology) from a client machine and got a result I wasn't sure was reasonable.
- ... after a while I switched to another client and got a different result.
- Would a deeper technical understanding have helped?
- Probably. In analogous cases where I have expertise I can more easily think about what factors are likely to be important and what kinds of scenarios I might consider.
- Try to question everything that you see: am I sure? How could I disprove this?
- Ask what assumptions are being made.
- What you look at changes what you see: we had an issue which wasn't repeatable with what looked like a relevant export from the database, only with the whole database.
- Part of the skill of testing is finding comparison points.
- Can you take an expert's perspective, e.g. by co-opting an expert.
Using mindmaps well for large groups of test cases.
- With such a large mindmap I can't see the whole thing at once.
- Do you want to see the whole thing at once?
- I want to organised mindmaps so that I can expand sub-trees independently because they aren't overly related.
- Is wanting to see everything a smell? Perhaps that the structure isn't right?
- Perhaps it's revealing an unwarranted degree of complexity in the product.
- Or in your thinking.
- A mindmap is your mindmap. It should exist to support you.
- What are you trying to visualise?
- Could you make it bigger?
- Who is the audience?
- I don't like to use a mindmap to keep track of project progress (e.g. with status).
- I like a mindmap to get thoughts down
- I use a mindmap to keep track of software dependencies.
Image: https://flic.kr/p/iv7P
Comments
Post a Comment