The Association for Software Testing has been crowdsourcing a book, Navigating the World as a Context-Driven Tester, for the last three years. Over that time 28 questions or statements about testing have been posed to our community and the various answers collected and collapsed into a single reply.
Lee Hawkins, the coordinator of the project, has just blogged about the experience in The wisdom of the crowd has created an awesome resource for context-driven testers. He pulled some statistics out of the records he's kept showing the level of interest in each question or statement, measured by the number of responses from the community. That's the red bars on the chart at the top, ranging in value from 4 to 28.
I replied every single time Lee posted, with a very specific mission in mind:
I've decided to contribute by answering briefly, and without a lot of editing or crafting, by imagining that I'm speaking to someone in software development who's acting in good faith, cares about their work and mine, but doesn't have much visibility of what testing can be.
I thought it might be fun to see whether interest in my answers correlated much with interest in the question. The view counts from Blogger are the blue bars on the chart, ranging from a few hundred to several thousand, and show that there's no obvious relationship at all.
I don't trust Blogger's numbers very far, but I'm prepared to believe that the posts with higher numbers were more popular to some extent. These are the top three:
- 69.3%, OK? (What percentage of our test cases are automated?)
- Meet Me Halfway? (Stop answering my questions with questions)
- Can Code, Can't Code, Is Useful (If testers can’t code, they’re of no use to us)
You can find all of my responses here.
Comments
Post a Comment