One of the challenges of testing is coming up with new things to try. Sure, there are common software anti-patterns to look for, and probably specific recurring kinds of issues in your particular product, that can form part of a test strategy. But, particularly as your application matures, you're more likely to find higher value problems with novelty than simple repetition.
During a recent round of exploratory testing we spent some time brainstorming test ideas. As a driver for productivity, spontaneity and creativity we tried something we hadn't done before: a strict two minutes on each of the CRUSSPIC STMPL letters with every suggestion that was shouted out noted straight down on the whiteboard without discussion or elaboration.
After a brief review and some aggregation and refinement, we generated a bunch of session charters which included some new general approaches. Here's a couple:
Coda: The two approaches I've mentioned did give us the returns we were after along with some others including these: we exercised our skills in learning enough about something to make use of it quickly, we found a bug in a third-party tool, we identified a tool we're definitely going to look at further and wrote it up on our wiki to share the knowledge within the team and we learned enough about XSLT to solve a customer issue which, as it happened, could be applied almost directly to the version of the product in the field too.
Image: http://flic.kr/p/58KgGS
During a recent round of exploratory testing we spent some time brainstorming test ideas. As a driver for productivity, spontaneity and creativity we tried something we hadn't done before: a strict two minutes on each of the CRUSSPIC STMPL letters with every suggestion that was shouted out noted straight down on the whiteboard without discussion or elaboration.
After a brief review and some aggregation and refinement, we generated a bunch of session charters which included some new general approaches. Here's a couple:
- tool-driven sessions: take a tool we'd never used but thought might be useful for us in test or could interact with our product in some way and attempt to apply it. We'd hope to get unusual interactions with the AUT and perhaps additional value from knowledge of the tool and some ideas about whether we should look at it properly.
- live issue sessions: take currently unresolved issues from our live customer ticketing system and try to apply the development build to resolve them. Here, our motivation is not so much to find an answer for the customer - although that might be a fringe benefit - but to put the tester into the customer's world with a problem to solve and no known solution.
Coda: The two approaches I've mentioned did give us the returns we were after along with some others including these: we exercised our skills in learning enough about something to make use of it quickly, we found a bug in a third-party tool, we identified a tool we're definitely going to look at further and wrote it up on our wiki to share the knowledge within the team and we learned enough about XSLT to solve a customer issue which, as it happened, could be applied almost directly to the version of the product in the field too.
Image: http://flic.kr/p/58KgGS
Comments
Post a Comment