Skip to main content


Showing posts from December, 2017

When Support Calls

Just over a year ago now, I put out an appeal for resources on testers doing technical support. A tester on my team had asked for background material before his first support call and I didn't know of any, beyond our internal doc for support staff. Turns out there's not much out there: I got a book recommendation, for The Mom Test which isn't strictly about either testing or technical support, and a couple of offers to pool experiences from local testers Neil Younger and Chris George . I bought and blogged about The Mom Test , and started a Google doc where Neil, Chris, and me began to deposit notes, stories, and advice (our fieldstones ). Some of my own material was culled from blog posts here on Hiccupps (e.g. 1 , 2 ) at a time when I was managing both the Test and Support teams at Linguamatics . When the doc had got to about 20 pages, I began the painstaking process of editing it into shape. Eventually four broad categories emerged: What even is technical supp

Cambridge Lean Coffee

This month's  Lean Coffee  was hosted by us at  Linguamatics . Here's some brief, aggregated comments and questions on topics covered by the group I was in. Performance testing We have stress tests that take ages to run because they are testing a long time-out ... but we could test that functionality with a debug-only parameter. Should we do it that way, or only with production, user-visible functionality? It depends on the intent of the test, the risk you want to take, the value you want to extract. Do both? Maybe do the long-running one less often? Driving change in a new company When you join a new company and see things you'd like to change, how do you do it without treading on anyone's toes? How about when some of the changes you want to make are in teams you have no access to, on other sites? Should I just get my head down and wait for a couple of years until I understand more? Try to develop face-to-face relationships. Find the key players.

Compare Testing

If you believe that testing is inherently about information then you might enjoy Edward Tufte's take on that term : Information consists of differences that make a difference. We identify differences by comparison, something that as a working tester you'll be familiar with. I bet you ask a classic testing question of someone, including yourself, on a regular basis: Our competitor's software is fast. Fast ... compared to what? We must export to a good range of image formats. Good ... compared to what? The layout must be clean. Clean ... compared to what? But while comparison as a tool to get clarification by conversation is important, for me, it feels like testing is more fundamentally about comparisons. James Bach has said "all tests must include an oracle of some kind or else you would call it just a tour rather than a test." An oracle is a tool that can help to determine whether something is a problem. And how is the value extracted from an orac