Skip to main content

Posts

Showing posts from December, 2015

State Your Business

In their write-up of the State of Testing 2015 survey , the organisers say: we can say with  confidence that demand for the “Thinking Tester” is on the rise, as it appears that today’s Industry needs people who are more than just “a tester”. I don't know whether two data points (2013 and 2015) are really enough to give (statistical) confidence in such a rise, but it certainly reflects my own intentions and desires for the team I run. With that in mind, although the annual snapshots can be interesting, the value of this kind of enterprise is often in the visibility of changes over time and I hope that with another year of data we'll begin to find evidence for some interesting trends. In general, analyses of this sort becomes more reliable with larger numbers of participants so why not help us all to help ourselves and get over to  the  State of Testing survey for 2016   which launches at the beginning of January. Image:  https://flic.kr/p/4N8WJP

My Two Cents

This is the 200th post on Hiccupps. At a milestone like this it's common to pause and reflect, and I've done so a couple of times to date. If you are of an historical bent you might try the lengthy hundredth , or, if introspective is your thing then number 150 is perhaps more up your street. But this one, this double centenary, this one is short and sweet and about ideas. My blog, I see more and more, is a repository of ideas I've had, and sometimes about aspects of the ideas, meta ideas such as the paths to those ideas , connections between ideas  and the way that ideas breed ideas . I try hard not to knowingly regurgitate other people's ideas unless I am commenting on them, or questioning them, or testing my own against them. Which doesn't mean that I don't value them. Quite the opposite in fact: I am a fan of ideas, for and by us all; not least because with no ideas there are no good ideas . Image: Old Book Illustrations

You Meta Watch Out

You are presented with a problem. You are a problem-solver so you suggest solutions and eventually find one that satisfies the problem-poser. Along the way you find out a lot of implicit things that might have been useful to know earlier. But well done anyway. Another satisfied customer! You are presented with a problem. You are a problem-solver but you know that diving into the detail of potential solutions is only one way to skin a cat (although the problem is rarely about feline furectomies in my experience). So you think about asking questions that help you to understand the problem. You might ask questions that help to constrain your search for solutions to the problem. You might ask questions that help to understand the history of the problem, the needs and intent of the problem-poser, the permitted ways in which a solution can be found, the scope of the solution, the time-frame for the solution, the priority of the solution, the necessity of the solution. You learn about the

Cambridge Lean Coffee

We hosted this month's Lean Coffee at Linguamatics . Here's some brief notes on the topics covered by the group I was in. Have You Ever Said "No" to a Testing Task? are we talking about can't or won't? ... or shouldn't (e.g. on moral or ethical grounds) does the time in a project's cycle make a difference to the acceptability of rejection? does the responsibility you take on when offering an alternative (and things later go wrong) prevent people from rejecting? we've questioned the need we've questioned the value that could be returned given the effort to be expended we've questioned the resource available for testing we've asked whether it could be done by someone else we've asked whether it has been done by someone else we've tried to understand the intent, in order to counter-propose Exploratory vs Fixed Testing the question concerned the automation of exploratory testing  ... where the assumed defint

Why Isn't Testing Easier?

We've been building software for decades. And testing it for pretty much the same length of time . Over the years we've built up expertise and methods and approaches and heuristics and skills and theories and practices and communities and schools and courses and accreditations and conferences and blogs and, well, all the kinds of things you associate with a profession. So why isn't testing easier? Tongue somewhat in cheek, I've asked this question of other testers on several occasions recently, including at both the EuroSTAR 2015 and Cambridge Lean Coffee s. Here's a selection of answers I received: Testing is easy. You just find out what the system is supposed to do and see whether it does it. (I loved Rikard Edgren's take on this .) The environment in which we test is constantly changing; perhaps at a rate faster than many testers are prepared to change. Testing is always bespoke to the particular software in its context. Interestingly, althoug

Reserved for the Testers

So it was a bit rude but I couldn't help myself when, at last night's Cambridge Tester Meetup , I laughed out loud during a particularly harrowing work war story. The reason? I'd just imagined the pub landlord as some kind of amateur Myers-Briggs  enthusiast putting the sign on our table earlier in the evening: "Testers? Hmmmm."

More TCs, Vicar?

Several times in recent months I've found myself being asked how many test cases there are in automated suites at Linguamatics . On each occasion I have had to admit that I don't know and, in fact, I'm not particularly motivated to calculate that  particular metric . Depending on how the conversations went, I've asked in return what my questioner thinks we can understand about the quality, coverage, relevance, value and so on of testing by knowing the number of cases. (And let's leave vocabulary aside for now.) Are you a counter? Try this scenario: you wrote sum , a program that adds two numbers, and have asked me to test it... Sure! I can test that for you. To do it, I'll write a test case generator. Here it is: #!/bin/bash echo "#!/bin/bash" for i in `seq 1 $1 ` do s= $(( $i + $i )) echo "r= \` ./sum $i $i \` ; if [ \$ r -eq $s ]; then echo \" OK \" ; else echo \" NO \" ; fi" done A s