Wednesday, December 21, 2011

A Gradual Decline into Disorder



I like to listen to podcasts on my walk to work and I try to interleave testing stuff with general science and technology. The other day a chap from Cambridge University was talking about entropy and, more particularly, the idea that the natural state of things is to drift towards disorder.

Entropy: "Historically, the concept of entropy evolved in order to explain why some processes (permitted by conservation laws) occur spontaneously while their time reversals (also permitted by conservation laws) do not; systems tend to progress in the direction of increasing entropy." 

In an ironic reversal of its naturally confused state, my brain spontaneously organised a couple of previously distinct notions (entropy in the natural world and the state of a code base) and started churning out ideas:
  • Is the development of a piece of code over time an analogue of entropy in the universe? Could we say that as more commits are made, the codebase becomes more fragmented and any original order becomes slowly obscured?
  • Is it possible that a metric based on entropy could be derived to measure the "organisedness" of a codebase, something like a cyclomatic complexity analysis? And, if so, could this metric predict software quality?
  • Could there be strategies, perhaps again analogues of the natural world, for holding back the increasing entropy in the software?
  • Could testing, with an aim of creating order at the external interface to an application, actually make the internal entropy worse by forcing more edits?
I strolled into the office feeling quite chuffed, made a cup of tea, turned my computer on, noted the 500 emails, ignored them, searched for "entropy testing" and found that, as usual, the good ship Original Thought had long since sailed. My day began its gradual decline into disorder.

Credit where it's due:
Image: graur razvan ionut / FreeDigitalPhotos.net

1 comment:

  1. Entropy is a concept in Physics that explains exactly how energy is transmitted. But, in typical fashion, the Physicists used a strange, counter-intuitive, and confusing rubric: that the propagation of energy results in more or less "disorder" i.e., entropy. Stranger still, more disorder is equated with more "information". This by the way is central to the Steven Hawking black hole controversy.

    In the 1940s, Physics was highly influential. Claude Shannon (a telecom engineer) discovered, in the great tradition of engineering math hacks, that certain aspects of entropy mathematics were very useful to analyze communication channels. Thus the strange idea that energy propagation is characterized as increasing or decreasing the relative disorder ("information") of physical entities turned out to be very useful for the analysis of certain aspects of communication channels and their messages.

    Sixty years later, we have Information Theory, a broad collection of quantitative models derived from Shannon's work and applied to a very wide range of phenomena (biologists use it a lot for example.)

    Since first encountering it, I've felt the terms "information theory" and "relative disorder" (entropy) are poorly chosen and confusing (again, thank the Physicists for that.) The math suffers from no such problem and is quite interesting and useful.

    Information theory has a great intrinsic appeal and some software researchers have attempted to find uses for it. So far, I don't think much can be said for this. I've developed an application of it to support the stop-testing decision. In the grand tradition of information theory obfuscation, I call my use "relative proximity." The deck is at
    http://www.robertvbinder.com/docs/talks/TestersDashboard.pdf

    The accompanying paper is
    http://robertvbinder.com//docs/arts/Testers-Dashboard-Final.pdf

    Just my 2 bits.

    Bob Binder

    ReplyDelete