I've just finished reading Thinking in Systems: A Primer by Donella Meadows. It's not a new book but I'd managed to be unaware of it until recently when Marianne Bellotti mentioned it on her podcast, Marianne Writes a Programming Language.
Bellotti is writing a programming language (duh!) for modelling system behaviour using the concepts of stocks and flows, inspired by Meadows' book. The image at the top gives an idea of how such models can work, notably making explicit the feedback relationships for resources in the system, and allowing modellers to reason about the system state under different conditions over time.
I have been aware of systems models similar to this since I first saw Diagrams of Effects in in Quality Systems Management Volume 1: Systems Thinking by Jerry Weinberg. Weinberg's book, and other reading I was doing early in my career as a tester, inspired me to look deeply at the systemic context in which the software I'm working on sits and to be explicit about the models of it that I'm creating.
Architecture diagrams are one common, and often useful, way to view a system graphically but they tend to miss out a couple of important factors: external, particularly non-technical, influences and system dynamics. Diagrams of Effects can remedy that, although I have always found them tricky to create, wondering both where to begin and where to stop.
When I heard Bellotti talk about systems models with two core types, stocks and flows, I was intrigued. Could this be a way for me to simplify the creation of more formal systems models? The answer turns out to be both yes and no. The conceptualisation is simple, for sure, but the struggle to construct a useful formal model after reading Thinking in Systems is still real.
I shouldn't be surprised, part of the challenge of making a model, however formal or informal, is deciding what to include in it, at what granularity. A significant part of that is down to your intended use of the model and the kind of insights you hope to gain by making it. Somewhat meta, this places the model itself as an entity in the system that includes you, the creator, the audience of the model, the constraints on your use of it, and so on.
I'm not going to review the content of Thinking in Systems here. If what I've said above sounds interesting, this handful of related links gives some background:
- A lengthy review of the book by Susan Stepney
- A lecture on Sustainable Systems by Donnella Meadows' from 1999.
- Stocks, Flows, and Feedback Loops, a blog post by Andrew Hening
- Places to intervene in a system, an essay by Donnella Meadows
- How to Create a Diagram of Effects by Rachel Davies
- How Complex Systems Fail by Richard I. Cook
What I will do here is pull out a few quotes from the book that spoke to me about systems and models for testing and at work:
Be a quality detector. Be a walking, noisy Geiger counter that registers the presence or absence of quality. (p. 176)
A system is an interconnected set of elements that is coherently organized in a way that achieves something. If you look at that definition closely for a minute, you can see that a system must consist of three kinds of things: elements, interconnections, and a function or purpose. (p. 11)
A system’s function or purpose is not necessarily spoken, written, or expressed explicitly, except through the operation of the system. (p. 14)
An important function of almost every system is to ensure its own perpetuation. (p. 15)
Because resilience may not be obvious without a whole-system view, people often sacrifice resilience for stability, or for productivity, or for some other more immediately recognizable system property. (p. 77)
Large organizations of all kinds, from corporations to governments, lose their resilience simply because the feedback mechanisms by which they sense and respond to their environment have to travel through too many layers of delay and distortion. (p. 78)
System structure is the source of system behavior. System behavior reveals itself as a series of events over time. (p. 89)
Nonlinearities are important not only because they confound our expectations about the relationship between action and response. They are even more important because they change the relative strengths of feedback loops. They can flip a system from one mode of behavior to another. (p. 92)
There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion—the questions we want to ask. (p. 97)
At any given time, the input that is most important to a system is the one that is most limiting. (p. 101)
Insight comes not only from recognizing which factor is limiting, but from seeing that growth itself depletes or enhances limits and therefore changes what is limiting. (p. 102)
Policy resistance comes from the bounded rationalities of the actors in a system, each with his or her (or “its” in the case of an institution) own goals. Each actor monitors the state of the system with regard to some important variable—income or prices or housing or drugs or investment—and compares that state with his, her, or its goal. If there is a discrepancy, each actor does something to correct the situation. Usually the greater the discrepancy between the goal and the actual situation, the more emphatic the action will be. (p. 113)
The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality. (p. 115)
Rule beating is usually a response of the lower levels in a hierarchy to overrigid, deleterious, unworkable, or ill-defined rules from above. (p. 137)
Systems, like the three wishes in the traditional fairy tale, have a terrible tendency to produce exactly and only what you ask them to produce. Be careful what you ask them to produce. (p. 138)
[confusing effort with result is one] of the most common mistakes in designing systems around the wrong goal. (p. 139)
Listen to any discussion, in your family or a committee meeting at work or among the pundits in the media, and watch people leap to solutions, usually solutions in “predict, control, or impose your will” mode, without having paid any attention to what the system is doing and why it’s doing it. (p. 171)
Images: Mental Pivot, Weinberg
Comments
Post a Comment