The following introduction to the essentials of systems thinking is an excerpt from Donella Meadow’s book Thinking in Systems. It was originally published in issue 74 of Timeline magazine, March 2004.
People who are raised in the industrial world and who get enthused about systems thinking are likely to make a terrible mistake. They are likely to assume that here, in systems analysis, in interconnection and complication, in the power of the computer, here at last, is the key to prediction and control. This mistake is likely because the mindset of the industrial world assumes that there is a key to prediction and control.
I assumed that at first, too. We all assumed it, as eager systems students at the great institution called MIT. More or less innocently, enchanted by what we could see through our new lens, we did what many discoverers do. We exaggerated our own ability to change the world. We did so not with any intent to deceive others, but in the expression of our own expectations and hopes. Systems thinking for us was more than subtle, complicated mindplay. It was going to Make Systems Work.
“But self-organizing, nonlinear feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable. The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best. We can never fully understand our world, not in the way our reductionistic science has led us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can’t optimize; we don’t even know what to optimize. We can’t keep track of everything. We can’t find a proper, sustainable relationship to nature, each other, or the institutions we create, if we try to do it from the role of omniscient conqueror.
“For those who stake their identity on the role of omniscient conqueror, the uncertainty exposed by systems thinking is hard to take. If you can’t understand, predict, and control, what is there to do?
“Systems thinking leads to another conclusion, however-waiting, shining, obvious as soon as we stop being blinded by the illusion of control. It says that there is plenty to do, of a different sort of ‘doing.’ The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned. We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can’t impose our will upon a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
“We can’t control systems or figure them out. But we can dance with them!”
Dana learned dancing from the “great powers from whitewater kayaking, from gardening, from playing music, from skiing. All those endeavors require one to stay wide awake, pay close attention, participate flat out, and respond to feedback. It had never occurred to me that those same requirements might apply to intellectual work, to management, to government, to getting along with people.
“But there it was, the message emerging from every computer model we made. Living successfully in a world of systems requires more of us than an ability to calculate. It requires our full humanity our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality.”
Dana then presents her “systems wisdoms,” which she says also apply to all oflife.
1. GET THE BEAT.
“Before you disturb the system in any way, watch how it behaves. If it’s a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it’s a social system, watch it work. Learn its history.” Keep good records, Dana advises, because you can’t always rely on memory. “Focus on facts, not theories. It keeps you from falling too quickly into your own beliefs or misconceptions, or those of others….I have been told with great authority that the milk price was going up when it was going down, that real interest rates were falling when they were rising.
“Starting with the behavior of the system directs one’s thoughts to dynamic, not static analysis–not only to ‘what’s wrong?’ but also to ‘how did we get there?’ and ‘what behavior modes are possible?’ and if we don’t change direction, where are we going to end up?
“And finally, starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution.”
2. LISTEN TO THE WISDOM OF THE SYSTEM.
“Aid and encourage the forces and struc tures that help the system run itself. Don’t be an unthinking intervener and destroy the system’s own self-maintenance capacities. Before you charge in to make things better, pay attention to the value of what’s already there.”
3. EXPOSE YOUR MENTAL MODELS TO THE OPEN AIR.
“Remember, always, that everything yo u know, and everything everyone knows, is only a model. Get your model out there where it can be shot at. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them plausible until you find some evidence that causes you to rule one out. That way you will be emotionally able to see the evidence that rules out an assumption with which you might have confused your own identity.”
4. STAY HUMBLE. STAY A LEARNER.
“Systems thinking has taught me to trust my intuition more and my figuring-out rationality less, to lean on both as much as I can, but still to be prepared for surprises. Working with systems, on the computer, in nature, among people, in organizations, constantly reminds me of how incomplete my mental models are, how complex the world is, and how much I don’t know.
“The thing to do, when yo u don’t know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment or, as Buckminster Fuller put it, by trial and error, error, error. In a world of complex systems it i s not appropriate to charge forward with rigid, undeviating directives. ‘Stay the course’ is only a good idea if you’re sure you’re on course. Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes.”
5. HONOR AND PROTECT INFORMATION.
“A decision-maker can’t respond to information he or she doesn’t have, can’t respond accurately to information that is inaccurate, can’t respond in a timely way to information that is late. I would guess that 99 percent of what goes wrong in systems goes wrong because of faulty or missing information.”
6. LOCATE RESPONSIBILITY IN THE SYSTEM.
“Look for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another. Sometimes those outside can be controlled (as in reducing the pathogens in drinking water to keep down incidences of infectious disease). But sometimes they can’t. And sometimes blaming or trying to control the outside influence blinds one to the easier task of increasing responsibility within the system. ‘Intrinsic responsibility’ means that the system is designed to send feed back about the consequences of decision making directly, quickly and compellingly to the decision-makers.
“Dartmouth College reduced intrinsic responsibility when it took thermostats out of individual offices and classrooms and put temperature-control decisions under the guidance of a central computer. That was done as an energy-saving measure. My observation from a low level in the hierarchy is that the main consequence was greater oscillations in room temperature. When my office gets over heated now, in stead of turning down the thermostat, I have to call an office across campus, which gets around to making corrections over a period of hours or days, and which often over-corrects, setting up the need for another phone call. One way of making that system more, rather than less, responsible, might have been to let professors keep control of their own thermostats and charge them directly for the amount of energy they use. (Thereby privatizing a commons!)”
7. MAKE FEEDBACK POLICIES FOR FEEDBACK SYSTEMS.
“President Jimmy Carter had an unusual ability to think in feedback terms and to make feedback policies. Unfortunately he had a hard time explaining them to a press and public that didn’t understand feedback.
“He suggested, at a time when oil imports were soaring, that there be a tax on gasoline proportional to the fraction of U.S. oil consumption that had to be imported. If imports continued to rise the tax would rise, until it suppressed demand and brought forth substitutes and reduced imports. If imports fell to zero, the tax would fall to zero.
“The tax never got passed.
“Carter was also trying to deal with a flood of illegal immigrants from Mexico. He suggested that nothing could be done about that immigration as long as there was a great gap in opportunity and living standards between the U.S. and Mexico. Rather than spending money on border guards and barriers, he said, we should spend money helping to build the Mexican economy, and we should continue to do so until the immigration stopped.
“That never happened either.
“You can imagine why a dynamic, self-adjusting system cannot be governed by a static, unbending policy. It’s easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties, the best policies not only contain feedback loops, but metafeedback loops–loops that alter, correct, and expand loops. These are policies that design learning into the management process.”
8. PAY ATTENTION TO WHAT IS IMPORTANT, NOT JUST WHAT IS QUANTIFIABLE.
“Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can’t measure. You can look around and make up your own mind about whether quantity or quality is the out standing characteristic of the world in which you live….Don’t be stopped by the ‘if you can’t define it and measure it, I don’t have to pay attention to it’ ploy.
9. GO FOR THE GOOD Of THE WHOLE.
“Don’t maximize parts of systems or subsystems while ignoring the whole.
As Kenneth Boulding once said, don’t go to great trouble to optimize something that never should be done at all. Aim to enhance total systems properties, such as creativity, stability, diversity, resilience, and sustainability–whether they are easily measured or not.
“As you think about a system, spend part of your time from a vantage point that lets you see the whole system, not just the problem that may have drawn you to focus on the system to begin with. And realize that, especially in the short term, changes for the good of the whole may sometimes seem to be counter to the interests of a part of the system. It helps to remember that the parts of a system cannot survive without the whole. The long-term interests of your liver require the long-term health of your body, and the long-term interests of sawmills require the long-term health of forests.”
10. EXPAND TIME HORIZONS.
“The official Lime horizon of industrial society doesn’t extend beyond what will happen after the next election or beyond the payback period of current investments. The time horizon of most families still extends farther than that-through the lifetimes of children or grandchildren. Many Native American cultures actively spoke of and considered in their decisions the effects upon the seventh generation to come. The longer the operant time horizon, the better the chances for survival….We experience now the consequences of actions set in motion yesterday and decades ago and centuries ago.”
11. EXPAND THOUGHT HORIZONS.
“Defy the disciplines. In spite of what you majored in, or what the textbooks say, or what you think you’re an expert at, follow a system wherever it leads. It will be sure to lead across traditional disciplinary lines. To understand that system, you will have to be able to learn from-while not being limited by-economists and chem ists and psychologists and theologians. You will have to penetrate their jargons, integrate what they tell you, recognize what they can honestly see through their particular lenses, and discard the distortions that come from the narrowness and incompleteness of their lenses. They won’t make it easy for you.”
12. EXPAND THE BOUNDARY OF CARING.
“Living successfully in a world of complex systems means expanding not only time horizons and thought horizons; above all it means expanding the horizons of caring. There are moral reasons for doing that, of course. And if moral arguments are not sufficient, systems thinking provides the practical reasons to back up the moral ones. The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem. It will not be possible in this integrated world for your heart to succeed if your lungs fail, or for your company to succeed if your workers fail, or for the rich in Los Angeles to succeed if the poor in Los Angeles fail, or for Europe to succeed if Africa fails, or for the global economy to succeed if the global environment fails.
“As with everything else about systems, most people already know the interconnections that make moral and practical rules turn out to be the same rules. They just have to bring themselves to believe what they know.”
13. CELEBRATE COMPLEXITY.
“Let’s face it, the universe is messy. It is nonlinear, turbulent, and chaotic. It is dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self organizes and evolves. It creates diversity, not uniformity. That’s what makes the world interesting, that’s what makes it beautiful, and that’s what makes it work.
”There’s something within the human mind that is attracted to straight lines and not curves, to whole numbers and not fractions, to uniformity and not diversity, and to certainties and not mystery.” One part of us, Meadows says, “designs buildings as boxes with uncompromising straight lines and flat surfaces. Another part of us recognizes instinctively that nature designs in fractals, with intriguing detail on every scale from the microscopic to the macroscopic. That part of us makes Gothic cathedrals and Persian carpets, symphonies and novels, Mardi Gras costumes and artificial intelligence programs, all with embellishments almost as complex as the ones we find in the world around us.”
14. HOLD FAST TO THE GOAL OF GOODNESS.
“Examples of bad human behavior are held up, magnified by the media, affirmed by the culture, as typical. Just what you would expect. After all, we’re only human. The far more numerous examples of human goodness are barely noticed. They are Not News. They are exceptions. Must have been a saint. Can’t expect everyone to behave like that.
“And so expectations are lowered. The gap between desired behavior and actual behavior narrows. Fewer actions are taken to affirm and instill ideals. The public discourse is full of cynicism. Public leaders are visibly, unrepentantly, amoral or immoral and are not held to account. Idealism is ridiculed. Statements of moral belief are suspect. It is much easier to talk about hate in public than to talk about love.
“We know what to do about eroding goals. Don’t weigh the bad news more heavily than the good. And keep standards absolute.
“This is quite a list. Systems thinking can only tell us to do these things. It can’t do them for us.
“And so we are brought to the gap between understanding and implementation. Systems thinking by itself cannot bridge that gap. But it can lead us to the edge of what analysis can do and then point beyond–to what can and must be done by the human spirit.”
Image credit: african_fi