Thinking in Systems – by Donella H. Meadows
Date read: 7/9/19. Recommendation: 8/10.
Great introduction to systems thinking – the ability to step back and appreciate the complexity of the interconnected whole. Meadows emphasizes the dangers of generalizing about complex systems and explains the key elements of resilient systems. This includes feedback loops, self-organization, experimentation, and alignment. She also digs into concepts like the tragedy of the commons, bounded rationality, modeling, and how to avoid the pitfalls of each. The benefit of systems thinking is that is helps you avoid isolated, shallow decision-making. With this comes the ability to appreciate the complexity of large systems, their connections, and how to improve or redesign them, when needed. This is an important book for anyone who’s working on complex problems or wants to grow into a more strategic thinker.
See my notes below or Amazon for details and reviews.
The system lens:
Helps reclaim intuition about whole systems, hone abilities to understand parts, see interconnections, ask “what if” questions about future behavior, and be creative in redesigns.
Ancient Sufi story about a king visiting a city of blind citizens on his mighty elephant. Each citizen touched a small part of the elephant (ear, trunk, legs) and drew false conclusions. Need a better understanding of the whole, not just the elements it’s made of.
Questions for testing the value of a model:
Are driving factors likely to unfold this way?
If they did, would the system react this way?
What is driving the driving factors?
Systems studies are not designed to predict, they’re designed to explore what would happen if factors unfold in a range of different scenarios.
Dangerous to generalize about complex systems.
Rich structure of many feedback loops allows a system to thrive in a variable environment. Similar to Taleb’s concept of anti-fragility.
Resilience is similar to a plateau that a system can play safely upon. The more resilient a system, the larger the plateau and the greater its ability to bounce back when near the edges. Less resilient, smaller plateau.
Awareness of resilience allows you to harness, preserve, or improve a system’s restorative powers.
Self-organization is the strongest form of system resilience. System that can evolve can survive almost any change. That’s why biodiversity is so important.
“Insistence on a single culture shuts down learning and cuts back resilience. Any system, biological, economic, or social, that gets so encrusted that it cannot self-evolve, a system that systematically scorns experimentation and wipes out the raw material of innovation, is doomed over the long term on the highly variable planet.” DM
Experimentation is key to anti-fragility and innovation. But it’s difficult because this means giving up control.
Antifragile: “In the end, it seems that mastery has less to do with pushing leverage points than it does with strategically, profoundly, madly, letting go and dancing with the system.” DM
DM: “Complex systems can evolve from simple systems only if there are stable intermediate forms.” Why they’re so common in nature.
Hierarchies are system inventions – provide stability, resilience, and reduce the amount of information system needs to keep track of. Too much central control overwhelms and breaks a system. Unable to achieve more complex tasks.
“Our knowledge of the world instructs us first of all that the world is greater than our knowledge of it.” Wendell Berry
Everything we know about the world is a model – languages, maps, statistics, mental models. Usually correspond well with the world (hints our success as a species), but will never fully represent the world with 100% accuracy. If they did, we would never make mistakes or be surprised.
Mental flexibility = willingness to redraw boundaries.
Alignment + policy resistance:
Bounded rationality: People make reasonable decisions based on information they have about parts of the system they’re closest too. But they don’t have perfect information or ability to see more distant parts of the systems. This is why narrow-minded behavior arises.
Policy resistance occurs when goals of subsystems are misaligned. Need an overarching goal to tie things together. Feedback loops should serve the same goal. Much of that is identifying what problem you’re trying to solve.
1967 Romanian government decided they needed more people so they made abortions illegal. Short term results saw birth rate triple, then resistance set in. People pursued dangerous abortion which tripled maternal mortality.
Hungary, at the same time, was also worried about low birth rate. Discovered it was partially due to cramped housing so they incentivized larger families with more living space. Only partially successful because it was only part of the problem, but not a disaster like Romania.
Sweden was most successful because they recognized that the goal of population and government was not family size, but quality of child care. Birth rate has gone up and down since then without causing panic because they focused on long-term welfare and more robust goal, not a narrow, short-sighted goal.
Silver Rule Example from Garrett Hardin (see Nassim Taleb, Skin in the Game): people who want to prevent other people from having an abortion aren’t practicing intrinsic responsibility unless they’re personally willing to raise the resulting child.
“If you want to understand the deepest malfunctions of systems, pay attention to the rules and who has power over them.” DM
The tragedy of the commons:
Result of simple growth in a system where a resource is not only limited, but erodible when overused. Selfish behavior more convenient and profitable than responsibility to whole community and shared future.
E.g. uncontrolled access to national park (over-tourism) bringing in crowds that destroy park’s natural beauties.
Three ways to avoid the tragedy of the commons:
Educate and exhort (moral pressure)
Privatize the commons (makes direct feedback loop)
Regulate the commons (mutual coercion agreements, i.e. traffic lights, parking spaces).