Michael Lewis

The Undoing Project – Michael Lewis

The Undoing Project: A Friendship That Changed Our Minds – by Michael Lewis
Date Read: 11/26/17. Recommendation: 9/10.

A fascinating look into the unlikely relationship and original contributions of two Israeli psychologists, Daniel Kahneman and Amos Tversky. Together they examined how people make decisions and predictions, and uncovered the systematic bias and errors that are inherent to each. They found that the human mind replaced laws of chance with rules of thumb–what they referred to as "heuristics," including availability, representativeness, anchoring, and simulation. Each heuristic reveals itself in the form of a cognitive bias (hindsight, recency, vividness, etc.). The papers of Kahneman and Tversky have had widespread positive implications, helping to educate experts in various fields (economics, policy, medicine) of their own biases, and ultimately leading to the creation of "behavioral economics."

See my notes below or Amazon for details and reviews.

 

My Notes:

"Your mind needs to be in a constant state of defense against all this crap that is trying to mislead you." -Daryl Morey (GM, Houston Rockets)

It seemed to him that a big part of a consultant's job was to feign total certainty about uncertain things.

His main question to NBA scouts he interviewed, "who did you miss?" Wanted to find some degree of self-awareness.

But this raised a bigger question: Why had so much conventional wisdom been bullshit? And not just in sports but across the whole society. Why had so many industries been ripe for disruption? Why was there so much to be undone?

Danny Kahneman's defining emotion is doubt. An outsider who was displaced by WW2.
Amos Tversky was the quintessential Israeli and thrived in the role of a fearless hero. Self-assured.

The question the Israeli military had asked Kahneman–Which personalities are best suited to which military roles?–had turned out to make no sense. And so Danny had gone and answered a different, more fruitful question: How do we prevent the intuition of interviewers from screwing up their assessments of army recruits? Remove their gut feelings, and their judgments improved.

Walter Mischel's "marshmallow experiment" inspired Danny to study "the psychology of single questions."
-A child's (3-5 years old) ability to wait turned out to be correlated with his IQ and his family circumstances and some other things. Tracking them through life, found the better they were able to resist temptation, the higher their future SAT scores and sense of self-worth, the lower their body fat or likelihood that they would suffer from some addiction.

"It's hard to know how people select a course in life. The big choices we make are practically random. The small choices probably tell us more about who we are. Which field we go into may depend on which high school teacher we happened to meet. Who we marry may depend on who happens to be around at the right time of life. On the other hand, the small decisions are very systematic. That I became a psychologist is probably not very revealing. What kind of psychologist I am may reflect deep traits." –Tversky

Economic theory, the design of markets, public policy making, and a lot more depended on theories about how people made decisions.

By changing the context in which two things are compared, you submerge certain features and force others to the surface. "The similarity of objects is modified by the manner in which they are classified." –Tversky

Danny thought, this is what happens when people become attached to a theory. They fit the evidence to the theory rather than the theory to the evidence. They cease to see what's right under their nose.

"Belief in the Law of Small Numbers"
Teased out the implications of a single mental error that people commonly made–even when those people were trained statisticians. People mistook even a very small part of a thing for the whole...They did this, Amos and Danny argued, because their believed–even if they did not acknowledge the belief–that any given sample of a large population was more representative of that population that it actually was.

If you flipped a coin a thousand times, you were more likely to end up with heads or tails roughly half the time than if you flipped it ten times. For some reason human beings did not see it that way. "People's intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well."

The failure of human intuition had all sorts of implications for how people moved through the world, and rendered judgments and made decisions.

The entire project, in other words, was rooted in Danny's doubts about his own work, and his willingness, which was almost an eagerness, to find error in that work...It wasn't just a personal problem; it was a glitch in human nature.

Oregon researchers (with Lew Goldberg) asked doctors to judge the probability of cancer in 96 different individual stomach ulcers by looking at a stomach X-ray. Without telling the doctors, researchers mixed up the duplicates randomly in a pile so they wouldn't notice they were being asked to diagnose the exact same ulcer they had already diagnosed...Doctors' diagnoses were all over the map: The experts didn't agree with each other. Even more surprisingly, they rendered more than one diagnosis for the duplicates–couldn't even agree with themselves.

"Belief in the Law of Small Numbers" raised obvious next question: If people did not use statistical reasoning...what kind of reasoning did they use?

"Availability: A Heuristic for Judging Frequency and Probability"
The mind did not naturally calculate the correct odds...It replaced the laws of chance with rules of thumb. These rules of thumb Danny and Amos called "heuristics."

Four Heuristics: availability, representativeness, anchoring, simulation.

Representativeness: When people calculate the odds in any life situation, they are often making judgments about similarity.
-You have some notion of a parent population: "storm clouds" or "gastric ulcers" or "genocidal dictators" or "NBA players." You compare the specific case to the parent population.

The smaller the sample size, the more likely that is is unrepresentative of the wider population.

Availability: Any fact or incident that was especially vivid, or recent, or common–or anything that happened to preoccupy a person–was likely to be recalled with special ease, and so be disproportionately weighted in any judgment.

"The use of the availability heuristic leads to systematic biases." Human judgment was distorted by...the memorable.

The bias was the footprint of the heuristic. The bias, too, would soon have their own names, like the "recency bias" and the "vividness bias."

"We often decide that an outcome is extremely unlikely or impossible, because we are unable to imagine any chain of events that could cause it to occur. The defect, often, is in our imagination."

The stories people told themselves, when the odds were either unknown or unknowable, were naturally too simple.

It's far easier for a Jew living in Paris in 1939 to construct a story about how the German army will behave much as it had in 1919, for instance, than to invent a story in which it behaves as it did in 1941, no matter how persuasive the evidence might be that, this time, things are different.

Simulation: Power of unrealized possibilities to contaminate people's minds. As they moved through the world, people ran simulations of the future.

Danny wanted to investigate how people created alternatives to reality by undoing reality. He wanted, in short, to discover the rules of imagination.

Imagination wasn't a flight with limitless destinations. It was a tool for making sense of a world of infinite possibilities by reducing them. The imagination obeyed rules: the rules of undoing...
-The more items there were to undo in order to create some alternative reality, the less likely the mind was to undo them.
-"An event becomes gradually less changeable as it recedes into the past." With the passage of time, the consequences of any event accumulated, and left more to undo.
-In doing some event, the mind tended to remove whatever felt surprising or unexpected.

"On the Psychology of Prediction"
People predict by making up stories.
People predict very little and explain everything.
People live under uncertainty whether they like it or not.
People believe they can tell the future if they work hard enough.
People often work hard to obtain information they already have. And avoid new knowledge. –Tversky

Man's inability to see the power of regression to the mean leaves him blind to the nature of the world around him.

When they wrote their first papers, Danny and Amos had no particular audience in mind...They sensed that they needed to find a broader audience. Began targeting high-level professional activities, economic planning, technological forecasting, political decision making, medical diagnosis, and the evaluation of legal evidence.

They hoped that the decisions made by experts in these fields could be "significantly improved by making these experts aware of their own biases, and by the development of methods to reduce and counteract the sources of bias in judgment."

That is, once they knew the outcome, they thought it had been far more predictable than they had found it to be before, when they had tried to predict it–"hindsight bias."

In his talk to historians, Amos described their occupational hazard: the tendency to take whatever facts they had observed (neglecting the many facts that they did not or could not observe) and make them fit neatly into a confident sounding story:
"All too often, we find ourselves unable to predict what will happen; yet after the fact we explain what did happen with a great deal of confidence. This "ability" to explain that which we cannot predict, even in the absence of any additional information, represents an important, though subtle, flaw in our reasoning. It leads us to believe that there is a less uncertain world that there actually is, and that we are less bright than we actually might be."

"Creeping determinism"–Sports announcers, political pundits, historians all radically revise their narratives so that their stories fit whatever just happened. Impose false order upon random events probably without even realizing what they were doing.

"He who sees the past as surprise-free is bound to have a future full of surprises."

Error wasn't necessarily shameful; it was merely human. "They provided a language and a logic for articulating some of the pitfalls people encounter when they think. Now these mistakes could be communicated.It was the recognition of human error. Not its denial." -Don Redelmeir

"So many problems occur when people fail to be obedient when they are supposed to be obedient, and fail to be creative when they are supposed to be creative." -Tversky

"The secret to doing good research is always to be a little underemployed. You waste years by not being able to waste hours." -Tversky

"Selective Matching"
–Amos had collected data on NBA shooting streaks to see if the so-called hot hand was statistically significant...but the streaks were illusions...false patterns that people mistakingly assigned meaning to.
–"For arthritis, selective matching leads people to look for changes in the weather when they experience increased pain, and pay little attention to the weather when their pain is stable...[A] single day of severe pain and extreme weather might sustain a lifetime of belief in a relation between them."

Even though insurance is a stupid bet, you buy it because you place less value on the $1,000 you stand to win flipping a coin than you do on the $1,000 already in your bank account that you stand to lose.

By the summer of 1973, Amos was searching for ways to undo the reigning theory of decision making, just as he and Danny had undone the idea that human judgment followed the precepts of statistical theory.

"It is the anticipation of regret that affects decisions, along with the anticipation of other consequences." -Kahneman

People did not seek to avoid other emotions with the same energy that they sought to avoid regret. When they made decisions, people did not seek to maximize utility. They sought to minimize regret.

People regretted what they had done, and what they wished they hadn't done, far more than what they had not done and perhaps should have.

"The pain that is experienced when the loss is caused by an act that modified the status quo is significantly greater than the pain that is experienced when the decision led to the retention of the status quo. When one fails to take action that could have avoided a disaster, one does not accept responsibility for the occurrence of the disaster." -Kahneman

"Theory of Regret" (they soon left this behind and focused on "Risk Value Theory")
-Coming close: Nearer you come to achieving a thing, the greater the regret you experienced if you failed to achieve it.
-Responsibility: The more control you felt you had over the outcome of a gamble, the greater the regret you experienced if the gamble turned out badly.

"For most people, the happiness involved in receiving a desirable object is smaller than the unhappiness involved in losing the same object...Happy species endowed with infinite appreciation of pleasures and low sensitivity to pain would probably not survive the evolutionary battle."

"Risk Value Theory"
-Realization that people responded to changes rather than absolute levels.
-Discovery that people approached risk very differently when it involved losses than when it involved gains.
-People did not respond to probability in a straightforward manner.

A loss, according to their theory, was when a person wound up worse off than his "reference point." This could be wherever you started from, your status quo, or an expectation (i.e. annual bonus).

"Framing"–simply by changing the description of a situation, and making a gain seem like a loss, you could cause people to completely flip their attitude toward risk, and turn them from risk avoiding to risk seeking.

"I have vivid memories of running from one article to another. For a while I wasn't sure why I was so excited. Then I realized: They had one idea. Which was systematic bias." –Thaler

"Reality is a cloud of possibility, not a point." –Tversky

"The Conjunction Fallacy"
People were blind to logic when it was embedded in a story. Describe a very sick old man and ask people: Which is more probably, that he will die within a week or die within a year? More often than not, they will say, "He'll die within a week."

Amos created a lovely example. He asked people: Which is more likely to happen in the next year, that a thousand Americans will die in a flood, or that an earthquake in California will trigger a massive flood that will drown a thousand Americans? People went with the earthquake.

The force that led human judgment astray in this case was what Danny and Amos had called "representativeness," or the similarity between whatever people were judging and some model they had in their mind of that thing.

The work that Amos and Danny did together awakened economists and policy makers to the importance of psychology.

Richard Thaler–the first frustrated economist to stumble onto Danny and Amos's work and pursue its consequences for economics single-mindedly–would helped to create a new field, and give it the name "behavioral economics."