Thinking, Fast and Slow

Thinking, Fast and Slow

by

Daniel Kahneman

Teachers and parents! Our Teacher Edition on Thinking, Fast and Slow makes teaching easy.

Thinking, Fast and Slow: Part 4, Chapter 34 Summary & Analysis

Summary
Analysis
Kahneman demonstrates that two statements about the results of the 2006 World Cup final: “Italy won” and “France lost” are logically equivalent but evoke different associations and meanings. This feature of System 1 makes it difficult for people to act consistently when presented with different frames.
Frames, like primes, serve as a way of influencing System 1 in how it processes information. Because often the frame is not fully apparent, it is difficult to deliberately adjust our perspective and make judgments in a less biased way.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Kahneman and Tversky applied frames to gambles in these two scenarios. They asked some participants, “Would you accept a gamble that offers a 10% chance to win $95 and a 90% chance to lose $5?” They asked others, “Would you pay $5 to participate in a lottery that offers a 10% chance to win $100 and a 90% chance to win nothing?” These two problems are identical, but the second usually attracts many more positive answers. Losses evoke stronger negative feelings than costs.
This example plays into both the concept of framing, as well as the concept of loss aversion. In the second example, the question is framed to minimize the idea that a person has a chance to lose money (even though of course they are losing money by paying to play the gamble).
Themes
Intuition, Deliberation, and Laziness Theme Icon
Choices, Losses, and Gains Theme Icon
In an essay, Richard Thaler describes another example: the credit card lobby pushed against gas stations that charged more if people paid with a credit card. But their fallback position was to request that the vendors call it a “cash discount” rather than a “credit surcharge,” because people will more readily forgo a discount than pay a surcharge.
Kahneman gives a real-world example that is essentially equivalent to the prior example: the credit card companies are manipulating customers by deemphasizing that they are losing money by using the credit card.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Choices, Losses, and Gains Theme Icon
Quotes
In another experiment conducted by British psychologists, participants are given £50 and then asked to choose between a sure outcome of £20 and a game of chance, in which the person has a 2/5 chance to keep the entire amount that they have been given. The £20 can either be framed as KEEP £20 or LOSE £30. When the frame is designated as KEEP £20, subjects more likely choose the sure thing. When it is LOSE £30, subjects are more likely to gamble.
Again, the frame affects our decisions due to our inherent loss aversion. Rather than take what is framed as a sure loss, people take a gamble. But when that same loss is framed as a gain, people opt for the sure thing.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Choices, Losses, and Gains Theme Icon
Get the entire Thinking, Fast and Slow LitChart as a printable PDF.
Thinking, Fast and Slow PDF
People’s brain activity is monitored during this experiment. When the subject chose the more frequent option (in either frame), a region associated with emotional arousal was active. When subjects did not do what comes naturally, a brain region associated with conflict and self-control was active.
The study of brain activity while this experiment was going on provides a good illustration of how the emotions caused by a word like KEEP or LOSE can impact people’s choices. When people tried to combat this emotion, their System 2 was mobilized.
Themes
Intuition, Deliberation, and Laziness Theme Icon
An experiment that Tversky carried out is another example of emotional framing, in which two outcomes of surgery are described to physicians: “The one-month survival rate is 90% and “There is a 10% mortality in the first month.” Recommending surgery was more popular in the former frame than in the latter.
This framing problem directly relates to WYSIATI. When people see the positive outcomes of surgery, they are more likely to recommend it. When they see the negative outcomes, they are less likely to recommend it.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Kahneman and Tversky also explored framing with this example: the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. If program A is adopted to combat it, 200 people will be saved. If program B is adopted, there is a 1/3 probability that 600 people will be saved, and a 2/3 probability that no people will be saved. A majority of people choose program A.
As in other examples, when the emphasis is on a positive word—saved—people focus on the surest positive outcome in the hopes that they will avoid losses (i.e., they become risk-averse).
Themes
Choices, Losses, and Gains Theme Icon
Now consider different framings: If program A’ is adopted, 400 people will die. If program B’ is adopted, there is a 1/3 probability that nobody will die and a 2/3 probability that 600 people will die. The consequences of A and A’ are the same, as are B and B’. In the second frame, however, most people choose the gamble.
When one examines the opposite perspective on the problem, people focus on the negative outcomes—deaths. They want to avoid sure losses and therefore become risk-seeking. Yet unlike the earlier examples Kahneman describes to illustrate prospect theory, here the only difference between the outcomes is how they are presented to people, highlighting their inconsistency.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Choices, Losses, and Gains Theme Icon
When people are confronted with this inconsistency, they often don’t know how to decide. They know intuitively that saving lives with certainty is good, and certain death is bad. But System 2 does not always have a way of answering the question on its own. Our moral intuitions too often rely on descriptions, not substance.
The framing in this example exposes that we sometimes make decisions—even important moral ones—based on intuitive reactions to emotional words. Though Kahneman doesn’t exactly propose a solution to the fact that System 2 seems to be without a compass, he highlights the necessity of understanding how System 1’s mode of reasoning can be flawed, even when it comes to decisions that seem obvious.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Quotes
Some frames can be more useful than others. Kahneman asks readers to consider a pair of problems. In the first, a woman loses two $80 tickets. In the second problem, a woman loses $160 in cash. Kahneman asks whether the first woman would buy two more tickets, and whether the second would buy tickets anyway. People who see the problems believe that the first woman not buy two more tickets, and the second woman will buy tickets. Kahneman advises that in each case, sunk costs should be ignored. He would ask the first woman, “Would you have bought the tickets if you had lost the equivalent amount of cash?” If yes, she should buy two more tickets.
In this pair of problems, the only difference between the two women is the framing of the loss: one has lost money intended for tickets, while the other has simply lost money out of an overall account (even though her money was also intended for tickets). By framing the loss of tickets as simply a broader loss of wealth, the woman can avoid the bias of believing that the price of the tickets has doubled—thus proving that frames can also be helpful.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Another example of bad framing centers on two drivers: Adam and Beth. Adam switches from a gas-guzzling car with 12 mpg to one with 14 mpg. Beth switches from a 30 mpg car to a 40 mpg car. Intuitively, people think that Beth is saving more gas, but if each person drives 10,000 miles, Adam will reduce his consumption by 119 gallons, and Beth will only reduce hers by 83 gallons. The mpg frame is wrong, and Cass Sunstein, a psychologist who worked with Richard Thaler, helped to change policy that now requires fuel economy information to be printed in gallons-per-mile instead of mpg.
This example is perhaps even more concrete than the last—not only because driving is common to many people, but also because nearly everyone’s intuitive answers about the problem are wrong. The miles-per-gallon frame is misleading not only to consumers but also to policy makers. Sunstein’s role in changing public policy demonstrates the need for more people who understand these biases to help come up with ways to combat them.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Framing is important in yet another example regarding organ donation. In 2003, a study was done that discovered that nearly 100% of people in Austria choose to be organ donors, but only 12% in Germany; 86% in Sweden, but 4% in Denmark. The difference is that the high-donation countries must check a box to opt out of organ donation; low-contribution countries must check a box to become a donor. An important choice is controlled by an inconsequential feature of the situation.
The difference in organ donation serves as a good demonstration of how frames can also play into the laziness of our brains. For those who have thought about organ donation, the frame does not matter. For those who have not thought about it, the easiest intuition is the option that is already selected for them.
Themes
Intuition, Deliberation, and Laziness Theme Icon