Thinking, Fast and Slow

Thinking, Fast and Slow

by

Daniel Kahneman

Teachers and parents! Our Teacher Edition on Thinking, Fast and Slow makes teaching easy.

Thinking, Fast and Slow Summary

Daniel Kahneman begins by laying out his idea of the two major cognitive systems that comprise the brain, which he calls System 1 and System 2. System 1 operates automatically, intuitively, and involuntarily. We use it to calculate simple math problems, read simple sentences, or recognize objects as belonging to a category. System 2 is responsible for thoughts and actions that require attention and deliberation: solving problems, reasoning, and concentrating. System 2 requires more effort, and thus we tend to be lazy and rely on System 1. But this causes errors, particularly because System 1 has biases and can be easily affected by various environmental stimuli (called priming).

Kahneman elaborates on System 1’s biases: sentences that are easier to compute and more familiar seem truer than sentences that require additional thought (a feeling called cognitive ease). System 1 also tends to search for examples that confirm our previously held beliefs (the confirmation bias). This in turn causes us to like (or dislike) everything about a person, place or thing (the halo effect). System 1 also causes us to substitute easier questions for hard ones, like “What is my mood right now?” for the question “How happy am I these days?”

The second part of the book focuses on biases in calculations. Our brains have a difficult time with statistics, and we often don’t understand that small samples are inherently more extreme than large samples. This leads us to make decisions on insufficient data. Our brains also have the tendency to construct stories about statistical data, even if there is no true cause to explain certain statistical information.

If we are asked to estimate a number and are given a number to anchor us (like asking if Gandhi was over 35 when he died, and then asking how old Gandhi was when he died), that anchor will have a large effect on our estimation. If asked to estimate the frequency of a thing or event (like people who divorce over the age of 60), it is rare that we will try to calculate the basic statistical rate and instead we will overestimate if we can think of vivid examples of that thing, or have personal experience with that thing or event.

We overlook statistics in other ways: if we are given descriptions about a fictional person who fits the stereotype of a computer science student (Kahneman names him Tom W), we will overestimate the probability that he actually belongs to that group, as the number of computer science students is actually quite small relative to other fields. In the same vein, if a fictional person fits the stereotype of a feminist (Kahneman calls her Linda), people will be more likely to say that she is a feminist bank teller than just a bank teller—despite the fact that this violates the logic of probability because every feminist bank teller is, by default, a bank teller.

When trying to make predictions, we often overestimate the role of qualities like talent, stupidity, and intention, and underestimate the role of luck and randomness—like the fact that a golfer who has a good first day in a tournament is statistically likely to have a worse second day in the tournament, and no other causal explanation is necessary. In this continuous attempt to make more coherent sense of the world, we also create flawed explanations of the past and believe that we understand the future to a greater degree than we actually do. We have a tendency to overestimate our predictive abilities in hindsight, called the hindsight illusion.

Kahneman next focuses on overconfidence: that we sometimes confidently believe our intuitions, predictions, and point of view are valid even in the face of evidence that those predictions are completely useless. Kahneman gives an example in which he and a peer observed group exercises with soldiers and tried to identify good candidates for officer training. Despite the fact that their forecasts proved to be completely inaccurate, they did not change their forecasting methods or behavior. People also often overlook statistical information in favor of gut feelings, but it is more important to rely on checklists, statistics, and numerical records over subjective feelings. An example of this can be found in the development of the Apgar tests in delivery rooms. This helped standardize assessments of newborn infants to identify which babies might be in distress, and greatly reduced infant mortality.

Kahneman spends a good deal of time discrediting people like financial analysts and newscasters, whom he believes are treated like experts even though, statistically, they have no demonstrable predictive skills. He works with Gary Klein to identify when “expert” intuition can be trusted, and discovers that some environments lend themselves to developing expertise. To develop expertise, people must be exposed to environments that are sufficiently regular so as to be predictable, and must have the opportunity to learn these regularities through practice. Firefighters and chess masters are good examples of true experts.

Kahneman elaborates on other ways in which we are overconfident: we often take on risky projects because we assume the best-case scenario for ourselves. We are ignorant of others’ failures and believe that we will fare better than other people when we consider ventures like starting small businesses, or as Kahneman himself experienced, designing curricula.

Kahneman then moves on to writing about the theory he and Amos Tversky developed, called prospect theory. He first introduces Daniel Bernoulli’s utility theory, which argues that money’s value is not strictly fixed: $10 dollars means the same thing to someone with $100 as $100 has to someone with $1,000. But Kahneman highlights a flaw in Bernoulli’s theory: it does not consider a person’s reference point. If one person had $1 million yesterday and another had $9 million, and today they both have $4 million, they are not equally happy—their wealth does not have the same utility to each of them.

Prospect theory has three distinct features from utility theory: 1) Prospects are considered with regard to a reference point—a person’s current state of wealth. 2) A principle of diminishing sensitivity applies to wealth—the difference between $900 and $1,000 is smaller than the difference between $100 and $200. 3) Losses loom larger than gains: in a gamble in which we have equal chances to win $150 or lose $100, most people do not take the gamble because they fear losing more than they want to win. Loss aversion applies to goods as well—the endowment effect demonstrates that a good is worth more to us when we own it because it is more painful to lose the good than it is pleasant to gain the good.

Standard economic theory holds that people are rational, and will weigh the outcomes of a decision in accordance with the probabilities of those outcomes. But prospect theory demonstrates that sometimes people do not weigh outcomes strictly by probability. For example, in a scenario in which people have 95% chance to win $10,000, people overweight the probability that they may not win the money. They become risk averse, and will often take a smaller, guaranteed amount. If there is a 5% chance of winning $10,000, people overweight the probability of winning and hope for a large gain (this explains why people buy lottery tickets).

Prospect theory explains why we overestimate the likelihood of rare events, and also why in certain scenarios we become so risk-averse that we avoid all gambles, even though not all gambles are bad. Our loss aversion also explains certain biases we have: we hesitate to cut our losses, and so we often double down on the money or resources that we have invested in a project, despite the fact that that money might be better spent on something else.

Our brains can lack rationality in other ways: for instance, we sometimes make decisions differently when we consider two scenarios in isolation versus if we consider them together. For example, people will on average contribute more to an environmental cause that aids dolphins than a fund that helps farmers get check-ups for skin cancer if the two scenarios are presented separately. But when viewed together, people will contribute more to the farmers because they generally value humans more than animals.

How a problem is framed can also affect our decisions: we are more likely to undergo surgery if it has a one month survival rate of 90% than if the outcome is framed as a 10% mortality rate. Frames are difficult to combat because we are not often presented with the alternative frame, and thus we often don’t realize how the frame we see affects our decisions.

Kahneman also worked on studies that evaluated measures of happiness and experiences. He found that we have an experiencing self and a remembering self, and that often the remembering self determines our actions more than the experiencing self. For example, how an experience ends seems to hold greater weight in our mind than the full experience. We also ignore the duration of experiences in favor of the memory of how painful or pleasurable something was. This causes us to evaluate our lives in ways that prioritize our global memories rather than the day-to-day experience of living.

Kahneman concludes by arguing for the importance of understanding the biases of our minds, so that we can recognize situations in which we are likely to make mistakes and mobilize more mental effort to avoid them.