Thinking, Fast and Slow

by Daniel Kahneman

Thinking, Fast and Slow: Part 1, Chapter 7 Summary & Analysis

Summary
Analysis
System 1 allows us to use intuition to draw conclusions. Kahneman introduces two shapes that can either look like the letter “B” or the number “13” depending on the surrounding context. He also describes a sentence, “Ann approached the bank,” which can change associations based on whether an earlier sentence has to do with money or with rivers. In each case, a definite choice was made in our minds, and people are often unaware of the ambiguity of the shape or the sentence.
Kahneman’s B/13 example, and the Ann example, demonstrate not only how we automatically (and unknowingly) process things, but also how we immediately move to construct stories and patterns from previously encountered contexts.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Psychologist Daniel Gilbert argues that to understand a statement, one must first attempt to believe it, and then consider whether it is untrue. Even a nonsensical statement, like “whitefish eat candy,” will initially evoke belief until it is proven false. If people are forced to hold digits in their brain and simultaneously determine whether statements are true, they will often believe many false statements. The conclusion: when System 2 is otherwise engaged, we believe almost anything.
Like the earlier example of the woman in the gorilla suit earlier, when our minds are preoccupied with other tasks it can be difficult to process and evaluate information. And because, as Gilbert proves here, we tend to err on the side of belief versus disbelief, we can make mistakes about the most basic information.
Themes
Intuition, Deliberation, and Laziness Theme Icon
This concept contributes to a general confirmation bias. Asking “Is Sam friendly?” will lead to a different thought process than “Is Sam unfriendly?” We automatically look for confirming evidence. This is contrary to the rules of science, which advises testing hypotheses by trying to refute them.
Themes
Human Fallibility and Overconfidence Theme Icon
Quotes
The “halo effect” is an outgrowth of confirmation bias—it is the tendency to like (or dislike) everything about a person, including things that we might not have observed. The halo effect is one of the ways that System 1 generates a simpler representation of the world than actually exists.
Themes
Human Fallibility and Overconfidence Theme Icon
Get the entire Thinking, Fast and Slow LitChart as a printable PDF.
Thinking, Fast and Slow PDF
Kahneman describes a scenario in which one might meet a woman named Joan at a party. Joan is personable and easy to talk to. If her name comes up as a possible donor for a charity, we retrieve good feelings about her and think that she is likely to be generous (a relatively baseless assumption). And now that we believe she is likely to be generous, we like her even more.
Themes
Human Fallibility and Overconfidence Theme Icon
Another psychologist, Solomon Asch, presented descriptions of two people and asked for comments about their personality. The descriptions included the exact same words, but for Alan, the description began “intelligent—industrious—impulsive,” and ended “critical—stubborn—envious.” For Ben, the words were listed in the opposite order. People are much more likely to have a positive view of Alan because the initial traits were positive, but for Ben the initial traits were negative. We view the second set of words in the context of the first.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Kahneman describes the halo effect he himself experienced when grading students’ exams. He would often be biased by their first essay score. When he started to score the tests blind, working one essay at a time before moving on to the next, he found that his new grading system made it more difficult to give a coherent score (because the essay grades varied wildly), but was less biased. He no longer experienced cognitive ease.
Themes
Human Fallibility and Overconfidence Theme Icon
A procedure to tame the halo effect is by using the opinions of many people, called “decorrelate error.” If people are forced to guess the number of pennies in a jar, their individual estimates will be relatively poor, but the average of a group of estimates tends to be quite accurate. The only caveat is that their estimates must be independent; they cannot be allowed to affect each other. Organizations can learn from this: open discussions often give too much weight to the opinions of those who speak early and assertively.
Themes
Human Fallibility and Overconfidence Theme Icon
Kahneman next introduces a principle, which he terms “What You See Is All There Is” (WYSIATI). If we are asked whether a person will be a good leader and are told first that they are intelligent and strong, we automatically assume that they will be, even though the description might go on to say that the person is also corrupt and cruel. We jump to conclusions based on the information available to us.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
In an experiment constructed by Kahneman’s long-time scientific partner Amos Tversky, people were presented with a legal scenario. Some people heard the defense, others heard the prosecution, and some heard both sides. The participants were aware of the setup and could have generated the argument for the other side. Nevertheless, the presentation of one-sided evidence strongly affected judgments, and people who only saw one side where far more confident in their judgments than people who saw both.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
WYSIATI implies that neither the quality nor the quantity of the evidence counts for much. The confidence that people have in their beliefs is based on the quality of the story they can tell about what they see. We often fail to allow for “the possibility that evidence that should be critical to our judgment is missing.” 
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Quotes
WYSIATI also accounts for framing effects. The statement that “the odds of survival one month after surgery are 90%” is more reassuring than “mortality within one month of surgery is 10%,” and people’s decisions about whether to go through with surgery can be affected by the framing that they see.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
Choices, Losses, and Gains Theme Icon
Lastly, WYSIATI accounts for what Kahneman calls “base-rate neglect.” Kahneman briefly describes a fictional man named Steve in the introduction and reintroduces him here. Steve is “a meek and tidy soul” who has a “need for order and structure.” If people are asked if it is more likely for Steve to be a librarian or a farmer, people will say a librarian, even though there are about twenty male farmers for every male librarian. The statistical facts did not come into mind, only the description. What we see is all there is.
Themes
Human Fallibility and Overconfidence Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon