Kahneman and Tversky once rigged a wheel of fortune that was marked from 0 to 100, so that it would only stop at 10 or 65. Participants would write down the number it landed on. They then asked participants two questions: “Is the percentage of African nations among UN members larger or smaller than the number you just wrote?” and “What is your best guess of the percentage of African nations in the UN?”
Kahneman and Tversky’s experiment introduces the anchoring effect, which provides another way in which System 1 influences System 2’s calculations in a way that makes us more prone to error.
The spin of a wheel of fortune cannot possibly provide useful information, but the participants did not ignore it. The average estimates of those who saw 10 and 65 were 25% and 45%, respectively. This is called the anchoring effect: the estimates stay close to the number that people consider.
The anchoring effect shows how even System 2 can be lazy: we use deliberate calculations to try and estimate the answer, but we rely on available information even though it appears to be random.
Kahneman describes another example: if people are asked whether Gandhi was more than 114 years old when he died they will give a much higher estimate of his age at death than if the first question referred to death at 35. Any number people are asked to consider as a possible solution to an estimation problem will induce the effect.
Again, the numbers here are obviously wrong: 114 is much too high, and 35 is much too low. But they provide a basis from which to start, as Kahneman goes on to explain.
Tversky believed this effect was due to the idea that someone would start from an anchoring number and adjust from there. The adjustment usually ends prematurely, because people stop when they are no longer certain that they should continue. This will happen when people are asked when George Washington became president, or the boiling point of water at the top of Mount Everest. An immediate anchor comes to mind, and then people move from it until they are no longer sure they should go farther.
Tversky’s explanation for anchoring places more blame on System 2 than Kahneman’s explanation, arguing that we use answers that come to mind (in these examples, the year 1776 and 100 degrees Celsius) and then deliberately move away from those answers as far as we feel confident.
Other studies found that this adjustment is deliberate: people whose mental resources are depleted or who are doing another task at the same time will adjust less (staying closer to the anchor), implying that System 2 is involved.
This experiment proves the involvement of System 2. As Kahneman explained in the earlier chapters, mental effort is the domain of System 2; when we are mentally depleted, we cannot provide that effort.
Kahneman, on the other hand, believed that anchoring is produced by priming. In the Gandhi example, if the anchoring number is 144 years old at age of death, people do not adjust down from that number, but they are still affected by it. They are primed by the image of an ancient person and are then prone to believe that Gandhi was very old when he died.
Kahneman, unlike Tversky, focuses his theory on System 1, particularly because we sometimes do not realize that we are affected by the number that has been presented to us.
Another experiment conducted by German psychologists demonstrated this aspect of anchoring. People were asked to estimate if the annual mean temperature in Germany was higher than 20 degrees Celsius or lower than 5 degrees Celsius. Those who were shown the first question had an easier time recognizing summer words, and those who were shown the second question had an easier time with winter words.
The experiment here supports Kahneman’s hypothesis, as people’s automatic responses become primed based on the number with which they are presented—either for summer or for winter, and therefore for the words associated with those seasons.
Powerful anchoring effects have been observed in real estate (with asking prices for homes) or in charitable contributions (where people are provided with a suggested donation number). There are situations, however, in which relying on anchoring seems reasonable. If asked a difficult question about which we know little, we will use any information available to us.
Our brains are laziest when we do not have a way of figuring out the answer to a question; like the Gandhi example, we use the random number because we assume that that number is close to the actual answer—in effect using it as a hint, even though we have no reason to believe that should be taken in that way.
Anchoring reveals that we are very suggestible, a fact that is often exploited. Arbitrary rationing is an effective marketing ploy. If a sign on a shelf of soup cans says “Limit of 12 Per Person” vs. “No Limit Per Person,” people will take twice as many cans. Still, many people believe that they are not affected by anchors, in the same way that they do not believe that they are not affected by primes because that is not their subjective experience.
Again, people are prone to be overconfident and that they are not affected by universal psychological phenomena, and that makes them easy to manipulate. But if people become aware of these unconscious effects, they can more accurately try to resist them.
The anchoring effect can be combated, however, by mobilizing System 2 in the correct ways. For example, people will be much more successful in avoiding the influence of anchors in negotiations if they focus their attention on the minimal offer that the opponent would accept, rather than being drawn up to the initial offer that the opponent provided.
Kahneman implies that the anchoring effect can be avoided if people are able to ignore the anchor. But in order to do so, they must first be aware that they have been exposed to an anchor. This is part of Kahneman’s purpose in writing the book in the first place: to give people the tools to recognize the things that make them prone to error.