The planning fallacy is an example of optimistic bias. Optimistic people view their own attributes as more favorable than they truly are and see their goals as more achievable. Optimists exaggerate their ability to forecast, and therefore are overconfident. Their self-confidence leads them to take more risks than they realize and underestimate the odds that they face.
Optimistic bias, like the planning fallacy, is an aspect of overconfidence—usually in one’s own abilities. This chapter explores how human overconfidence in the face of difficult odds can lead to poor decisions, both financial and otherwise.
The chances that a small business will survive for five years in the United States is about 35%, but each person who opens such a business does not believe that the statistics apply to them: their estimate of the chance of success of a business like theirs was 60%. 81% of them put their own personal odds of success at 7 out of 10 or higher.
People who begin small businesses are Kahneman’s first example of the way in which people are overly optimistic, even in the face of difficult odds.
Optimism encourages persistence in the face of obstacles, but that persistence can be costly. A Canadian organization called the Inventor’s Assistance Program rates inventions on a letter grade scale, where D and E predict failure. Their predictions are largely accurate: none of the 411 projects with a D or E grade became commercially successful. Still, after hearing this result, 47% of the inventors with those grades continued developmental efforts even in the face of hopeless odds, often doubling their initial investment.
Perhaps one of the factors that cause people to have overconfidence stems from Kahneman’s theme that people have a hard time weighing statistics over personal experience. They understand their own drive and skill more than they are able to fully digest the odds they have been given. Kahneman will also go on to demonstrate in later chapters how people overweight small probabilities.
Years prior, Kahneman and his wife were on vacation and found a nice but deserted motel in a little-traveled area owned by a couple. The couple said that they had been able to buy it cheap, because six or seven prior owners had failed to make the business profitable. They felt no need to explain why they expected to succeed.
The example of the motel owners provides a concrete anecdote demonstrating the concept of overconfidence and optimism: the couple, like most people, believe that they could succeed where others had failed.
Cognitive biases play an important role in optimism. We focus on our goal and neglect base rates, exposing ourselves to the planning fallacy. We focus on our own qualities and neglect the plans and skills of others. We focus on skill and neglect the role of luck. We focus on what we know and neglect what we do not know.
Overconfidence in business ventures sums up many of the points that Kahneman introduced up to this point, as each factor here contributes to the extreme optimism that we have about our own chances of success.
Kahneman asks readers to consider two questions: “Are you a good driver?” and “Are you better than average as a driver?” Most people say yes to the first. They have a more difficult time with the second, and usually substitute the first answer for the second. When people are asked about tasks they find difficult, they readily rate themselves as lower than average. Thus, “people tend to be overly optimistic about their relative standing on any activity in which they do moderately well.”
The example in which Kahneman asks whether a person is a good driver introduces a logical fallacy and a cognitive illusion. We understand that most people will say that they are good drivers, but have a difficult time reconciling this information with the fact that not everyone can be a better-than-average driver, and we have a minimal reference point for what average driving might look like.
Returning to the example of people working on a business venture, Kahneman writes they will also overestimate their own effect on outcomes, rather than considering the actions of the markets and competition. This is why many big-budget movies might open on a given weekend: they focus on their own abilities and ignore the competition.
This concept is another example of WYSIATI. Instead of considering outside knowledge that the studio may not have, film executives might only focus on how to market and sell their own film to audiences.
Professors at Duke University conducted a survey of large corporations, asking the chief financial officers (CFOs) to estimate the returns of the Standard & Poor’s index over the following year. In addition to this estimate, the CFOs provided two other estimates: a value that they were 90% sure would be too high and one that they were 90% sure would be too low. The range between these two values is known as the 80% confidence interval. In reality, their estimates were far too conservative: about 67% of outcomes fell outside of the range (more than 3 times higher than the expected 20%). This demonstrates that CFOs were grossly overconfident about their estimates.
The 80% confidence interval is standard practice among statisticians for estimates of any kind, in order provide a range for an outcome that they can predict with fair certainty. But outside of statistical analyses, people have a hard time broadening their estimates in this way and will often be far too specific (and therefore far too confident) in that range of estimates.
Kahneman acknowledges, however, that if CFOs had given the accurate 80% range, they would have been laughed out of any company because it is far too broad for financial standards. Organizations like to take the word of overconfident experts. This is true not only in financial institutions, but also in fields like medicine, where high confidence (even if it’s unwarranted) earns the trust of clients.
Kahneman also demonstrates why this overconfidence is rewarded, because people naturally associate expertise with confidence. People are also comforted by experts who will confidently confirm their own beliefs (an aspect of the confirmation bias).
Overconfident optimism is difficult for individuals to tame but perhaps possible for organizations. Gary Klein proposed a procedure called a “premortem.” When making a decision, Klein instructs a company to imagine that it is a year into the future, and the plan they had implemented was a disaster. They should take a few minutes to write a history of that disaster. The premortem overcomes groupthink and legitimizes doubts. It also encourages supporters of a decision to search for possible pitfalls of a plan that they may not have considered earlier. It will not offer protection against all surprises, but it reduces the damage of plans that are borne of uncritical optimism.
The premortem overcomes optimistic bias in the same way that many people’s estimates of the number of pennies in a jar will make the group more successful as a whole. The procedure allows for different perspectives, and also helps overcome certain cognitive biases that some individuals may have and others may not. And simply by considering a negative perspective, it prevents people from following a plan simply because it is the cognitively easier thing to do.