Economist Howard Kunreuther noticed that the availability heuristic explained the pattern of insurance purchase after disasters. After disasters, people are very concerned and buy insurance, but this concern dims over time. He also observed that protective actions are usually designed to be adequate for the worst disaster that has been experienced. It is difficult for people to prepare for disasters that may be worse.
The fact that people have a hard time preparing for disasters that may be worse again demonstrates how people put more stock into their own experiences, rather than objectively identifying possible risk.
An influential study designed by Paul Slovic asked participants in a survey to consider pairs of causes of death (e.g., diabetes and asthma). Participants indicated the more frequent cause and estimated the ratio of the two frequencies. The results were clear: estimates of causes of death were warped by media coverage, which is biased toward novelty and poignancy (for example, death by accidents were judged to be more than 300 times more likely than death by diabetes, but diabetes is actually four times more likely). Causes of death that yielded frightening and visceral images were particularly overestimated.
Slovic’s study provides another example of how we overestimate frequency of certain things—in this case, causes of death—based on the ease with which we can think of examples of that happening. This thought process also seems inextricably linked from the emotional aspect of System 1, as frightening images becomes even more available and overestimated in our minds.
Paul Slovic eventually developed the notion of the affect heuristic, in which people substitute the question “What do I think about it?” with “How do I feel about it?” When surveying people about the benefits and risks of various technologies, people who liked a technology exaggerated its benefits and underplayed its risks; when they disliked a technology, the opposite happened.
The example of the various risks and benefits of technologies also recalls the halo effect, in which people have a tendency to like or dislike everything about a person or thing in order to simplify their thought processes about that thing.
After completing the initial survey, people then read brief passages with arguments in favor of certain technologies. They found that simply reading about the benefits of a technology change people’s mind about its risk, even though they had received no evidence about the risks.
This other example Kahneman provides also exhibits the halo effect, in which learning about benefits caused people to try to form a more simplistic and coherent idea of that technology, thus decreasing their worry about its risks.
Slovic does a lot of work with risk and paints a picture of the average person as guided by emotion rather than reason. Expert judgments about risk are often based more in logic and statistics. Slovic argues, however, that expert opinions should not be accepted without question when they conflict with average people—that risk is subjective, and people’s emotions about it should be taken into account when creating public policies.
This argument between the “average person” and the “expert” highlights the way in which people, most of the time, are deeply affected by stories and are highly subjective. The experts, on the other hand, represent the confidence in statistical objectivity.
Another scholar, Cass Sunstein, argues against Slovic. Sunstein believes that the system of regulation in the United States reflects public pressure and sets poor priorities. He believes that risk can be calculated by lives and dollars cost.
While Slovic sides more with the average person, Sunstein takes on the objective viewpoint and sides with the experts in trying to deal with risk.
Sunstein’s research focuses on two examples that are still debated: the Love Canal affair and the Alar scar. In Love Canal, buried toxic waste was exposed during a rainy season in 1979. There were daily stories about it, and the control of toxic waste became the major environmental issue of the 1980s. But scientists believed that the risks were overstated, and the expense incurred by cleaning up the waste could have saved many more lives if directed to other priorities. The Alar scare provides a similar example in which the small risks of a chemical sprayed on apples became hugely overstated.
These two stories play not only with the tension between subjective fear and objective risk assessment, but also with the availability heuristic that Kahneman brought up before. In both of these examples, the large amount of media attention caused people to become fearful of these risks, and therefore swayed by their dangers.
The Alar scare demonstrates how we have a difficult time dealing with small risks: we either ignore them or give them far too much weight. This “availability cascade,” in which events given prominent media attention garner a large overreaction, explains why terrorism is so potent. Even in a county plagued by terror campaigns, like Israel, the weekly number of casualties never comes close to the number of traffic deaths, but the media attention biases our perception.
Again, the availability cascade relies on a human fault: that we tend to become particularly fearful or particularly affected by gruesome and unique events. Media stories then often focus on these kinds of events because we have such strong reactions to them, and as a result we become even more fearful.
Kahneman writes that he sees the merit in both arguments: that availability cascades are real and lead to overreactions. But fear is also painful, and policy makers must endeavor to protect the public from fear, not only from real dangers. Additionally, availability cascades can alert people to classes of risk, like environmental concerns as a whole. Risk policies should combine expert knowledge with the public’s emotions and intuitions.
This is perhaps the only time in the book where Kahneman sees the merit in creating policy or making decisions based on emotion. He understands that on a large scale, it is hard to control fear, and sometimes it is just as important to make the public feel comfortable as it is to solve the actual problem, because the issue really stems from the fear that is incurred from a given event.