The Mind Killer: Cognitive Biases Enrolled in the Service of Fear

“Fear is the mind killer...” Frank Herbert, 'Dune'

There was a highly regarded woman who was buried in Eastern Oregon a week or so back. There was no national media attention. Millions of dollars were not donated to the Red Cross for her family or the families of those who died like her. Nations will not be invaded in her name, and civil rights will not be revoked to punish anyone suspected in her death or the 16,000 or so that will die violently like her this year. You see she did not die in the Pentagon or the WTC but joined the 84,770 killed on our nation's highways by drunk drivers in the years 2001-2006 and the roughly 33,000 that joined or will join them in 2007-2008. 2973 people died during the terrorist attacks in NYC and Washington DC in Sept of 2001. But 40 times more people have been killed by drunk drivers or in drunk driving accidents since 2001 than on 9/11/2001 by terrorists and 5.5x will die in this year alone. Drunk drivers account for between 30-40% of all traffic fatalities. So more than 80x more people died on U.S. roads in the last 6-7 years than died on 9/11/2001.

According to the EPA, between 8000 and 45,000 Americans will die each year from radon-induced lung cancer (best estimate 21,000).

So then explain why people who have no trouble getting into a car or going down into their basements are still afraid to visit NYC, or sheepishly disrobe in front of rent-a-cops in the airport while relinquishing their toothpaste without complaint? In Washington State when the Governor suggested that road blocks be instituted to catch drunks, the citizenry was up in arms about this 'unnecessary intrusion without just cause', yet body scans and pat downs for random airline passengers fails to start a traveler's revolt. Based upon factual risk assessment something is greatly amiss.

Don't get me wrong. The attacks on the WTC and Pentagon were unspeakable acts of evil. But the deaths of the innocents killed by drunk drivers are no less tragic than any other violent death. Each year hundreds of people die in fires, accidents, crimes, etc. Their terror, suffering and tragedy for those left behind is no less than those who died in the WTC.

Fear is a very useful evolutionary facet that comes in quite handy and was critical for our early survival. But like most things fear is a double-edged sword. In appropriate doses it protects us from harm and rash decisions but in larger doses it can paralyze us or serve us up to those who use fear as a tool of manipulation and control. Fear is a great motivator and properly nurtured you can use it to make people do almost anything. Use fear to turn the heat up just a bit at a time and you can boil a lot of frogs. Just ask Karl Rove (spit).

Why are we so obsessed with one event? I think the answer is a combination of cognitive vulnerabilities exploited by masters of manipulation and poor leadership. The later two aspects will no doubt be debated but the first is pretty well described. It is there that I will concentrate my efforts and leave the others for a later time.


We have talked about this one before. The endless repetition of the images of the burning towers and their collapse reinforces this heuristic and makes it seem as if the threat is omnipresent. It's easier for people to recall those images and fears than a picture of a twisted auto after a fatal car crash. People were repelled but could not look away. The coverage provided a sense of intimacy that made people feel as if they were in the heat of the calamity. This sense of vulnerability resulted in a feeling that everyone was at risk – even in small towns that I am sure OBL and company would miss on a geography test.

Neglect of Probability:

The facts speak for themselves. Terrorists appear to be a lot less of a threat to your survival than 'Joe six-pack behind the wheel'. Factual risk assessment proves this point for now. People have a very poor appreciation of probability and tend to believe that vivid or negative events occur at a much higher rate than is true. Ironically the number of people actually at risk from an event or decision seems to have a minimal affect on decision weight (called scope neglect). This may help explain why we fuss about terrorists and ignore drunk drivers.

Causality and Hindsight Bias:

How many times have we heard this; “There have been no terrorist attacks in this country since 9/11; therefore, our security efforts are worth the cost.” This is a classic logic failure based upon an assumption which may or may not be true. Remember that the claim could have been made that 'since the WTC bombing of 1993 our security efforts have protected America' right up to 8:46 am EST, September 11, 2001. The point of this is not to bash the Bush administration but to point out that the explanation as to why we haven't been attacked yet again is multi-factorial and causality has not been established. Prudence demands that all possible explanations be explored including the unholy patience of this enemy. What are the dangers of misplaced causality? As it turns out they may be huge.

Let's examine another vicarious moment in American history - The Challenger explosion and the destruction of the Columbia. Lots of school kids got to watch the Challenger explode, heard possibly the greatest understatement in history (the NASA spokesperson saying 'obviously a major malfunction' as bits of the shuttle blasted in all directions), and saw the bits fall to the sea. The recriminations flew for months and included Richard Feynman's classic o-ring dunking and the dull thud his sample o-ring made on the floor after a few moments in cold water (Rogers et. al. 1986.). The o-rings were fixed and the shuttle went back into space. So what was the problem with that? Hindsight bias was strongly at work. Everyone 'knew' that NASA had been lazy or negligent in not fixing the o-rings before the fatal flight. In reality that is unfair. UNFAIR! Seven astronauts died how can that be unfair? Because of the effects of hindsight bias. The space shuttle is an engineering marvel. It also has the largest number of single point failure modes (single problem or malfunction leading to mission loss) of any machine in existence. Prior to Challenger exploding, o-ring problems were only one of several possible mission loss risk factors. Without the benefit of hindsight, improving the safety of the shuttle would have required addressing ALL of the risks of equal gravity – something that NASA had expressed to Congress and the administration hoping for some funds. But absent exploding shuttles it was not a national priority at the time.

The o-ring problem may have been solved but another single point failure mode killed Columbia leading to another patch-work fix and the ultimate quiet decision to terminate the shuttle program.

The moral of the story; hindsight bias can affect future decisions even in cases where causality is determined for a specific instance (we know the o-ring killed Challenger but concentrating efforts to fix that problem did not save Columbia). It does this by preventing people from a thorough risk assessment and accounting for the true probability and predictability of similar risks. The fact that something happens does not imply that it was reasonably predictable based upon pre-disaster models – the assumption of increased predictability of past events is another classic cognitive bias (one that greatly complicates civil liability law). One type of attack may have been thwarted by these airline security measures but equating that with improved overall security is false as the Columbia loss demonstrates. How much of a patchwork solution is making you undress for airline security? Hard to tell, but it is interesting that the cargo stored in the hold of the airliner (whose owners required you to roll onto your back and submit before boarding) is still not screened...

Complicating all these discussions of hindsight and causality is the next topic: the risk and effect of exceptionally rare but extremely catastrophic events, or Black Swans.

Black Swans:

Nassim Nicholas Taleb characterized this bias in his book 'The Black Swan'. In it he discusses the difficulty of dealing with risk assessments in circumstances where the majority of negative risk comes from one extreme case – i.e. a black swan. Was 9/11 a black swan? Of course it was. The problem is how likely are and how do you prevent future such events and will there nature necessarily be predictable from assessments of previous events. Placing too much emphasis on the value of hindsight is dangerous. Fischhoff (1982) described this issue well:
"When we attempt to understand past events, we implicitly test the hypotheses or rules we use both to interpret and to anticipate the world around us. If, in hindsight, we systematically underestimate the surprises that the past held and holds for us, we are subjecting those hypotheses to inordinately weak tests and, presumably, finding little reason to change them."
In other words we assume that future events will look much like past ones. We assume that we can easily predict them so pay little attention to the possibility of similar but nonidentical events making it much harder to predict them while living under a false sense of security.

In a wonderful paper on cognitive bias in global risk assessment in 2006, Eliezer Yudkowsky wrote:
“After September 11th, the U.S. Federal Aviation Administration prohibited box-cutters on airplanes. The hindsight bias rendered the event too predictable in retrospect, permitting the angry victims to find it the result of 'negligence' - such as intelligence agencies' failure to distinguish warnings of Al Qaeda activity amid a thousand other warnings. We learned not to allow hijacked planes to overfly our cities. We did not learn the lesson: "Black Swans occur; do what you can to prepare for the unanticipated." “
At this point you may be wondering which side of the fence I am on. Am I arguing for restraint in concerns regarding terrorism because of the obvious tendency to neglect the true probability of genuine risks such as drunk drivers? Or am I saying that we need to be even more vigilant to major catastrophes because of the unrecognized risk of 'Black Swan' events such as the WTC attack? In fact I support both but not in the way we have thus far approached the problems. The truth is this. Terrorism is a real threat but it should be incorporated into our assessment and response to a wide range of dangers we face in life. We need a measured response to terrorism in light of the fact that, thus far, the risk is less than many common dangers. By the same token we need to prepare, as Yudkowsky suggested, for the the real possibility of terrorist Black Swan events by preparing as best we can for the unexpected. What does that mean. First of all we are unlikely to be able to prevent all such attacks by a resourceful enemy. We should not worry so much about the fact that it will happen again and go on with life as best we can with a slight up tic in daily risk. We need to concentrate not on how they might attack us but rather on assessing where such attacks would be most devastating then take steps to mitigate those risks. We cannot be everywhere at once. We need to concentrate on those areas where the overall risk to the nation as a whole is greatest. What will be hard is that if we are honest in this assessment we will probably find out that limited loss of life situations are less of a threat than certain infrastructure attacks that could impact millions.

We need to make our judgments based not upon irrational fear and emotion but on careful deliberation. True risk assessment and assumption will make us better able to counter those whose reactionary response is to throw the baby out with the bath water and curtail liberties unnecessarily. Our response should be “true patriots carefully assess the risks and act as if they have a spine!” Or “true patriots do not cower in the presence of false risks”. True patriots are not afraid to analyze the nature and rationality of their fears and accordingly act as adults.

These are tough calls and preparing in this way ideally will be a thankless job. As Taleb notes in his book:
“It is difficult to motivate people in the prevention of Black Swans... Prevention is not easily perceived, measured, or rewarded; it is generally a silent and thankless activity. Just consider that a costly measure is taken to stave off such an event. One can easily compute the costs while the results are hard to determine. How can one tell its effectiveness, whether the measure was successful or if it just coincided with no particular accident? ... Job performance assessments in these matters are not just tricky, but may be biased in favor of the observed "acts of heroism". History books do not account for heroic preventive measures. “
Realizing how we can be manipulated and deceived is the first step in correcting the problem. In closing I will avail myself of one of the four quotes that hang in my office: “The only thing we have to fear is fear itself.” (FDR's 1rst inaugural – talk about leadership in a crisis). 9/11 didn't change everything – our response and continued obsession with it did. And if that is true, and if as Shakespeare said in Julius Caesar, “the fault lies not in our stars but in ourselves”, then we can do something to banish this pernicious fear that continues to eat away at everything we hold dear.

A Word of Caution. Cognitive biases are a very real part of human decision-making and can be quite deleterious. But like all things they need to be viewed in moderation and we should realize that although human decision bias is present in almost all decisions, its relative importance to the final outcome and its impact on decisions varies from case to case. Some decisions are complex and nuanced and as such, are vulnerable to cognitive bias – others are straight forward or involve such a preponderance of evidence that bias is a trivial component. Learning the difference is critical.


GearHedEd said...

cogito, ergo sum.

Also, perception is reality.

Pliny-the-in-Between said...

perception is reality.
to a point. But perceptive reality is not truth. And if we understand what skews perception away from truth we can (hopefully) steer it closer and closer to objective reality. Or not ;). Then we truly will be distinguishable from the apes!

Asylum Seeker said...

It's not just terrorism either. Any time a person dies in a violent manner, or more than one person is involved, we care a little bit more about those boring (though far more abundant) deaths from heart disease. We inherently care more about such tragedies according to how much we identify with the victims, how violents the death were, how many victims were involved, and according to just how unique it feels. That availability heuristic...it is an incredible annoyance.

Asylum Seeker said...

Ahem...should be a "than the" between the words "those" "boring". Hate this comment format...

Liz said...

Chris Burns, who's an expert in information management, has studied HOW people make the wrong decisions, including the shuttle disaster, 3-mile island, Iraq War, even the Titanic. His book, Deadly Decisions,
offers a simple explanation of how the mind constantly fools us into the wrong decision, and why. It's really, really interesting.

Pliny-the-in-Between said...

Thanks Liz - I'll take a look at it. This is an area of great interest to me because my specific work is in machine cognition. Avoiding a repeat of human cognitive bias ad mitigating their effects are critical to these systems.

Thanks for your comment.