Open main menu

Wikipedia β

The normalcy bias, or normality bias, is a mental state people enter when facing a disaster. It causes people to underestimate both the possibility of a disaster and its possible effects, because it causes people to have a bias to believe that things will always function the way things normally function. This may result in situations where people fail to adequately prepare and, on a larger scale, the failure of governments to include the populace in its disaster preparations.

The assumption that is made in the case of the normalcy bias is that since one has never personally experienced a disaster, one never will. It can result in the inability of people to cope with a disaster once it occurs. People with a normalcy bias have difficulties reacting to something they have not experienced before. They also tend to interpret warnings in the most optimistic way possible, seizing on any ambiguities to infer a less serious situation. Normalcy bias is essentially a "desire for the status quo."[1]

"With a normalcy bias," writes one observer, "we project current conditions into the future. Normalcy bias is a form of denial where we underestimate the possibility and extent of a looming disaster even when we have incontrovertible evidence that it will happen. We assume that since a disaster never has occurred, then it never will occur. Consequently, we fail to prepare for a disaster and, when it does occur, we may be unable to deal with it."[1]

A famous observation by Patrick Henry can be seen as a reference to the normalcy bias: "We are apt to shut our eyes against a painful truth, and listen to the song of that siren till she transforms us into beasts. Is this the part of wise men, engaged in a great and arduous struggle for liberty? Are we disposed to be the number of those who, having eyes, see not, and having ears, hear not, the things which so nearly concern their temporal salvation? For my part, whatever anguish of spirit it may cost, I am willing to know the whole truth; to know the worst, and to provide for it."[2]

Normalcy bias has also been called analysis paralysis, incredulity response, the ostrich effect[1] and by first responders, the negative panic.[3] The opposite of normalcy bias would be overreaction, or "worst-case scenario" bias,[4][5] in which small deviations from normality are dealt with as signaling an impending catastrophe.

Contents

ExtentEdit

About 70% of people reportedly display normalcy bias in disasters.[6]

ExamplesEdit

"Normalcy bias flows into the brain no matter the scale of the problem," David McRaney has written. "It will appear whether you have days and plenty of warning or are blindsided with only seconds between life and death."[3] It can manifest itself in the case of phenomena, such as car crashes, that occur very frequently, but the average individual experiences only rarely, if ever. It also manifests itself in connection with world-historical events.

Experts in various fields have attributed certain familiar behaviors to normalcy bias. One relationships expert has written that "Lack of a working knowledge of the warning signs of abuse is the main reason millions of women unwittingly enter into dangerous relationships and normalcy bias is a contributing factor to their getting stuck in them."[7] One woman who witnessed a horrific traffic accident was convinced that she "saw a scarecrow fly through the air," and even after discovering that it had been a young man flung from a truck she still had "the image of a scarecrow...imprinted" in her brain, "because humans don’t fly through the air." She recognized this as an example of normalcy bias.[8]

As for world-historical events, the normalcy bias explains why, when the volcano Vesuvius erupted, the residents of Pompeii watched for hours without evacuating. It explains why thousands of people refused to leave New Orleans as Hurricane Katrina approached.[8] Officials at the White Star Line made insufficient preparations to evacuate passengers on the Titanic because they were incapable of imagining a scenario in which this "indestructible" ship could sink. Similarly, experts connected with the Fukushima nuclear power plant were so convinced that a multiple reactor meltdown could never occur.[2]

The normalcy bias also explains why many Jews stayed in Nazi Germany despite increasing discrimination, harassment, and oppression.[8] Eric Sammons has written: "Normalcy bias is prevalent in any crisis. During the rise of Nazi Germany, the idea that Hitler was historically evil was simply too horrible to consider."[9] In his book Wealth, War, and Wisdom, global investment strategist Barton Biggs writes: “By the end of 1935, 100,000 Jews had left Germany, but 450,000 still [remained]. Wealthy Jewish families… kept thinking and hoping that the worst was over… Many of the German Jews, brilliant, cultured, and cosmopolitan as they were, were too complacent. They had been in Germany so long and were so well established, they simply couldn't believe there was going to be a crisis that would endanger them. They were too comfortable. They believed the Nazi’s antisemitism was an episodic event and that Hitler's bark was worse than his bite. [They] reacted sluggishly to the rise of Hitler for completely understandable but tragically erroneous reasons. Events moved much faster than they could imagine." Quoting this passage, investment advisor Porter Stansberry commented: "This is one of the most tragic examples of the devastating effects of the 'normalcy bias' the world has ever seen."[10]

A publication for police officers has noted that members of that profession have "all seen videos of officers who were injured or killed while dealing with an ambiguous situation, like the old one of a father with his young daughter on a traffic stop." In the video, "the officer misses multiple threat cues...because the assailant talks lovingly about his daughter and jokes about how packed his minivan is. The officer only seems to react to the positive interactions, while seeming to ignore the negative signals. It's almost as if the officer is thinking, 'Well I've never been brutally assaulted before so it certainly won't happen now.' No one is surprised at the end of the video when the officer is violently attacked, unable to put up an effective defense." This professional failure, notes the publication, is a consequence of normalcy bias.[11]

Normalcy bias, David McRaney has written, "is often factored into fatality predictions in everything from ship sinkings to stadium evacuations." Disaster movies, he adds, "get it all wrong. When you and others are warned of danger, you don’t evacuate immediately while screaming and flailing your arms." McRaney notes that in the book Big Weather, tornado chaser Mark Svenvold discusses "how contagious normalcy bias can be. He recalled how people often tried to convince him to chill out while fleeing from impending doom. He said even when tornado warnings were issued, people assumed it was someone else's problem. Stake-holding peers, he said, would try to shame him into denial so they could remain calm. They didn't want him deflating their attempts at feeling normal."[3]

People who promote conspiracy theories or apocalyptic future scenarios have cited the normalcy bias as a prime reason why others scoff at their pronouncements. For example, survivalists who fear that the U.S. will soon descend into totalitarianism cite normalcy bias as the reason why most Americans do not share their worries.[2] Similarly, fundamentalist Christians use the normalcy bias to explain why others scoff at their beliefs about the "End Times." One fundamentalist website writes: "May we not get blinded by the 'normalcy bias' but rather live with the knowledge that the Lord’s coming is near (James 5:8)."[12][13]

EffectsEdit

Normalcy bias has been described as "one of the most dangerous biases we have." It often results in unnecessary deaths in disaster situations. The lack of preparation for disasters often leads to inadequate shelter, supplies, and evacuation plans. Even when all these things are in place, individuals with a normalcy bias often refuse to leave their homes.[1]

Normalcy bias can cause people to drastically underestimate the effects of the disaster. Therefore, they think that everything will be all right while information from the radio, television, or neighbors gives them reason to believe there is a risk. The normalcy bias creates a cognitive dissonance that they then must work to eliminate. Some manage to eliminate it by refusing to believe new warnings coming in and refusing to evacuate (maintaining the normalcy bias) while others eliminate the dissonance by escaping the danger. The possibility that some may refuse to evacuate causes significant problems in disaster planning.[14]

Hypothesized causeEdit

The normalcy bias may be caused in part by the way the brain processes new data. Research suggests that even when the brain is calm, it takes 8–10 seconds to process new information. Stress slows the process, and when the brain cannot find an acceptable response to a situation, it fixates on a single and sometimes default solution that may or may not be correct. An evolutionary reason for this response could be that paralysis gives an animal a better chance of surviving an attack and predators are less likely to see prey that is not moving.[15]

PreventionEdit

The negative effects of normalcy bias can be combatted through the four stages of disaster response:[16]

  • preparation, including publicly acknowledging the possibility of disaster and forming contingency plans.
  • warning, including issuing clear, unambiguous, and frequent warnings and helping the public to understand and believe them.
  • impact, the stage at which the contingency plans take effect and emergency services, rescue teams, and disaster relief teams work in tandem.
  • aftermath, reestablishing equilibrium after the fact by providing both supplies and aid to those in need.

Various sources have proffered more detailed advice for overcoming normalcy bias. First, be aware of your own normality bias and be conscious of the need to overcome it. Learn to think for yourself instead of relying on information from authorities or expecting help from the government. Challenge your own natural human tendency to avoid unpleasant news. As you go through your day, don't take for granted that everything will be as it always is: be constantly aware of your surroundings so as to avoid being startled by the unexpected. Carry out "thought experiments anticipating possible problems and getting mentally ready for them. What would you do if you were mugged? How would you handle a job loss?....The more you practice this, the greater the likelihood you'll be able to choose between fight or flight rather than freezing up or surrendering." Engage in "situational analysis." Avoid wishful thinking and denial and cultivate self-honesty, strength, and determination. Develop survival plans, for example an evacuation plan in case of fire in your own home.[1]

OverreactionEdit

The opposite of normalcy bias is overreaction bias. Noting the effect regression to the mean, most deviations from normalcy do not lead to catastrophe, despite regular predictions of doomsday.[17][18] Logically, both underreaction (normalcy bias) and overreaction (worst-case thinking) are cognitive flaws.

In popular cultureEdit

Free Fall, a novel about Somali pirates by author Brad Thor, features the statement that "the fact that even Somalis suffered from normalcy bias made Harvath shake his head."[11]

See alsoEdit

ReferencesEdit

  1. ^ a b c d e "Beware Your Dangerous Normalcy Bias". Gerold Blog. Gerold Blog. Retrieved 24 May 2017. 
  2. ^ a b c "Titanic and Costa Concordia Examples Reveal Normalcy Bias, Cognitive Dissonance And Denial; Fukushima Mega Nuclear Disaster And Entire Nuclear Industry". A Green Road Journal. Retrieved 23 May 2017. 
  3. ^ a b c "Normalcy Bias: Why Nobody is Doing Anything About the State of the Planet". Cassiopaea. Cassiopaea. Retrieved 23 May 2017. 
  4. ^ Bruce Schneier, "Worst-case thinking makes us nuts, not safe", CNN, May 12, 2010 (retrieved April 18, 2014); reprinted in Schneier on Security, May 13, 2010 (retrieved April 18, 2014)
  5. ^ Dylan Evans, "Nightmare Scenario: The Fallacy of Worst-Case Thinking", Risk Management, April 2, 2012 (retrieved April 18, 2014); from Risk Intelligence: How To Live With Uncertainty, by Dylan Evans, Free Press/Simon & Schuster, Inc., 2012; ISBN 9781451610901
  6. ^ Inglis-Arkell, Esther (May 2, 2013). "The frozen calm of normalcy bias". Gizmodo. Retrieved 23 May 2017. 
  7. ^ Moss, Anna. "What Is Normalcy Bias and Why Is It So Dangerous?". SelfGrowth. Retrieved 23 May 2017. 
  8. ^ a b c "Normalcy Bias: It’s All In Your Head". The Survival Mom. The Survival Mom. Retrieved 23 May 2017. 
  9. ^ Sammons, Eric. "Normalcy Bias and Papal Positivism". One Peter 5. One Peter 5. Retrieved 23 May 2017. 
  10. ^ Elliott, Robin. "Will You Let Normalcy Bias Condemn Your Future?". Robin J. Elliott. Robin J. Elliott. Retrieved 23 May 2017. 
  11. ^ a b Smith, Dave. "Normalcy Bias". Police The Law Enforcement Magazine. Police The Law Enforcement Magazine. Retrieved 23 May 2017. 
  12. ^ "The Normalcy Bias". Relevant Bible Teaching. Relevant Bible Teaching. Retrieved 23 May 2017. 
  13. ^ Strandberg, Todd. "The Normalcy Bias and Bible Prophecy". Prophezine. Retrieved 23 May 2017. 
  14. ^ Oda, Katsuya. "Information Technology for Advancement of Evacuation" (PDF). National Institute for Land and Infrastructure Management. Retrieved 2 November 2013. 
  15. ^ Ripley, Amanda (25 April 2005). "How to Get Out Alive". TIME Magazine. Retrieved 11 November 2013. 
  16. ^ Valentine, Pamela V.; Smith, Thomas Edward (Summer 2002). "Finding Something to Do: the Disaster Continuity Care Model" (PDF). Brief Treatment and Crisis Intervention. From the School of Social and Behavioral Science at the University of Alabama-Birmingham (Valentine) and the School of Social Work at Florida State University (Smith): Oxford University Press. 2 (2): 183–196. ISSN 1474-3329. Retrieved 2 November 2013. 
  17. ^ "This is a world where a relatively ordinary, technical, information-technology problem such as the so-called millennium bug was interpreted as a threat of apocalyptic proportions, and where a flu epidemic takes on the dramatic weight of the plot of a Hollywood disaster movie. Recently, when the World Health Organisation warned that the human species was threatened by the swine flu, it became evident that it was cultural prejudice rather than sober risk assessment that influenced much of present-day official thinking." Source: Frank Furedi, "Fear is key to irresponsibility", The Australian, Oct. 9 2010 (retrieved April 18, 2014), ; extracted from the speech, "The Precautionary Principle and the Crisis of Causality," September 18, 2010.
  18. ^ Source: Frank Furedi, "Fear is key to irresponsibility", FrankFuredi.com, accessed 2016-02-18