Normalcy bias

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

The normalcy bias, or normality bias, is a mental state people enter when facing a disaster. It causes people to underestimate both the possibility of a disaster and its possible effects. This may result in situations where people fail to adequately prepare and, on a larger scale, the failure of governments to include the populace in its disaster preparations.

The assumption that is made in the case of the normalcy bias is that since a disaster never has occurred, it never will occur. It can result in the inability of people to cope with a disaster once it occurs. People with a normalcy bias have difficulties reacting to something they have not experienced before. People also tend to interpret warnings in the most optimistic way possible, seizing on any ambiguities to infer a less serious situation.

The opposite of normalcy bias would be overreaction, or "worst-case thinking" bias,[1][2] in which small deviations from normality are dealt with as signaling an impending catastrophe.

Effects

The normalcy bias often results in unnecessary deaths in disaster situations. The lack of preparation for disasters often leads to inadequate shelter, supplies, and evacuation plans. Even when all these things are in place, individuals with a normalcy bias often refuse to leave their homes.

Normalcy bias can cause people to drastically underestimate the effects of the disaster. Therefore, they think that everything will be all right while information from the radio, television, or neighbors gives them reason to believe there is a risk. The normalcy bias creates a cognitive dissonance that they then must work to eliminate. Some manage to eliminate it by refusing to believe new warnings coming in and refusing to evacuate (maintaining the normalcy bias) while others eliminate the dissonance by escaping the danger. The possibility that some may refuse to evacuate causes significant problems in disaster planning.[3]

Hypothesized cause

The normalcy bias may be caused in part by the way the brain processes new data. Research suggests that even when the brain is calm, it takes 8–10 seconds to process new information. Stress slows the process, and when the brain cannot find an acceptable response to a situation, it fixates on a single and sometimes default solution that may or may not be correct. An evolutionary reason for this response could be that paralysis gives an animal a better chance of surviving an attack; predators are less likely to see prey that is not moving.[4]

Prevention

The negative effects can be combated through the four stages of disaster response:[5]

  • preparation, including publicly acknowledging the possibility of disaster and forming contingency plans.
  • warning, including issuing clear, unambiguous, and frequent warnings and helping the public to understand and believe them.
  • impact, the stage at which the contingency plans take effect and emergency services, rescue teams, and disaster relief teams work in tandem.
  • aftermath, reestablishing equilibrium after the fact by providing both supplies and aid to those in need.

Overreaction

The opposite of normalcy bias is overreaction bias. Noting the effect regression to the mean, most deviations from normalcy do not lead to catastrophe, despite regular predictions of doomsday.[6][7] Logically, both underreaction ("normalcy bias") and overreaction ("worst-case thinking") are cognitive flaws.

See also

References

  1. Bruce Schneier, "Worst-case thinking makes us nuts, not safe", CNN, May 12, 2010 (retrieved April 18, 2014); reprinted in Schneier on Security, May 13, 2010 (retrieved April 18, 2014)
  2. Dylan Evans, "Nightmare Scenario: The Fallacy of Worst-Case Thinking", Risk Management, April 2, 2012 (retrieved April 18, 2014); from Risk Intelligence: How To Live With Uncertainty, by Dylan Evans, Free Press/Simon & Schuster, Inc., 2012; ISBN 9781451610901
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. "This is a world where a relatively ordinary, technical, information-technology problem such as the so-called millennium bug was interpreted as a threat of apocalyptic proportions, and where a flu epidemic takes on the dramatic weight of the plot of a Hollywood disaster movie. Recently, when the World Health Organisation warned that the human species was threatened by the swine flu, it became evident that it was cultural prejudice rather than sober risk assessment that influenced much of present-day official thinking." Source: Frank Furedi, "Fear is key to irresponsibility", The Australian, Oct. 9 2010 (retrieved April 18, 2014), ; extracted from the speech, "The Precautionary Principle and the Crisis of Causality," September 18, 2010.
  7. Source: Frank Furedi, "Fear is key to irresponsibility", FrankFuredi.com, accessed 2016-02-18