Debiasing

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, and many parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence.[1] At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.[2]

Debiasing can occur within the decision maker. For example, a person may learn or adopt better strategies by which to make judgments and decisions.[1][3] Debiasing can also occur as a result of changes in external factors, such as changing the incentives relevant to a decision or the manner in which the decision is made.[4]

There are three general approaches to debiasing judgment and decision making, and the costly errors with which biased judgment and decision making is associated: changing incentives, nudging, and training. Each approach has strengths and weaknesses. For more details, see Morewedge and colleagues (2015).[1]

General Approaches

Incentives

Changing incentives can be an effective means to debias judgment and decision making. This approach is generally derived from economic theories suggesting that people act in their self interest by seeking to maximize their utility over their lifetime. Many decision making biases may occur simply because they are more costly to eliminate than to ignore.[5] Making people more accountable for their decisions (increasing incentives), for example, can increase the extent to which they invest cognitive resources in making decisions, leading to less biased decision making when people generally have an idea of how a decision should be made.[6] Incentives can also be calibrated to change preferences toward more beneficial behavior. Price cuts on healthy foods increase their consumption in school cafeterias,[7] and soda taxes appear to reduce soda consumption by the public. People often are willing to use incentives to change their behavior through the means of a commitment device. Shoppers, for example, were willing to forego a cash back rebate on healthy food items if they did not increase the percentage of healthy foods in their shopping baskets.[8]

Incentives do not always debias and improve decision making. If people are using a suboptimal strategy by which to make judgments and decisions, incentives can exacerbate bias.[6] Incentives can backfire when they are miscalibrated or are weaker than social norms that were preventing undesirable behavior. Large incentives can also lead people to choke under pressure.[9]

Nudges

Nudges, changes in information presentation or the manner by which judgments and decisions are elicited, is another means to debiasing. People may choose healthier foods if they are better able to understand their nutritional contents,[10] and may choose lower-calorie meals if they are explicitly asked if they would like to downsize their side orders.[11] Other examples of nudges include changing which option is the default option to which people will be assigned if they do not choose an alternative option, placing a limit on the serving size of soda, or automatically enrolling employees in a retirement savings program.

Training

Training can effectively debias decision makers over the long term.[1][12][13] Training, to date, has received less attention by academics and policy makers than incentives and nudges because initial debiasing training efforts resulted in mixed success (see Fischhoff, 1982 in Kahneman et al[14]). Decision makers could be effectively debiased through training in specific domains. For example, experts can be trained to make very accurate decisions when decision making entails recognizing patterns and applying appropriate responses in domains such as firefighting, chess, and weather forecasting. Evidence of more general debiasing, across domains and different kinds of problems, however, was not discovered until recently. The reason for the lack of more domain-general debiasing was attributed to experts failing to recognize the underlying "deep structure" of problems in different formats and domains. Weather forecasters are able to predict rain with high accuracy, for example, but show the same overconfidence in their answers to basic trivia questions as other people. An exception was graduate training in scientific fields heavily reliant on statistics such as psychology.[15]

Experiments by Morewedge and colleagues (2015) have found interactive computer games and instructional videos can result in long-term debiasing at a general level. In a series of experiments, training with interactive computer games that provided players with personalized feedback, mitigating strategies, and practice, reduced six cognitive biases by more than 30% immediately and by more than 20% as long as three months later. The biased reduced were anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness.[1][12]

(Sometimes Effective) Debiasing Strategies

Incentives

  • Paying people for optimal behavior through bonuses or by providing discounts (e.g., to exercise, to take their medication, to trade in fuel inefficient vehicles such as the "cash for clunkers" program).[16]
  • Taxing people for suboptimal behavior (e.g., drinking soda, smoking tobacco, and weed).

Nudges

  • Using (or implying) default options that are optimal for the decision maker or society.
  • Commitment devices that makes it more costly to make suboptimal decisions (e.g., Schwartz et al., 2014[8]).
  • Reframing choice options in ways that make important attributes salient. Labeling hamburger meat 25% fat, for example, makes people more sensitive to fat content than labeling it 75% lean.
  • Presenting information in formats that make critical information easier to evaluate, such as displaying nutritional value using a "traffic light" system.[10]

Training

  • Providing people with personalized feedback regarding the direction and degree to which they exhibit bias.[1]
  • Teaching a "consider-the-alternative" strategy, such as considering a plausible alternative reason for an event than cause one suspects.[17]
  • Teaching people statistical reasoning and normative rules of which they are unaware.[15]
  • Encouraging people to take the perspective of a person who will experience the consequences of their decision can reduce bias. Participants who were shown a "morphed" image of their face to resemble themselves upon retirement were more likely to save money for the future rather than elect to receive it in the present.[18]

See also

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. 6.0 6.1 Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. 8.0 8.1 Lua error in package.lua at line 80: module 'strict' not found.
  9. Lua error in package.lua at line 80: module 'strict' not found.
  10. 10.0 10.1 Lua error in package.lua at line 80: module 'strict' not found.
  11. Lua error in package.lua at line 80: module 'strict' not found.
  12. 12.0 12.1 Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. 15.0 15.1 Lua error in package.lua at line 80: module 'strict' not found.
  16. Lua error in package.lua at line 80: module 'strict' not found.
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. Lua error in package.lua at line 80: module 'strict' not found.