Earlier this year, Tom Stafford wrote a column for the BBC's website about how to combat confirmation bias. In other words, how do we avoid the problem of searching for and relying on data that confirm what we already believe (while dismissing or avoiding data that contradict our pre-existing beliefs)?
Stafford recalls a famous set of experiments by Charles Lord, Lee Ross, and Mark Lepper. In one classic study from several decades ago, they looked at how people's attitudes toward the death penalty changed after being exposed to two contrasting studies - one demonstrating a powerful deterrent effect for the death penalty and another showing the exact opposite finding. Lord, Ross, and Lepper found that people's attitudes polarized after looking at the two studies. Why? They assimilated the data in a biased way, relying heavily on the information that supported their pre-existing beliefs.
Stafford describes a second experiment that these scholars conducted. In this subsequent research, they compared two strategies for trying to combat confirmation bias. They analyzed the impact of two different sets of instructions for people. They were given these instructions before looking at the data. Stafford summarizes what these scholars discovered:
For their follow-up study, Lord and colleagues re-ran the biased assimilation experiment, but testing two types of instructions for assimilating evidence about the effectiveness of the death penalty as a deterrent for murder. The motivational instructions told participants to be "as objective and unbiased as possible", to consider themselves "as a judge or juror asked to weigh all of the evidence in a fair and impartial manner". The alternative, cognition-focused, instructions were silent on the desired outcome of the participants’ consideration, instead focusing only on the strategy to employ: "Ask yourself at each step whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue." So, for example, if presented with a piece of research that suggested the death penalty lowered murder rates, the participants were asked to analyse the study's methodology and imagine the results pointed the opposite way.
They called this the "consider the opposite" strategy, and the results were striking. Instructed to be fair and impartial, participants showed the exact same biases when weighing the evidence as in the original experiment. Pro-death penalty participants thought the evidence supported the death penalty. Anti-death penalty participants thought it supported abolition. Wanting to make unbiased decisions wasn't enough. The "consider the opposite" participants, on the other hand, completely overcame the biased assimilation effect – they weren't driven to rate the studies which agreed with their preconceptions as better than the ones that disagreed, and didn't become more extreme in their views regardless of which evidence they read.
No comments:
Post a Comment