Have you ever noticed that when you present people with facts that are contrary to their deepest held beliefs they always change their minds? Me neither. In fact, people seem to double down on their beliefs in the teeth of overwhelming evidence against them. The reason is related to the worldview perceived to be under threat by the conflicting data.
Creationists, for example, dispute the evidence for evolution in fossils and DNA1
because they are concerned about secular2
forces encroaching on religious faith. Anti-vaxxers distrust big pharma and think that money corrupts3
medicine, which leads them to believe that vaccines4
cause autism despite the inconvenient5
truth that the one and only study claiming such a link was retracted6
and its lead author accused of fraud. The 9/11 truthers focus on minutiae7
like the melting point of steel in the World Trade Center buildings that caused their collapse8
because they think the government lies and conducts "false flag" operations to create a New World Order. Climate deniers study tree rings, ice cores and the ppm of greenhouse gases because they are passionate9
about freedom, especially that of markets and industries to operate unencumbered by restrictive government regulations. Obama birthers desperately10 dissected11
the president's long-form birth certificate in search of fraud because they believe that the nation's first African-American president is a socialist12 bent13
on destroying the country.
In these examples, proponents14
' deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive15
dissonance and the backfire effect. In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult16
when the mother ship failed to arrive at the appointed time. Instead of admitting error, "members of the group sought frantically17
to convince the world of their beliefs," and they made "a series of desperate attempts to erase18
dissonance by making prediction after prediction in the hope that one would come true." Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously20
Two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), in their 2007 book Mistakes Were Made (But Not by Me) document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor21
of the "pyramid of choice" places two individuals side by side at the apex22
of the pyramid and shows how quickly they diverge23
and end up at the bottom opposite corners of the base as they each stake out a position to defend.
In a series of experiments by Dartmouth College professor Brendan Nyhan and University of Exeter professor Jason Reifler, the researchers identify a related factor they call the backfire effect "in which corrections actually increase misperceptions among the group in question." Why? "Because it threatens their worldview or self-concept." For example, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as that there were weapons of mass destruction in Iraq. When subjects were then given a corrective article that WMD were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite ... and more: they reported being even more convinced there were WMD after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In fact, Nyhan and Reifler note, among many conservatives "the belief that Iraq possessed24
WMD immediately before the U.S. invasion persisted long after the Bush administration itself concluded otherwise."
If corrective facts only make matters worse, what can we do to convince people of the error of their beliefs? From my experience, 1. keep emotions out of the exchange, 2. discuss, don't attack (no ad hominem and no ad Hitlerum), 3. listen carefully and try to articulate the other position accurately25
, 4. show respect, 5. acknowledge that you understand why someone might hold that opinion, and 6. try to show how changing facts does not necessarily mean changing worldviews. These strategies may not always work to change people's minds, but now that the nation has just been put through a political fact-check wringer, they may help reduce unnecessary divisiveness.