We’re hard-wired not to change our minds

2016 Olympian Board of Contributors Rachel Burke
2016 Olympian Board of Contributors Rachel Burke sbloom@theolympian.com

What does it take to change someone’s mind about something that’s just not true?

Brendan Nyhan and Jason Riefler have been studying this question for more than a decade, using real-world cases in which thousands of us have persisted in believing things that have been proved false — that Iraq held weapons of mass destruction, that the Affordable Care Act would result in “death panels” denying health care to elderly people, and that Mitt Romney, as CEO of Bain Consulting, shipped jobs overseas.

The clearest example may be their work around the popularly held belief that the measles-mumps-rubella vaccine is linked to autism, a claim made by a single, long-discredited study. Nyhan, Riefler and their research partners surveyed over 2,000 parents; most received one of the following: (1) materials from the Centers for Disease Control (CDC) correcting the falsehood; (2) a pamphlet describing the dangers of measles, mumps, and rubella; (3) pictures of children who have these illnesses; or (4) a mother’s firsthand story about how her baby almost died from measles. A control group received no materials.

The results: None of these approaches made parents who were opposed to vaccines more likely to vaccinate their kids. And, while information from the CDC changed some people’s minds about the link to autism, it made them even less inclined to vaccinate.

Arguing that something is not true — even when the facts are on your side — may make someone more convinced that it is true. Nyhan and Reifler call this the “backfire effect.” And those who are most susceptible to it are the people who are most educated — and invested — in an issue.

Why? Because, as rational as we may think we are, our emotions play a much bigger, and much quicker, role in our decision-making than our reasoning does. In fact, we respond on a feeling level before our conscious mind even kicks in.

Writer Chris Mooney, who explores Nyhan and Reifler’s work, puts it this way, “Our ‘reasoning’ is a means to a predetermined end — winning our ‘case’ — and is shot through with biases … we give greater heed to evidence and arguments that bolster our beliefs … and we try to debunk arguments we don’t agree with.” Mooney paraphrases psychologist Jonathon Haidt: “We may think we’re being scientists, but we’re actually being lawyers.”

Now Nyhan and Reifler are testing the theory that we’re even more inclined to hold on to a false belief if it threatens our sense of self.

This tendency to selectively use information that supports our beliefs is reinforced when we use ideological news sources, like Fox News or MSNBC. And with social media, it’s amplified even more. We cherry-pick quotes or parts of articles from web-based news sources — the ones we trust because they reflect our beliefs — building a foundation on “facts” that support our biases.

Nyhan and Riefler have published recommendations for journalists to avoid reinforcing false beliefs. Ultimately, though, they speculate that “naming and shaming” politicians and sources who promote falsehoods may be more effective in building a consensus of those who are most invested and conversant about social issues, a consensus insisting upon truth.

But what about communication with those closest to us, the friends in our backyards or — perhaps especially — our Facebook friends? With social media as a platform for communication and presentation of self, the personal and political are intertwined: ‘This is who I am’, we say — or at least who we want to be.

Maria Konnikova, in an article about Nyhan and Riefler’s work, concludes:

Facts and evidence, for one, may not be the answer everyone thinks they are: They simply aren’t that effective … Instead, why not focus on presenting issues in a way that keeps broader notions out of it — messages that are not political, not ideological, not in any way a reflection of who you are?

Perhaps our discussions — on Facebook, at coffee shops and in our homes would be less likely to reach an impasse if we took this advice. And ultimately, if we want to be an informed society, one that relies on reason at least as much as feeling, we will need to learn to listen more, to allow ourselves to be a little more uncomfortable and explore alternative explanations for what we believe is true.

Rachel Burke is a member of The Olympian’s 2016 Board of Contributors. She can be reached via rachelburke1515@gmail.com.