Resisting the Backfire Effect is Futile, but Victory is in Sight
By Nick Adams with Eric Wimsatt
You shout in frustration at people you once considered friends. Your workplace divides into ‘conservative’ and ‘liberal’ factions. Your family argues over what news station is “fake”. You’ve lost all sympathy for people who support the politicians you don’t like. And people you once thought reasonable are touting crazy conspiracy theories.
What’s going on? And is there any way out of this mess?
As social scientists, we see clearly what’s happening. We are experiencing a deluge of misinformation combine with a human cognitive tendency that psychologists call the “Backfire Effect.”
When it comes to unwinding the damage done by misinformation, there are few challenges as formidable as the backfire effect. This is a term, detailed by Brendan Nyhan and Jason Reifler in 2010, that describes people’s psychological tendency to dig in their heels and defend beliefs—including obviously false ones—even more vigorously when they are presented with evidence that contradicts them. It’s so frustrating when you encounter it—usually while arguing with someone who is spewing crazy theories—that it’s enough to make you throw up your hands. If clear evidence won’t change one person’s mind, how can we hope our divided nation will ever share a common view of reality?
We see the backfire effect everywhere. It is the basis for much of the commentary on partisan news outlets and talk shows. And we’ve all experienced it—probably as both perpetrator and frustrated victim. Frighteningly, many theories suggest the reason the backfire effect is so pernicious and ubiquitous is that it is feature of human cognition—not a bug.
To understand how the backfire effect operates, we first need to understand how beliefs take root.
We believe in each other more than ‘the truth’
Western science, for the last few hundred years, has bought into the notion that there is an ‘objective’ reality we can all understand. Most of us believe that understanding how objective reality works––the laws of physics, the workings of chemistry, biology, etc.–– is beneficial. But, for a social species like ours, a substantial portion of any belief’s value is also determined by its popularity. When we agree with our neighbors, we live in harmony. When we don’t, friction occurs.
Holding and espousing beliefs—especially false ones—can be a very effective way to show your loyalty to a tribe. And belonging to a tribe is important to humans. It’s where we get resources—social, physical, and economic. Failure to align with our group’s beliefs can cause us to be marginalized or cast out, a fate—for humans—that is arguably worse than death. Irrationally optimistic beliefs can also create opportunities within tribes while ‘realistic’ pessimism tends to curdle synchronicities. Even outside of social contexts, erroneously believing every rustle of the wind is a threatening predator can keep a family alive in a dangerous environment.
Moreover, doubt requires energy. For daily survival, we operate at optimal efficiency when we let the sleeping dogs of erroneous belief lie. Constantly examining, interrogating, and validating our belief systems is exhausting. Our brains usually try to ‘satisfice’––to do a sufficient and satisfactory job of keeping us moving. Unless a belief’s truth or falsity is immediately relevant, we have little incentive to revise it.
Beliefs are stubborn for the same reason people are stubborn. Managing our identity—tribe, belief systems, etc.—requires less effort if we don’t change anything we don’t have to. In this, our identities function like other biological and engineered systems. Managing our identity is like managing our thermostat. To keep the proper temperature, you don’t keep your hand on the HVAC controls all the time, making micro-adjustments second by second. You set a desirable temperature range so the system will heat or cool when the temperature is lower or higher than that range. Similarly, with our identities, we develop a sense of the behaviors and beliefs that fit our normal range and remain at peace until we face a situation where our behaviors or beliefs threaten that range. In those moments, we might say we are beside ourselves, outside of ourselves, or simply not ourselves.
Why you are fighting a losing battle on Facebook
Trying to get someone to change their mind via Facebook debate is usually a waste of time. Unless that person has staked their identity on being circumspect and open to new information (as many scientists have,) they won’t change their mind. When hard pressed, they’ll change the subject instead, shout their group’s slogan or—as the ‘backfire effect” predicts—reaffirm their original position with more force. To ask someone to change their mind, unless they consider intellectual flexibility a key component of their identity, is like asking them to manually adjust the heat or air conditioning. It’s a pain. They might enjoy tinkering occasionally, when they have time. But they don’t want it sprung on them, certainly not on in public view by someone who is putting them on the spot.
Most of us, too, are a lot happier attempting to adjust someone else’s heater—and we seem more skilled at it. As Lord, Ross, and Lepper found in a 1979 study, people are more likely to find logical and other methodological errors in articles they disagree with. This corollary to confirmation bias, which predisposes us to credulously believe facts in line with our belief systems, rounds out a psychology built to resist change.
Can you argue with the backfire effect?
What then should we do when we are faced with something that is wrong—disproven, made up, or dangerous—on the internet? The question doesn’t stem from mere petty annoyance. Our social-media-amplified misinformation problems have shifted major election outcomes, caused grievous harm, and had potentially historic consequences. The situation is dire. And we can’t just give up.
But fighting human psychology is not the answer either. Reacting with resistance mobilizes the willfulness of those we resist. People just dig in their heels deeper. They believe harder. But maybe there is a way to keep people from forming so many bogus beliefs in the first place. Maybe this is a situation where an ounce of prevention is worth a pound of cure.
History provides some reason for hope. Though it is very hard to quickly change people’s beliefs by arguing with them, education works. For the last few decades throughout democratic societies, public schools have explicitly taught the value of inclusion and discouraged identities built around the hatred of historically disadvantaged populations. This shift to a more humane regard for one another has trickled up to parents, who are motivated to maintain identities seen as good and respectable by their children, and we’ve seen public opinion shift as a result. We’ve even witnessed politicians and religious leaders whose livelihoods and identities depended on not changing their minds come around to support ideas they built their careers opposing. Sea changes like these—and others, like the near abolition of slavery and the winning of women’s suffrage—have required society-wide efforts over many decades. But none of these shifts came about because ‘leaders’ lost public debates and changed their minds.
Now our mass society is closer than ever to providing individual freedom and safety while maintaining the social fabric that allows us all to coordinate and cooperate when we wish—the dream of democratic republicanism. That fabric feels pretty sullied and threadbare right now. But our current challenges are small in historical context. When we faced down slavery, we were up against a coherent (and terrible) set of beliefs that underpinned an entire international economy. Right now, we are up against many thousands of uncoordinated falsehoods, exaggerations, and erroneous inferences pulling people in different directions and polluting what should be our common ground. It’s a crummy situation. But we are bigger than this—far bigger.
At Goodly Labs, we know that solving problems requires understanding them. We can’t expect success simply by joining forces with yesteryear’s victors or applying their previously successful approaches––whether those were lobbying campaigns, street protests, or sit-ins––to today’s problems. When we understand how human psychology works to confirm and protect a person’s beliefs, we realize that debating individual ideas is a losing proposition. In a sense, resistance is futile.
However, a real solution to the problems of misinformation—one that respects and avoids the powerful backfire effect—is ready for deployment. It’s called Public Editor and it could be justifiably described as civilization’s first B.S. detector. We believe it will work because it has been designed with a fulsome understanding of how beliefs form, how their value is substantially socially determined, and how difficult beliefs are to uproot once they have taken hold.
Strictly speaking, Public Editor doesn’t try to directly counteract or solve the backfire effect. It does not, for instance, provide ways to directly change people’s minds on social media. Most people’s social psychological systems would immediately shut down any such intervention. Instead, Public Editor works by introducing doubt around erroneous claims in the first moment a newsreader encounters them, preventing false beliefs from taking root in the first place. This approach works because humans typically put new ideas through a review process before we adopt them as beliefs. We compare any new idea with our existing beliefs, revise it to fit, and rehearse it over a number of weeks as we (consciously and unconsciously) contemplate how we would apply it across contexts or share it with others. Research suggests that this integration and rehearsal period is vital to belief adoption, even finding that the backfire effect is stronger a few weeks after a new idea is introduced than in the moments immediately after. Once beliefs are well-integrated, they are very difficult to relinquish.
From a newsreaders’ perspective, Public Editor is easy to appreciate. Its browser extension decorates any news article—as you read it—with labels pointing out where authors or sources have committed cognitive errors, argumentative fallacies, inferential mistakes, or errors in language or tone. Read in this context, you can easily dismiss the erroneous information before it becomes lodged in your mind. Public Editor will help you avoid leaps of logic, rhetorical flourishes, and bogus conclusions that often get propagated through social media.
For people who consider themselves open minded or who want to be part of the solution, Public Editor provides a tool that is more effective than argument to help you disseminate behavioral norms that support the adoption of discernment. By providing volunteers with tasks and badges and by training and rewarding people for their evaluations of news content, Public Editor inspires us to identify with and perform the virtues of ‘discernment,’ and ‘neutrality.’ And, finally, Public Editor offers newsreaders a supportive community to join—an alternative to the political tribes many of us are weary of identifying with.
(If there are wealthy foundations, governments, and technology companies reading this who might be able to scratch together a few million dollars to scale it up for use at a national scale, we could use the help. So far, it has been invented and developed by a plucky team of student-debt-laden non-profit data science volunteers.)
Prevention is the cure
The “backfire effect” is real and formidable. And there is little use fighting it. But with Public Editor we have a tool to short circuit it, to prevent faulty ideas from taking root in the first place. It’s a system that runs on volunteer labor, so it also helps engage people in a community where they can gain social support for developing virtues we need more of in our information-heavy society.
Resist the urge to unfriend the people you’re arguing with. Instead, come help Public Editor stop the lies from spreading to their minds in the first place.