By Nick Adams with Eric Wimsatt
You shout in frustration at people you once considered friends. Your workplace divides into ‘conservative’ and ‘liberal’ factions. Your family argues over what news station is “fake”. You’ve lost all sympathy for people who support the politicians you don’t like. And people you once thought reasonable are touting crazy conspiracy theories.
What’s going on? And is there any way out of this mess?
As social scientists, we see clearly what’s happening. We are experiencing a deluge of misinformation combine with a human cognitive tendency that psychologists call the “Backfire Effect.”
When it comes to unwinding the damage done by misinformation, there are few challenges as formidable as the backfire effect. This is a term, detailed by Brendan Nyhan and Jason Reifler in 2010, that describes people’s psychological tendency to dig in their heels and defend beliefs—including obviously false ones—even more vigorously when they are presented with evidence that contradicts them. It’s so frustrating when you encounter it—usually while arguing with someone who is spewing crazy theories—that it’s enough to make you throw up your hands. If clear evidence won’t change one person’s mind, how can we hope our divided nation will ever share a common view of reality?
We see the backfire effect everywhere. It is the basis for much of the commentary on partisan news outlets and talk shows. And we’ve all experienced it—probably as both perpetrator and frustrated victim. Frighteningly, many theories suggest the reason the backfire effect is so pernicious and ubiquitous is that it is feature of human cognition—not a bug.
To understand how the backfire effect operates, we first need to understand how beliefs take root.
We believe in each other more than ‘the truth’
Western science, for the last few hundred years, has bought into the notion that there is an ‘objective’ reality we can all understand. Most of us believe that understanding how objective reality works––the laws of physics, the workings of chemistry, biology, etc.–– is beneficial. But, for a social species like ours, a substantial portion of any belief’s value is also determined by its popularity. When we agree with our neighbors, we live in harmony. When we don’t, friction occurs.
Holding and espousing beliefs—especially false ones—can be a very effective way to show your loyalty to a tribe. And belonging to a tribe is important to humans. It’s where we get resources—social, physical, and economic. Failure to align with our group’s beliefs can cause us to be marginalized or cast out, a fate—for humans—that is arguably worse than death. Irrationally optimistic beliefs can also create opportunities within tribes while ‘realistic’ pessimism tends to curdle synchronicities. Even outside of social contexts, erroneously believing every rustle of the wind is a threatening predator can keep a family alive in a dangerous environment.
Moreover, doubt requires energy. For daily survival, we operate at optimal efficiency when we let the sleeping dogs of erroneous belief lie. Constantly examining, interrogating, and validating our belief systems is exhausting. Our brains usually try to ‘satisfice’––to do a sufficient and satisfactory job of keeping us moving. Unless a belief’s truth or falsity is immediately relevant, we have little incentive to revise it.
Beliefs are stubborn for the same reason people are stubborn. Managing our identity—tribe, belief systems, etc.—requires less effort if we don’t change anything we don’t have to. In this, our identities function like other biological and engineered systems. Managing our identity is like managing our thermostat. To keep the proper temperature, you don’t keep your hand on the HVAC controls all the time, making micro-adjustments second by second. You set a desirable temperature range so the system will heat or cool when the temperature is lower or higher than that range. Similarly, with our identities, we develop a sense of the behaviors and beliefs that fit our normal range and remain at peace until we face a situation where our behaviors or beliefs threaten that range. In those moments, we might say we are beside ourselves, outside of ourselves, or simply not ourselves.
Why you are fighting a losing battle on Facebook
Trying to get someone to change their mind via Facebook debate is usually a waste of time. Unless that person has staked their identity on being circumspect and open to new information (as many scientists have,) they won’t change their mind. When hard pressed, they’ll change the subject instead, shout their group’s slogan or—as the ‘backfire effect” predicts—reaffirm their original position with more force. To ask someone to change their mind, unless they consider intellectual flexibility a key component of their identity, is like asking them to manually adjust the heat or air conditioning. It’s a pain. They might enjoy tinkering occasionally, when they have time. But they don’t want it sprung on them, certainly not on in public view by someone who is putting them on the spot.
Most of us, too, are a lot happier attempting to adjust someone else’s heater—and we seem more skilled at it. As Lord, Ross, and Lepper found in a 1979 study, people are more likely to find logical and other methodological errors in articles they disagree with. This corollary to confirmation bias, which predisposes us to credulously believe facts in line with our belief systems, rounds out a psychology built to resist change.
Can you argue with the backfire effect?
What then should we do when we are faced with something that is wrong—disproven, made up, or dangerous—on the internet? The question doesn’t stem from mere petty annoyance. Our social-media-amplified misinformation problems have shifted major election outcomes, caused grievous harm, and had potentially historic consequences. The situation is dire. And we can’t just give up.
But fighting human psychology is not the answer either. Reacting with resistance mobilizes the willfulness of those we resist. People just dig in their heels deeper. They believe harder. But maybe there is a way to keep people from forming so many bogus beliefs in the first place. Maybe this is a situation where an ounce of prevention is worth a pound of cure.
History provides some reason for hope. Though it is very hard to quickly change people’s beliefs by arguing with them, education works. For the last few decades throughout democratic societies, public schools have explicitly taught the value of inclusion and discouraged identities built around the hatred of historically disadvantaged populations. This shift to a more humane regard for one another has trickled up to parents, who are motivated to maintain identities seen as good and respectable by their children, and we’ve seen public opinion shift as a result. We’ve even witnessed politicians and religious leaders whose livelihoods and identities depended on not changing their minds come around to support ideas they built their careers opposing. Sea changes like these—and others, like the near abolition of slavery and the winning of women’s suffrage—have required society-wide efforts over many decades. But none of these shifts came about because ‘leaders’ lost public debates and changed their minds.
Now our mass society is closer than ever to providing individual freedom and safety while maintaining the social fabric that allows us all to coordinate and cooperate when we wish—the dream of democratic republicanism. That fabric feels pretty sullied and threadbare right now. But our current challenges are small in historical context. When we faced down slavery, we were up against a coherent (and terrible) set of beliefs that underpinned an entire international economy. Right now, we are up against many thousands of uncoordinated falsehoods, exaggerations, and erroneous inferences pulling people in different directions and polluting what should be our common ground. It’s a crummy situation. But we are bigger than this—far bigger.
At Goodly Labs, we know that solving problems requires understanding them. We can’t expect success simply by joining forces with yesteryear’s victors or applying their previously successful approaches––whether those were lobbying campaigns, street protests, or sit-ins––to today’s problems. When we understand how human psychology works to confirm and protect a person’s beliefs, we realize that debating individual ideas is a losing proposition. In a sense, resistance is futile.
However, a real solution to the problems of misinformation—one that respects and avoids the powerful backfire effect—is ready for deployment. It’s called Public Editor and it could be justifiably described as civilization’s first B.S. detector. We believe it will work because it has been designed with a fulsome understanding of how beliefs form, how their value is substantially socially determined, and how difficult beliefs are to uproot once they have taken hold.
Strictly speaking, Public Editor doesn’t try to directly counteract or solve the backfire effect. It does not, for instance, provide ways to directly change people’s minds on social media. Most people’s social psychological systems would immediately shut down any such intervention. Instead, Public Editor works by introducing doubt around erroneous claims in the first moment a newsreader encounters them, preventing false beliefs from taking root in the first place. This approach works because humans typically put new ideas through a review process before we adopt them as beliefs. We compare any new idea with our existing beliefs, revise it to fit, and rehearse it over a number of weeks as we (consciously and unconsciously) contemplate how we would apply it across contexts or share it with others. Research suggests that this integration and rehearsal period is vital to belief adoption, even finding that the backfire effect is stronger a few weeks after a new idea is introduced than in the moments immediately after. Once beliefs are well-integrated, they are very difficult to relinquish.
From a newsreaders’ perspective, Public Editor is easy to appreciate. Its browser extension decorates any news article—as you read it—with labels pointing out where authors or sources have committed cognitive errors, argumentative fallacies, inferential mistakes, or errors in language or tone. Read in this context, you can easily dismiss the erroneous information before it becomes lodged in your mind. Public Editor will help you avoid leaps of logic, rhetorical flourishes, and bogus conclusions that often get propagated through social media.
For people who consider themselves open minded or who want to be part of the solution, Public Editor provides a tool that is more effective than argument to help you disseminate behavioral norms that support the adoption of discernment. By providing volunteers with tasks and badges and by training and rewarding people for their evaluations of news content, Public Editor inspires us to identify with and perform the virtues of ‘discernment,’ and ‘neutrality.’ And, finally, Public Editor offers newsreaders a supportive community to join—an alternative to the political tribes many of us are weary of identifying with.
(If there are wealthy foundations, governments, and technology companies reading this who might be able to scratch together a few million dollars to scale it up for use at a national scale, we could use the help. So far, it has been invented and developed by a plucky team of student-debt-laden non-profit data science volunteers.)
Prevention is the cure
The “backfire effect” is real and formidable. And there is little use fighting it. But with Public Editor we have a tool to short circuit it, to prevent faulty ideas from taking root in the first place. It’s a system that runs on volunteer labor, so it also helps engage people in a community where they can gain social support for developing virtues we need more of in our information-heavy society.
Resist the urge to unfriend the people you’re arguing with. Instead, come help Public Editor stop the lies from spreading to their minds in the first place.
Simulation -- a YouTube channel dedicated to 're-birthing the public intellectual' spoke to our own Dr. Nick Adams about everything from the decreasing role of humans in our economy to the role of misinformation in politics to ways social science and data can predict violence. He explains Public Editor, how it works, and how it might be the world's first BS detector. We cued it up so you can quickly learn about Public Editor. But you might enjoy listening to the entire conversation.
By Nick Adams, Ph.D.
One of the world’s immanent interviewers –– Dave Edmonds –– kicks Public Editor’s tires.
I was repeatedly told that getting an interview with Dave Edmonds was a special thing. He has chopped it up with the world’s leading philosophers and social scientists, and intellectual audiences have been loving his probing sit-downs for years. But no mere verbal missive could have prepared me for the ride that was our hour-long conversation in London.
You should listen to the podcast, boiled down to a potent 16 minutes, here. (Find the little link at the bottom that says ‘Direct download’.)
It was a blast!
Dave is obviously brilliant, as you will hear. His questions often come as shockingly succinct and pregnant descriptions of our two years of work, followed by a deep yet clean incision with his skillful scalpel of doubt.
It took all of my capacity –– built up from years of high school debate, seminar sparring, conference Q&A parrying, board room jiu-jitsu, and zen practice –– to meet him at his level. And it was a total thrill. He forced me to play my ‘A game’ as project evangelist despite significant jet lag, and he extracted for his audience an incredibly tight, dense, and quickly syncopated explanation and exploration of Public Editor and its promise.
So have a listen. If you’re a nerd like me, I bet you’ll like it. And be sure to check out more of Dave Edmonds’ masterful work distilling the best thinking from some of the world’s top minds, here and here.
By Nick Adams, Ph.D.
After a couple years of working with our heads down, Public Editor is show ready — and the power geeks of tech and social science are beginning to take notice.
I was very fortunate, recently, to sit down with Josh Tucker — the NYU professor behind tons of amazing computational social science (including studies identifying Russian bots, field-defining reports of the false news research space, and the discovery that fake news is mostly spread by the baby boomers, just to name a few). In addition to leading NYU’s SMaPP Lab (Social Media and Political Participation Lab), Josh is one of the co-founders of the Monkey Cage blog, a haven for political science geeks (including me) which was incorporated into the Washington Post a few years back.
Talking with Josh is always a joy. His knowledge of, and enthusiasm for, social science is infectious –– he has the air of a kid in a candy shop whenever we go deep into these topics. And his excitement about Public Editor reminded me that we are on the right track.
Here’s a link to our interview. Hope you enjoy it as much as I did!
Public Editor is a new credibility service provider – The Goodly Labs secret weapon in the war on misinformation! It is a simple, crowdsourced tool anyone will be able to use to annotate and verify the accuracy, bias, and quality of the news. We know you are excited to try it. And it is almost ready. Right now, we are looking for volunteers to help us through the next phase of testing. If you – or someone you know – would like to help, come to Public Editor and sign up.
Public Editor works! We have an alpha version up and running. It’s not ready for Prime Time but it’s ready for a select team of concerned citizens to get started annotating articles. Once we have that team, we can start testing the tools, troubleshooting, and taking it to the next level. Click here to see an scored article. (use Chrome and click ‘refresh’ if you don’t see the labels at first)
We need help! Want to do more than sign up to volunteer to annotate? We need all kinds of help. See the ‘Calling All Contributors’ section below for our wish list.
People are taking notice! Public Editor has been getting a lot of attention. The Washington Post did an interview with project lead Nick Adams. And we have been getting lots of engagement on social media. We were best in show at the World Economic Forum event in New York last month where everyone from governments to platforms and newsrooms expressed their excitement to work with Public Editor to solve misinformation. A lot is brewing right now. We can’t tell you all about it just yet! But stay tuned and we will keep you updated.
Why Do Our Relatives Believe Wild Conspiracy Theories?
Your relatives are not completely to blame for the national conversation taking a turn toward the strange.
By Nick Adams, Ph.D.
In the Spotlight
The Public Editor team is a made up of a wonderful team of volunteers, committed employees, contractors, students, researchers, and academics. We want to introduce you to a number of people on the team. And this quarter, we invite you to meet Catharine Wu.
Catharine is well loved on the team as both a manager, keeping all of our work on schedule, and as a designer and engineer leading Public Editors algorithm team and contributing to many of its UX interfaces.
Catharine is a Senior in Applied Mathematics and Data Science at UC Berkeley. She discovered Public Editor when we went looking for undergraduate researchers. We were happy to take her onboard based on her experience, and she has become so devoted to this project – and so essential to us -- that (we hope!) she is planning to stay in Berkeley after she graduates to keep us moving forward.
Catharine’s reports that her family is not entirely thrilled about this. She comes from Jinzhou, China. So, she is a long way from home. “I am the only child in my immediate family and the only girl in my generation. My family is very close. Everyone misses me so much!”
Thank you, Wu family! Catharine is doing very important work here.
"Public Editor helps me hone my critical thinking skills and makes the world a better place while doing it."
-Scott Peterson, Head of the Morrison Library and the Graduate Services Library at UC Berkeley.
"Public Editor is important to society because it provides a human-centric approach to keep our media sources/government in check. We come together as humans with a common goal of receiving the whole truth."
-Alex Popescu, Applied Math and Data Science major at U.C. Berkeley
Calling All Contributors
- Community volunteer leads needed
- We need professors and teachers who want to participate in our National Homework Assignment
- We need donations to help us hire operational support as we scale up
- We need donations of money or assistance to produce training and onboarding videos
We hope to be able to make an announcement about significant funding, so we can do:
- Validation studies
- UX optimizations
- scale up for a national audience and volunteer community
Share the Love!