Skip to main content

How your brain tricks you into believing you’re the reasonable one

How your brain tricks you into believing you’re the reasonable one

/

It’s so natural to ignore everything we disagree with

Share this story

Donald Trump Holds Meeting At The New York Times
Photo by Spencer Platt/Getty Images

Few people saw Donald Trump’s victory coming, including Donald Trump. There are easy culprits to blame for the surprise win (skewed polls and fake news come to mind), but the biggest enemy might be our own egos. It’s natural to ignore everything we disagree with, and many of us don’t realize that we’re biased until we’re shocked into noticing that things are not like we expected. That’s true regardless of who you ultimately voted for.

Maybe you feel like you’re the only sane one in a world gone crazy. Unfortunately, everyone thinks that, says Lee Ross, a social psychologist at Stanford University who studies bias. Political beliefs aren’t just emotionally charged because of patriotism — they also tie us to our social circles and help establish our personal identity, even if we don’t know exactly how long a senate term lasts. Sure, sometimes people believe wrong things because we’re lazy and uninformed — but the social factor plays a very large role. Changing our minds could change how we see ourselves and alienate us from our friends. Because we want badly to feel good and accepted, we resist challenging information, often without realizing it.

Our political beliefs can even affect our ability to do math

These inherent, human biases are part of why even a little fake news is a big problem. It’s easy to believe because it tells us what we want to hear. Meanwhile, a lot of correct information might not do much good if it contradicts our existing beliefs. It’s not true that misinformation only happens because of a lack of knowledge, says Brendan Nyhan, a Dartmouth College political scientist. Sometimes knowledgeable people are better at cherry-picking information so that it supports their beliefs. So, many of us don’t look for contradictory information in the first place, and we might not believe it when we do actually find it. “We do this because it is so uncomfortable to be presented with the new and to be challenged that we actively avoid this information,” says Ross. The flip side is true too: it is so tempting to believe anything that supports our position that we become vulnerable to being tricked.

There are several ways we fool ourselves, and they’re predictable. The best-known is “confirmation bias,” when people look for information that confirms what they already believe and ignore everything else. It’s easy to do this because most of our friends agree with us and are doing the same thing, too. And when we do see information from “the other side,” we think it’s bad information; Ross calls this the “hostile media effect.” Basically, regardless of whether you lean right or left, you always think the other side is biased. Think about how people feel about outlets like The National Review and Mother Jones.

Even when we view the same information as everyone else we trick ourselves into seeing what is most convenient. During the 1968 Democratic Convention, police were filmed beating demonstrators that supported Eugene McCarthy, a presidential candidate for the Democratic election who was very anti-war. Video clips were widely circulated as an example of police brutality — but most Americans were not on the demonstrators’ side. A Gallup poll at the time showed that more than half surveyed supported the police violence.

In one study, Dan Kahan, a professor of law and psychology at Yale Law School, showed people actual footage of people protesting the Westboro Baptist Church. He told half the participants that demonstrators were protesting abortion access, the other half that the people were protesting the military’s “don’t ask, don’t tell” anti-LGBT policy. Afterward, people’s political leanings affected what they remembered seeing. For instance, most of the pro-choice participants saw the protesters as blocking the “clinic.” When the video was described as a military protest, the effect was reversed: this time, most of the conservative respondents saw the protesters blocking the building.

Sometimes we double down on wrong beliefs when we’re presented with new evidence

No matter how independent you think you are — even if you haven’t registered with a political party — you aren’t immune to these biases, because you probably have friends. It doesn’t matter if you don’t know a lot about, for instance, fracking; you almost certainly know what your friends think about it. That’s enough to cause a bias, Kahan says. “I think that’s, in a microcosm, what happens in an election like the one that we just had,” he adds. “People are watching these events but they’re processing the information in ways that align them with their important affinity groups so they can watch the same thing and see radically different results.” And almost any kind of basic reasoning can be affected. In another study, Kahan’s team gave participants some data to interpret. People were told that the data was either about skin cream or about whether banning concealed handguns affected levels of violence. The data was the same, but when it came to the political question, people’s biases made their math skills worse. Democrats had no problem solving the data when it showed that banning guns lowered violence, but they messed up the math when the data suggested that banning guns increases violence. The opposite was true for Republicans.

There’s another tricky problem when it comes to fixing inaccurate beliefs: we’re committed to not changing our minds. So committed, in fact, that when you’re confronted with evidence that proves one of your cherished beliefs is false, that evidence can actually strengthen your false beliefs. Nyhan first studied this phenomena, called the “backfire effect” nearly a decade ago. At the time, there was debate over if it was justified to invade Iraq because the country supposedly had weapons of mass destruction. (It was eventually decided that there were none.) But Nyhan found that reading new information that Iraq didn’t have WMD could make people more convinced of the opposite. Thankfully, more recent work shows that the backfire effect isn’t as common as we used to think. We still don’t know exactly what causes it, or how to solve it. But the data does suggest that the backfire effect happens more when the evidence is weaker, when the issue is more controversial, and when it involves a polarizing political figure — which means that the next four years are bound to be interesting.

So what can we do?

As soon as election results were announced, there were calls to “reach out” and listen to each other. This isn’t easy when it’s so natural for us to evaluate things in a biased way. So what’s the best way to do it?

The Southern Poverty Law Center has an extensive list on how to talk to people you disagree with — everyone from joking in-laws to offended guests. The Environmental Defense Fund also has concrete suggestions for approaching the hot-button topic of climate change including “avoid painting pictures of planetary apocalypse” and, of course, “don’t get angry.”

We should be realistic that there needs to be some trust for a discussion to happen, says Julia Galef, president of the Center for Applied Rationality, a nonprofit that teaches people to apply psychological research to their lives. “Unless you have a pre-existing ground of some sort of trust and some amount of mutual respect for each other, you basically shouldn’t be trying,” she says.

It’s also important to understand that there is a difference between “listening” and “persuading.” People often say that they just want to understand when they really want to want someone’s mind, but skip the “understanding” part — instead, going directly to an argument. “One of the main things that gets in our way is feeling like understanding something implies condoning it or is supporting it in some way,” Galef says. “But those two concepts don’t have to go together. I can form a really accurate model of why a Nazi did what he did to the level of understanding the incentives he was facing, understanding exactly what caused him to believe that Jews were evil — and still think he’s wrong and dangerous.”

Ross, the Stanford psychologist, suggests that both sides try to present the views of the other in a way that the other side will agree with. This is often called the Ideological Turing Test. In this test, a (for example) pro-choice activist is challenged with explaining pro-life positions so convincingly that people cannot tell what her actual beliefs are. This prevents people from straw-manning the other side, and helps people feel like their concerns are being understood.

When it comes to correcting false information, Nyhan, the political scientist, has found that corrections from people who are similar to you (Democrats correcting a Democrat) is more effective. The SPLC also suggests appealing to the tie that you have with the person, using phrases like “I value our relationship, and these comments are putting a lot of distance, which I don’t want.”

Also, most people like to think of themselves as “good citizens” who rationally evaluate all the information, so reminding them of this can encourage everyone to take a step back.