It is painful to lose your reality, so be kind, even if you are right.”10. This tendency to embrace information that supports a point of view and reject what does not is known as the “confirmation bias.” There are entire textbooks and many studies on this topic if you’re inclined to read them, but one study from Stanford in 1979 explains it quite well. This is also a big part of why people don’t trust the media. In this narrative, which the military leaders found much easier to swallow, it was the “presidential family” and a few corrupt civilians close to Rojas—not military officers—who were responsible for the regime’s excesses. Americans are more politically polarized than they’ve been in decades, possibly ever. In other words, you think the world would improve if people changed their minds on a few important topics. We friend people like us on Facebook. For example, in the study, a message about the overwhelming scientific consensus on climate change included a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.” Exposing people to the fact that this misinformation is out there should make them more resistant to it if they encounter it later. So it doesn’t matter that she’s lying.’” (For her part, Gurumayi has denied banishing her brother, and Siddha Yoga is still going strong.

Same thing if you attempt to debunk the facts that the other person has marshaled to support that belief. “Here was a person who was super rational, and believed in science, and was the target of these factless claims, but won anyway,” Manjoo says. “I remember actually consciously making that choice.”. Get life changing ideas from the world’s great thinkers. If someone you know, like, and trust believes a radical idea, you are more likely to give it merit, weight, or consideration. Someone please help me. Deferring to experts might seem like a good start, but Kahan has found that people see experts who agree with them as more legitimate than experts who don’t.

In 1877, the philosopher William Kingdon Clifford wrote an essay titled “The Ethics of Belief,” in which he argued: “It is wrong always, everywhere, and for anyone to believe anything on insufficient evidence.”, Lee McIntyre takes a similarly moralistic tone in his 2015 book Respecting Truth: Willful Ignorance in the Internet Age: “The real enemy of truth is not ignorance, doubt, or even disbelief,” he writes. If you had asked me this question–How do you change a mind?–two years ago, I would have given you a different answer. Sign-up to get a daily batch of tips, tricks, and smiles to, Light Up Your Yard This Christmas With Giant ‘Star Wars’ Inflatables, Ellen DeGeneres Debuted A Sleek New Hairstyle, How To Make Reversible Scarecrow And Snowman Decoration For Your Front Porch, Find The Ghost Among The Skeletons In Halloween Brainteaser, How To Keep Your Herbs Thriving Indoors All Winter Long, Nestlé Toll House Just Released Peppermint-filled Truffles For Holiday Baking, Jon Stewart Is Returning To TV With An Apple TV+ Series. “Particularized trust destroys generalized trust,” Manjoo wrote in his book. Instead, as author and psychology professor Robert Cialdini explains, Democrats must offer Trump supporters a way to get out of their prior commitment while saving face: “Well, of course you were in a position to make that decision in November because no one knew about X.”. But if the changes are going to happen at all, it’ll have to be “on a person-to-person level,” Shaw says. Facts usually don't change minds because people's beliefs pre-determine which facts they consider valid or relevant. Trump counselor Kellyanne Conway famously used it to describe White House Press Secretary Sean Spicer’s lie that Trump’s inauguration had drawn the “largest audience to ever witness an inauguration—period.”, Spicer has also said to reporters, “I think sometimes we can disagree with the facts.”. But what all this does seem to suggest is that, no matter how strong the evidence is, there’s little chance of it changing someone’s mind if they really don’t want to believe what it says.

What do the visual elements in the article reveal? When we are in the moment, we can easily forget that the goal is to connect with the other side, collaborate with them, befriend them, and integrate them into our tribe. I really like that I got this question now, as I am currently taking an introductory course in psychology. Now, most people around them just laugh and tell them that’s stupid. It’s something that’s been popping up a lot lately thanks to the divisive 2016 presidential election. Language, Cognition, and Human Nature: Selected Articles by Steven Pinker, I am reminded of a tweet I saw recently, which said, “People say a lot of things that are factually false but socially affirmed. When you want to change another person's opinion, your first challenge is to assess the strength of the beliefs that are supporting the opinion that you wish to change. This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as “motivated reasoning.” Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs. “Instead of thinking about the argument as a battle where you’re trying to win, reframe it in your mind so that you think of it as a partnership, a collaboration in which the two of you together or the group of you together are trying to figure out the right answer,” she writes on the Big Think website. He left “the way most people do: Sort of like death by a thousand cuts.”. What might be an alternative way to analyze her conclusions? The safest thing to do is probably high-tail it out of there, even if it turns out it was just your buddy messing with you. As proximity increases, so does understanding. A group of researchers at Dartmouth College wondered the same thing. And the belief kind of disappears.”. And so it spirals. 7, Each time you attack a bad idea, you are feeding the very monster you are trying to destroy. We ridicule (“What an idiot”). Though both Hillary Clinton and Donald Trump were disliked by members of their own parties—with a “Never Trump” movement blooming within the Republican Party—ultimately most people voted along party lines. “So it’s easy to see how we can slide into a sort of cognitive tribalism.”. The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. The more facts you muster to support your viewpoint, the more it will harden the other person's mind against your viewpoint. You are simply fanning the flame of ignorance and stupidity.

Nobody wants their worldview torn apart if loneliness is the outcome. “They knew that someone was just trying to show up Trump or trying to denigrate their identity.” The question behind the question was, “Whose team are you on?”. Make a point to befriend people who disagree with you. We had a year of watching with interest as Republicans struggled to resolve this. If you had asked me this question–How do you change a mind?–two years ago, I would have given you a different answer. Feed the good ideas and let bad ideas die of starvation. For example, our opinions on military spending may be fixed—despite the presentation of new facts—until the day our son or daughter decides to enlist. I must get to know him better.”. In the past couple of years, fake news stories perfectly crafted to appeal to one party or the other have proliferated on social media, convincing people that the Pope had endorsed Trump or that Rage Against the Machine was reuniting for an anti-Trump album. Students who smoked were very eager to tune in to the speech that suggested cigarettes might not cause cancer, whereas nonsmokers were more likely to slam on the button for the antismoking speech.

“I think we need to get to an information environment where sharing is slowed down,” Manjoo says. They’re not going to win the 2020 presidential elections by convincing Donald Trump supporters that they were wrong to vote for him last November or that they’re responsible for his failures in office. Survival is more important than truth. Fact-checking erroneous statements made by politicians or cranks may also be ineffective. This is hardly the first time there have been partisan publications, or many competing outlets, or even information silos.

I think once they’ve hit denial, they’re too far gone and there’s not a lot you can do to save them.”. Click here to download a free copy of his e-book, The Contrarian Handbook: 8 Principles for Innovating Your Thinking.

But as a community gets larger, the likelier it is that a person can find someone else who shares their strange belief. In other words, what point does the painting above the title make? Carol Tavris, a social psychologist and co-author of Mistakes Were Made, But Not by Me, says that for Never Trump Republicans, it must have been “uncomfortable to them to feel they could not be wholeheartedly behind their candidate. They can only be believed when they are repeated. What's going on here? In these charged situations, people often don’t engage with information as information but as a marker of identity. “This is why we need to teach critical thinking, and this is why we need to push back against false beliefs, because there are some people who are still redeemable, who haven’t made that full slide into denialism yet.