Facts Don’t Change Minds. Friendship Does

Asad Baloch
12 min readFeb 27, 2023

--

Einstein was a big fan — and a master — of thought experiments. Sitting in the solitude of his room, he would allow his mind to wander through the cosmos, putting himself in situations that were otherwise impossible, like how would he see the universe if he was riding a beam of light? Or how would he feel the effects of gravity if he was inside an elevator free-falling towards the surface? This envious power of imagination allowed him to make the greatest breakthroughs in the history of science. And since I love Einstein, I want you to join me in a little thought experiment.

Imagine you’re a medical emergency responder working with your local city hospital. Your job is mentally taxing — it requires you to be attentive and vigilant at all times, in situations that would give a normal person panic attacks. Injuries, trauma, horrible illnesses, and even death are common. Even though you are trained for the job, this can take a toll on even the most emotionally strong and healthy individual.

The past few months have been difficult, and you think you badly need a vacation. You plan a solo trip to the city you’ve always wanted to visit. You compile a list of places in that city you want to see — the local museum, the Victorian-era library, Disneyland, the bakery that sells the most amazing croissants known to mankind. You want to kayak across that heart-shaped lake everyone talks about, try bungee jumping, and dine at exquisite restaurants. Overall, the trip is going to be amazing.

You booked all the tickets and did all the paperwork. You are sitting in the airport waiting lounge when you notice the plane you are about to fly is a Boeing 737 MAX 8. Shit! These planes have gotten into a lot of accidents, which makes this statistically the most dangerous plane ever. The route you are flying has no alternative, so you have no choice — you have to board it.

The three-hour flight stretches to eternity. Even though the plane has been extensively inspected by a thousand organizations and cleared for aviation, you still have your fears. It’s only when the plane comes to a halt on the tarmac that you breathe a sigh of relief. You’re alive!

As you make your way out of the airport, you casually get into a taxi, an act that is 100 times more likely to result in your death, and head to your hotel. The vacation has officially begun!

Okay, that’s enough imagination for a day. The point of this thought experiment was to illustrate that there are a lot of things, backed by concrete evidence, that we don’t take into account while making decisions in our day-to-day lives. We all know this to some extent, and yet when we’re sitting across the table from someone who displays this very tendency, we cannot help but get infuriated about how clearly we see the facts and they don’t.

Whether it be vaccination, climate change, or any other heavily debated topic, we all have experienced being on one side of the debate or the other, where we desperately tried changing the other person’s mind. We barrage them with facts, and pull out the big guns of logic and reasoning, hoping it would make a dent in their thick skull. But almost always, regardless of the nature of the debate, neither party changes its stance. And then you head back home, thinking to yourself what a waste of time that was, and that you shouldn’t have started the debate in the first place. The other person is thinking the same thing. Congrats, you both hate yourselves a little more now.

In this article, I want to talk about why bombarding someone with facts and figures doesn’t shake their beliefs. In fact, in some cases, it does quite the opposite — it hardens their belief that they are right.

Buckle up, and let’s dive in.

The Logic Of False Beliefs

Leo Tolstoy once wrote: “The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of a doubt, what is laid before him.”

As humans, we need a model of the world in order to survive. It helps us navigate through life so that we can pass on our genes to the next generation before we die. If our model of the world does not align with reality, we struggle to take effective and meaningful actions.

The biggest curse of our evolutionary psychology, perhaps, is that reality and facts don’t matter to the human mind, even though they are essential for survival. Instead, what truly matters is a deep desire to belong, to be part of a tribe.

In Atomic Habits, James Clear writes: “Humans are herd animals. We want to fit in, bond with others, and earn the respect and approval of our peers. Such inclinations are essential to our survival. For most of our evolutionary history, our ancestors lived in tribes. Becoming separated from the tribe — or worse, being cast out — was a death sentence.”

Understanding the truth is important, but so is being part of a community. While these two desires go hand-in-hand, they often come into conflict. And when they do, it creates problems.

In many circumstances, social connection is more important than the truth of a particular idea or a thing. Steven Pinker wrote about this when he said: “People are embraced or condemned according to their beliefs, so one function of the mind may be to hold beliefs that bring the belief-holder the greatest number of allies, protectors, or disciples, rather than beliefs that are most likely to be true.”

We don’t always believe things because they are correct. Sometimes, we believe them because they make us look good and earn social validation from the people around us.

Kevin Simler talked about this: “If a brain anticipates that it will be rewarded for adopting a particular belief, it’s perfectly happy to do so, and doesn’t much care where the reward comes from — whether it’s pragmatic (better outcomes resulting from better decisions), social (better treatment from one’s peers), or some mix of the two.”

False beliefs are not useful in a factual sense, but they can be useful in a social sense. To borrow a phrase from this tweet that I came across, we might call this approach “factually false, but socially accurate.” When we have to choose between the two, people often select family and friends over facts.

This insight explains why we hold our tongues in social settings and let that one obnoxious uncle keep yammering red herrings, or look the other way when our parents say something offensive. But it also reveals the secret to changing our minds.

Facts Don’t Change Minds. Friendship Does

We believe in falsehoods because they earn us social validation and acceptance. Convincing someone to change there is really the process of convincing them to let go of their tribe. You can’t expect to change someone’s mind by taking away their community. You have to give them somewhere to go. Nobody wants their worldview torn apart. They will not let go of that, especially if loneliness is the outcome.

You can change someone’s mind by making friends with them and bringing them into your tribe. Now, they can challenge their beliefs without the risk of being abandoned.

The best way to change someone’s mind is to be friends with them

In his book Religion for Atheists, British Philosopher Alain de Botton suggests we simply share meals with those who disagree with us. “Sitting down at a table with a group of strangers has the incomparable and odd benefit of making it a little more difficult to hate them with impunity. Prejudice and ethnic strife feed off abstraction. However, the proximity required by a meal — something about handing dishes around, unfurling napkins at the same moment, even asking a stranger to pass the salt — disrupts our ability to cling to the belief that the outsiders who wear unusual clothes and speak in distinctive accents deserve to be sent home or assaulted. For all the large-scale political solutions which have been proposed to salve ethnic conflict, there are few more effective ways to promote tolerance between suspicious neighbours than to force them to eat supper together.”

It is not difference that breeds prejudice and hostility, but distance. As Lincoln said: “I don’t like that man. I must get to know him better.”

Spectrum of Beliefs

The people who are most likely to change our minds are the people that we agree with 98% of the time. This s a common, yet profound, realization. Think about it: If someone you like believes a radical idea, you are more likely to give it merit, weight, and consideration. You already agree with that person in most other areas of life. Maybe you should change your mind and agree with them on this one too. But if someone who is wildly different proposes the same idea, you will likely dismiss them without a second thought.

I have a friend who believes in a lot of stupid shit and says a lot of stupid things. Let’s call him Harry. He and I have a mutual friend, one that I’m closer to and more connected with. Let’s call him Ron. Now, Ron once asked me why I don’t try to change Harry’s mind. I just said that it’s impossible for me. In terms of beliefs, he and I are a world apart. The schism between us is just too wide to bridge.

One way to visualize this is by mapping beliefs on a spectrum. Let’s divide this spectrum into 10 units. If you find yourself at Position 7, there is not much sense in trying to convince someone who is at Position 1. This is me and Harry — the gap between us is just too wide.

When you’re at Position 7, your time is better spent convincing someone who is at Positions 6 or 8, gradually pulling them in your direction. This is me and Ron — he can change my mind, and I can change his.

The most heated arguments occur between people who are at the opposite ends of the belief spectrum, and most frequent learning occurs between people who are nearby. The closer you are to someone, the more likely it is that some of their beliefs will bleed into your own mind and shape your thinking. The further away an idea is from your current position, the more likely you are to reject it.

When it comes to changing minds, it is difficult to jump from one Position to another. You can’t jump the spectrum; you have to slide down it.

Your upbringing also plays a crucial role in what you believe in when you grow up. When we are kids, our parents, teacher, and society hammer certain beliefs down our throats. As we group up, these ideas get deeply embedded into our psyche — they become a part of our identity, a defining trait of who we are. The get reinforced over time by the social groups we associate with, the media we consume, and our brain’s function. And when we come across evidence that conflicts with these beliefs, our identity is jeopardized. At this point, it is easier to dismiss the evidence than to overhaul our entire identity. Scientists call this phenomenon belief perseverance, maintaining a belief in the face of evidence that firmly contradicts it.

Any Idea that is significantly different from our current worldview feels threatening. And the best place to ponder a threatening idea is a non-threatening environment — one where we don’t risk alienation if we change our minds. Finding such an environment is difficult. So the best place to start is with books because I believe they are a better vehicle for transforming beliefs than seminars and conversations with experts.

When people are confronted with a set of uncomfortable facts, they double down on their current position because they want to save face and not look stupid. They would rather die than publicly admit they are wrong.

Books resolve this tension. With a book, the conversation takes place within someone’s head. When there is no risk of being judged by others, it is easier to be open-minded and accepting.

Arguments are a personal affront to someone’s identity. Books are like slipping the seed of an idea into someone’s mind and letting it grow on its own.

Why False Ideas Persist

Silence is death for an idea. It doesn’t matter if an idea is true or not if it is never spoken or written down, it dies with the person who conceived it. Ideas are remembered when they are repeated. They can be believed when they are repeated. This is another reason stupid ideas survive — people continue to talk about them.

When I say people talk about bad ideas, I mean not only the people who subscribe to them, but also the people who complain about them. Linguist and Philosopher George Lakoff beautifully summarized this when he said: “If you use logic against something, you’re strengthening it.”

Before you criticize a bad idea, you have to reference that idea. You end up repeating what you hoped people will forget. But, of course, people won’t forget about it because you keep reminding them how bad it is. The more you repeat an idea, the more likely people are to believe it.

Every time you attack a bad idea, you are feeding the monster you seek to destroy. This is why people like Andrew Tate go viral overnight. When the mainstream media challenges their beliefs, it helps them. It disseminates their bullshit. Trump also uses this strategy to remain relevant — he deliberately says awful, incendiary stuff to stir controversy and rides the wave of media attention that follows. Tate and Trump would have disappeared into oblivion if it weren’t for The New York Times and CNN. Hell for the ideas you deplore is silence.

It is better you spend time championing good ideas than tearing down bad ideas. Feed good ideas and let the bad ideas die of starvation.

Intellectual Soldier

I know what you might be thinking. “This guy is out of his fucking mind. How am I supposed to let all these idiots get away with this?”

Well, it’s a good thing we brought this up. Let me be clear: I’m not saying it is never useful to criticize someone or point out an error. It is the intention behind it that matters.

If you are criticizing bad ideas because you believe the world would be a better place if few people believed in them, then your best choice is to remain silent. Remember what we talked about: the more you talk about an idea, the more people are to believe it.

If your goal is to change minds, then criticizing the other side is not the best approach. As I said earlier: when people are confronted with their stupid ideas, they get defensive. As Haruki Murakami put it: “Always remember that to argue, and win, is to break down the reality of the person you are arguing against.”

Confronting someone with facts that don’t line up with their worldview can trigger a “backfire effect”, which can end up strengthening their original position and beliefs. People argue to win, not to learn. Accepting that their ideas are false is conceding defeat, and good luck getting anyone to do that.

As Julia Galef put it: people often act like soldiers rather than scouts. Soldiers are always on the intellectual attack, looking to defeat the people who they disagree with, people who might threaten their identity. Scouts, on the other hand, are intellectual explorers, mapping their way through uncharted territory with others. It is curiosity that distinguishes between the two. So don’t be a soldier, be a scout.

Summary

In the world of TikTok videos and Instagram reels, where people mindlessly scroll through their social media feeds for minutiae of entertainment, it’s amazing how you managed to read through this entire article. Your ability to focus and concentrate is just mind-blowing. Here’s what I want you to take away from this:

• We all believe in things that violate the rational, scientific ways of thinking. We notice this in others, but in ourselves.

• Facts and reality are essential for survival, but they don’t matter to the human mind. Thanks to our evolutionary psychology, we crave being part of a tribe, and when our desire for belonging clashes with our need to know the facts and understand the reality, the former triumphs.

• We believe in falsehoods because they earn us social validation and acceptance, and make us look good in the eyes of the people around us.

• Falsehoods are not useful in factual sense, but they are useful in social sense.

• Our upbringing plays a crucial role in what we believe in — the beliefs and ideas we are frequently exposed to as we grow become deeply embedded in our psyche. They become a part of our identity, a defining trait of who we are. When these beliefs are challenged, we feel our identity is threatened, which causes us to become defensive and double down on our position.

• People believe in stupid shit because it makes them a part of a tribe. Letting go of those beliefs risks alienation and loneliness, which is why people don’t change their minds so easily.

• The best way to change someone’s mind is by being friends with them and then gradually pulling them in your direction.

• Beliefs lie on a spectrum — the closer you are to the other person, the more likely they are to change your mind, and the farther away you are, the less likely they are to influence your mind.

• Belief perseverance is the idea that we maintain a belief even in the face of all the evidence that contradicts it because accepting that our belief is false threatens our identity.

• Confronting someone over their worldview can harden their belief that they are right. Scientists call this the “backfire effect.”

• The more you talk about — or criticize — an idea, the more likely it is that people will believe it. Silence is the death of ideas you deplore.

--

--

Asad Baloch
Asad Baloch

Written by Asad Baloch

Helping you become less of a shitty person @TheAsadBaloch on Twitter (now X), Facebook, and Instagram.

No responses yet