Episode 029:

Changing Minds in a Polarized World

with David McRaney

October 25th, 2023

Listen or subscribe to our podcast on your favorite podcast provider!

Episode description

Why do people sometimes become more entrenched in their beliefs when they are challenged? In this episode, David McRaney, science journalist and creator of the You Are Not So Smart podcast, book, and blog, joins us as we dive into the psychology of persuasion and explore how and why people change their minds. Together, we investigate the psychological need for connection and how, under certain circumstances, that need can lead people to gravitate toward extremist communities. We also consider ways to protect ourselves and others from such polarized thinking and practice active open-mindedness. David discusses the challenges inherent in conversing with people who do not share our beliefs and why these discussions often don’t go as planned. We also share a powerful technique to change someone’s mind, including your own.

 

David McRaney is a science journalist fascinated with brains, minds, and culture. In 2009, he created the blog/podcast, You Are Not So Smart, to gain a better understanding of self-delusion and motivated reasoning. It became an internationally bestselling book shortly after and is now available in 17 languages. On the podcast, David interviews scientists who study the psychology of reasoning, decision-making, and judgment. He also gives lectures around the world about these topics. His second book, You Are Now Less Dumb, was released in 2013, and his third book, How Minds Change. came out in 2022.

David is also on the Ambassador Council for the Alliance for Decision Education.

Charles: We have all been in situations where we realize that we made a mistake and change our decision accordingly. Maybe we think a new job is perfect, until we try the commute for the first time and realize there’s terrible traffic. Or we’re apprehensive about going on a date with someone, until we realize that they share our unusual hobbies.

In these cases, it’s important that once we learn this new information, we update our beliefs and consequently change our mind about our decisions. If we ignore new information and refuse to change our course of action accordingly, the impact can be very significant, causing us to make bad decisions or stick to courses of action that are no longer serving us.

Consider the impact of a doctor refusing to change a patient’s diagnosis in light of new information. Or a police officer who is so sure that someone is the suspect they’re looking for that they ignore important new evidence. Our guest today is David McCraney, science journalist and the creator of the blog, podcast, and bestselling book, You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory is Mostly Fiction, and 46 Other Ways You’re Deluding Yourself, which explores the psychology of decision-making, reasoning, and judgment.

His second book, You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself, was released in 2013, and his third book, How Minds Change: The New Science of Belief, Opinion and Persuasion, came out in 2022. David is also an ambassador here at the Alliance for Decision Education.

David: I am very happy to be part of this fantastic org who does things that I actually believe in, and that’s always the best org to be a part of. So thanks a bunch for having me on the show.

Charles: We sat down with David to talk about his most recent book, How Minds Change, and to learn more about how and why some people change their minds whilst others struggle to do so. We explore how the ability to change our own minds and the minds of those around us can significantly impact the quality of our decisions, which impacts our quality of life.

David: So I wanted this book to be different than my previous work. I wanted to write a book about how minds change. And then eventually, of course, if you’re talking about that, you’ll get into persuasion and other stuff. And I also wanted the book to be on the ground. I wanted this to be: I meet these people,

I talk to these people, I have these experiences, and you come along with me and we build up to an understanding of the issue. And to get started, the real way I got started, and the real way that the book starts, I wanted it to be—let’s go spend time with someone who changed their mind in a really drastic way, and then we can get into the science, and philosophy, and everything else that goes along with how that happened.

And I learned of Charlie Veitch. And I reached out to him and I said, “Can I come spend a week with you?” He said, “Come on out.” So I traveled out to the UK. I went to Manchester, where he lives, and learned all about his life and was astonished right away. Like, he has insights on everything, in every direction, at all times, and just barely pays attention to traffic laws or any other laws.

The story of what happened to him was, he was—in the earlier days of social media and the internet in general—he had spent a lot of his young adult life . . . his father was an oil tanker pilot, and they moved around a lot, and he moved all around the world. And everywhere he moved, he felt ostracized, and he felt out of place and didn’t have a community and was the subject of all sorts of weird, like, marginalization.

And he eventually got to settle into the UK, got his degree, and went into the world of banking and hated it. And he was living that cubicle life that so many movies in the nineties talked about—the Fight Club sort of “how do I escape this drudgery, I am a cog in a machine” thing. He was feeling that.

And he was slowly looking around the internet and finding all sorts of things. He was feeling angry at institutions, angry at people in power, angry at authority, and feeling out of place and didn’t have a community. And he noticed a lot of people were talking about this whole 9/11 thing. You know, 9/11 truthers. And he sort of stumbled into some of those earlier videos that you may recall, like the Loose Change and stuff like that. And he started making these YouTube videos in the early days of YouTube, where he’d run around with a megaphone, and he would just go into public places and sort of bark at people and tell them that they were living this Orwellian nightmare. He’s like, “Look at you doing what they tell you to do.” And it was almost like he was street art, Rage Against the Machine, beat poet stuff to just stir things up. And he was getting a pretty big following on YouTube as a person who would do this sort of thing.

And at some point, he went to the embassy and he barked at them in the same way, and they didn’t like that. They pinned him to the ground, and they had machine guns, and they were dressed like goons, and this was perfect for his audience. They’re like, “See? There’s the deep state just putting you on the ground. They knew it. Eventually you went too far.” That launched him into a sort of stardom with that group.

And eventually he was getting invited on things like Alex Jones’s podcast. The people that got him into this fascination were now inviting him on the programs that he was watching and listening to. And he rose and became a sort of superstar of that world. He got to quit his job, and make YouTube videos for a living, and meet all the major players in the conspiracy theory universe—David Icke, and Alex Jones, and everything.

But this is also the vehicle by which he changed his mind. He was invited on a program called Conspiracy Road Trip. It was a BBC show. Then they had a really, really cool premise. The premise of the show was—let’s get a group of conspiracy theorists and then we will do the thing that everyone wished they could do with these people. We will take them to all the actual leading world experts on whatever the topic it is that they have a conspiracy about. And so for this particular episode, Charlie and several other truthers—9/11 truthers—they took them to everything. They took them to Ground Zero. They took them to all the ground zeroes, to Pennsylvania. They took them to . . . the architects of the World Trade Center and had them look at the blueprints and talk to them in person. They met demolition experts. They met chemical experts. They met construction experts. They learned how to fly an airplane in a commercial flight simulator and then actually learned how to fly a real single engine airplane and take it off and land it in the New York area.

They did everything, right? So in each one of these cases, the experts very carefully explained to them why this could not possibly have been an inside job. And, along the way, he started softening a good bit, because he had very specific questions, like, “How come it doesn’t melt steel beams? How could it work?”

And so the demolition experts and architects are like, “Well, it doesn’t have to melt the beams. They just have to bend a little bit. And when you have a whole building above a tiny bend, it becomes a bigger bend until it comes down.” Stuff like that. But, at some point, they met the actual widows and widowers of people who had died in 9/11.

And that was the thing that really got him. He hugged a woman who had lost her husband, and it affected him deeply. And when he went to the hotel room where all the other truthers were staying in different rooms—but they would meet up at the end of the day—he was eager to talk to them about that.

And when he joined the conversations, they were saying, “Wow, what a great actress. Can you believe they would hire somebody to do something like that?” All that kind of stuff. The humanity of it didn’t make sense to him. And he started to really feel the doubts at that point and look over all the evidence that had been presented to him and he felt like, you know, “Look, I was wrong.” That’s what he felt. Whereas everybody else, they were doubling down. This always happened on that show. The host of the show remarked how that was what made it a great reality show because you watch it and then, at the end of it, they all believed even more than they did before they went in. But not Charlie.

He goes to Times Square. He films himself. He does a selfie video where he says, “I’m telling you. I went on this thing. It hasn’t come out yet. You’ll see it later in the year, but I’ve changed my mind. And if we’re looking for the truth, I mean, that’s what we have to do sometimes.” And he puts it on his YouTube channel and goes about his day.

But he was absolutely, totally obliterated after that. It started out with a lot of disbelief from people who loved him or people that were fans—people thinking what happened to you? How did they get to you? Were you ever really you? You know, the conspiracy theory stuff. And it escalated, and escalated, and escalated.

As his partner told me, “They were just out for his blood.” They were contacting her on private messages and saying—like, they had a child they were expecting—and they were saying, “It’s going to be the spawn of a demon.” So they just start this whole “I want to destroy everything about you” campaign.

And remember, he hasn’t said anything but, “I was wrong. I changed my mind.” But being a prominent member of the community, there were a lot of implications there. And he’s a great example of the fear that often drives a person who’s fallen into the community of a conspiracy theory, or a cult, or a pseudo cult, or something that’s . . . a deeply polarized fundamentalist, or a radical position politically, even.

At some point, there’s this—maybe not articulated, maybe not salient, but certainly felt—“Oh, no, I better be careful what I say” feeling. And he is an example of—yeah, that’s a rational, reasonable fear. Because he experienced an absolute horrible campaign of reputation destruction. And he obviously completely left the community after that.

Charles: Why was Charlie Veitch different? Did he not have the same fears as others about how the community might treat him once he changed his mind? Or was there another factor that enabled him to change his mind despite his fear? Everyone else who went through the same program doubled down on their existing beliefs. So why did Charlie do the opposite? And if we can identify what made Charlie change his mind, can it teach us how to more effectively change our own minds when we come across new evidence that doesn’t fit our existing beliefs?

David: I start the story there because I felt like there was a mystery in this. Why did the facts work on Charlie and the facts not work on everybody else? And the reason he was able to do that—the way I explain it in the book, with lots of backing from studies and experts on how this usually plays out—is Charlie wasn’t just a member of the truther community. On his rise through conspiracy theory fame, he became a darling of another community called TruthJuice.

And this is a group of people who go for, like, the truth. And it’s a neo-hippie sort of thing. He met his partner there, and he was a member of another robust community at the same time. And so he had this social safety net, where if he did get ostracized, or if he did say something that got him in trouble, he didn’t lose his entire community. And that wasn’t salient for him. That wasn’t something he had ever articulated, but he felt it on some level. And, sure enough, that is what happens in his story.

Charles: To fully understand the conditions under which people are open to changing their minds, first we have to understand why people make up their minds in the first place. So why did Charlie get into this conspiracy theory at all? What draws people to these types of groups?

David: The yearning for a community was already there in the beginning. The identity that he’s looking for is group identity. Like, I would like to be a member of a group of people who respect me and think I’m trustworthy and think talking to me is worth their time, and there’s reciprocal altruism and all these things that go into it.

He was deeply yearning for that. The conspiracy is irrelevant. Like the thing that you glom onto, that you fall into, it’s irrelevant. The thing that’s relevant is this social primate thing that is playing out. There’s a great psychologist, Anni Sternisko, that I talked to for the book. And she calls them motivational allures. A motivational allure that gets you into a conspiracy theory community—for some people it’s to take your identity and demonstrate that it has value. Whereas others have a prejudice or an anxiety or a fear that the conspiracy theory allays. These people have two different reasons for wanting to look deeply into the conspiracy, and they will find the community, and there’s a sloughing off process that happens along the way.

It’s sort of like an email scam. The first step is just that allure. All of us have these allures, all of us would like to not feel bad about ourselves, and all of us have anxieties and fears. And then you find someplace online—and we do that! If you like a TV show, you will go to Reddit and maybe check out that subreddit and love it and join it. We all do that. But in this case, there’s a step-by-step. Some people get sloughed off from it. And there’s another stage where you get very invested in the community, another stage where you feel like your reputation is kind of on the line when you comment and are commented upon. At another point you will reveal who you really are and start posting content about it and really become a member of the community. And at some point you eventually want to meet people in person.

So Anni Sternisko, she compared it to—let’s say you’re on Netflix and you’re clicking through stuff. And one person just loves spaceships and explosions, and they’re like, “Can I find something with spaceships and explosions in it?” And they’re like, “Oh, yeah, Star Wars: The Force Awakens. I like, usually, harder sci-fi than this, but I haven’t watched it. And it’s all I can find tonight.” And you watch it. And that person’s like, “Yay, that’s pretty good.” But a whole lot of people like that will watch it and go, “Okay, I was right. That wasn’t very good.” So that’s the sloughing that’s taking place.

Another person is, like, they’re really into Adam Driver right now. “He’s dreamy. He’s kind of ugly-pretty. Just from some angles, he’s very attractive. I want to watch more movies with him in it. Oh, yeah. I don’t watch Star Wars, but here’s the last thing left that I haven’t seen.” So they watch Star Wars: The Force Awakens. And, sure enough, that person goes, “No, I don’t like this.” But a certain portion of the people with that allure go, “Oh, I didn’t know—I like Star Wars.

So you have these two different people with these two different allures. They go to the next level. They go online and find people talking about Star Wars. They get a little deeper. And then the whole thing plays out like I described earlier. They eventually find themselves at a Star Wars convention, you know, cosplaying, starting out with two completely different motivations, but ending up in the same group and having the same group identity. And, at the point that you move from the motivation having something to do with anxieties or identity to your motivation is “I’m a member of this group,” that’s a much stronger motivation. That’s the strongest motivation. And Brooke Harrington, the sociologist, she told me if there was an E = mc² in social science, it would be that the fear of social death is greater than the fear of physical death. Once that’s the drive that’s driving your behavior, everything else is irrelevant.

And you can see that it could have been anything, like, who knows what you might get into that will funnel you into having a group identity at some point? And once you’re in the group identity space, the beliefs, the ideas, the facts, the pages and pages of stuff that gets spread around—whether it’s QAnon or Star Wars: The Force Awakens—it’s irrelevant to what’s actually driving you every day.


“At the point that you move from the motivation having something to do with anxieties or identity to your motivation is ‘I’m a member of this group,’ that’s a much stronger motivation. That’s the strongest motivation.”David McRaney


Charles: It appears that it’s our desire for social belonging and acceptance that leads us to find community and adopt the beliefs of that community, regardless of whether it’s a conspiracy group or a running club. If we keep this in mind, can it help us to be more wary of the beliefs that our communities hold and potentially to question them? We asked David if he thinks this knowledge has made him less likely to adopt false beliefs and more likely to question his existing ones.

David: I am more aware of it in myself, for sure, with the caveat that a lot of the things that I write about, and talk about, and have covered—knowing is not half the battle. There’s even a psychological term for that. One of the core truths of the psychology of decision-making, and bias, and judgment, and everything is that the more intelligent you are and the better educated you become, the better you get at rationalizing and justifying your beliefs regardless of their accuracy. So you have to be more aware of your motivations and drives. That’s the thing that you need to pay attention to.

So I have tried to—especially since starting How Minds Change (this is why therapy is good)—become more aware of what it is that’s driving me or motivating me. What are my insecurities and my predilections? What are the wounds and what is the humanity that’s yearning within me that can get me into a good place or a terrible place if it’s untended? Especially if you are really good at lying to yourself or looking away from it, that’s where the danger is, right?

And you can notice, I think, if you take a minute, especially if you do morning pages, journal, if you do commit to legitimate therapy, something that is introspective and metacognitive. Like becoming honest with yourself in that way and then noticing when you’re straying away from it, because you will. That’s important.

If you want to avoid this, becoming an expert is not enough. You need to have multiple communities to which you move, in many circles, and enjoy the idea of comparing and contrasting perspectives. And be interested in so-and-so thinks this, but so-and-so disagrees and have this feel for the debate that’s taking place.

Because if you get locked into one perspective on it, the moment your reputation is at stake for being for or against something in the minds of the other people around you, that’s when the bars start forming around you and the cage starts to really get its grip on you.


“Becoming an expert is not enough. You need to have multiple communities to which you move, in many circles, and enjoy the idea of comparing and contrasting perspectives . . . and have this feel for the debate that’s taking place.”David McRaney


Charles: So, if we want to avoid adopting false beliefs, it seems helpful to make sure that our identities are not tied to one group alone, but rather, we belong to multiple communities, so that we feel safely able to question the beliefs of the group if we want to. Just like Charlie belonged to both the truther and TruthJuice communities, which were founded on very different belief systems.

This advice is helpful when we’re dealing with our own beliefs, but what about when we’re observing others behavior? David shared a scenario, familiar to many of us, of watching someone we care about falling for someone problematic and then justifying why the romantic partnership is a good choice for them.

David: Everyone has had that experience of someone who’s really falling for someone that you know they ought not be with, kind of thing. You know, the gift and curse of the fact that, from the objective perspective, you can totally see a person when they’re being motivated by things they ought not to be. But from the subjective perspective, it’s like, “I don’t know. It just feels good. Like, I just like it.”

We’re motivated reasoners, right? So, you know, when someone’s falling in love with someone and you ask them, “Why do you like them?” They will say, “The way they talk, the way they walk.” Maybe very specific, like, “The way they walk across the room, you know, and the music they have introduced me to. Oh, my God.”

And then when that exact same person is breaking up with that exact same person and you ask them what reasons do they have to break up with them, they will often say things like, “The way they talk. It’s so grating. The way they walk across the room. Janky, jangly freak. And the stupid music they put on in the car. I can’t even go anywhere with it.” So the reasons remain the same, but reasons for will become reasons against when the motivation to search for reasons changes.

Charles: What can we do to help others who have adopted false beliefs? How can we help them to re-evaluate and hopefully change their minds?

David: I would say that I’m way more aware of it now, and I’m also more frustrated by how it can’t be attacked head-on. You have to consider more the attitudes, and values, and motivations, and drives that are leading people places than just, “Is the reason you’re falling in with Flat Earth, because you are fearful of and have anxiety about institutions, and governments, and authority? Or is it more that you don’t feel validated by your immediate social circle?”

I still want to introduce people to critical thinking and skepticism and all these things that are important. But this other thing, which requires this incredible empathy and emotional heft and labor—as a man, as a Gen Xer, as a person who grew up in the Deep South, and the West, and the United States—there are a lot of barriers culturally to crossing the line and to offering that kind of support to other people. And you have to really put effort into that.

I also want to point out that Charlie—it wasn’t hugging the widow that changed his mind. But that had to happen. It was that he had affirmed his values and identity with another group of people. And there’s robust psychological literature on this.

There’s some great studies into it, too. What they will do in these studies is they will give subjects—they will divide them. And one group of subjects gets their identity affirmed or their values affirmed, and the other group does not. And then they get something that they’re very tied to or connected to, it’s attacked by evidence that suggests maybe it’s a bad thing, like political ideas or wedge issues and stuff. The people who have had their identity affirmed ahead of time are much more likely to carefully consider the evidence and then update their priors, whereas people who have not had that treatment will push back really hard and feel not just threatened, but insulted, and they get angry.

And, you know, you can affirm identity in all sorts of ways. Like, you tell people, “What are your thoughts on this? Well, that’s great.” There’s many ways to affirm that you are doing a good job of expressing and holding up your values. That’s what had happened to Charlie without his knowledge. Like, he wasn’t aware of it. He had not introspected upon this. But, yeah, TruthJuice did that for him. The things that actually he cared about in this world, and his core values, were being greatly affirmed there. And so he was open to that hug, that embrace. He was open to the demolition expert taking out some Legos and demonstrating how buildings work in a way that the others were not. And that was why those things could affect him and they didn’t affect the others in the same way. They couldn’t afford to change their mind in the way that he could.

I spent time with people who left Westboro Baptist Church and other conspiratorial communities and cults and pseudo cults. And everyone who off-ramped out of one—same story almost every single time. They had someone from another group or another community, an outsider, even people who opposed them, maybe presented themselves as an empathetic, nonjudgmental listener. Or they were invited to, you know, meetups or hangouts in a way where they weren’t confronted with arguments. They were invited to conversations outside of the conversations they were having. And that was what started to construct their off ramps out of the thing.


“People who have had their identity affirmed . . . are much more likely to carefully consider the evidence and then update their priors, whereas people who have not had that treatment will push back really hard and feel not just threatened, but insulted, and they get angry.”David McRaney


Charles: David suggested that in order to be open to changing our minds, we need psychological safety. We need to feel affirmed in our identities, a sense of belonging, and to feel that that belonging is not called into question when we re-evaluate things or change our minds. So if we want to convince someone to change their mind, or at least to re-evaluate an existing belief, how can we do that? How can we help create the right conditions where they feel safe doing so? It might not be as simple as we think due to the backfire effect.

David: Anyone who’s a fan of the show is probably very aware of the fact-based approach not necessarily working all the time. If you’ve ever attempted to dump a bunch of YouTube links on someone or shake your finger at them and say, “That’s not true. You are wrong. This is what the CDC says about this. This is what—pick your scientist, pick your expert, pick your trustworthy source—say. No, that’s not true.” And then you maybe take out your phone and go, “See, I’m telling you . . .” It doesn’t work very well often.

I’ve spent a lot of time with Brendan Nyhan and Jason Reifler and Ethan Porter and Thomas Wood. Nyhan and Reifler are the original authors of the backfire effect. It’s much more nuanced than it was originally assumed. The original idea of the backfire effect was the more you tell someone they’re wrong, the more deeply they entrench themselves. The original research in the backfire effect has not replicated.

The research that has been done since has found that facts do a pretty good job most of the time. Like, it’s not a bad idea to inform people and give them information. It does move the needle a bit. But the gigantic caveat is you can correct people factually, but it may not affect their attitudes at all. In fact, it may strengthen their attitudes.

The backfire effect—that’s where it manifests—is that when a person is confronted with evidence—they may be incorrect about something factually or even attitudinally, but mostly factually—they can very readily update their priors in that regard. And it does not affect the reason they chose those facts in the first place.

So that’s what Porter and Wood told me. It was like, remember, the facts were irrelevant to begin with. They were just handy dandy justifications they were using at the time. You delete them and they will just get some new ones to replace them. In fact, they might get a few more this time because they’re engaged in the process and they start building an even stronger case, they think. And this is where backfire comes in.


“I spent time with people who left . . . conspiratorial communities and cults and pseudo cults. And everyone who off-ramped out of one . . . they had someone from another group or another community, an outsider, even people who opposed them, maybe presented themselves as an empathetic, nonjudgmental listener . . . they were invited to conversations outside of the conversations they were having. And that was what started to construct their off ramps out of the thing.”David McRaney


Charles: It’s likely that it can backfire if you try to bombard someone with facts, because they will just choose new reasons to back up their positions. So what’s a more effective way to get someone to try to change their mind? David explained the difference between topic rebuttal and technique rebuttal, which can give us insight into how to effectively get someone to be more open to reconsidering their perspective.

David: I discussed this in the book by dividing it into two categories of conversation and persuasion. There’s topic rebuttal and technique rebuttal. Topic rebuttal is in a good faith environment where people are all playing by the same rules, this is like science and medicine and law and certain academic pursuits. You’re both good faith actors who agree that we’re going to look at the evidence and, where there’s a preponderance of evidence, we’re going to suggest that that’s more likely. But we’re always open to new evidence and we’re going to debate and discuss. That’s wonderful. It’s great. That’s gotten us some of the best things that humans have ever done. I love it. I will never suggest stopping doing this.

When you just walk up to your relative or friend or random stranger or person on the internet, that’s not the same environment. And they will slip into all sorts of things I discuss in the book—reactants, backfire, reputation management, and so on. The better approach is to engage in technique rebuttal. And that is, instead of addressing their actual conclusions and opinions, what you’re going to address is what systems are they using to arrive at them? Instead of appealing to reason, we’re going to appeal to reasoning. And they’re different things. Reason is logic and propositions and everything. Reasoning is the reasons I came up with to justify and rationalize my thoughts, feelings, behaviors, plans, and goals.

Charles: So David highlights two different ways to try to persuade someone or to get them to change their mind. The first used in science is topic rebuttal. This is where you discuss why the evidence they have presented is flawed or provide them with new evidence to the contrary. This is what we usually think of when we say persuasion.

However, the alternative is technique rebuttal, which is often more effective when we’re trying to persuade others. Technique rebuttal involves focusing on what their reasoning is, how they arrived at their conclusions, and getting them to delve into that reasoning and break it apart.

This insight is used for deep canvassing and several other persuasive conversational techniques that have been effective in many different domains. David shared more about how he came to see a commonality between a variety of methods of persuasion that have proven effective in different domains.

David: I totally never expected to find all these people, but I’ll describe all of this. I found deep canvassing, street epistemology, in the therapeutic world, I found motivational interviewing and, even since writing the book, I’ve found so many others. Yet all of these different groups, through A/B testing, through thousands of conversations where they threw away what didn’t work and kept what did, they all kind of arrived at pretty much the same technique.

And if they put it in steps, the steps were the same steps in the same order, maybe slightly reworded or rearranged. That blew my mind. When it comes to persuasion that actually gets you somewhere, all of these groups, the first thing that they’re dealing with is they’re avoiding the crash and burn. They’re avoiding the conversation devolving and becoming an argument, becoming an intractable debate. And if you’re trying to avoid that, you’re going to have to deal with how brains work when we’re trying to trade information back and forth. All of it, every bit of it, is going to come into play in a very particular way to destroy a conversation or let it keep going. And they all independently rediscovered it.

Charles: Before we begin a conversation with someone who we’re hoping will change their mind, David reminds us to ask an important ethical question. Why are you trying to change that person’s mind?

David: I was proselytizing the hell out of this with all sorts of people that I knew. I was like, “Look at this. This is crazy.” And it was negotiation experts that I have worked with in the past who looked at it and were concerned that I was excited about trying to change other people’s minds and I wasn’t asking myself, “Why are you doing that?” Because there’s a real assumption in there that you’re right and they’re wrong. You have the moral or ethical high ground, or your attitudes are the good attitudes to have, and there’s the bad ones. And they were like—I really appreciated it—they were like, “Have you considered that you’re not using this on yourself first?” And that was their suggestion, basically. Use this on yourself.

So, if these have steps, step zero would be to ask yourself, “Why do you want to change that person’s mind?” So that’s back to the motivations and drives that I was discussing earlier. Have you thought about what it is that’s compelling you or what is it that you’re trying to validate or salve over by this? What are you hoping to accomplish? In essence, do you have good reasons? And at least do that. Even if you trick yourself into thinking you do have good reasons when you don’t, like, if you don’t do that first step, that step zero, you won’t discover that until too far along the way and you might ruin the conversation.

This is what happens. I talk about it in the book. I was very excited about it. And very early on, I eject from an attempt to change someone’s mind about something because I realized there was very much no good in doing so. So I really encourage you to do that. At least walk into the situation feeling like you’ve done your due diligence before you jump in, because otherwise you’re not doing it ethically as far as I’m concerned. If it’s about fact-based things, hopefully your actual motivation is because we want to believe true things, right? That’s a good reason. And then, when it comes to attitudes, you know, is this a purely subjective thing? If not, is the attitude you’re trying to adjust causing harm in the world? Are you removing some poison from the world? And when it comes to values, same thing.

So please do step zero first. In so doing, there’s intellectual humility that you’re trying to foster that maybe I’m wrong. You know, hopefully—what you should do in all of the techniques that are described in the book—the concept is you’re going to go shoulder to shoulder with the other person and explore why you may disagree on those. And, in so doing, you may both learn that you’re both wrong, or you may adjust each other’s priors to the point that it isn’t even like that anymore. We’re just getting more dimensions to our perspective on the issue. And we’ve all done this already before, which is, you walk out of a movie—I’m sure you’ve felt this, where you watch a movie and you’re like, “I love this. This is the best movie I’ve ever seen.” And then you walk out in the parking lot, if you see it with another person, and you become astonished to learn that your friends are like, “I hated that.” But you don’t go, “Well, we are never going to be friends again. I reject you and everything you believe in.” You start having a little conversation about it. And you explain the things you like, they explain what they don’t like. Every time I’ve done that, I’ve moved a little bit. I’ve been like, “Well, yeah, you are right about that.” And they’ve moved a little bit, which is what we’re talking about, which is, instead of trying to be right versus wrong, trying to convince the other person, we just move shoulder to shoulder and go home. “I find it very interesting that we would not be in agreement on this. I wonder why?”


“The concept is you’re going to go shoulder to shoulder with the other person and explore why you may disagree . . . you may both learn that you’re both wrong, or you may adjust each other’s priors to the point that it isn’t even like that anymore. We’re just getting more dimensions to our perspective on the issue . . . every time I’ve done that, I’ve moved a little bit . . . and they’ve moved a little bit . . . instead of trying to be right versus wrong, trying to convince the other person, we just move shoulder to shoulder and go home.”David McRaney


Charles: Once you’ve established why you want to get someone to change their mind, or at least start to explore their beliefs more deeply, and you’ve made sure you’re open to changing your own mind on the issue too, then you’re ready to begin. Step one is establishing rapport with your conversation partner.

David: So here are the steps. And there are several of these in the book. If you’re discussing a belief, you should use one system. If you’re discussing an attitude, use another, but they’re almost the same. But they’re slightly different.

First thing is step zero and then establish rapport. From a scientific perspective, what we’re talking about is not activating a lot of the triggers that social primates have innately. You want to assure them that they’re not going to be shamed, that they’re not going to be ostracized. There’s no threat of that.

You’re not going to threaten their agency, which is what generates reactants. Reactants—we’ve all experienced that, too. Establish rapport early on. And part of that is asking for their consent. “Hey, I would like to have a conversation with you about this, where maybe we might, you know, we possibly could see it differently at the end of it? I don’t know. I’m interested in having that conversation with you. Mainly, I just want to know how you feel about it and what you think about it.” And you don’t necessarily want to change their minds. That would be nice if it happens, but you don’t want to shame them in any way. And, you know, that has to be established. Do that naturally. We’re all very good at it. If you’re a person, you can do that.

So establish rapport first, then get consent, and then ask for your fact-based claim and ask them to state it as a claim. You can also ask, “What is a belief that you hold very strongly? What is something that sort of guides your thinking?” Or if it’s about a particular issue, like, “How do you feel about gun control?” Ask for a very specific claim within that, and get it down to the point where you repeat it and summarize it back. It’s important that you summarize it in a way that you’re almost a lawyer for their side. You’re creating an argument that’s even better than the argument they’re presenting. And then, ask over and over again, “Is that the argument?” Like, make sure you get it right.

Charles: To recap, step one is establishing rapport, making sure they feel comfortable and safe talking to you, and asking for consent. Step two is asking them for a claim and then repeating it back to them. For example, this could be anything from, “Casablanca is the best movie of all time” to “The earth is flat.”

David: Whatever definitions they’re using, clarify them. If they say, “Politics,” ask them, “What is that—you know, what is your definition of politics?” Because you don’t want to use your definition because it’s an assumption that you’re having the same conversation when you might not be. Like, they might say, “Politics” and you’re thinking—civics, and ad valorem taxes, and things, and they’re thinking—circle of dinosaur men eating cigars, cutting the world up into smaller and smaller golf courses.

Like, you need to make sure you’re having the same discussion. Then here’s the magical moment. And all of these techniques have this. Most of the time, this entered into the fray because they just wanted to quantify them to study them, but it ended up being the most important part. Ask for a numerical measure of their confidence or their certainty.

And if you’re talking about attitudes, it will be like, where’s it at on the spectrum? I did this recently at Bridgewater College. I had asked someone about gun control. So you’re asking them, like, on a scale from zero to 10, where are you on this issue? If it’s an attitude-based issue, you would say, like, “10 means you’re very in favor of gun control, to the point that if someone were to say the word gun out loud and a cop heard it, that person gets life in prison. So that kind of makes 10 off limits. And then zero is everybody gets an assault rifle in the mail once a week, like, for free. That’s zero. So where would you put yourself on that scale?” That’s a way to talk about attitudes.

For a fact-based claim, it’s more like, “How certain are you from zero to 10?” Like—the earth is flat. Ten is there’s not a single shred of doubt in your mind. Zero . . . you know, you get the idea. So you want to go from zero to 10, and then you ask the person, “What reasons do you have to hold that level of confidence?” You don’t have to use that kind of language I’m using, but you can be more natural about it.

But when I ask, “Where would you put it on a scale?”—this is what always happens in these discussions—you get one of these “Well . . . mmm . . . “ moments. That’s introspection. That’s metacognition. And it’s magical, and mysterious, and strange, because oftentimes you’ve never done it. You just walk around with the attitude that you got and that’s as far as you go with it, right?

But to be asked, like, “Okay, what would I give it? Like, what percentage, what number?” That’s different. And so you gave it a seven, and I want to note that seven is not an eight or a nine or a 10 in this regard. So I’m wondering, why not an eight? Why not a nine? And then we can also go to the other side. Like, how did it get up to seven? Why is it not a five? You’re actually evoking justifications and rationalizations. You’re asking them to employ their reasoning process. And what reasons do you have to have that level of confidence?

And then the next step after that is to say, “What methods are you using to judge the quality of those reasons?” And the conversation stays there until it burns out. That’s usually almost all you have to do, honestly, strangely enough. Because most of us never get the opportunity to introspect in that way, and you just focus on that until the end of the conversation—listening, summarizing, repeating. And stay cordial and acknowledge that you’re opening up the floor to have lots of conversations like this. And maybe your only job here is just to give the person that first taste of, “Oh! I never really have thought about this in any way.”


“Establish rapport first, then get consent, and then ask for your fact-based claim . . . ask questions that encourage reflection and be sure to be open-minded and empathetic as you engage with them. Then here’s the magical moment . . . ask for a numerical measure of their confidence or their certainty.”David McRaney


Charles: So after you have established rapport, and they have told you their claim, you repeat it back to them and confirm that you understand what they’re saying. Then, identify their definitions and make sure you’re using their terminology. Next, you should ask about their confidence in that claim on a scale of one to 10.

Once they have given you a number, ask them more about their reasoning. Why did they not say higher? Why didn’t they say a lower number? Ask questions that encourage reflection and be sure to be open-minded and empathetic as you engage with them. The conversation continues in this vein, and at the end you should thank them for their openness.

So, when can you use this method? Is it just for when you want to change someone’s mind in a specific way? And is it always only your conversational partner who changes their mind? David shared more about how broadly this conversational method can be applied.

David: That can be applied to everything that’s inside of us, every opinion we have, every conception we have, every model. There’s an opportunity to generate what almost could be considered your first true opinion about that thing. And in this kind of technique rebuttal stuff, especially when I was with the deep canvassing people, I went through their catalog and it was astonishing to watch people. You could, like, YouTube scroll their opinion. And in the beginning, they’re like, “I’m against it.” And then at the end, they’re like, “Well, you know, when you think about it . . . “ They didn’t have an actual truly mature concept of how they saw the thing.

And, when they did have that mature concept, it couldn’t be the same as it was before the conversation. So it changes. Their mind changes. And the idea that this is going to be a 180—that happens sometimes. Sometimes people really do have this epiphany. They’re like, “Wow, I am not okay with that.”

And I’ve done that with movies. I’ve said, “I love that movie!” And people are like, “What do you like about it?” And I just sit there and talk and talk, and by the end of it, I’m like, “Well, you could YouTube scroll me. Actually, I guess I don’t really like it that much.” So that’s what this is all about. And this is a really powerful way to discuss things that are oftentimes third rail or wedge issues. All of a sudden it’s okay for two people who disagree or have different attitudes about something to discuss it. And, at the end of it, they have an evolved idea of it. And I’m also noticing, “Do I share these values? I haven’t revisited it in a while. I wonder if it holds up.”


“This is a really powerful way to discuss things that are oftentimes third rail or wedge issues. All of a sudden it’s okay for two people who disagree or have different attitudes about something to discuss it. And, at the end of it, they have an evolved idea of it.” David McRaney


Charles: Here at the Alliance for Decision Education, our mission is to improve lives by empowering students with essential skills and dispositions for making better decisions. So we asked David what he thinks the impact on society will be when the Alliance succeeds in its mission to ensure Decision Education is part of every student’s learning experience.

David: The pressure put on a person in K-12, the world they’re in, where you are inundated with more information than any human being has ever been exposed to on a daily basis, and surrounded by social pressures that are endless and unrelenting, and your attention is being sold and begged for at every angle—this would have been a great for someone in the 1970s to experience. Now it’s more relevant and more important than ever to give yourself the tools to be a good information consumer, a good traveler in the epistemic chaos of the modern era. And I can’t think of anything more important. And the impact on society would be—better citizens and, whatever wrongs they see in the world, they’re more empowered to do something about it. And the choices they’re going to make when it comes to economics, and relationships, and jobs, and their role as a citizen, these are all things that, more than ever, require a greater understanding of introspection, metacognition, critical thinking, decision-making, and judgment.


“It’s more relevant and more important than ever to give yourself the tools to be a good information consumer, a good traveler in the epistemic chaos of the modern era . . . I can’t think of anything more important. And the impact on society would be—better citizens and, whatever wrongs they see in the world, they’re more empowered to do something about it.”David McRaney


Charles: We also asked David why he thinks the most important decision-making tool is that he would want to pass down to the next generation of decision makers.

David: I mean, I have so many ways to answer that. Like, there are all these things you should know about yourself, like pluralistic ignorance, and confirmation bias, all that stuff. But there’s this one simple thinking tool that I got from Will Storr, and I’ll just ask you to do this first.

This is your introduction to the whole thing. Just do this on yourself. Ask yourself, “Are you right about everything?” Most people are going to say, “No.” So ask yourself, “Am I right about everything?” And if your answer is no, ask yourself, “What am I wrong about?” And if you’ve already admitted that you aren’t right about everything, that means you must be wrong about some things. And it should freak you out that—when I ask you, “What are you wrong about?”—you can’t tell me. That means there’s some reason why you don’t know what you’re wrong about. And there should be more questions building up. How would you go about learning what you’re wrong about? And how would you go about fixing that? And I would encourage you to ask those two questions yourself and see what happens next.

Charles: That was David McRaney, science journalist, podcaster, and author of several books exploring the psychology of reasoning, decision-making, and judgment. His most recent book, How Minds Change, was published last year.

We’re grateful to have David as a guest on The Decision Education Podcast. To learn more, check out his website, DavidMccraney.com. He’s @DavidMcCraney on Instagram and LinkedIn. He’s on Twitter @notsmartblog, and his podcast is called You Are Not So Smart.

Published October 25, 2023

Share this episode to your favorite platform!

Check out our other latest episodes

  • Episode 028:

    Rethinking the Workplace

    with Dr. Adam Grant

    Can giving advice actually be more valuable than receiving it? In this episode, Dr. Adam Grant, organizational psychologist and world-renowned author, joins host Annie Duke, [...]

  • Episode 027:

    Too Many Choices

    with Dr. Sheena Iyengar

    Is more choice always better? Join us in conversation with Dr. Sheena Iyengar, a professor at Columbia Business School and an expert on choice, as [...]

Stay informed and join our mailing list