David McRaney on the Science Behind Persuasion

David McRaney on the Science Behind Persuasion

Released Friday, 30th September 2022
Good episode? Give it some love!
David McRaney on the Science Behind Persuasion

David McRaney on the Science Behind Persuasion

David McRaney on the Science Behind Persuasion

David McRaney on the Science Behind Persuasion

Friday, 30th September 2022
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

M This

0:02

is Mesters in Business with Very

0:04

Results on Bloomberg Radio.

0:09

This week on the podcast, I have an extra

0:11

special and fascinating guests.

0:14

His name is David McCraney and

0:16

he is a science journalist and author.

0:20

I first came to know David's

0:22

work through his blog and

0:25

book You Are Not So Smart, which was

0:27

a fun review of all of

0:29

the cognitive foibles and

0:31

behavioral errors we all make. But

0:34

it turns out that David was

0:37

looking at how people

0:39

change their minds, how you persuade people, and

0:42

he thought the answer was found

0:45

in all of these cognitive errors. And

0:47

if you could only alert people to

0:49

the mistakes they were making, whether

0:51

it be fact checks or just showing

0:53

them their biases uh

0:55

and the heuristics they use and the rules of

0:57

thumb they use that we're wrong, hey

1:00

would come around and see the light. And

1:02

as it turns out, that approach is

1:04

all wrong, and his mia

1:07

culpa is essentially this book

1:09

How Minds Change. It turns out that

1:11

persuading people about

1:13

their fundamental beliefs involves

1:16

a very very specific set of steps,

1:19

starting with they have to want to change.

1:22

They have to be willing to change, which

1:24

only occurs when people come

1:27

to the realization that they

1:29

believe something for perhaps reasons

1:31

that aren't very good, and

1:34

it's a process, it's an exploration.

1:36

It's fascinating the people

1:39

he's met with and discussed, whether

1:41

it's deep canvassing or street epistemology,

1:44

or some of the other methodologies that are used

1:46

to persuade people that some of

1:49

their really controversial political

1:51

beliefs are wrong. He's met with

1:53

various people from all everything from

1:55

flat Arthur's to antivactors

1:58

to the folks who have left the

2:00

Westboro Baptist Church, a

2:02

pretty notorious and controversial institution.

2:06

I found this conversation really to be

2:09

tremendous and fascinating, and

2:11

I think you will also. With no

2:13

further ado, my interview with

2:15

David McRaney. Well, I've been a fan

2:17

of your work and I thought when this book

2:20

came out it was a great opportunity to sit

2:22

down and have a conversation with you. Before

2:24

we get to the book, let's talk a little bit about your background.

2:27

You started as a reporter covering everything

2:29

from Hurricane Katrina, test

2:31

rockets for NASA, Halfway

2:33

Home for homeless people with HIV.

2:36

What led you to becoming focused

2:39

on behavior in psychology.

2:42

Well, I thought that's what I was gonna do for a living. I was. I went

2:44

to school to university to study

2:47

psychology. I thought it would be a therapist, got

2:50

that degree. But then as I was doing that, uh,

2:52

there was a sign up on campus that said

2:55

opinionated in big helvetica font and

2:57

I was like, yeah, I am that would mean that seems what is

2:59

that? And said, you know, come down to the offices

3:01

of the student newspaper. I went down there and said, how does

3:03

this work? They said, just email us stuff.

3:06

You have an opinions piece you want to do. I'm like m and

3:08

I I wrote a really like uh

3:11

sophomore thing about Starbucks on

3:13

campus because it was just about to come into campus. And I

3:15

wrote that and wrote a couple of things. And then there

3:18

was a study that had just recently come out, and who knows if

3:20

it's replicated or stood the test of time, but it

3:22

was when your favorite sports team

3:24

loses, men's sperm counts go down.

3:27

And I thought, our team

3:29

at our school had lost every

3:32

single game that year so far? What

3:34

does that mean for the future progeny

3:37

alumni? That's right? And I thought it would be a

3:39

great headline that would be funny. And the

3:41

headline I wrote was, you know, evidence

3:43

suggests the sperm counts reach record lows on campus.

3:46

And uh one of my professors

3:49

laughed about it and asked the whole class that they had read

3:51

it, but they didn't know that I was in the class. And I was like, oh, way, this could

3:53

be fun So I switched to journalism,

3:55

and you know, went all the way through the student

3:57

paper and then went into print journalism and TV journalist.

4:00

But I once I've reached a certain point in

4:02

that world, I wasn't able to write anymore. I was

4:04

doing editing and helping other people, and I just really

4:06

wanted to write something. And it just happened blogs

4:08

were becoming very popular that time. Uh

4:11

my dad says, and uh others

4:13

that were like, oh, that's way later. I'm thinking

4:16

back to uh Yahoo's

4:18

geo cities and ye world.

4:20

I mean, I'm I'm the o G when it comes to blogging,

4:23

go way way back. I just happened to be

4:25

there when they blew up on the point of like they got book

4:27

deals and I started a blog called you Were Not So

4:29

Smart about all the cognitive biases and fallacies

4:31

and heroistics that I really enjoyed, and

4:33

I wrote a piece about brand loyalty that went

4:36

viral and the rest is history.

4:38

That was asked to write a book about it, and then

4:41

I was like, oh, I will continue playing in

4:43

this world. But I started. I started the podcast

4:45

to promote the second book because the first book did

4:47

so well. They said, and do another really quickly, and I did, you

4:49

are less dumb now? You are now less dumb yet? And

4:52

I just so happened to start a podcast right when

4:54

podcasts were becoming a thing. I sent an email to Mark

4:56

Mayren because he had the number one podcast. I said, how do you do this?

4:58

And he actually sent me an email with a a point

5:01

really h like each with links

5:03

to Amazon items and no

5:05

kidding, and he was very nice. And I got

5:08

all this stuff and started it up, and that

5:10

has now becomes from the centerpiece, because that's

5:12

uh, I was there when I got going. My My

5:14

pitch for this podcast was WTF

5:17

meets Charlie Rose, and nobody

5:19

knew what w TF was. I

5:22

mean, they didn't know the acronym, nor did they

5:24

know the podcast. Because you know, you have

5:26

to be a little bit of a comedy junkie

5:28

to have found that in the early days.

5:31

Later on it was ubiquitous. So

5:33

sticking with journalism when you

5:35

were still writing, you seem to have covered

5:37

some really unusual and interesting

5:40

stories. Tell us about one of the more surprising

5:43

things that you covered. I always wanted to do

5:45

feature pieces. That was the world that I loved, and that was always

5:47

in journalism school and you know, uh,

5:49

Frank Sinatra has a cold, electricoli ascid

5:52

test. I just wanted to write features. I wanted to be

5:54

there in person and and like tell

5:56

you explore humanity from the inside out

5:58

my way and I the Halfway

6:00

Home for HIV positive men

6:03

for homeless people in the Deep South. That was a

6:05

real turning point for me because there's uh. I

6:07

had to spend about three weeks on that story, and I visited

6:09

all the different people, went to all the different meetings, and

6:12

the homelessness is very invisible in the Deep South. They

6:14

often live, uh in the woods.

6:16

You know, they're looking for us and they there's

6:19

a lot of people in the Deep South. I don't think there is a homeless problem.

6:21

And that was a really interesting way to break that story

6:23

into the public.

6:25

You know, consciousness of no, no, there's a problem

6:27

here. It's just hidden from you in a very particular way,

6:30

and a lot of people aren't even wear their organizations

6:32

that dealt with that, and that really show

6:35

me this is the world I want to be and this kind of stuff I want to

6:37

do. So I'm picking up a theme in

6:39

both your writing columns and books,

6:41

which is there's a problem you don't

6:43

know about it. It's hidden, and

6:46

here it is. That's the whole thing, Like hidden

6:48

worlds are it for me? Like I grew up in a trailer

6:50

in the woods in the Deep South, and as an only

6:52

child, I was always searching for

6:55

the others. I didn't know how I was going to get there, and

6:57

once I got at a hand was extended into this stage.

7:00

It's all I want to do. Like I call him taramissoum

7:02

moments because I remember the

7:04

first time you had tarremsa I was

7:07

I went to. It was when I was still in the working

7:09

for a TV station. We had a little conference

7:11

where people in my position went and we went there

7:14

and we got tarmiseu as a dessert and I remember

7:17

took a bite of it and I was like, oh my god, this is so

7:19

damn good. What what is this, and

7:21

everyone there was like, uh, it's

7:23

Tarmissus, And I was like, oh yeah, yeah,

7:25

termisous loved the stuff and and

7:27

but that's it. That's what I'm pursuing now. I

7:30

want more of those things. I didn't know. I didn't know. You

7:32

know, that's really quite interesting. So I

7:35

guess it's kind of natural that you evolve

7:37

towards behavior and cognitive

7:40

issues. I was going to ask you what led to it,

7:42

but it seems like that's something you've been driving

7:44

for your whole career. Yeah, it's a unity through humility.

7:46

It's it's we're all absolutely

7:50

stumbling and fumbling in the dark and pretending like

7:52

we're we know we're up to even here in these fantastic,

7:54

you know, bloomberg offices, like uh.

7:57

The thing I want to avoid is the sense that I've

7:59

got it all figured out. And there

8:01

are massive domains in psychology,

8:03

neuroscience, and other social sciences that just start

8:05

from that place and then investigate it.

8:08

And I find that when I discover these

8:10

things that we all share that should give

8:12

us a pause, should cause us to

8:14

feel humility. I feel like I'm in

8:16

the right spot, and I want to like dig deeper into

8:18

those places and reveal them so we can all be on the same

8:20

page that way. So blind spots, unknown

8:24

unknowns, things that we are just clearly

8:26

clueless about, and the biases that they're When

8:28

I started out, things like confirmation

8:31

bias, wasn't you know there wasn't as just the tip

8:33

of the tongue as it is now and survivorship bias,

8:35

things like that. So I noticed in

8:37

this book nothing written about

8:39

Dunning Krueger, nothing about Chaldoni's

8:42

persuasion. Is that

8:44

a different approach to decision making

8:47

and psychology? Like or because

8:49

I always assume there would be a little bit of an overlap

8:51

there, I didn't want to retread anything

8:54

there. There's some foundational stuff that I do

8:56

talk about in the book that feel like you can never not talk

8:59

about, some which go back a century

9:01

and like the introspection illusion has to always

9:03

be uh talked about. We don't

9:05

know the antecedence to our thoughts, feelings, behaviors, but

9:07

we are very good at creating narratives to explain ourselves

9:10

to ourselves. And if you always

9:12

have to mention that in any book about this topic, as far

9:14

as concern and so there's a little bit of that. But like

9:17

Dunning, Krueger and uh all

9:19

the other big heavy hitters, I definitely did not want

9:21

to write How to Win Friends and Influence People, Part

9:23

Two because I wanted to come from a very

9:26

different perspective on all this, and I didn't want

9:28

it to be a book specifically about persuasion,

9:30

because I don't even start talking about actual persuasion

9:32

techniques still about page two hundred, Like, I show

9:34

you people who are doing things that could be labeled as persuasion

9:36

techniques, but I don't get into the like the science of it till

9:38

later. Other you mentioned Dunning Krueger.

9:41

I I just recently spent some time with

9:43

Old Dunning, Professor David Dunning

9:45

he um a former guest on the

9:47

show. I don't think he's that old. I think, yeah,

9:50

I say old in the chow me pat

9:52

you on the back cut of right he Uh. I keep

9:54

asking to come back on the show, but he's working on a new project

9:56

and he's a new book on Dunning Crew. Yeah. Yeah, because

9:58

you know that a lot of people has been all these people who want

10:00

to knock it down and and he's there

10:03

have been attempts, but none have really landed

10:05

a blow. So we helped him out, or he helped us

10:07

out. My good friend Johansson has a YouTube

10:09

channel and uh it does explore different

10:11

science Stuff's called be Smart, and we

10:14

were talking about that recently. There was a story about

10:16

someone who, uh, the pilot you went unconscious

10:18

and they've landed the airplane but they got help from

10:20

the tower. And we were talking

10:22

about that and I was like, I feel like I

10:24

could land an airplane based off all my video game

10:27

experience, and Joe said

10:29

he thought he could too, and I said, this has got

10:31

to be done in Kruger, right, And I said

10:33

it would be cool if you did a video

10:35

where you got into like one of those commercial

10:38

commercial flight simulator and they just said,

10:41

yeah, I try, go ahead, land knock yourself. And

10:43

so he got I get I got

10:45

him in touch with Dunning, and Dunning was like, I can't

10:47

wait to be part of this project. So

10:49

he both interviews back and forth with Dunning

10:51

before and after, and of course he gets in the simulator

10:54

and they hand him the controls and they say, okay, landed, and

10:56

of course he crashed. And you crashed it three times.

10:58

That's impressive, you know. Even David

11:01

Dunning tells a wonderful story

11:03

about they never expected the

11:05

research paper Dunning Krueger

11:07

on metacognition to explode.

11:10

And he goes, I never thought about trademarking

11:12

it. He goes, go on, go on Amazon. You'll

11:15

see Dunning Krueger University shirts,

11:17

key chains, all sorts of stuff. He goes, there's

11:19

a million dollars there. I just had no

11:22

experience in that, and I got a little Dunning

11:24

crugerd did

11:26

not did not think about the

11:28

commercial side of it. So there's a quote

11:31

I want to share because it sets up

11:33

everything. Uh, and I'm I'm

11:35

sort of cheating. It's from towards the end of the

11:37

book. We do this because we are

11:39

social primates who gather information

11:41

in a bias manner for the purpose

11:43

of arguing for our individual

11:46

perspectives in a pooled

11:48

information environment within

11:50

a group that deliberates on shared

11:53

plans of actions towards

11:55

a collective goal. Kind of sums

11:57

up everything we do in a US.

12:00

That was a lot of work, with years of work

12:02

within that little little that

12:04

A lot of that comes from something that's called the interactions model.

12:07

Uh. They're sort of a peanut butter and chocolate have come up

12:09

and in this book because I've spent years

12:11

talking to people through you are not so smart, and I would

12:13

argue that we're flawed and irrational,

12:16

right, And that was there was a big pop psychology

12:18

movement for that about a decade

12:20

ago, things like predictably Irrational

12:23

and uh, even the work of Knomen

12:25

Diversky, like a lot of the like interpretation

12:28

of that was like, oh, look how dumb we are, right, and

12:30

look how easily fooled we're Look how bad we were probabilities.

12:33

And one of the incepting moments of this

12:35

book was I did a lecture and somebody came

12:38

up to me afterwards. Her father

12:40

was had slipped into a conspiracy theory, and she asked, what

12:42

do I do about that? And I told

12:44

her nothing like but

12:46

I felt gross saying it.

12:48

I felt like I was locking my keys in my car. I felt

12:50

like, I think I know enough

12:53

to tell you that, but I know I don't. And also I

12:55

don't want to be that that pessimistic and centicle.

12:57

And at the same time, the attitudes and

12:59

norms on same sex marriage the United States had

13:01

flipped like very rapidly, We're

13:04

that. So those two things together

13:06

I was like, I would I want to understand this better.

13:08

So I invited on my podcast Hugo mercy A

13:11

and he teamed up with Dan Sperber and they created

13:13

something called the interactions model, which is a model

13:15

that I only want to talk to them about, you know, changing minds,

13:17

arguing, and it opened up this whole world

13:19

and through them, I also met with Tom Stafford,

13:22

and there's the interactionist model, and

13:24

there's the truth wind scenario, and those are sort

13:26

of a peanut butter and chocolate my come up. And it's because instead

13:28

of looking at people as being flawed in irrational, alsa

13:30

is just as biased and lazy, which is different. And

13:33

what you're just talking about what that paragraph is about the

13:35

interactionist model, which is uh, A

13:38

lot of the research that went into all those

13:40

books from about a decade ago. They

13:43

were pulling from studies that were done on individuals

13:45

and isolation. And then when you pool all

13:47

of their conclusions together and you

13:49

treat people as a group of people

13:52

based off that research, we do look kind

13:54

of flawed, right, we do look very irrational.

13:57

But if you take that exact same research and you allow

13:59

people to do liberating groups, you get much

14:01

different reactions, much different responses, and

14:05

that's been furthered by the work of Tom Stafford.

14:07

He's been taking some of the old stuff from those old

14:09

studies and putting them to groups

14:11

and even creating um social media

14:14

similarcrooms that work like Twitter

14:16

and Facebook and stuff, but have a

14:18

totally different context, allows people to deliberate

14:20

and argue in different ways and you get much

14:23

different results. You get better results were much. A good

14:25

example of that is like, uh, you take something

14:27

from the common reflection task or something like

14:29

a I'll make it real simple so we

14:31

don't have to like do any weird math in their heads.

14:33

Like you're running a race and you pass the person in second

14:35

place, what place you in? And you know

14:37

the intuitive answer You're start trying to work it out in your

14:39

head. But the answer one if

14:41

you like lean back, is well I replaced second

14:43

place. I'm in second place. But if you ask

14:45

people individually you get a pretty high response

14:48

fade where they get the wrong answer. But

14:50

if you take that exact same question and you pose

14:52

it to a group of people and I do some lectures

14:54

now and you say, Okay, I'm

14:57

gonna ask this question, keep the answer to yourself. Now,

15:00

does anyone to have the right answer. You know you have the right answer.

15:02

Raise your hand. Somebody raised their hand. I say, okay,

15:04

what's the answer. They give you the answer. Then you say,

15:06

explain your reasoning, and then they explain the reasoning. When

15:09

they give their answer, there will be a grumble on the crowd. When

15:11

they explain the reasoning behind it, the crowd goes,

15:13

Okay. Now, if you took everyone's

15:16

individual answer and pulled it together,

15:18

you'd be like, wow, this group got their

15:20

wrong answer. But if you allow that deliberation

15:23

moment to take place where I explain how reasoning

15:25

to you, you get a group of people who would go from

15:27

eight percent in correct correct. And

15:29

we really set up for that. And the interactionist model is

15:31

all about this the work of Humorocity and Dan Spur really

15:33

have a great book about this, called The Enigma of Reason.

15:36

It's a it's not a light read, it's really sort

15:38

of you know, academic, but it's great because

15:41

they found, looking through the old research and

15:43

their own new research, that we

15:45

have two cognitive systems, one for producing arguments,

15:48

one for evaluating arguments, and the one

15:50

that produces arguments does it very lazily

15:52

and very in a very biased manner. You can think of it like

15:55

you ask where do you want to go eat? And you know,

15:57

I have three or four people after a movie like hanging out

15:59

in the lobby like I wanna I

16:01

want to go here, I want to go here, I want to go here, and UH

16:04

they have biased reasons for that. One person's says,

16:06

hey, let's go get sushi, and the somebody's like, we're

16:09

over here and on my ex work there, or

16:11

or I someone to say I had sushi yesterday,

16:13

or I don't like sushi. That you can't predict

16:16

what are going to be the counter argument, so you to

16:18

present your most biased and lazy argument

16:20

up front, and you let the deliberation

16:22

take place in the pooled UH evaluation

16:25

process. You all flow the cognitive labor to that we're

16:27

all familiar with doing that. Everyone has

16:29

their ideas, You trade back and forth, and we decide on the

16:31

group goal on the plan, which is what this evolved

16:34

to do. But we are also very familiar with the way that plays

16:36

out on the internet, which is not good for an AUC

16:38

is removed and you don't get the same social

16:41

cues coming right, So you get like, let's say, my good

16:43

friend Alistair Croll, who runs conferences.

16:46

He put it to me like this. He's like, on the Internet, when

16:48

you say, uh, I want a grilled cheese

16:50

sandwich, Uh, it's not an argument

16:52

for for who wants grilled cheese sandwiches? Should we

16:54

get grilled cheese sandwiches anywhere else? Agree with me? On

16:57

the Internet, on most of the platforms we use today,

17:00

it's saying I want grilled cheese sandwiches.

17:02

Who wants to go with me to the grilled cheese sandwich room?

17:04

And so everyone who agrees with that position

17:06

and is already like, yeah, that's what I want to they

17:09

get pulled off into a community of people who

17:11

want this and then a whole new set of psychological

17:13

act that was going to play, which is all about being a social

17:15

primate and being a community. So there's no

17:17

iteration, there's no debate, there's no

17:19

consensus forming as to

17:22

what the best solution to that problem

17:24

is. You just have some salient

17:27

issue and people form of what

17:30

looks like madness or what looks like some sort of

17:32

nefarious thing going down. One of

17:34

the things that the Internet gives us is the ability

17:36

to group up very quickly, and we

17:38

are social prime rates. If we go into a group,

17:40

we start being worried about motivations

17:42

like I want to be a good member of my group, I want to be to be

17:45

considered a trustworthy member of my group, and so on, and

17:48

you get a lot of weird stuff we see today that that

17:50

falls into the domain of being polarized

17:52

or being in a system where everyone is. If

17:54

you have in a group people who agree with you in your current position,

17:56

it's very difficult to argue out of it because I can always

17:58

fall back to them from back and

18:01

so that that's some of the stuff that goes into that paragraph, and it gets

18:03

more complicated from there. But yeah, it's that was

18:05

very illuminating to me, and a lot of the new material

18:07

in this book relates back to it. Not that

18:09

the earlier books were wrong

18:12

or incorrect in anyway, but I kind

18:14

of took this as a little bit of a mia Kalpa

18:17

in terms of, hey, I

18:19

was focusing on one area, but

18:21

really we need to focus on a broader

18:24

area in terms of not just

18:26

why we make these cognitive eras, but

18:29

how you can change somebody's mind who's

18:31

trapped in some heuristic

18:34

or other cognitive problem that

18:36

is leading them the wrong life. I did

18:39

not intend for this to be like some sort of marketing

18:41

phrase or trick, but it's the truth. But I in

18:43

writing a book of how Minds Change, I changed

18:45

my mind on a lot of stuff that I was like depending

18:48

on for like my career, and I'm

18:50

happy to do that. It feels really great to be on the other side

18:52

of some of these things and see it more clearly and more, you

18:54

know, more dimensionality to it. So

18:57

let's talk a little bit about the blog

18:59

that led to the books that really put you

19:01

on the map. You are not so

19:03

smart? Um. I love the title of

19:05

this Why you have too many friends

19:07

on Facebook? Why your memory

19:09

is mostly fiction? And forty

19:12

six other ways you're deluding yourself?

19:14

Was there were there forty six chapters? Was that

19:16

just a random No? That was that was exactly

19:18

how many things are exploring in the book. Yeah,

19:21

that's that's great. So we already discussed what

19:23

led you to this area of research. Why

19:26

did you decide to go from blogging,

19:28

which is easy in short form,

19:30

to writing a book, which anyone

19:32

who has done it will tell you it can be a bit

19:35

of a slog It was. Here's how that happened.

19:37

I was just blogging away back in the early

19:39

days, and maybe you had a thousand people reading

19:41

my stuff. And that was back way

19:43

before medium and Twitter and any other way to get your stuff

19:45

out there. And when

19:47

did you launch? You are not so smart. I

19:53

got into an argument two of my friends about

19:55

what was better the PlayStation three of the Xbox

19:58

three six. We got so mad at

20:00

each other that it was like I might not be able to

20:02

like hang out with them. And this isn't

20:05

a political Trump versus Biden

20:07

debate. This is yeah, but it's

20:09

just the

20:11

same psychology. And I

20:14

couldn't get over, like, why would I get mad about this?

20:16

It's just a box of wires and

20:19

uh. And I,

20:22

since I had a background in psychology,

20:24

I went and I had access to the university

20:27

library. I was I just was like, there's got to be

20:29

some material about this. And I found a bunch of material

20:31

and brand loyalty and identification and

20:33

group identity, and I wrote a little blog

20:35

about it, but I framed it as Apple versus

20:37

PC. That we mean those commercials were oute right there. And

20:41

at that time the blog Gizmoto

20:43

had stolen the iPhone prototype.

20:46

I recall that, and like Steve Jobs,

20:48

and they didn't steal it. They found it in a

20:50

bar. Found they found it in a bar,

20:53

and uh, Steve Jobs sent them emails says,

20:55

give me back my iPhone and they just they

20:58

just went for the hits and they super

21:00

viral. And I just assumed they had like a Google alert

21:02

for stuff written about Apple stuff.

21:05

And I got an email said can we reblog your

21:07

blog post on this? And I was like, yeah,

21:10

sure, And I went from a thousand to two hundred

21:12

and fifty thousand people, and I was like, oh, I should write

21:14

a bunch of stuff on And so that week I just started

21:16

going like things in that sort

21:18

of area, and I wrote a lot more things about like I learned

21:20

helplessness and other issues, and

21:23

I had an audience and it and it was maybe

21:26

four months later. An agent reached out who

21:28

had worked on pre economics and said, I think

21:30

this could be a book. And she still my age. I actually

21:32

met with her two days. If I'm in

21:34

town, I want to always try to meet with because she changed my life.

21:36

Alan Bar amazing human being. And

21:39

uh we turned it into a book, and

21:41

about half of it was already in blog form. I wrote the rest

21:43

of it for the book and that book

21:45

just really took off, like it's still even today.

21:48

It's like in nineteen different languages. It's something every

21:50

once in a while to beating number one in a different country. It was recently

21:52

number one in Vietnam. Well that's how I

21:54

went from blog to book world. But then they

21:56

were like, hey, could you write another book? And I

21:58

said, I sure can, and uh,

22:01

I wanted to promote it. And at that time,

22:03

podcasting had just become a thing. I was listening to

22:05

Radio Lab and This American Life,

22:08

and uh, I was like you, I was listening to WTF

22:11

and I said, I can. I want to do something like that,

22:13

and I just started up the podcast

22:15

to promote it. And it just turned out that the podcast

22:18

was really where I could actually explore this stuff,

22:20

and I jumped into it. So so

22:22

there is a quote. I think this might

22:24

be from the back of the book. So I don't

22:26

know if this is your words or a Blurbu'm

22:29

stealing, but quote. There's

22:31

a growing body of work coming out of

22:33

psychology and cognitive science

22:36

that says you have no clue why you act

22:38

the way you do, choose the things you choose,

22:41

or think the thoughts you think

22:44

that's got the introspecting illusion,

22:46

and it's been a real centerpiece of my work

22:48

for a long time. We don't have access

22:50

to the antecedents of our thoughts, feelings, and behaviors,

22:53

but we do have thoughts, feelings of behaviors that requires

22:56

some kind of explanation, and we

22:58

are very good at coming up with these post

23:00

hoc ad hawk rationalizations

23:02

and justifications for what we're doing, and

23:05

those eventually become a narrative that we live

23:07

by. It becomes sort of the character we portray,

23:10

and we end up being an unreliable narrator in

23:12

the story of our own lives. And so the two

23:14

it's like a one two punch of you're unaware

23:16

of how unaware you are, and that leads

23:18

you to being the unreliable narrator in the story of your

23:20

life. And that's

23:22

fine, Like this is something that is adaptive

23:25

in most situations. But there's when we get into

23:27

some complex stuff like you know, politics,

23:29

running a business, designing an airplane.

23:32

You should know about some of these things because they'll

23:34

get you into some trouble that we never got into, you know,

23:36

a hundred thousand years ago. So a lot of

23:38

this is evolutionary baggage

23:40

that we carry forward. But you touched

23:43

on two of my favorite biases. One

23:45

is the narrative fallacy that we

23:47

create these stories to

23:49

explain what we're doing, as well as

23:52

hindsight bias, where after something

23:54

happens, of course we that was going to

23:57

happen, We saw it coming. Tell us

23:59

about those two dies, well, they're the fallacy.

24:01

I love this. My good friend Will Store, who writes, um,

24:04

it's a question I have for you love enemies

24:06

of science. I love Will so much. And he

24:08

has a book not too long ago he came up with the science

24:11

of storytelling, and uh, I

24:13

love that domain all the whole hero's

24:15

journey, the Camp of Campbell.

24:18

Right. The science side of that is, most

24:21

storytelling takes place exactly along

24:23

the same lines as retrospection. So your retrospection,

24:25

looking back, prospection looking forward. We

24:27

tend to look back on our own lives. As you know, we're

24:29

the hero, were the protagonist, and

24:32

whatever we're looking at specifically, it's like, Okay, we

24:34

started out in this space, and then we went

24:36

on a exploratory journey, and then we eventually

24:38

came back. Yeah, eventually we came back around

24:40

with that new knowledge and applied it. Yeah,

24:43

yeah, you know that we have the the

24:45

synthesis and the anti thesis

24:47

and all those things are how we kind of see

24:50

ourselves. Is how we make sense of our past because

24:52

we couldn't remember everything that would be horrible we

24:54

had, so we edit it to be useful

24:56

in that way. That's when when you're watching a movie or

24:58

reading a book and it doesn't seem to be work for you, it's

25:00

because it's not really playing nice with that retrospective

25:03

system. But it's also our

25:05

personal narratives seemed to be very nice and tidy

25:07

in that way, and although

25:10

they never are. If you've ever told

25:12

a story about something with someone who was also there

25:14

and they're like, it didn't happen that.

25:17

My wife says that all the time, I don't know

25:19

what what experience he had, but I

25:21

was there, None of that happened. That's right, And you, uh,

25:24

if without people to check you, what

25:26

does that say? It says that a whole lot of you, what you

25:28

believe is the story of your life is one of those

25:31

things that if we had a perfect diary of it or a

25:33

recording of it, or someone who was there who could challenge

25:35

you, it wasn't exactly the way you think it was.

25:37

Who was the professor after was it nine

25:39

eleven or some big events had

25:41

everybody right down their notes

25:44

as to what they saw, what they felt, what they were experiencing.

25:47

And then I guess these were freshmen and then by the

25:49

time they become seniors they circle back

25:51

and asked them, now it's three years later, and

25:55

not only do they misremember

25:57

it, but when showing their own notes, they

26:00

disagree with themselves. Yeah. Yeah, that's been repeated a few

26:02

times. I'll talk about it and how mind changed. Robert

26:04

Burton did this experiment after the Challenger incident.

26:06

That was that was the big one, right, But the one

26:09

that in that study was when it's signaling

26:11

above the noise. And yeah, that's the most

26:13

amazing part of it. You they have

26:16

you to write down whatever happened, at what you thought happened.

26:18

They also do a prospection wise, I think they've done they've

26:20

done it where you tell me what you think is going to happen, and

26:22

you put into a Manila envelope, and then the thing happened,

26:24

you know, whatever event takes place, and then

26:26

you ask people, what did you what did you predict?

26:29

Was going to happen, and they tell you, I predicted exactly what

26:31

happened. You take it out of the minial envelope and it's

26:33

not that and they're like, oh, come on, there's no way.

26:35

But even though that's my handwriting, I

26:37

never would have written. And that's the weirdest thing in the in

26:40

the Challenger study, uh,

26:42

when he showed people that their

26:44

memory was absolutely not what they thought it was,

26:47

their first reaction was to say, you're tricking

26:50

me, like, like this is you

26:52

wrote this, like somebody else wrote this. And that

26:54

seems so similar to something called anasignosia.

26:57

And anasignosia is the denial of disorder.

27:00

And you can have like a lesion or a brain injury

27:02

that it calls some something that's wrong in your body.

27:05

But then on top of that, you have this other thing,

27:07

which is denial of the thing that's wrong your body. So

27:09

I've seen cases where people have an arm

27:11

that doesn't function properly, and they'll

27:14

ask, like, why can't you lift your arm? Why can't

27:16

you pick up this pencil? And they'll and they'll say, oh, what do you

27:18

do? And I can pick that up? But what's going on

27:20

with this arm? Like that's my mom's arm. She's playing a

27:22

joke on me right now. Like the split brain patients

27:25

where they don't understand what they're

27:27

seeing when they come up with This is the greatest

27:29

example we've been discussing is if you if

27:31

you have someone who has a they

27:34

called split brain patient, the corpus colos

27:36

and that connects the two hemispheres. They corpus

27:38

colosomy is often performed when a

27:40

person has a certain kind of they have

27:43

seizures that they don't want a cascade. Um,

27:46

you end up with basically two brains,

27:48

and you can use a divider so

27:50

that one I is going to one hemisphere

27:52

wives going the other. You can show a person in image.

27:54

Let's say you show them a terrible carrec

27:57

mangled bodies, and they feel very sick. But

27:59

the portion of and you're showing that too, is not the portion

28:01

that delivers language. So then you ask the person

28:03

who is feeling sick, why you feel sick right

28:05

now? What's going on? They say, Oh, I had something bad at lunch.

28:07

You have we we will very quickly come up with a narrative

28:10

and our explanation for what we're experiencing, and

28:12

we do so believing that narrative, even

28:14

if that narrative is way far away from what's actually

28:16

taking place. So let's quickly run through

28:18

some of our favorite cognitive bias

28:21

is going to be tested. I hope I remember this. Let's

28:23

go, um. Well, we'll start with an

28:25

easy one. Confirmation bias. Confirmation

28:27

bias. When people write about confirmation bias,

28:29

they usually get it pretty wrong. Uh, here's

28:31

the way it confirms.

28:34

It's a great way to put it. The

28:36

least sexy term in psychology

28:39

is that makes sense stopping rule. You think they

28:41

come up with a better phrase, and that that just

28:43

means when I go looking for an explanation

28:45

of something, when when it finally, when it makes sense, I'll stop

28:48

looking for information. Confirmation

28:50

bias is what happens. Here's the way I prefer

28:52

to frame it. Let's say you're at a tent in the woods. You hear a weird

28:54

sound and you

28:57

think, oh, that might be a bear. I should go look. So

28:59

what you have a negative effect in your body, you have an anxiety.

29:02

You go out looking for confirmation that that anxiety

29:04

is just or reasonable because

29:06

as a social aspect to it at all times, because

29:08

we can't escape our social selves, and

29:11

so you're looking and maybe you don't find it, you know, maybe

29:13

you don't find evidence that points in that direction. Eventually

29:15

you you modify your behavior based off what

29:17

you see with your flash lie. If you

29:20

do that online though an environment that's an

29:22

information rich environment. You have some sort of

29:24

anxiety and you're looking for justification that

29:26

that anxiety is uh is

29:29

reasonable. You'll find it.

29:31

You'll find something right, and that will confirm

29:33

that you that your search was was good

29:35

and justified and reasonable to other human beings. So

29:38

confirmation bo is very simply is just something

29:40

happens that doesn't make sense. You want to disambiguate

29:43

it. It's uncertain. You want to reach some level of certainty.

29:45

So you look for information that based off

29:47

your hunch, your hypothesis,

29:49

and then when you find information that seems to it's

29:52

like confirm your hunch. You stop looking as if

29:55

you like, did some something wrong, as

29:57

if you solved it. Why don't we as a species,

29:59

look, we're disconfirming information

30:01

just to validate in most situations is not

30:03

adaptive, like confirmation bias is actually the right

30:06

move in most situations, Like if you're looking for your keys,

30:08

you know, yeah, you don't go

30:10

looking for your keys on Mars. You go looking for

30:12

them in your kitchen, right, and like it's

30:14

the faster solution, and most of our most

30:17

of these biases go back to the adaptive

30:19

thing is the thing that costs the least calories and

30:21

and gets you to this solution as quickly as possible,

30:23

so you can go back to trying to find food not getting eaten.

30:26

And in this case, most of the time, most

30:28

of the time confirmation bias serves as well. It's in those

30:31

instances where it really doesn't serve as well that we

30:33

end up with things like you know, climate change or

30:35

what have you. What about ego depletion,

30:38

Oh, man, ego depletion is one of the things that boy that it

30:40

goes back and forth. Uh. The

30:43

original scientists are still like hardcore into

30:45

it. I love it. Uh. Whether or not ego

30:47

depletion is properly like

30:49

defined or categorized, the

30:51

phenomenon does exist. The actual mechanisms

30:54

of it aren't well understood. But when

30:56

you have been faced with a lot of cognitive

30:58

tasks, uh, you

31:01

start to have a hard time completing more

31:03

cognitive task and in general, hum

31:06

as well as issues that require willpower

31:08

and discipline. Right, So the more you the more

31:10

you use willpower, the less willpower you have to

31:12

use. It's finite, not not

31:15

unended. And this is not well understood

31:17

a lot of the like, here's why this is happening,

31:19

Like haven't I failed to replicate? So we have this phenomenon,

31:22

but we still don't quite understand what is the mechanism under

31:24

allowing it. Well, let me do one last one.

31:26

The Benjamin Franklin And that's my favorite

31:29

Benjamin Franklin effect goes back to you know, a lot of my new

31:31

book is is as in this domain of justification

31:33

and rationalization. Um, Benjamin

31:35

Franklin had someone who was opposing him

31:38

at every turn. I call him a hater in the in the

31:40

previous book, back when that was a term. Yeah,

31:43

and uh, he just had this political opponent that

31:45

he, uh, he knew was going to cause him real

31:47

problems for the next thing he was going up for. And

31:50

uh, he also knew that this guy had

31:52

a really nice book collection. And everybody also

31:54

knew that Benjamin Franklin had a nice book collection,

31:57

and so he sent them a letter and said,

31:59

there's a book that I've always wanted to read. But I can never find

32:01

I hear you've got a copy of it. No, the who

32:03

knows it seems from reading the literature that been

32:06

Refrancla totally had this book. And

32:08

uh, but the guy gave him the book as a favor, and he

32:10

was like very honored that been re Franklin asked for it.

32:12

I like to think of Benjamin Franklin just like put it on a shelf

32:15

and then wait, wait, waited a month, and then took

32:17

it back to him. Um, but he said,

32:19

thank you, I'm forever in your debt. You're

32:21

the best. And from that point forward, that guy never

32:23

said another negative thing about Benjamin Franklin. So what

32:26

that comes down to is I just observed my own behavior.

32:28

I did something that produced cognitive dissonance. I

32:30

have a negative attitude toward ben or Franklin, but I

32:32

did something that a person with a positive attitude would do. So

32:35

I must either think a strange thing

32:37

about who I am and what I'm doing, or I could just take

32:40

the easy rout out and go I like Benjamin Franklin. And

32:42

that's that I think called that the Benjamin Franklin effect.

32:45

I find that really just fascinating

32:47

that there are two phrases that I'm in

32:49

a note of in one of the books

32:52

that I have to ask about extinction

32:54

burst, and I have to ask

32:56

what is wrong with Catharsis. Catharsis

32:59

the stincionburst is a real thing that I love.

33:02

Uh. I see that everywhere I see. I see that all

33:04

in society right now in many different ways.

33:06

That extinction bursts is u when

33:08

you have a behavior that has been enforced

33:10

many many times, and you it's

33:13

your body even expects that you're going to perform this behavior,

33:16

and you start doing something like say dieting,

33:18

or you're trying to keep smoking, or you're trying to do

33:20

you're trying to just extinguish the behavior. Right

33:23

at the moment before it fully extinguishes, you

33:25

will have a little hissy fit. You'll have

33:28

a as they say back home, um,

33:30

you'll have a toddler outburst sort of

33:32

thing where you're all of your systems,

33:35

cognitive systems are saying, why don't we really

33:37

really try to do that thing again, because we're

33:39

about to lose it? And the

33:43

they call this an extinction bursts. That moment of

33:45

like if you're watching it on a slope, it's sloping

33:47

down down, down, down, and there's a huge spike. And

33:49

that could either be the moment you go back to smoking or

33:52

relapse, or the fish.

33:54

It could be the death rottle. Depends on how you how you

33:57

deal with your extinction burst. I thought that was fascinating,

33:59

And then of Tharsis comes up. Why

34:02

is the concept of that cathartic

34:06

surrender or finish the things problematic?

34:09

It's related to the extverse. There's

34:11

a for a while there this, especially

34:13

in like nineteen fifties psychology, the idea

34:15

that like just get it out, you know, like like if you're

34:17

angry, go beat up a punching bag or yell

34:21

at people from the safety of your car. Yeah. There

34:23

used to be a thing in like the eighties scream therapy. Yeah,

34:25

I recall the

34:28

unfortunately primal scream. Yeah.

34:30

Unfortunately or fortunately. Uh, the

34:33

evidence, the evidence suggests that what

34:36

this does is reward you for the behavior, and

34:38

uh, you maintain that level of anger and

34:40

anxiety and frustrations. It's self rewarding,

34:43

yea. And so it's uh, there

34:46

are ways to have cathartic experiences,

34:48

but the ones where you reward yourself for being angry

34:50

tend to keep you angry that that makes a lot of

34:52

sense. And last question on you

34:54

are not so smart? Do we ever

34:57

really know things or do we just have

34:59

a feeling of knowing? Is

35:02

an unanswerable question, thankfully, Uh

35:05

from from you? No, No, I feel like I

35:07

feel like I know. That's uh. Here's

35:09

what's important to know about the Certainty

35:11

is an emotion. This is something that gets me in trouble.

35:13

I think in like rationalists and uh, you

35:16

know, circles. But I won't get you in trouble

35:18

here. Well, thank you, because like the the ideas

35:20

like facts not feelings or your you

35:22

know, let's not get emotional, let's not make

35:24

emotional appeals. Uh,

35:27

there's no dividing emotion from cognition.

35:29

Emotion is cognition, and certainty

35:31

is one of those things that lets you bridge the two because

35:34

certainty is the emergent property of networks

35:36

waiting something in one direction or another, and

35:38

you feel like, you know, if you want to do percentage wise,

35:40

it's is you can feel it. I ask you percentage wise,

35:42

like if I ask you did you

35:44

have eggs last week on Tuesday?

35:47

And you're like, I think I did, and like, well, like on

35:49

a scale from one to ten, like on a percentage

35:51

wise on Saturday morning, I went to the dinner,

35:54

I had eggs. So that feeling that you're

35:56

getting, there's something generating that

36:00

deepeealing, right, So the

36:02

feeling of knowing is something that's separate from

36:04

knowing, but as far as subjectively,

36:07

it's the exact same thing. We only get to see this objectively

36:09

in some way, especially in those like open up the middle envelope,

36:11

let's see what you actually said. So

36:14

this is a pet peeve of mind because here

36:16

in finance there is this, for

36:18

lack of a better phrase, meme that

36:21

the markets hate uncertainty,

36:24

and whenever people are talking about what's going to happen

36:26

in the future, well, it's very uncertain,

36:28

to which I say,

36:30

well, the future is always inherently uncertain.

36:33

When things are going along fine and the market's

36:35

going up, we feel okay

36:37

with our uncertainty, so we can lie

36:40

to ourselves about it very very easily, exactly.

36:42

But when everything is terrible, the markets are

36:44

down, the Feds raising rates, inflation,

36:46

Oh, the market hates uncertainty. Now

36:49

at the uncertainty level, you didn't know

36:51

the future before, you don't know the future now, but

36:53

you can no longer lie to yourself and you

36:56

have a sense of of what's going on. This

36:58

is, by the way, a very out liar view,

37:00

because everybody loves the uncertain. So well,

37:02

I'm happy to sit here, I despise. I'm happy to sit

37:05

here and surrounded about all these people and take

37:07

the position of, uh, you're very wrong. Uh,

37:10

the they are less smart.

37:12

There is no such thing as certainty. This

37:14

is, you know, from the scientific or psychological

37:17

even philosophical domain. Everything is probabilistic

37:19

and we can like hedge our bets, but the

37:22

concept of certainty is way outside the domain of

37:24

any of these topics. Yeah, and and well, we'll

37:26

talk about bertrand Russell later. But it's

37:29

a quote from your book that always

37:31

makes me think, well, let's let's talk about

37:33

it now, because it's such an interesting observation

37:36

quote. The observer when he

37:38

seems to himself be observing

37:41

a stone is really, if physics

37:43

is to be believed, observing the

37:45

effects of the stone upon himself. Right,

37:48

Isn't that awesome? That is right from

37:51

this book How Minds Change by

37:53

Day Man Here, it's a good book. The U I

37:55

got that from interviewing the Lately

37:58

Ross who created the term naive

38:00

realism. It's another phrase I love,

38:02

and this this is a way to kind of get

38:04

into naive realism. N I realism is the assumption

38:06

that you're getting a sort of a video camera

38:08

view of the world through your eyeballs, and that

38:11

you're storing your memories and some sort of database

38:13

like a hard drive and uh,

38:16

and that when I ask your opinion on

38:18

say, uh, immigration or gun control,

38:20

that whatever you tell me came from you

38:22

went down to the bowels of your castle to your scrolls

38:25

and pulled out the scrolls by candle light and

38:27

read them all. And then one day came up from that and emerged

38:30

from the staircase and raised your finger and said, ha, this

38:32

is what I think about gun control.

38:35

And it minded what what's

38:37

invisible in the process or what becomes

38:39

invisible when we're at we're tasked with explaining ourselves.

38:41

Is that uh, all the rationalization

38:44

and justification and all the interpretation

38:46

that you've done, and all the elaboration, these all these psychological

38:49

terms, and that you Uh.

38:51

This concept of nairealism is that you

38:54

see reality for what it is and other people

38:56

are mistaken when you get into moments of conflict

38:59

and the The thing that

39:01

bergand Russell said is so nice because he is

39:03

alluding to the fact that all reality's virtual

39:06

reality, that the subjective experiences is

39:08

very limited, with the German psychologists

39:11

called an un velt the thing related

39:14

to naive realism that was so

39:16

surprising in the book. And we keep alluding

39:18

to evolution in various things. I

39:21

did not realize that the

39:23

optic nerve does not perceive

39:25

the world in three D. It's

39:27

only two dimensional. And

39:30

okay, so we have two eyes, so we

39:32

were able to create an illusion

39:34

of depth of a third dimension,

39:36

but the human eye does not see the

39:38

world in full three Yeah. I just while

39:41

while I'm in visiting New York, I spent time with

39:43

Pascal who's in the book, and he's the one

39:45

who like ram me through all this. That's

39:48

amazing, isn't it. It's uh, the retina.

39:50

I mean, you know, obviously at microscopic levels

39:52

it's three dimensional, but for the purpose of the vision,

39:54

it's a two dimensional sheet. And

39:56

so we create within

39:59

consciousness the third dimension. But it's an illusion,

40:01

just like every color is an elite. It's a very realistic

40:04

illusion, but it's an illusion, right, And

40:06

and that's why paintings can look nice because

40:08

you play with the rules of illusions that

40:10

to create depth. And even

40:14

people who who gained vision late in

40:16

life, the

40:18

understanding depth and three dimensionality is something

40:20

that takes a lot of experience. You have to learn how to

40:22

do it. And they oftentimes they'll in experiments

40:25

with people who just gained vision late in life, they'll

40:27

like put a telephone and run like

40:29

far away from them, and they'll they'll try to reach out

40:32

to it. It's like thirty ft away because you have

40:34

to learn depth. That's something that we learned over time.

40:36

We you know, we did as children. Who don't you remember,

40:38

you don't really think about it. So let's talk

40:40

about how minds change. And I want to start

40:43

by asking how did a flat

40:45

earther inspire this

40:47

book? They actually came in a little later

40:50

in the process. I was there

40:52

was a documentary on Netflix.

40:55

You may have seen it, Behind the Curve, uh,

40:58

and the producers of that were ends

41:00

of my podcast, and then they grabbed a couple

41:02

of my guests for the show and everything, and I thought it would

41:04

be you know, I would love to help promote something. I didn't know

41:06

this, but someone told me I was in the credits and

41:08

I looked in the credits it was like, you know David, thanks

41:11

to David McRaney, And I was like, oh wow. So I

41:13

emailed them and said, hey, you want to come

41:15

on my podcast, we'll talk about your documentary.

41:17

Because if I had gotten a chance to make a Netflix show, it had have been

41:19

very similar because that's it seems like

41:21

it's about flat Earth, but it's actually about motivated

41:24

reasoning and identity

41:26

and community and things like that, and community community

41:28

is a huge part of it. Group identity,

41:30

and um that

41:32

after that episode, Uh, they

41:35

a group in Sweden do they put on something like south By

41:37

Southwest called the gather Festival. They asked, Hey,

41:40

we got this crazy idea. What if you

41:42

go to Sweden and we'll get h Mark

41:44

Sargent, who was sort of the spokesperson for the

41:46

flat Earth community, and we'll put

41:48

you on stage. And I know you're writing a book about how minds

41:50

change, you can try to try out some of those techniques

41:53

on them. And I was like, oh, that sounds awesome, So um

41:56

I did. I went and I met Mark,

41:58

and uh, I found it very nice, very

42:00

lovely man. And I did try something

42:02

at the point where where I met him. I was about halfway

42:04

through and that wasn't great at the techniques, but I did an

42:06

okay job. That's that's towards the

42:08

end of the book where you actually describe um.

42:11

He said it was one of the best conversations he

42:13

ever had. You don't call him an idiot,

42:15

you know challenges views. You're really

42:18

asking how did you come to

42:20

these sorts of perspectives

42:23

to get him to focus on his

42:25

own process. That's the whole idea of the techniques

42:27

I learned about in the book. Uh, we're writing

42:29

this book. I met many different organizations. Deeve Campels are

42:31

street epistemology, people who work in motivational

42:34

interviewing and therapeutic practices, UH,

42:36

professional negotiation and conflict resolution.

42:39

UH people work in those spaces. And

42:41

what really astounded me was when I would bring the

42:43

stuff that I was witnessing to scientists

42:47

or experts. They there was this underlying

42:49

literature that made sense. But none of these

42:51

groups had ever heard of this literature for

42:53

the most part, and they definitely hadn't heard of each other.

42:55

But they did a lot of a b testing thousands of conversations

42:58

throw away. I didn't work keeping what did and they would

43:00

arrive at this is how you ought to do this, and

43:02

then would also be similar they had it

43:05

was in steps, the steps would be in the same order.

43:08

And I sorted think of it like, you know, if you wanted to

43:10

build an airplane, the first airplane ever built, no

43:12

matter where it was builder who did it, it's gonna look

43:14

kind of like an airplane. It's gonna have wings, and

43:16

it's going to be lighter than because you're dealing

43:18

with the physics that you have to

43:20

contend with when it comes to the kind of conversation

43:23

dynamics that actually persuade people or

43:25

move people or illuminate them. They

43:27

have to work with the way brains makes sins of the world and

43:29

all of the evolutionary path that pressures

43:32

all of that. And so these independent groups discovered

43:34

all that independent of each other and

43:36

of the science that supports them. And Mark Sargeant,

43:38

like when I first met him, I shook his hand and said, look, I'm

43:41

not going to like make fun of you or anything. Goes Oh, that's fine,

43:43

make fun of me all you want. And he he took out his phone and showed

43:45

me the commercially've done for LifeLock, whereas

43:47

like, if I can do it, anybody can do it, and he's

43:50

totally okay with it. But that's not

43:52

what I did. And then when I sat down with him, one of the essences,

43:54

I know we'll get to it, but it's like, you don't want to face

43:56

off and I need to win, you need to lose. I'm

43:59

trying. I'm not even to aiding you. What I want to do is

44:01

get shoulder to shoulder with you and say, isn't

44:03

interesting that we disagree? I wonder why you

44:05

want to pardner up with me and trying to investigate that mystery

44:07

together. And in so doing I opened up a space

44:10

to let him meta cognate and run through how

44:12

did I arrive at this? And that's what I did

44:14

with him on stage, and you know, we learned all sorts of things,

44:16

like he he used to be a ringer for a video

44:18

game company, so that so that's where his conspiratorial

44:21

stuff first came from. Oh so of course he

44:23

wasn't just a guy showing these contests

44:25

weren't fair. They and

44:27

it's always a name, they had

44:30

somebody skewing the outcome. Going

44:32

through his whole history, it was really clear how he got

44:34

motivated into this. But the thing that really kicked

44:36

in was, you know, flat Earth is a

44:39

pretty big group of people. They have conventions, they have dating

44:41

apps, and once he became

44:43

a spokesperson for it, and he's traveling around the world,

44:45

is going to Sweden, Like now he's not traveling

44:47

around the world, he's traveling up

44:49

across surface. That's right.

44:52

He is traversing the geography

44:55

Cartesian plane of planet, that's

44:57

right. And so that that was a really the

44:59

sun flat also, that's always my question.

45:02

If the Earth is flat, is the Sun of sphere,

45:04

why would some celestial bodies be

45:07

There are schisms within the flat earth community.

45:09

There are many different models of flat Earth. The

45:11

one that Mark Sergeant is part

45:14

of. They see the Earth that's sort of it's almost

45:16

like a snow globe. It's flat, but there's

45:18

a dome, there's a there's makes perfect

45:20

sense to and perfectly the Sun

45:22

and the Moon are are are celestial objects

45:25

that are orbs. And when you ask my good question

45:27

was like, okay, well then it seems manufactured.

45:29

Who made it? God's or aliens? And he goes and I remember

45:31

leaning in and saying it doesn't matter. Isn't

45:35

it the same thing? Well, you know,

45:37

the Greeks figured out five

45:39

thousand years ago that the Earth was round

45:41

by just looking at the shadow of the sun

45:44

cast at the same time in different

45:46

cities of different latitudes. But

45:49

five thousand years of progress, just hold

45:51

a side. Look, you wouldn't you

45:53

would believe that the number of ways that that has

45:56

been explained away in you know, flat

45:58

earth world, there's a plenty of explanations

46:00

for why that's, you know, part of the

46:02

big conspiracy. My favorite part of

46:04

the flat earth community was flat

46:07

Earth Meets done in Krueger with the guy who

46:09

built a rocket to go up

46:12

in order to prove that the Earth was flat.

46:14

We don't know what he saw because

46:17

he crashed and died. Do you recall

46:19

this was like remember three summers ago, I can tell

46:22

I can I know exactly how the response would like see

46:24

see he took him out and

46:26

they took him out. So you mentioned

46:29

several different groups, The Street

46:31

Epistemology and the

46:34

deep canvassers were really fascinating

46:36

my book. Right, So a

46:38

quick background, A well funded

46:41

group in California. We're trying

46:43

to convince people to support

46:46

the Marriage Equality Act, which

46:48

ultimately ends up failing in California

46:50

by three or four percent, and they had done

46:53

thousands of home visits, knocked

46:55

on the door, Hey, we want to talk to you about about

46:57

this act and why we think you should support it. And

47:00

the failure of that was a real

47:02

moment of clarity, and they said, we have to

47:05

rejigger everything we doing because this is totally

47:07

ineffective. And the methodology

47:09

they came up with that standing shoulders shoulder

47:11

and let's figure out why we think. Let's

47:14

explore why we think so differently. You

47:17

know, in politics and single issues,

47:19

if you can move somebody a tenth

47:21

of a percent, it's huge. Their

47:24

impact is a hundred times that it's

47:26

ten. It's

47:29

astonishing. Tell us a little bit about

47:31

what this group does that's so

47:33

effective when they're supporting a specific

47:35

issue. Yeah, the background you gave exactly

47:38

what happened. They wanted to understand how they

47:40

lost, and they went door to door asking

47:42

They came up with this idea. This Dave Fleischer, who runs

47:44

the Leadership Lab U c

47:47

l A or USC and the LGBT

47:49

Center of Los Angeles, and they're extremely

47:51

well funded, millions of millions of dollars and the

47:53

biggest LGBT organization of its

47:55

kind in the world, and the Leadership Lab was their

47:57

political action wing. And as they were doing this cavasing

47:59

thing and they lost in Prop eight, he wanted

48:02

to know, how could that be because this seems

48:04

to be an area where we would definitely wouldn't lose this, And

48:06

so he said, what if we just wouldn't ask people? And

48:08

so they did the exact same thing again. So this time they knock

48:10

on doors and say they went to areas they knew

48:12

that they law that they had lost in help us

48:15

understand, and if that somebody had voted against it,

48:17

they asked, why did you vote against it? And

48:20

they had these listening brigades about fifty seventy

48:22

five people would go out and knocked or the door, and

48:25

to their astonishment, people wanted to talk.

48:27

When they started asking, like, this is a non adversarial

48:30

thing, it's just hear them out. And

48:32

when they did that, these conversations we

48:35

go to forty minutes and they started

48:37

thinking, well, we need to record these, and they

48:39

started recording them, and somewhere along the way,

48:41

about three or four times people talk

48:43

themselves out of their position when you just stood there

48:45

and listened, don't You're not You're not nudging

48:48

them, and you're not challenging them, You're just letting

48:50

them be heard. And so they wanted to know what

48:53

did we do there, what happened in that conversation

48:55

that led to that? So they started reviewing

48:57

that those specific conversations and taking

49:00

bits and pieces and testing out was that this wasn't

49:02

that, was that, this wasn't that. And they eventually

49:04

when I met them, they had done seventeen thousand

49:06

of these conversations and recorded on the video

49:09

and they had a b tested their way to a technique

49:12

that was so powerful that while I went

49:14

there several times and went door to door with them

49:16

everything, but every time I went there would be scientists

49:18

there, there'd be activists from around the world there

49:20

because they were like, what have you done? What have

49:22

you discovered? And it's very

49:24

powerful. And over the course of writing the book, the research

49:27

was done a couple different times on them, and they found the

49:29

numbers. You're talking about twelve ten or twelve percent

49:31

success, right, And the method

49:33

is very similar. You only really know two of

49:35

the steps, but you know it's about ten steps. If you wanted

49:38

to do it the full thing The most important

49:40

aspect of this is non judgmental listening and

49:42

non judgmental listening and holding.

49:44

You're gonna hold space to let the other person explore

49:47

how they arrived at their current position. In

49:49

other words, you're going to help them

49:51

be very self reflective

49:54

and figure out their thought problems.

49:56

It's probably good to give you a foundation of what motivational,

49:59

what motivated reasoning is right here. So you

50:01

know, when somebody's falling in love with someone and you ask them

50:03

like why do you like them? Like why you why are you going

50:05

to date this person? And they'll say something like the way

50:07

they talked, the way they walked, where they cut their food, the uh,

50:10

the music they're introducing me to. When that same

50:12

person is breaking up with that same person, you ask why

50:14

you're breaking up with them, they'll say things like, well,

50:17

the way they talk, the way they walked, where they cut their food, the

50:19

dumb music they made me listen to. It. So reasons for what

50:21

will come, reasons against when the motivation to search

50:24

for reasons that will I rationalize

50:26

and justify your position change.

50:28

As you've said all throughout our conversation, we're often are

50:30

very unaware of that and if someone comes along and

50:32

gives you an opportunity to self reflecting a way where

50:34

you will go through your reasoning process, you will often

50:37

start to feel moments of dissonance

50:40

and question yourself. And as long

50:42

as the other party isn't is allowing you

50:44

to save face. And it's just non judgment and listening.

50:46

That's a big component of this and their technique.

50:48

They'll open up and say, okay, we're talking about that that

50:50

same sex marriage or transgender bathroom laws or

50:53

something. They're very political organizations, so the sort of the topics

50:55

they cover, they'll ask a person this

50:57

is this is the biggest part of everything, and this I

50:59

urge every you wanted to try this out on yourself and other people.

51:01

You can just do it on a movie like the last movie that you watch.

51:04

Let's what's last movie you watched? The

51:06

Adam Project? Okay, the Adam Project.

51:09

Did you like it? Yeah? Ryan Reynolds

51:13

boom. It's so easy to say I liked it.

51:15

Okay. Now I ask on a scale

51:18

from zero to ten, like if you're a movie review or what would

51:20

you give it? Six? Seven? Okay?

51:22

Why why does six feel like the right number? Um,

51:25

it's not a great movie. It's not The Godfather,

51:27

but it was entertaining and silly and fun.

51:29

You like The Godfather, you know that's a

51:31

ten. Yeah, you know what what do you think is the Godfather

51:34

has that this movie doesn't. It's much

51:36

more sophisticated. It tells a much

51:38

more interesting tell. It's the

51:41

characters are much more fleshed out. They're

51:43

more interesting. Um, the

51:45

violence is is gripping, whereas

51:48

the violence and this is sort of cartoony. So

51:50

we're gonna step out of that conversation when we come back to it.

51:52

But now what this is what I'm doing.

51:54

I'm listening to you. I'm not judging you, and

51:56

I'm giving you a chance to actually explore the

51:59

reasoning and at and and your values are

52:01

starting to come up and things that are unique to you and things

52:03

you like about the world are a lot of times

52:05

this is the first time a person has ever even experienced that.

52:08

And this is a moment for you to start to understand yourself

52:10

in a certain way. And a conversation about a political

52:12

issue you might start pulling in things about where

52:14

this actually, you know, the first time you ever heard

52:16

about this thing, and then will become you see it superceived

52:18

wisdom or you're being influenced by others, and

52:21

then all that comes into it's very easy

52:23

for you to extract that emotion and tell me what

52:25

you felt. I liked it, I didn't like it. When I ask

52:27

you to rationalize and justify it for me and come up

52:29

and go through your own personal reasoning process,

52:31

not my reasoning process. This is a

52:33

unique experience for a lot of people. And then the other thing I can

52:35

do is say you give it a six? How

52:38

come not say a four? You

52:41

know, under five I would think is something

52:43

I didn't especially like. You know,

52:45

I smiled and laughed throughout it, and it get

52:47

me up detained for ninety minutes. That's

52:49

and my nephews, that's all I'm looking at, you see.

52:52

And so we're getting deeper, more and deeper deeper

52:54

into the things that you you look for entertainment,

52:56

but we're talking about a political issue. This

52:58

something comes out of motivation interviewing, and they weren't even

53:00

aware of this. The deep camps, people therapists

53:03

who had dealt with people

53:05

would come in for say alcoholism or drug addiction,

53:08

and you know, they already are at a state

53:10

of ambivalence. They want to do it, and they

53:12

all and they don't want to do it. That's why they've come for help.

53:14

But a psychologist would often engage in

53:16

something called the writing reflect They say, okay, well there's

53:18

here's what you're doing wrong. Is what you need to do. Here's what you and

53:22

you you will feel something called reactants, which

53:24

is that unhand me, you fools, feeling that

53:26

I'm telling you what to think, I'm shaming

53:28

you. And when you push away from it, you

53:30

will start creating arguments to keep pursuing

53:33

the thing. And they

53:35

this was such a debacle that they developed

53:38

something come motivational interviewing, where I would start,

53:40

I would start trying to evoke from you counter

53:42

arguments. And I can do that very simply with the scale, because

53:44

when I ask you why not a four, the

53:47

only thing you can really produce for me your

53:49

reasons why you wouldn't go away from the six,

53:51

which is also kind of going towards a seven.

53:54

And in a political discussion, that's how they'll

53:56

open it up. They'll say, we're talking about transitor

53:58

bathroom law. Here's the position that I'm talking about

54:01

is coming up for a vote. I'm wondering where you're at on that like

54:03

a scale one was eer to ten. They'll tell them, and

54:05

then they'll say why that, And this is a moment.

54:08

We may stay there for twenty minutes, we go through

54:10

how you arrived at this number. And then

54:13

in that the deep campras to do something different from

54:15

the other groups. They asked the personal

54:17

if they've had a personal experience with this issue.

54:20

And on the LGBT same

54:22

sex marriage issue, what seemed

54:24

to have come up time and again was hey,

54:27

is there anybody gain your family? Do you want

54:29

them to have find love? Do you want them to find happiness?

54:32

And suddenly when it becomes personal, the

54:34

political issue gets inverted. That's right. You start

54:37

really realizing how much of this is abstraction, how

54:39

much of this has received wisdom, how much of this is political

54:41

signaling our group I inter singly, and

54:44

not every time, but many times people who will have

54:46

a personal experience related to the issue, and that

54:48

personal experience really issue will create massive

54:50

amounts of cognitive dissonance on the position I just gave

54:52

you. There's a phrase which I

54:55

was going to mention later, but I have to share

54:57

it. Excruciating disequal

55:00

a brim is that how

55:02

you ultimately get to a point where either

55:05

somebody changes their perspective or or

55:07

something breaks. This is how we change our minds on

55:09

everything, Like we're always changing your minds at all times,

55:11

like the everything is provisional until

55:13

yeah, and we don't we're not always We're totally not aware

55:16

of it most of the time. But this comes to the work of a lot of

55:18

psychologists. But I focus in on p J

55:20

because there's two mechanisms, assimilation and accommodation.

55:23

Assimilation is uh when something's ambiguous

55:26

or uncertain you when you interpret in a way that says

55:28

basically, everything I thought and felt and believed

55:30

before now I still think, feeling, believe it. This

55:32

is just a modified a little bit with

55:35

you assimilated into your current model reality.

55:37

Accommodation, on the other hand, is when there's so many anomalies

55:40

build up or this is so counterattitudinal

55:42

or counterfactual what you currently have

55:44

in your model reality are they they had called it a

55:46

schema, you must accommodate us. You can think

55:48

of it like a child sees a dog for the first

55:50

time and they're like, what is that? You

55:53

say, it's a dog in their minds?

55:55

Something categorical, something like it's got four

55:57

legs, walks on four legs, it's not wearing

55:59

any clothes, as it's furry as a

56:01

tail, it's non human dog.

56:04

And then if they see like a a orange

56:06

dog or a you know, a speckled

56:08

dog, they can just say they can assimilate that

56:11

and different versions of the thing. I already understand.

56:13

When they see a horse, they might point

56:15

at it and go big dog, and

56:18

they're really is an attempt to assimilate,

56:20

Like I'm interpreting it and

56:23

look, it's got four legs, it walks on four legs,

56:25

it's non human, it's not wearing clothes. What's

56:28

going on here? And he said, no, no no, no, that's not a dog, that's

56:30

a horse. This requires

56:32

an accommodation moment because you need to create a category

56:34

that both horses and dogs can fit with it an overarching

56:37

category. And we're doing

56:39

that all the time. Like there are moments where we I

56:41

think of things like that have happened. Uh, politically,

56:43

I don't know how politically you want to give a let's like think

56:45

about the insurrection right the For

56:47

a lot of people, I have positive attitudes toward a certain

56:50

political persuasion, and people

56:52

within that positive attitude space did something

56:54

I don't like. So I have these two feelings. I'm I

56:56

feel negatively and I feel positively about what has

56:58

happened. You could accommodate

57:01

and say, well, it looks like people

57:04

who share my political views sometimes do bad things

57:06

and I need to like have a more complex view of things.

57:09

Or you could assimilate, which is often

57:11

how we get into conspiratorial thinking, and say,

57:13

well, I'm looking at this. What if they didn't do that at all?

57:16

What if those were actors? What if those were people

57:18

who are pretending to be people that agree with

57:20

me? So how do you explain from that? Here's

57:22

the fascinating thing. There was widespread

57:26

disapproval, especially from

57:28

Republican leadership, and then

57:30

very quickly, within

57:32

a sixty days, maybe even less thirty

57:35

days, that faded and then

57:37

it was just a bunch of tourists passing through

57:39

UH through Congress. So was

57:41

it just strictly that sort of tribal

57:45

thing that we needed to to everybody

57:48

to manage people just reverted

57:50

back to their tribalism because there was some

57:52

consensus for a brief period

57:54

and then it went straight back to partisan

57:57

politics. Was that there's a there was a long stretch,

57:59

and there always is where you're you're being pulled

58:01

in every direction. Uh, you know, I don't

58:03

want to make a blanket statement. Most people are pretty rational

58:05

about what happened there. But there's a certain ports

58:07

and the population that went very conspiratorial

58:10

with it, and there is a

58:12

deep crisis of how to

58:14

make sense of the world and where should I put my allegiances

58:17

and where my values expressed. And

58:19

what we would rather do is assimilate

58:22

if we can get away with it, because that allows us to maintain

58:24

our current model and move forward. And

58:26

if we can find an elite who says, no, it's

58:28

okay to think what you think. In fact, I agree with you.

58:31

If I can find peers who will who will support

58:33

me in that, if I can find groups having conversations

58:35

on the Internet who let me do this, I'll assimilate

58:38

and I'll stay within it, and as they say in psychology,

58:40

my social network will reassert its influence.

58:42

So one of the interesting things about the shift

58:45

in UH same sex marriage

58:47

opinion the US is how

58:49

sudden it was, and when we compare it to things

58:52

like abortion rights, Vietnam,

58:54

race voting, even marijuana, all

58:56

those things seem to have taken much

58:59

longer or why is that?

59:01

That was actually the first question I had. I thought that

59:03

that's what the book was going to be about. There's

59:05

a dozen different answers of that question.

59:08

That was sort of a confluence of psychological mechanisms.

59:10

The most influential part of his contact,

59:13

right. There's an idea in psychology

59:15

called pluralistic ignorance, where you know,

59:17

you ask a lot of people have will have a certain

59:19

feeling inside of them, attitude or value,

59:22

and they'll feel like that the only person within their community

59:24

who has that feeling, and unless you

59:26

surface the norm in some way, they are not They won't

59:28

be aware that there are so many other people who feel the same way

59:30

they feel. Surface the norm,

59:32

surface the norm, as they put it. When

59:34

I was asking political scientists about um

59:37

the shift and attitudes about same sex marriage, they

59:39

kept telling me this was the fastest recording shifts in public

59:42

opinion since we've been recording this since the twenties

59:44

and since then. There was an attitude shift

59:46

on COVID nineteam, which I put in the book

59:48

that was a little bit faster, but in this case

59:51

in which direction towards vaccination. Vaccination

59:53

yet which is kind of interesting because there

59:56

was an anti vaxor movement pre COVID

59:59

that was really kind fringe, and

1:00:01

I went to one of the conventions for the book best not in the book.

1:00:03

It was part of the cut material, the Lancet article

1:00:05

on what is an m R M

1:00:08

or R m R and remember which the Musles

1:00:10

rebella MOMPS vaccine,

1:00:13

which was substantly completely debunked.

1:00:15

But what ended up happening is that group

1:00:18

seems to gain a little bit of

1:00:20

momentum, the anti vaxers,

1:00:23

and yet even around the world

1:00:25

most countries are vaccinated,

1:00:29

most wealthy developed countries that with access

1:00:31

to the vaccine, the US is a laggered

1:00:34

um, less vaccinations,

1:00:37

less boosts, and the most per capita

1:00:39

deaths of any advanced

1:00:42

economy, which kind of raises

1:00:44

the question how much of an impact did

1:00:46

the anti vaxers have, even

1:00:48

though a lot of people eventually came around and got

1:00:50

the vaccine. The reason I like to talk about flat

1:00:52

earthers so much is because the same psychological mechanisms

1:00:55

are at play and everything else that we like to talk about. But most

1:00:57

most people assume they would never

1:00:59

be a flat right, But you don't necessarily

1:01:02

get that uniformity

1:01:04

when it comes to things like same sex matters or

1:01:06

or any or any political issue, that anything

1:01:09

becomes charged politically. And

1:01:11

I use flat earthers so much because they're pretty

1:01:13

much neutral and people can feel like they have some distance

1:01:16

from it, and the mechanisms. You can see those mechanisms at play,

1:01:18

and then I can say, and that's also in this

1:01:20

and you can see how it works the with

1:01:22

same sex marriage. That it's almost impossible

1:01:25

to believe this as a person talking to a microphone

1:01:27

right now, in this modern moment, like it wasn't

1:01:30

very long ago. The people argued

1:01:33

about this as vehemently as they argue about

1:01:35

like immigration and gun control and

1:01:37

everything else that's that's a wedge issue today.

1:01:39

And there were articles that would be

1:01:41

that would come out, but like this is something we'll

1:01:43

never get over. You should and you shouldn't talk about this and Thanksgiving

1:01:46

kind of things, right, And then the

1:01:49

course of about twelve years, but very

1:01:51

rapidly, of course, with three or four years, we went from

1:01:54

more than sixty percent of the country opposed to six

1:01:56

in the country in favor and around

1:01:58

two thousand twelve ish, and

1:02:00

the it seemed like, how could this

1:02:02

possibly have happened where it come from. And I want to understand

1:02:05

that too, because I thought if I could take most

1:02:07

of the of the country and put them on a time machine and

1:02:09

send them back a decade, would they argue with themselves

1:02:11

and what happened in between these two moments, and if they were going to

1:02:14

change their mind about this, what was preventing

1:02:16

them from changing their mind the whole time. One

1:02:18

answer to that is that a lot of things that have changed when

1:02:20

it comes to like social issues, people

1:02:23

were separate from one another in social contexts,

1:02:25

whereas with same sex marriage and other LGBTQ

1:02:28

issues, coming out was a very huge

1:02:30

part of that. Any movement that urged people to reveal

1:02:33

their identities and to live openly allow

1:02:36

people the opportunity to go, well,

1:02:38

oh my god, I have a family member like this. I have a person

1:02:40

who I care about who's being affected by this issue.

1:02:42

I have people who my plumber, my

1:02:45

my hairdresser, my brother and

1:02:47

my brother, all this whole world, and

1:02:49

then that contact was part of

1:02:51

that, right, I think that is

1:02:53

the key to this being so

1:02:55

stealthy. Why nobody saw it Because

1:02:58

you go from I know

1:03:00

a guy who's gay, or I know a

1:03:02

woman who's gay too. I know lots

1:03:05

of people who are gay, and over that ensuing

1:03:08

decade and the decade before, at

1:03:10

least from my perspective, it felt

1:03:13

like lots of people both

1:03:15

private and public personas were

1:03:17

coming out as getting and you know, you had

1:03:20

Ellen come out, which was a big deal, and

1:03:22

you had Will and Grace on TV. It

1:03:24

seemed like it was just, you know, the

1:03:26

momentum was building and it was an

1:03:28

exchange like that. And you talked about in the book

1:03:30

where where it's the cascade is

1:03:33

waiting for the network to be ready.

1:03:35

That's EXAs where I'm headed

1:03:37

up. Thanks for kicking me over to The

1:03:39

culture is being influenced by the social change,

1:03:42

and then the social change in reflection influence

1:03:44

of the culture, and this back and forth is what creates

1:03:46

a staggered acceleration of

1:03:48

the social change. Right, But what's

1:03:51

deep within that is understood a network science as

1:03:53

cascades. And the best way I could

1:03:55

like quickly explain what a cascade is is, Uh,

1:03:57

have you ever been to a party and everything seems to be going up

1:04:00

k and then all of a sudden everybody leaves and

1:04:02

you're like, what happened? Especially if with the host, or

1:04:04

have you ever like waited to get into a restaurant,

1:04:06

or if you remember back in a university

1:04:09

setting, you're waiting to get into a classroom

1:04:11

and uh, there's just a big line of people, and

1:04:13

then the door opens up and you could have

1:04:15

gone in at any time. These

1:04:18

are examples of cascades, up cascades and down castkades.

1:04:20

So in a school setting or a restaurant

1:04:22

setting, you're waiting a line. The first person that shows

1:04:25

up, they have an internal signal because

1:04:27

they have no information the doors closed. So

1:04:29

maybe in the past they tried to go to a classroom

1:04:32

and they open the door and everybody turned and looked at them, and they felt

1:04:34

real weird about it. Maybe they just have a

1:04:36

certain kind of social anxiety. There are all sorts

1:04:38

of nature nurture things to give them an internal

1:04:40

signal that says I should wait and see what's

1:04:42

going on. So they take out their phone, they're playing with it.

1:04:44

The second person that shows up, they don't just have

1:04:47

an internal signal. They have one human being who

1:04:49

seems to be waiting and maybe they know something I don't,

1:04:51

So whatever internal signal they have is magnified

1:04:54

by that. They start to wait. Once

1:04:56

you show up at a door and there are two people waiting and

1:04:58

you don't, you're you're pretty

1:05:00

sure you're gonna wait too. Once there are three people waiting

1:05:02

at a door, it's almost inevitable you're gonna

1:05:04

get a line of people waiting because they assume they're part of

1:05:07

something, and everybody knows something they don't, and

1:05:09

you're not with a cascade. The only thing that will break

1:05:11

the cascade is new information out of the system.

1:05:13

The door opens up and like the professor's like, why are

1:05:15

you waiting? Or somebody who looks at their watch

1:05:17

and it's like, I figure we should have been

1:05:19

in here by now. Or you

1:05:21

could have a real rabble rouse, or you could have a subversive

1:05:23

element. Somebody who's a punk. They have a low threshold

1:05:25

for conformity, you know, they're they're like, I don't care what people

1:05:27

think of me. I'm gonna open the door, and that person

1:05:29

will lead everybody in. So what you end up here

1:05:32

with our thresholds of conformity. Uh.

1:05:34

Some people need only a few people around

1:05:36

them to do something before they do it. Some people need a lot. And

1:05:38

any population is going to have a large

1:05:41

mix of people who have different thresholds of conformity.

1:05:43

If you think of it like an old chemistry

1:05:46

molecule with like balls and sticks connect

1:05:48

to it, each person is a node and each

1:05:50

node has a different threshold of conformity,

1:05:52

and that threshold conformity is influenced by how many

1:05:54

people they know, so how many sticks are connected to balls

1:05:57

around them, and you end up

1:05:59

with clumps and claw sters of people who have different

1:06:01

thresholds as a cluster. Let's

1:06:03

say you're at a party and they

1:06:05

want to go because you know they're they're tired of being

1:06:07

there, they have work in the morning or whatever. But there are other

1:06:09

people in the group who are like I would like to go, but

1:06:12

like, I can't just be the first person that leaves. So

1:06:14

the person who has a reason to leave, or they just

1:06:17

don't care what other people think they leave the party.

1:06:19

That that encourages the next group of people who needed

1:06:21

one more person to back them up to leave.

1:06:23

Now there are people who actually did. They wanted to stay

1:06:26

at the party, but hey, if everybody

1:06:28

else is leaving but they're threshold conform you just got me

1:06:30

out there, like I should probably go. And then now you

1:06:32

have the people who are really we're gonna stay all night,

1:06:34

and like I guess in the last person here and either they spend

1:06:36

the night on your couch or they leave and you're like, oh my god, what happened

1:06:39

to my party? This is cascades.

1:06:41

This is a It's a very fascinating part of human

1:06:43

psychology because we're talking about massive groups

1:06:45

of people, and you have a nation of people, you'll have

1:06:48

massive clusters of people that will

1:06:50

have different thresholds, and we often will have one

1:06:53

in that group, many of them called a percolating

1:06:55

local cluster. Anyone listening, who's

1:06:57

in this world? I hope you're happy that I found

1:06:59

your stuff. This stuff was totally unfamiliar

1:07:01

to me. The stuff goes into like diffusion

1:07:03

science and people studying how rocks sink

1:07:05

and floating local

1:07:08

climate. So here's the here's the best thing, uh,

1:07:11

I've ever seen about to explain this, you're

1:07:14

driving down this is Duncan Watts.

1:07:16

Yes, Duncan Watts, the great

1:07:19

sociologist. Everything is he

1:07:21

is. He gave this example to me and

1:07:24

thank him forever for it. You can imagine

1:07:26

a road that people are driving down in the middle

1:07:28

of a forest. There's uh, someone

1:07:30

who smokes a cigarette on the way to wherever they're

1:07:32

going, and they throw a cigarette out pretty much every time at

1:07:35

a certain spot in the forest. And they've been doing this for years and

1:07:37

nothing ever happens, and then one day,

1:07:39

uh, they tossed that cigarette out and it causes

1:07:42

a seven county fire. Now, if

1:07:45

you look at this from a sort of a great Man theory of history,

1:07:47

or you're looking for people who are innovators, if

1:07:49

you look at the whole old tipping point models and

1:07:52

things like that, you're looking for the mavens and the connectors

1:07:54

and everything. Well, it turns out the science doesn't really support

1:07:56

that very well. Has nothing to do with any

1:07:58

individual being more connect that are more powerful

1:08:00

or more savvy than anybody else. What it has to do with

1:08:03

is the susceptibility of the system to anybody

1:08:05

throwing out a cigarette, meaning how dry

1:08:07

or droughts stricken, something happened

1:08:10

in that system. What's going on with dry leaves,

1:08:12

with just the vulnerability of that the

1:08:15

vulnerable that's exactly how they that they're

1:08:17

phrasing the youth. The vulnerability of that particular aspect

1:08:20

of the network at that particular moment was quite vulnerable

1:08:22

to any nudge, any impact

1:08:24

any strike. And the thing that

1:08:26

really struck me about his example

1:08:28

was it could have been a cigarette he tossed out, it

1:08:31

could have been a lightning bolt, it could have been a nuclear

1:08:33

bomb. It didn't matter how powerful it

1:08:35

was, and it didn't matter how connected the person was. You don't

1:08:37

think of it any connection. Uh. In the science

1:08:40

of that connectivity everything, it doesn't matter that

1:08:42

the cluster was vulnerable at that point. And

1:08:44

any complex system is going to be like the surface

1:08:46

of the ocean. There are is constantly

1:08:49

moving around. So if you think of that molecule

1:08:51

model of human connectivity, it's constantly morphing

1:08:53

and changing as people their relationships

1:08:55

change and they move from one group to another. So

1:08:58

the point that's vulnerable is all is

1:09:00

moving. So how do you affect

1:09:02

great change like same sex marriage

1:09:04

or any other social issue that we've seen in the past. You

1:09:07

have to strike at the system relentlessly,

1:09:10

and if you're an individual, you need to get

1:09:12

as many people on your your group to

1:09:14

strike together, and because eventually

1:09:17

you're going to be the lip match in the dry forest.

1:09:19

That's the idea, and you have to let luck

1:09:22

be a big part of it, because you're trying to find the percolating

1:09:24

local cluster that will create the cascade

1:09:26

that will cascade all along the network

1:09:28

because you're different, thresholds of conformity are

1:09:30

moving in and out of the networks that you're affecting. If you

1:09:33

look through any of the history of people

1:09:35

who who have affected great social change, especially

1:09:37

history of the United States, UH, they

1:09:39

had figured out some system by which to get

1:09:42

a lot of people together to strike at the system

1:09:44

relentlessly, and they were indefatigable.

1:09:46

And that was the most important aspect of the whole thing.

1:09:49

And there are all sorts of other ways to nudge and move around,

1:09:51

but that seems to be the essence of it. And that

1:09:54

example from COVID

1:09:56

nineteen, that's what the fastest social

1:09:58

change now ever recorded. They use this. What

1:10:01

they did is they it was people who were very

1:10:04

hesitant to get vaccinated because it was in the UK.

1:10:06

People in certain religious communities were very hesitant

1:10:09

because of their past with the you know, the government

1:10:11

of the UK, and they didn't want to necessarily allow

1:10:13

these government entities they didn't understand very

1:10:15

well to take a needle and put something they didn't understand very

1:10:17

well in their bodies. So organizations

1:10:20

got together with mosques and said

1:10:23

here's the sites where we'll have the vaccinations. And they

1:10:26

they got the the elites within that religious

1:10:28

community too, to be the first to vaccinate.

1:10:30

And so what you end up with is you have this wave

1:10:32

effect of the least hesitant

1:10:35

among the most hesitants. So these are people with the threshold,

1:10:37

the threshold of performity where they will go, well, all

1:10:39

I need is one person I trust to do this, they

1:10:42

get vaccinated. Well, that's a new wave of

1:10:44

people who are vaccinated. So that next level of

1:10:46

hesitancy says, well, this number of people

1:10:48

that I trust have been vaccinated, I'll get vaccinated.

1:10:50

So now you have that next level of hesitancy that they've

1:10:53

been they've been saying, told two friends. And

1:10:55

you eventually wave your way through the cascades

1:10:58

so that when you get to that middle hump that's very hard

1:11:00

to get over, you have so many people vaccinated around

1:11:02

you, it seems kind of weird that you wouldn't be And

1:11:04

it's okay. You only some of the holdouts may

1:11:06

take forever, the last people to buy the facts machine

1:11:08

or whatever, but they're in a world. But you people

1:11:11

that have already, and that's what we're aiming for and so there

1:11:13

are ways to UH to catalyze the cascade

1:11:15

effects, but you have to you have to think of it in terms

1:11:18

of the diffusion model. In this regard

1:11:20

is not that old fashioned, the

1:11:23

early adopter holdout model. It's it's

1:11:25

waves of conformity via the thresholds

1:11:27

cafformity, where you want to build up by

1:11:29

saying this group influences this group together,

1:11:32

they become a new unit, and so on and so on. Quite

1:11:34

quite intriguing. So let's

1:11:36

talk a little bit about this

1:11:39

evolutionary baggage that we have. It

1:11:41

seems that so much

1:11:44

of our decision making UH

1:11:47

is affected by mechanisms

1:11:49

and processes which worked

1:11:52

great on the savannah, but in

1:11:54

a modern world don't

1:11:56

really seem to help us and sometimes hurt us.

1:11:58

Yeah, yeah, the I

1:12:00

mean, that's been a big part of all of my work of the all

1:12:03

these things are adaptive, as that's the phrase you want to use,

1:12:05

like, in all things being equal,

1:12:07

this is probably the best thing to do. But

1:12:10

we get in certain situations where that are unique

1:12:12

to modern life, and it turns out that he can get us in trouble.

1:12:15

So that's the the baggage you're talking

1:12:17

about. Is one of those things where most of the time it

1:12:19

serves as well, but in very specific situations

1:12:21

it goes the other way. Huh. Really

1:12:24

intriguing there. There's there

1:12:26

is some specific evolutionary

1:12:29

or adaptive issues

1:12:31

that come up. Why

1:12:33

do humans argue?

1:12:36

And why is that really a social

1:12:38

dynamic that we all do

1:12:41

when we all evolved to do? I? You

1:12:43

know, this is one of my favorite things that

1:12:45

that change the way I see the world. In

1:12:47

researching the book The Um,

1:12:49

a lot of this also goes back to the interactions model.

1:12:52

With that Mercy and Sperber helped put together,

1:12:54

UM, why would we argue? Well,

1:12:57

the human beings have this

1:12:59

nice, uh, complex and dense

1:13:01

communication system that eventually became

1:13:03

language, and we depend

1:13:06

very much on the signals from other units

1:13:08

in our social network to help us understand

1:13:10

what's going on, to make plans, to

1:13:12

settle on goals, shared

1:13:15

goals, to decide to just do stuff, and

1:13:17

so we do a lot of deliberating and arguing

1:13:19

that space. The problem is,

1:13:22

uh, imagine it like, UM, there

1:13:24

are three people, three proto humans are on a

1:13:26

hill and they're all looking in different directions,

1:13:29

and the none of us can

1:13:31

see what the other two can see, so you

1:13:34

would benefit from some sort of worldview

1:13:36

that is the combination of all three perspectives.

1:13:39

So if I do trust

1:13:42

these people, I know them pretty well, and we're

1:13:44

talking about going to a certain place in the in the forest

1:13:47

together or something. One persons

1:13:49

for it, one persons against it, I'll know

1:13:51

that the person who's for it is young. It's

1:13:54

the first time out. They don't know much about the world.

1:13:56

They're eager to show what they can do. That's

1:13:58

that's that's why they like that. The other news

1:14:00

hesitant. They were in a bear attack two years ago, and

1:14:02

they're I don't know if they really are there,

1:14:04

maybe a little over scared. This is also

1:14:07

I have a pretty good idea of what how to modulate

1:14:09

my trust about when it comes into the deliberation

1:14:11

process. The more

1:14:13

people involved in that process, the more complex

1:14:15

against the more I have to worry about. People

1:14:17

could be misleading me. They could

1:14:20

be wrong just about about no fault

1:14:22

of their own, or they could be purposely

1:14:24

misleading me because they want to get an advantage

1:14:26

over me. So um they use

1:14:28

the phrase we have a built in epistemic

1:14:30

vigilance for when people might be misleading

1:14:32

us. The and that serves

1:14:34

as well too. The only problem is that can lead

1:14:37

to something they call a trust bottleneck. And

1:14:39

a trust bottleneck is when someone

1:14:41

does actually in our group come up with a very innovative

1:14:44

idea. Maybe it's a some

1:14:46

sort of invention. They've created some sort of new way of

1:14:48

doing something. They have an idea about going to

1:14:50

a new territory where they're there's good things for

1:14:52

us to go do there. But it's there's a

1:14:54

risk and reward in it and this, but this

1:14:56

person really is right. If we get into

1:14:59

an argumentation prod yus that's too epistemically

1:15:01

vigilant, then we will end up not doing

1:15:03

the thing that could benefit the group, And so we have this

1:15:05

trust bottleneck that could prevent the calls groups

1:15:08

to stagnate. So we developed

1:15:10

another evolutionary mechanism

1:15:12

to get past trust bottlenecks, and that is arguing

1:15:14

itself. The argumentation process

1:15:17

is how we get through the trust bottleneck created by epistemic

1:15:19

vigilance and go ahead. Ye, So I was gonna

1:15:21

ask, why are we so good at picking

1:15:23

other people's arguments apart and

1:15:25

so terrible at objectively

1:15:28

evaluating all loves so much? There's a uh

1:15:31

this it reminds me something the psychology called the Solomon paradox.

1:15:33

I think it's in business too, with the we're

1:15:36

really good at giving out advice, it's very hard for us

1:15:38

to actually employ in our own lives. Like you, when

1:15:40

somebody has a problem, they tell you and you're like,

1:15:42

here's what you ought to do, But then when you have that exact same problem,

1:15:45

you don't do that thing. Um. There's

1:15:47

some really cool research recently where they have people put on VR

1:15:49

headsets and they walk into a room

1:15:51

in virtuality and see uh Freud

1:15:53

sitting there and uh Freud

1:15:56

says, tell me about your problems, and they sit down and they explained

1:15:58

the problem they're they're having, and then they

1:16:00

run the second time, but the second time, you are

1:16:02

Freud and uh

1:16:05

you see yourself walk in. It's all been recorded.

1:16:07

They even have an avatar with your face and you

1:16:09

hear the audio of yourself telling your

1:16:11

telling you as Freud what your problems are. And

1:16:14

they have around a sixty eight percent success

1:16:16

rate of the person having a breakthrough, Oh

1:16:18

I see what I ought to do now that they couldn't do on their

1:16:20

own. They need to get into this dynamic that we're talking

1:16:22

about meaning looking at it from

1:16:24

with through a different path they need to be they had to get that

1:16:27

evaluation phase. So um,

1:16:29

we have two cognitive mechanisms to really simplify

1:16:32

this. One for the production of arguments,

1:16:34

for the production of justifications

1:16:36

and rationalizations, reasons

1:16:39

why we are doing something. It's important

1:16:41

that in psychology reason is not the big our reason

1:16:43

of philosophy with propositional logic and

1:16:45

all that. It's just coming up for with reasons

1:16:47

for what you think totally rationalization,

1:16:50

rationalization and justification and in some cases

1:16:52

just explanation and why

1:16:55

why do we do this? Well, the the interaccess

1:16:58

model is because we're always imagined the

1:17:00

audience that's going to be receiving the information. That's

1:17:02

why you in your shower you're thinking of

1:17:04

how you're gonna really stick it to that person

1:17:06

on Reddit that you've been arguing with all day? Right? Why?

1:17:09

Because that's the that's how that's

1:17:11

how we produce reasons. But we also do it alone. Like if

1:17:13

I'm imagining I want to buy something on Amazon or

1:17:15

I want to take a trip somewhere, You'll start rationalizing

1:17:18

and justifying it to yourself. And when you when

1:17:20

you want a piece of cake, you will come up with with a

1:17:22

justification. We're getting the cake right, like I didn't need

1:17:24

anything to day, or I did or exercise yesterday,

1:17:26

or whatever it is you need to do. You want to do it, but

1:17:28

you need a justification for it. There's

1:17:31

a humongous body of evidence that we don't even

1:17:33

make the decisions that are best. We only make the decision that's

1:17:35

easiest to justify. And um,

1:17:37

huge mercy and spurn all these great experiments

1:17:40

where they have people um and one of them, they had

1:17:42

people they solve these

1:17:44

word problems, and then they would mix

1:17:46

the answers up and have people evaluate

1:17:48

other people who have been looking at the word problem. But

1:17:50

of course the trick is when one of the answers is

1:17:52

their own, and they would find that when people

1:17:55

were thinking that they were evaluating other people's

1:17:58

arguments, they'd find the holes in

1:18:00

their own like thinking and their own reasoning,

1:18:02

But if they thought they were looking at their own arguments, they'd

1:18:04

usually miss it. So it's an effective

1:18:07

trick. Maybe trick is the wrong

1:18:09

word, but it's an effective technique to

1:18:11

get people to objectively self

1:18:14

analyzes, to make them believe they're

1:18:16

criticizing someone else's argus right, So the

1:18:19

there and what it seems to be the function

1:18:21

here. Why this is so adaptive is

1:18:23

that under a lot of pressure or doesn't

1:18:26

even it doesn't even need to be a group selection process. It's

1:18:28

just simply how the math works out. If

1:18:31

you have a lot of different people with a lot of different

1:18:33

experiences, and they have a lot of different value

1:18:35

sets, and they have a lot of different skill sets, and you're

1:18:37

facing a problem, you're trying to come up with a solution to it,

1:18:39

or you have a goal you want to reach it, you

1:18:41

will be much more effective as a group if everybody

1:18:44

presents their biased individual

1:18:46

perspective and they

1:18:48

don't put a lot of cognitive effort into the production

1:18:50

of it, make it easy, cheap, and

1:18:52

bias. Then you offload the cognitive

1:18:55

labor to that evaluation process that twelve

1:18:57

angry men experience where everyone

1:18:59

looks at each other arguments and goes, okay, this

1:19:01

that, this, that, this that, and over time

1:19:03

that's developed these two mechanisms we have this. Uh.

1:19:05

That's why as individuals, and that's not

1:19:08

one of the biggest problems on the Internet is that we

1:19:10

we do a lot of our deliberating these days

1:19:12

and context that incentivize

1:19:14

the production of arguments, but don't really give

1:19:17

us much opportunity to go through that evaluation

1:19:19

together. There's a phrase you had

1:19:21

in the book that that caught my eye. Debate

1:19:24

leads those who are wrong to change

1:19:26

their minds, and as a group, you want

1:19:28

to get to the best decision, the best outcome.

1:19:31

On the Internet, it's not as much

1:19:34

a real collaborative discussion

1:19:36

argument debate as it is just people yelling

1:19:39

past each other. Yeah, but it feels like, you

1:19:41

know, it looks like a real debate,

1:19:43

but it's not. Yeah, I feel like I'm doing that, you know,

1:19:45

I feel like I'm participating in some sort of

1:19:48

marketplace of ideas. It seems like

1:19:50

I'm doing that. But the way the

1:19:52

platforms are currently set up, for the most part,

1:19:54

it's just people yelling people like

1:19:56

writing on a piece of paper what they think, feeling believe

1:19:58

in, dumping it onto a big pie, and then

1:20:00

other people running through the pile and guard like there's

1:20:03

it's not like a twelve angry man. We're

1:20:05

not actually sitting in a circle and

1:20:07

and or you know, it's not like a dinner party where we're

1:20:10

I'm sure you've had dinner parties or had guests

1:20:12

over who have really wildly different political

1:20:14

views in you and you didn't like get into twitter

1:20:16

mind with them. You talked it out in some

1:20:18

way. That is that that aspect is something

1:20:20

we've yet to tweak the system to allow us in certain

1:20:23

contexts. There was a very amusing cartoon.

1:20:25

I don't remember who who's it was,

1:20:28

but the line was,

1:20:30

what did you do when the United States was overthrown

1:20:33

in the early twenty one century? Oh?

1:20:35

I tweeted my disapproval for and

1:20:38

it just you know what, what

1:20:40

is a hundred forty characters?

1:20:43

It's just it scrolls by

1:20:45

instantly. It's not really that

1:20:48

sort of engaged discussion. I don't mean

1:20:50

to be like, you know, I don't mean to to poople on

1:20:52

social media. It's it's great for what it is. It's just that

1:20:54

it is. But it also is what it is like, it's

1:20:57

been a It's a great tool for giving

1:20:59

voice to people who haven't been part of the conversation

1:21:01

in a long time. It's a great way to gauge

1:21:04

what are people thinking and feeling. But if we want

1:21:06

to do the deliberation thing, the argumentation

1:21:08

thing that moves things around, it's not so

1:21:10

great at that yet and the question

1:21:13

is will will it ever be? Um?

1:21:16

So, so you mentioned twelve angry men.

1:21:18

This is a this is a great line in your

1:21:20

book. All culture is

1:21:22

twelve angry men at scale. Yeah,

1:21:25

go into some detail about you know, as

1:21:27

plays right off what we were just discussing, like the

1:21:30

the everything we've ever achieved as species

1:21:32

of note as came out of a lot of people disagreeing

1:21:35

and then like sorting it out and there

1:21:37

are um we've been great at

1:21:39

creating some some institutions that do

1:21:41

this on purpose, like science when it's

1:21:44

done well is a group of people debating

1:21:46

and argument with the other and trying to tear each other's ideas. But

1:21:48

there's a good faith in signs and

1:21:51

elsewhere that you may not get on

1:21:53

on Reddit or Twitter. It's so crucial

1:21:55

to create creating the rules of the game, and we all play

1:21:57

by it. And you I've

1:22:00

that you if I meet you on the street or I meet

1:22:02

you on the internet, like we may not be in a good

1:22:04

faith environment, we're going to play by those rules that that

1:22:07

That's why it was so nice to create these systems

1:22:09

of argumentation like law and UH

1:22:11

medicine and academia, the and

1:22:14

most of the people that we I'm very against

1:22:17

the great Man theory of things that where you imagine

1:22:19

single inventors coming up with amazing insights

1:22:21

like no one ever does anything in isolation like that. The

1:22:24

and a lot of the even the people we've lauded throughout history,

1:22:26

they had either someone that they bounced ideas with

1:22:29

across and against, or they

1:22:32

collaborated with, or they

1:22:34

were absolutely assaulted over and over again by

1:22:36

people who disagreed with them, and they had to refine their arguments

1:22:38

in the presence of all of that. And

1:22:41

that's why I talk about culture being twelve agrement scale

1:22:43

like once any like society

1:22:46

five figures out a way to institutionalize those things.

1:22:48

That's when you get those massive leaps, and both

1:22:51

in the social domain and the political domain and the scientific

1:22:53

and technological domains. So so,

1:22:55

given all of these things we've been talking

1:22:57

about, from tribalism

1:23:00

to identity, how do we get

1:23:02

people to actually change their mind?

1:23:04

What are the three key things people

1:23:07

need to have happened

1:23:09

to them in order to get a

1:23:11

major shift in their position. We

1:23:13

know it would be difficult, I think, to pick just three things,

1:23:16

but I can think of a couple of things that we fit in here. I

1:23:18

think one thing I want people to understand

1:23:20

is all persuasion and self persuasion, uh

1:23:23

most mostly when it comes to change people's minds,

1:23:25

what you're trying to do is alert them the fact that they could

1:23:28

change their mind. So a little bit of socratic

1:23:30

process is you you're guiding

1:23:33

them to something and if they're not willing,

1:23:35

then they're never going to change them. And it's you know, we

1:23:38

all we talk a lot about how facts don't seem to work

1:23:40

so well. Uh, that's only because the

1:23:42

usually when you start arguing somebody over an issue

1:23:45

you want to present them, you'll say like, hey, read this book,

1:23:47

Hey watch this YouTube video, Hey go to this website,

1:23:50

and you're like, that should do it. But how's that?

1:23:52

Has that ever happened to you? Like, never has anyone

1:23:54

sent me a YouTube video? And I've been like, oh, okay,

1:23:56

I never know it though, change

1:23:59

my mind, said nobody. And that's the idea

1:24:01

of that is you there's a reasoning.

1:24:03

There's a chain of processing involved in reasoning

1:24:05

where you are probably unaware that you went

1:24:07

through all this and it landed on a particular conclusion

1:24:09

because it made sense to you had matched

1:24:11

your values and your attitudes and your beliefs on the matter,

1:24:14

and you have to afford the other person

1:24:16

the opportunity to go through that same process. You

1:24:18

can't meet them at the level of the conclusion, because what ends

1:24:21

up happening you just start tossing these these

1:24:23

facts that support your position at each

1:24:25

other instead of having a conversation

1:24:27

in which we're looking at the issue together. Right. So

1:24:29

that's one thing, is like you can't copy and paste

1:24:31

your reasoning into another person. And when you're trying

1:24:33

to argue just based off facts and links and stuff,

1:24:36

that's really what you're suggesting they ought to do. So

1:24:39

all persuasion self persuasion. I have to open up

1:24:41

a space for you to explore your own reasoning, and

1:24:43

I have to open up a space for you to to entertain different

1:24:46

perspectives and to think about where your stuff comes from.

1:24:48

Was what we did earlier in the conversation. Secondly,

1:24:50

you have to recognize that we're

1:24:53

social creatures, so people are influenced

1:24:55

by the signaling and the expectations

1:24:57

of the people around them. If you say

1:25:00

anything to that person that could be interpreted as you ought

1:25:02

to be ashamed for what you think, feeling, believe, conversations

1:25:04

over. At that point, no one was willing

1:25:07

to be ostracized. The great sociologist

1:25:09

Brooke Harrington told me that there was an Eagles mc Square

1:25:11

of social science. It would be social

1:25:14

death. The fear of social death is greater than the fear of

1:25:16

physical death. Literally a quote I have written down

1:25:18

because I thought it was so so, and

1:25:21

she ran me through a hundred examples where this is the true, from

1:25:23

war to excommunication, to go

1:25:26

down the list. It is social. Social

1:25:28

death is actual death in most of

1:25:30

history. And I don't care who you are, what kind of profession

1:25:32

you're in. You're worried about other people around

1:25:35

you, and that profession think about you, and you're modulating

1:25:37

your behavior to go with and your modulating your beliefs, attitudes

1:25:39

and values. And when

1:25:41

it comes down to it, if the situation

1:25:43

requires it, you'll put your reputation

1:25:46

on the lifeboat and you will let your body

1:25:48

sink to the bottom of the ocean if that's the situation you're

1:25:50

put in. And Dueling and all

1:25:52

those things we do,

1:25:55

I'll talk all about the book Dueling Last a long time

1:25:57

was really peculiar, but it was just this system out of

1:25:59

ann troll. And if I'm

1:26:01

trying to discuss an issue with you and I've

1:26:04

put you in that state of mind, you

1:26:06

there's no what you're what you're gonna do is react.

1:26:08

You're gonna push back against me. Then I'm

1:26:10

going to get feel that feeling. I'm going to push back against

1:26:12

you, and then you push back harder, I push back harder, and

1:26:14

we end up in that stupid phrase of well,

1:26:17

let's agree to disagree. Well, we already agree to disagree.

1:26:19

That's how we sat down here, right. We're really saying

1:26:21

is stopped talking to me, And that's what that is, is

1:26:24

a nice we're agreeing to stop arguing

1:26:26

with the data. We're agreeing to to never actually

1:26:28

advance this issue and never talk to each other again. So

1:26:31

never open up the conversation with anything

1:26:34

that could be interpreted as you ought to be ashamed, even

1:26:36

if they should be ashamed of what they're feeling and thinking.

1:26:38

If you're hoping to persuade them, you have to not do that.

1:26:41

And then the so be aware that they're a social

1:26:43

primate. You're a social primate. Never try to copy

1:26:45

and paste your reading and the other person and

1:26:48

the most important part is that you have

1:26:50

to get out of debate frame.

1:26:53

Don't don't create a dynamic work.

1:26:55

I want to win, I want you to lose. I want to show that I'm right

1:26:57

and you're wrong. This

1:26:59

is this is the most crucial

1:27:02

thing. If you take nothing else away from it, take this, think

1:27:04

of it. We're like hum, I find you a reasonable,

1:27:07

rational, interesting human being, and it's ad that I

1:27:09

disagree with you on this. I wonder why

1:27:11

I disagree with you? Our disagreement

1:27:14

is a mystery. What if we teamed up to solve

1:27:16

a mystery together why we disagree? And

1:27:18

now we're taking all these things that are adaptive

1:27:21

and usually in a way that could actually get us

1:27:23

further along and settle. And what might actually

1:27:25

happen is we both realized we're both wrong. What

1:27:28

we we get to vin diagram ourselves.

1:27:30

So you go from face off to shoulder shoulder

1:27:32

and this if There are many different ways to go about

1:27:34

it, but once you get in that dynamic, you're much more

1:27:36

likely to persuade each other

1:27:38

of something and move the attitude the room

1:27:41

quite fascinating. So let's jump to our

1:27:43

speed round. I'm going to ask all

1:27:45

these questions thirty seconds or less.

1:27:47

I'm gonna do my best. These are These

1:27:49

are what we ask all of our guests, starting

1:27:52

with what are you streaming

1:27:54

or listening? To? Tell us what what Netflix,

1:27:57

Amazon Prime podcasts kept

1:27:59

you entertained the past couple of years? Cool very quickly.

1:28:01

My favorite podcast has always been are still is

1:28:03

Decoder Ring? I recommended to everybody.

1:28:05

I love it. Will A Paskin is amazing best

1:28:07

show US stream and recently it's definitely Severance.

1:28:09

Everybody should have seen Severance by now. Also

1:28:12

the rehearsal. You can see the kind of stuff that I like watch.

1:28:15

Someone just recommended the rehearsal and said

1:28:17

it reminded them of Severance

1:28:19

and how out there? Ye watch that? And then

1:28:21

like, I'm among those people that please video

1:28:23

games the highest form of art. You definitely

1:28:26

play Death Stranding And

1:28:28

I replayed BioShock recently because I interviewed Douglas

1:28:30

rush Kof and we were talking about BioShock and it

1:28:32

still holds up. Who are some of your mentors

1:28:34

who helped you develop your your

1:28:37

view of psychology and cognitive

1:28:39

issues? And you know, persuasion

1:28:42

Gene Edwards my first, like the first psychology

1:28:44

professor that took me inside aside and said

1:28:46

let's be friends and really talking about it. I owe a lot

1:28:49

to her, you know people who I've met in real

1:28:51

life. Whoever. James Burke is the most

1:28:53

influential person in my life. I loved his

1:28:55

show years ago. I think it was BBC

1:28:57

How the Universe Change, How the Universe change? And

1:28:59

can actions? And I can Connections only

1:29:02

for people listening to this. I worked with Johnson

1:29:04

and UH, James Burke all over all

1:29:06

throughout COVID to develop a new Connection series

1:29:09

really and I can't say anything else

1:29:11

about it, but it will be coming out with in the next year. Very

1:29:13

exciting. I love his stuff. What are

1:29:15

some of your favorite books? And what are you reading

1:29:17

right now? Let me say, as far as authors,

1:29:20

I love UH, John Jeremiah Sullivan,

1:29:22

UM, Charlie Laduff, Michael

1:29:25

Perry, Larry Brown, all these there are

1:29:27

either people who are in Southern Gothic literature or

1:29:30

or the Southern Gothic literature version of journalism.

1:29:32

I can't get enough of that stuff. Our last two questions,

1:29:34

what sort of advice would you give to a recent

1:29:36

college grad who was interested

1:29:39

in a career of either journalism

1:29:41

or psychology or anything related

1:29:44

to to your field. I'll give you. I'll give

1:29:46

you two solid pieces of advice that aren't

1:29:48

just high minded like that sounds nice, and they can

1:29:50

put it on postcard things. This is what you gotta do. Number

1:29:52

one, Email the people that you admire

1:29:54

or the people you'd like to interview. I have about

1:29:56

a seventy percent success rate of when

1:29:59

I was good starting out of people. They will at least email

1:30:01

you back and say I can't talk, but you'd be surprised

1:30:03

how many people are willing to talk to you. Just do

1:30:05

that, and then on this on the back end, make content

1:30:07

out of that and give it away for free until you build up

1:30:09

an audience. We now live in an environment

1:30:12

we've been living in it for about twenty years now where

1:30:14

the people who are going to offer their hand to get you on

1:30:16

stage, they care about whether or not you have an audience. Yet

1:30:18

you can build that audience without anybody's permission

1:30:21

right now, and you can do that by making content

1:30:23

on TikTok YouTube, putting it out on a medium

1:30:25

wherever you put your stuff. So do those two

1:30:27

things back to back. Email the people you want, and

1:30:29

they make content out of those emails and give

1:30:32

it away for free into you have an audience, develop your

1:30:34

voice. Love that idea. Final question,

1:30:37

what do you know about the world of psychology,

1:30:40

changing minds and persuasion today

1:30:42

that you wish you knew twenty or so

1:30:44

years ago when you were first getting started. No

1:30:47

one's unreachable, No one's unpersuadable.

1:30:50

There's no such thing. And I think of it

1:30:52

more like if you try to reach the moon with a ladder,

1:30:55

you'll fail, and if you assume

1:30:57

from that that the moon is unreachable, then you really learn

1:30:59

nothing right. And that's what I actually had thought

1:31:01

for a long time. And it turns out the frustration I was feeling

1:31:04

to where other people should have been directed to myself for not

1:31:06

trying to understand, well, why is this

1:31:08

not working the way I thought it should work. The assumption

1:31:11

that they're stupid or they're misled, are

1:31:13

there nefarious in some way?

1:31:16

That was a real misconception

1:31:19

on my part, the misconception that people are

1:31:21

just absolutely unreachable and unpersuadable.

1:31:25

I have, through the work of this book changed

1:31:27

my mind. Thank you, David for being so generous

1:31:29

with your time. We have been speaking

1:31:32

with David McRaney, the award

1:31:34

winning science journalist and author

1:31:36

of the book How Minds Change

1:31:38

The Surprising science of belief,

1:31:41

opinion, and Persuasion. If

1:31:44

you enjoy this conversation, be sure and check

1:31:46

out any of our four hundred previous discussions

1:31:48

over the past eight years. You can

1:31:50

find those at iTunes, Spotify,

1:31:53

YouTube, wherever you feed your podcast

1:31:55

fix. You can sign up from my

1:31:58

daily reading list at rittle dot

1:32:00

com. Follow me on Twitter at Ridholts. I

1:32:03

would be remiss if I did not thank the crack team

1:32:05

that helps put these conversations together each

1:32:07

week. Justin Milner was my audio

1:32:09

engineer. Atico val Bron

1:32:12

is my project manager. Paris

1:32:14

Wald is my producer. Sean

1:32:16

Russo is my head of research.

1:32:19

I'm Barry Ridholts. You've been listening

1:32:21

to Masters in Business on Bloomberg

1:32:23

RADIOA

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features