How You Can Predict The Future Better Than World-Famous Experts - The Art & Science of Risk with Dan Gardner

How You Can Predict The Future Better Than World-Famous Experts - The Art & Science of Risk with Dan Gardner

Released Thursday, 26th October 2023
Good episode? Give it some love!
How You Can Predict The Future Better Than World-Famous Experts - The Art & Science of Risk with Dan Gardner

How You Can Predict The Future Better Than World-Famous Experts - The Art & Science of Risk with Dan Gardner

How You Can Predict The Future Better Than World-Famous Experts - The Art & Science of Risk with Dan Gardner

How You Can Predict The Future Better Than World-Famous Experts - The Art & Science of Risk with Dan Gardner

Thursday, 26th October 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:06

Welcome to the science of success with

0:08

your host,

0:09

Matt Bodner. Welcome

0:12

to the science of success. I'm your host,

0:15

Matt Bodner. I'm an entrepreneur and investor

0:17

in Nashville, Tennessee, and I'm obsessed with

0:19

the mindset of success and the psychology

0:22

of performance. I've read hundreds of books, conducted

0:25

countless hours of research and study, and

0:27

I'm going to take you on a journey into the human

0:29

mind and what makes peak performers tick

0:32

with a focus on always having our discussions rooted

0:34

in psychological research and scientific

0:36

fact, not opinion. In

0:38

this episode, we discuss the radical mismatch

0:40

between your intuitive sense of risk and

0:42

the actual risks you face. We look

0:45

at why most experts and forecasters

0:47

are less accurate than dart throwing monkeys.

0:50

We talk about how to simply and dramatically

0:52

reduce the risk of most of the major

0:55

dangers in your life. We explore the

0:57

results from the good judgment project, which

0:59

is a study of more than 20,000 forecasts.

1:02

We talk about what super forecasters

1:05

are, how they beat prediction markets, how

1:07

they beat intelligence

1:08

analysts with classified information and

1:10

software algorithms to make the best possible

1:13

forecasts and much more with

1:15

Dan Gardner.

1:17

The science of success continues to grow with

1:19

more than 650,000 downloads, listeners in over

1:22

a hundred countries hitting number one new and noteworthy

1:25

and more. A lot of our listeners are curious

1:27

how to organize and remember all this information.

1:29

I get listener emails all the time asking

1:31

me, Matt, how do you keep track of everything?

1:34

How do you keep track of these interviews, podcasts,

1:36

books that you read, studies that you read, all

1:38

this incredible information. I've developed

1:41

a system from reading hundreds of books, from doing

1:43

all this research, from interviewing these incredible experts,

1:45

and I put it all in a

1:47

free PDF that you can get. All

1:49

you have to do is text the word smarter,

1:51

that's S-M-A-R-T-E-R to the number 44222.

1:55

It's a free guide we created called how to

1:57

organize and remember everything.

3:23

started

4:00

to study psychology heavily and career

4:04

ever since. And it's really been an

4:06

interesting experience because when you change

4:08

your understanding of how people

4:11

think, how they perceive, how they decide,

4:14

you change your understanding of people generally.

4:17

And it was a real watershed

4:19

in my life. So what is risk

4:22

perception psychology? I'm really curious. Oh,

4:24

well, basically, it feels the psychology

4:26

that goes back to the 1970s when,

4:29

as you may know, there was large

4:31

and growing controversy about the safety of nuclear

4:34

power. The nuclear engineers

4:36

would say, you know, look at our data, it's

4:39

okay, it's safe, don't worry about

4:41

it. And the public was worried about it regardless.

4:44

And it didn't matter how many numbers

4:46

they were shown, they got more and more worried. And

4:49

so that was the point at which psychologists

4:51

got involved to say, well, how do

4:53

people make these judgments about risk? If

4:55

they're not making it on the basis of available

4:57

data, how are they making these

4:59

judgments? Why are they so much more worried

5:02

than the nuclear engineers say they should be? And the

5:05

bottom line on that is that risk

5:07

perception is in large part

5:10

intuitive. It's felt. If

5:12

you feel that something

5:14

is a threat, you will take it seriously.

5:17

If you don't feel that, you

5:19

won't. And generally

5:22

speaking, that applies to

5:24

any risk. And sometimes

5:26

that works. Sometimes our intuitive

5:29

understanding of risk or intuitive sense of risk

5:31

is very accurate, and will keep us out of danger.

5:33

And sometimes it is horribly

5:35

inaccurate, and it will not help us

5:38

whatsoever. A simple example is after

5:40

9-11, of course, we also saw the jet

5:43

flying to the tower, we saw what

5:46

happened afterward, and all sorts of folks

5:48

became terrified of flying,

5:50

thinking that they would be the next victims of deadly

5:52

hijackings. And so but they still

5:54

had to get around. So what did they do? Well, they started

5:57

driving instead because that didn't

5:59

feel

9:59

to think a lot about their

10:02

thinking. Psychologists call that metacognition.

10:04

They think about their thinking. So they

10:07

tend to be the sorts of people who say, okay, this is what

10:09

I think, here's my conclusion, but

10:11

does it really make sense? Is

10:14

it really supported by evidence? Am

10:16

I looking at the evidence in an unbiased fashion?

10:18

Have I overlooked other possible

10:21

explanations? And as I say, when

10:23

you look at people with good judgment, you find that

10:25

they have that introspection in spades. My

10:27

favorite illustration of that is George Soros.

10:30

George Soros is, of course, today's controversial

10:33

because of politics, but just forget that. Remember

10:35

that George Soros from the 1950s to the

10:38

1980s was an incredibly successful investor,

10:41

and particularly during the 1970s, that was impressive

10:43

because, of course, that was a terrible time to be an investor,

10:45

and yet he was very successful during that

10:47

time. And the interesting thing

10:50

is when George Soros was asked, you know,

10:52

George, why are you so good? And when you've

10:54

made billions and billions of dollars, you're perfectly

10:56

entitled to say it's because I'm smarter than all you

10:58

people. But he never said anything at

11:00

all like that. His answer was

11:03

always the same thing. He always

11:05

said, I am absolutely aware

11:08

that I am going to make mistakes.

11:10

And so I'm constantly looking at my

11:12

own thinking to try to find the mistakes

11:15

that I know must be there. And

11:17

as a result,

11:18

I catch and correct more of my mistakes

11:21

than does the other guy.

11:23

And so it's that sort of very intellectually

11:25

humble message, which he says is

11:27

the source of his success. And

11:30

frankly, I think you can, as I say, I think

11:32

you can find that sort of deep introspection

11:35

in every single person who has

11:37

demonstrable good judgment.

11:39

The so on the topic of good

11:41

judgment, I think that's a good segue into

11:43

kind of the whole discussion about forecasting.

11:46

Let's start out, I'd love to hear the story

11:49

or kind of the analogy of monkeys throwing

11:51

darts. Yeah,

11:54

we call that the unfortunate punch line by

11:56

co author Philip Tetlock is a very eminent psychologist

11:59

that recently

11:59

University of California Berkeley, now at the University

12:02

of Pennsylvania at the Wharton School of Business. And

12:04

Phil, back in the 1980s,

12:06

became interested in expert

12:09

political judgments. You know, you have very

12:11

smart people who are observing world affairs

12:13

and they say, okay, I think I understand it and I

12:15

think I know what's going to happen next. And

12:18

they make the forecast. And Phil decided,

12:20

well, are they any good? And

12:22

when you look at the available evidence,

12:24

what you quickly realize is that, well, lots

12:27

of people have lots of opinions about

12:29

expert forecasts. That's all they are.

12:31

They hadn't been properly

12:34

scientifically tested. And so Phil said

12:36

to himself, well, how should they

12:39

be tested? How can we do this? And

12:41

he developed a methodology for testing

12:43

the accuracy of expert forecasts. And

12:46

then he launched, what was at the time,

12:48

one of the biggest research programs on expert political

12:51

forecasting ever undertaken. He had

12:53

over 280 experts. You

12:55

know, people like economists, political scientists,

12:58

journalists, intelligence analysts. He

13:00

had those folks make a huge

13:03

number of predictions about geopolitical

13:05

events over many different time

13:08

frames. And then he waited for

13:10

time to pass so that he

13:12

could judge the accuracy of the forecast.

13:16

And then he brought together all the

13:18

data, encouraged all the data and boiled it all

13:20

down. And there are vast numbers of findings

13:22

that came out of this enormous research, which was

13:24

published in a book called Expert Political Judgment

13:27

in 2005. And

13:29

one conclusion that came out of this research was that

13:31

the average expert was about as

13:33

accurate as random guessing, or if you

13:36

want to be pejorative, the average expert

13:38

was about as accurate as a dart throwing chimpanzee. And

13:40

some people really latched

13:43

on to that conclusion. They really enjoyed that.

13:45

These are the sorts of people who like to sneer at so-called

13:47

experts. And there

13:49

are other people who like to say that, you know,

13:52

it's impossible to predict the future. And

13:54

they always cite this as being evidence

13:56

of that demonstrably

13:58

fallacious conclusion. again

16:00

to tell them what is going

16:03

on, right, to make forecasts. And

16:06

that sort of expert, they like to keep their

16:08

analysis simple. They don't like to clutter

16:10

it up with a whole bunch of different perspectives and

16:12

information. And they like to push

16:14

the analysis until it delivers

16:17

a nice clear answer. And of course,

16:19

if you push the analysis until it delivers

16:22

a clear answer, you're more often than not, you're

16:24

going to be very confident in your conclusion.

16:27

You're going to be more likely to say that something is

16:29

uncertain or that something is impossible.

16:32

The other type of expert is the fox. And

16:34

as the ancient Greek poet has it, the fox knows many

16:37

things. What that means in this context is the

16:39

fox doesn't have one big analytical idea.

16:42

The fox will use multiple analytical

16:44

ideas. You know, in this case, the fox may

16:46

use one idea. In another case, the fox

16:48

may use a different idea. And foxes are

16:51

also very comfortable with going

16:53

and consulting other views. So

16:56

here I have my analysis. I

16:58

come to a conclusion. But you

17:00

have an analysis. I want to hear your analysis. And

17:02

if you've got a different way of thinking, a different analysis,

17:05

a different method, then I definitely

17:07

want to hear that. And they want to hear from multiple

17:10

information sources. They want to hear

17:12

different perspectives. And they drag those

17:14

perspectives together and try

17:16

and make sense of all these disparate

17:19

sources of information and different perspectives. Now

17:21

if you do that, you will necessarily end

17:23

up with an analysis that is not

17:26

so elegant as the hedgehog's analysis.

17:28

It'll be complex. And it will

17:30

be uncertain, right? You'll probably end up

17:32

with more situations where you have, you know,

17:34

say you have seven factors that point in one direction

17:37

and five factors that point in another direction.

17:39

And then you'll say, well, you know, on balance,

17:41

I think it's maybe 65%. It will

17:44

happen. So they'll be more likely to

17:46

say that sort of thing than they will

17:48

be to say it's certain to happen

17:50

or it's impossible, right? So

17:53

they end up being much less confident than

17:55

the hedgehogs. Well the conclusion

17:57

of Phil's research was that the. hedgehogs

18:00

were disastrous when it came to making

18:03

accurate forecasts. As I said, they

18:05

were less accurate than the Dart throwing chimpanzee.

18:07

The Foxes had the style of thinking

18:10

that was more likely to produce an

18:12

accurate forecast. But and here's

18:15

the punchline, the real punchline for

18:17

Phil's research is that he also

18:19

showed there was an inverse

18:21

correlation between fame and accuracy.

18:24

Meaning the more famous the expert was,

18:26

the less accurate his forecasting was.

18:29

Which sounds absolutely perverse when

18:31

you think about it because of course you would think that

18:33

the media would flock to the accurate forecaster

18:36

and ignore the inaccurate forecaster. But

18:39

in fact it makes perfect sense.

18:43

Creating visual content is an essential

18:45

part of what I do. But the creative process

18:47

hasn't always been easy. Ever

18:49

since I found Canva for Teams, it's

18:51

been easy to collaborate and design

18:54

with my team, which makes the whole process so

18:57

much more creative and fun. And that's

18:59

why I'm so excited that Canva

19:01

is once again the sponsor for this episode.

19:04

Canva for Teams is a design platform that

19:06

makes it easy for anyone to create stunning

19:09

content in any format, from social media

19:11

posts to videos, presentations and

19:13

websites. The endless templates and

19:15

premium fonts, photos, graphics and

19:18

videos add personality and edge

19:20

to my team's content. With features

19:23

designed for brand consistency, Canva

19:25

for Teams makes it easy to maintain your

19:27

aesthetic, add logos, fonts and

19:29

colors to anything you create. Canva

19:32

for Teams streamlines how we do

19:34

social media too here at the Science of Success. We

19:37

can plan, create and share

19:39

social media content directly to all

19:41

of our channels from one place and even

19:43

schedule posts ahead of time. Canva

19:45

for Teams has a video editor that's so

19:48

easy to use with tons of filters, animations,

19:50

transitions and so much more that brings

19:53

our content to life. And

19:55

Canva for Teams enables us to take presentations to

19:57

the next level and make them look

19:59

perfect. professionally designed with their powerful

20:01

templates and their remote control

20:03

features. You can collaborate

20:06

with Canva for Teams. Right now, you

20:09

can get a free

20:11

45-day extended trial

20:14

when you go to canva.me

20:16

slash success. That's

20:19

C-A-N-V-A dot M-E

20:22

slash success for a

20:25

free 45-day extended trial,

20:27

canva.me slash success.

20:31

If you do anything at all with graphic

20:34

design, social media, anything

20:36

like that, seriously

20:38

check this out, 45 days of Canva

20:40

for Teams for free. You got to check

20:42

it out. It's awesome. And

20:45

now back to the show.

20:47

Yes,

20:47

because remember

20:49

that the hedgehog tells you a simple,

20:52

clear story that

20:55

comes to a definite conclusion. It will

20:57

happen or it won't happen, a confident

21:00

conclusion. Whereas that the

21:02

fox expert says,

21:05

well, there are some factors pointing in one

21:07

direction, there are other factors pointing in another direction.

21:09

There's a lot of uncertainty here, but I think it's

21:12

more likely than not that it will happen.

21:14

And if you know anything about the psychology

21:16

of uncertainty, we really

21:19

just don't like uncertainty. So

21:21

when you go to an expert and you get that

21:24

fox-like answer that says, well,

21:27

balance the probabilities, that's psychologically

21:30

unsatisfying. Whereas the hedgehog

21:33

is giving you what you psychologically crave,

21:36

which is a nice, simple, clear story with

21:38

a strong, clear conclusion. And

21:41

as a result, we find that the

21:43

media goes to exactly

21:45

the type of expert who is most likely to

21:47

be wrong.

21:48

That's a really important and really

21:51

unfortunate finding. And I wish

21:53

it were as famous

21:55

as Phil finding about the average

21:58

actual being as likely as

21:59

as accurate as the dart throwing chimpanzee

22:02

because it is just so much more important. But

22:04

unfortunately, there it is. So that

22:06

was sort of the culmination of Phil's first

22:09

enormous research program.

22:11

I think it's such an important finding that

22:13

the smartest people, the most accurate forecasters,

22:16

as you call them, the foxes,

22:18

are often kind of the most humble and the least

22:21

very, you know, kind of confident and certain about what's

22:23

actually going to happen.

22:25

Yep. This is, this is, again,

22:27

this is, if you were asking me about sort

22:29

of the universals of good judgment,

22:31

I think one of the universals is a quality

22:34

that I call intellectual humility.

22:37

And I emphasize intellectual humility because

22:40

it's not just humility. You know, this isn't

22:42

about somebody wringing his or her hands

22:44

and saying, I'm not worthy. I'm no good. You

22:46

know, by intellectual humility, I mean, it's

22:49

almost like a worldview in which

22:51

you say, look, reality

22:54

is immense, complex,

22:57

fundamentally uncertain in many ways

22:59

for us to understand even

23:01

a little bit of it, let alone to predict what's

23:04

going to come next is a constant

23:06

struggle. And what's more, we're fallible

23:08

people and people make mistakes.

23:12

So I just know that I'm going

23:14

to have to work really hard and I'm still going to make

23:16

mistakes, but I can

23:18

in fact slowly try to comprehend

23:21

a little bit and try to do a little bit better. That

23:23

attitude is absolutely fundamental

23:26

for a couple of reasons. Number one, it

23:28

says you're going to have to work really hard

23:31

at this, right? Comprehending

23:33

reality, let alone forecasting is not easy.

23:36

Expect to work hard if you want to do it well

23:38

and accurately. Number two

23:41

is it encourages introspection. You

23:43

remember I mentioned earlier, the introspection

23:45

is universal among people of good judgment.

23:48

Well, if you're intellectually humble and

23:50

you know you're going to make mistakes, you're

23:52

going to be constantly thinking about your thinking so that

23:55

you can try and find those errors. Okay.

23:58

So that is sort of that introspection. flows

24:00

naturally out of intellectual humility.

24:03

And the third element that comes flows out of

24:05

intellectual humility is this. If

24:07

you have this idea that you know, universe

24:09

is vast and complex, and we can never be sure,

24:11

then you know, that certainty

24:15

is an illusion, you should not

24:17

be chasing certainty because human

24:19

beings just can't manage that. So

24:22

what does that mean? That means don't think

24:24

of making a forecast in terms of it will

24:27

happen or it won't happen. Don't

24:29

think in terms of it's 100% or 0%.

24:31

Think in terms of 1 to 99%. It's all a question

24:33

of degrees

24:38

of maybe, right? And the finer grains

24:41

you can distinguish between degrees of maybe the

24:43

better. So what I've just described is

24:45

something called probabilistic thinking. And

24:47

it too is very, very fundamental

24:50

to people with good judgment. And unfortunately,

24:53

it's very unnatural. It's

24:56

not how people normally think.

24:58

In fact, how people normally think is

25:01

we sometimes call it a three setting mental

25:03

dial. You know, you ask yourself,

25:05

is this thing going to happen? And you say, it

25:08

will happen, or it won't happen. Or

25:10

if you really force me to acknowledge uncertainty,

25:12

because I really don't like uncertainty, I will

25:14

say maybe I'll at the third setting

25:17

of my mental dial. So there's only those three

25:19

crude settings. Whereas probabilistic

25:22

thinking says, no, no, throw out those two

25:24

settings, you know, it will happen or it won't

25:26

happen. It's all degrees of maybe.

25:29

So as I say that this is not natural,

25:31

this is not how people ordinarily

25:33

think, but people can

25:36

learn to do it. And they can learn

25:38

make it a habit. Scientists

25:40

think as probabilistic thinkers, good

25:43

scientists do anyway. And the

25:45

super forecasters that we discovered

25:47

in Phil's second research program,

25:50

people with demonstrably excellent forecasting

25:52

skill, they are real

25:55

probabilistic thinkers, and it is a habit

25:57

with them. I mean, I spoke with one super forecast.

26:00

And, you know, just in a casual conversation,

26:02

I said, you know, do you read much?

26:05

And he said, oh, yeah, I read lots. I said, well, do you read

26:07

fiction or nonfiction? He said, I read both. I

26:09

said, well, what proportion of the two

26:11

would you say that you read? And

26:14

he said, oh, it's about 70, 30. And then he caught

26:16

himself and thought carefully. And

26:18

he said, no, it's closer to 65, 35. Right.

26:22

And this has been a casual conversation. People

26:24

just don't think with that degree

26:27

of fine-grained maybeness.

26:31

But people who learn to think in probabilistic

26:33

terms, they can make it habitual and they can

26:36

think that carefully. And by

26:39

the way, the data is very clear that that is, in fact,

26:41

one of the reasons why these superforecasters

26:43

are super.

26:44

Before we dig into that, because I do want to talk about

26:47

how we can kind of train ourselves and to

26:49

think more probabilistically and how we can

26:52

learn from some of these superforecasters. Touching

26:55

back on the idea of why people dislike uncertainty

26:57

so much, can you share kind of the anecdote

26:59

about cancer diagnosis?

27:01

Oh, sure. You know, look, when

27:03

I say that people dislike

27:05

uncertainty, you know, people get it. Okay.

27:08

I just like uncertainty. I would prefer to have hard

27:10

facts. It is or it isn't. Okay.

27:13

I don't think they quite appreciate just how

27:16

profoundly aversive uncertainty

27:19

really is, psychologically aversive

27:21

it really is. And let me illustrate in

27:23

fact with two illustrations. One

27:26

is a scientific study that was conducted in

27:28

Holland where they asked volunteers to

27:31

experience electric shocks. And

27:33

some of the volunteers were

27:35

told, you are about to receive 20

27:38

strong electric shocks in a sequence.

27:41

And then they were wired up to be monitored

27:43

for the physiological evidence

27:46

of fear, which is elevated heart rate, elevated

27:48

respiration rate, perspiration, of course.

27:50

And then other volunteers

27:52

were told, you will receive 17

27:55

mild electric shocks interspersed

27:58

randomly with three strong electric shocks. strong

28:00

electric shocks and they too were monitored

28:02

for the evidence of fear. Now,

28:04

objectively, the first group obviously

28:07

received much more pain, much more painful

28:09

shocks,

28:10

but guess who experienced more fear? It

28:12

was the second group. And why? Because

28:15

they never could know whether

28:17

the next shock would be strong

28:20

or mild. And that uncertainty

28:22

caused much more fear than the pain

28:25

itself. So that sort

28:27

of aversion to uncertainty is very

28:29

powerful stuff and you will

28:32

see it in doctor's offices. In

28:34

fact, any doctor can, will

28:36

tell you a version of the story I'm about to say.

28:39

The patient comes in, the doctor

28:42

has reason to suspect that the patient has

28:44

cancer, tells the patient this, says,

28:46

but we can't be sure, we have to do more

28:48

tests and then we'll see. So

28:51

they do the tests and then the patient waits

28:53

and any person who's ever been through that will

28:55

tell you that the waiting is hell.

28:58

And then one day you go back to the

29:00

doctor's office, you sit down and sometimes

29:02

unfortunately the doctor has to say, I'm

29:04

afraid to tell you that the tests

29:06

confirm that you have cancer. And

29:09

almost universally what patients

29:11

report feeling at that

29:14

moment is relief. They

29:16

feel better and they almost always

29:18

say the same thing. At least

29:20

I know, at least I know. So that's

29:22

how powerful uncertainty is

29:25

that the possibility of a

29:27

bad thing happening can be a greater

29:29

psychological burden on us

29:32

than is the certainty that the

29:34

bad thing is happening. And

29:36

so if that's the case,

29:39

if uncertainty is so

29:41

horrible to us and we just want to

29:43

get rid of it, it's really no surprise

29:46

then that we will turn to sources

29:49

that promise to get rid of uncertainty

29:52

even when it's not rational to do

29:54

so.

29:55

So now let's dig into kind

29:57

of the idea of super forecasting.

30:00

And let's start with what is a super

30:02

forecaster?

30:03

Yeah, it's a bit of a grandiose term, I have to

30:05

admit, but it actually has humble

30:07

origins. A number of years ago,

30:10

the Office of the Director of National Intelligence

30:12

in the United States, that's the office

30:14

that oversees all the 16 intelligence

30:18

agencies, including the CIA in the United

30:20

States. A number of officials in

30:22

that office decided that they

30:25

had to get more serious about

30:28

analyzing the forecasting

30:30

that the intelligence community does, because

30:32

I don't know if you're aware, but the intelligence community

30:35

actually spends a lot of its time, not just spying,

30:37

but also analyzing information

30:40

to try and figure out what's going to happen

30:42

next. So, you know,

30:44

if Russia is saber-wrathling, they're going to

30:46

make a forecast. Will Russia try to seize the

30:49

Crimea? You know, they'll try to make forecasts

30:51

about all parts of geopolitical events, including

30:53

economic events, like, you know, what's going

30:56

to happen at the Chinese economy in the fourth quarter, that

30:58

sort of thing. And so the officials

31:00

within the ODNI decided they had to

31:02

get better at this, and one of the ways that they decided they

31:04

would get better at this is to sponsor what

31:06

became called a forecasting

31:08

tournament. And what that meant was very

31:11

simply, it sounds like a game, but it's not a game, it's an

31:13

enormous research program. And what

31:15

they did was they went to leading researchers

31:18

in forecasting, and they said, you set

31:20

up a team to make forecasts,

31:22

and we'll ask questions, and there'll be the real

31:25

world questions that we have

31:27

to answer all the time. And we'll ask

31:29

them in real time. So as they arise,

31:31

you know, if an interaction breaks out in

31:33

Syria, we'll ask something about how that

31:36

will proceed. And so you have to forecast

31:38

it, and then we'll let time pass, and then we

31:40

will judge whether your forecasts are accurate

31:43

or not. And we'll do this for lots

31:45

and lots and lots of questions. And

31:47

you guys, you researchers, you

31:49

can use any methods you want. And

31:52

then at the end of this process,

31:54

we will be able to analyze the accuracy

31:56

of all these forecasts. We

31:58

will see which methods work, which methods

32:01

don't, and then try to learn how

32:03

we can improve what we're doing. Very

32:05

sensible stuff you would say. So they,

32:07

as I said, they went out to leading researchers.

32:10

Ultimately, they ended up with five university-based

32:13

research teams in this forecasting

32:15

tournament. One of the research

32:17

teams was led by my co-author, Philip

32:19

Tetlock, and that team was called

32:22

the Good Judgment Project. To give you an

32:24

idea of the scale of this undertaking,

32:26

the Good Judgment Project, which

32:28

as I say, was only one of five teams, it

32:31

involved volunteers. They went

32:33

out and they recruited and, you know,

32:35

through blogs and whatnot and said, you know, basically,

32:38

do you want to spend a little free time

32:40

making geopolitical forecasts? Then

32:42

sign up here. And so they

32:45

got huge numbers of volunteers.

32:47

At any one time, there were 2,800 to 3,000 people involved with

32:51

the Good Judgment Project. Over the course

32:53

of the four-year tournament, there were more

32:56

than 20,000 people involved. So

32:58

it gives you an idea of the scale of this. And

33:00

the bottom line result,

33:03

I mean, there were many, many results that came out of this

33:05

because as you can imagine, the data are luminous.

33:07

But the bottom line result was, one,

33:10

the Good Judgment Project, one, hands

33:13

down. Number two, the

33:15

Good Judgment Project discovered that

33:18

there was a small percentage, between 1

33:20

and 2 percent, of the forecasters,

33:23

the volunteer forecasters, were

33:25

truly excellent forecasters.

33:27

They were consistently good. And I

33:30

say consistently good because that's very important

33:32

to bear in mind. Anybody can get lucky

33:34

once or twice or three times, but if you're

33:36

consistently good, you can be pretty sure

33:38

that you're looking at scale, not luck. And

33:41

to give you an idea of how good they were, well, at

33:43

the start of the tournament, the

33:46

ODNI set performance

33:48

benchmarks, which all the researchers thought were way

33:50

too ambitious. Nobody can beat these. Well, the super

33:52

forecasters went past the performance

33:54

benchmarks. They beat connection

33:56

markets, which economists would say shouldn't

33:59

be possible. They even beat intelligence

34:02

analysts who had access to classified

34:04

information, which is particularly amazing Because

34:07

remember these are ordinary folks. So

34:10

these super forecasters when they went to

34:12

make their forecasts Basically,

34:14

they had to use just whatever information

34:17

they could dig up with Google And

34:19

yet they were able to beat even people who had

34:21

access to all that juicy classified information

34:24

So this is really impressive stuff and then

34:26

the question is well, why are they so good? And

34:29

so we can quickly dispatch a number of things

34:31

that you might think would explain this Number

34:34

one, you might think that they're using some kind

34:36

of arcane math, right? They're using

34:38

big data Algorithms some

34:40

craziness that you know ordinary folks

34:43

can't understand No, they

34:45

didn't in fact to the extent that they

34:48

use math. They weren't very numerous people

34:50

by the way They are very newer people. I should emphasize

34:52

that point They are well above average in

34:54

numeracy, but to the extent that they use

34:57

math and making their judgments It

34:59

was like high school math. It was nothing particularly

35:01

dramatic Another thing that you might

35:03

say would make the difference. Well, maybe they're just

35:06

geniuses, right? They're just so

35:08

off the charts intelligent that you know,

35:10

they're just super and no that's not the case either

35:13

They were tested for on IQ. They were given IQ

35:15

tests and again, they

35:17

scored well above average These are not just you

35:20

know, randomly selected folks out the street But

35:22

they're not sort of you know, mental level geniuses.

35:25

They're not so incredibly intelligent that

35:28

You know ordinary folks can't relate to them

35:31

And so it's very clear that conclusion that

35:33

you can draw from this is basically it's it's less

35:35

What they have than how they use it

35:38

and the third element that you might think is specialist

35:41

knowledge, right? You might think well, okay They're

35:43

used these are experts in some field

35:45

in the fields that they're trying to forecast and and

35:48

no I can tell you categorically They were not experts

35:50

in the field. They're very informed people,

35:52

right? These are people who agreed to

35:55

make geopolitical forecasts in their spare time.

35:57

It's no surprise that they're you know, they're

35:59

smart

35:59

They follow the news, they follow international

36:02

news, they're

36:03

interested in this stuff, they're very informed, but

36:05

they're not specialists.

36:07

And we know this for the very simple

36:09

reason that they were asked about all sorts of

36:11

different questions in all sorts of different fields and nobody is

36:13

an expert in every field. So they're

36:15

not any of those things. So then the

36:18

question is, well, what elevates them? What makes

36:20

them different? And

36:22

I wish there were like one or two simple

36:24

answers, you know, a couple of clear, crisp bullet

36:27

points that answers everything. But

36:29

that's not the case, as is so often the case,

36:31

the reality is complex. There's

36:34

quite a list of things that make them different. Number

36:37

one, they're intellectually curious. I think that's

36:39

very, very important. It's no surprise. These are people

36:41

who like to learn and

36:42

are constantly picking up bits and pieces of information.

36:45

And no surprise when you spend a lot of time picking up

36:47

bits and pieces of information, eventually you will have

36:49

quite a number of dots in your intellectual

36:52

arsenal for you to connect.

36:54

Two, these are people who score

36:57

very high in what psychologists call

36:59

need for cognition, which is simply

37:01

means that they like to think. They really enjoy

37:03

thinking. They're the kinds of people who do

37:06

puzzles for fun. And the harder the puzzle

37:08

is, the more fun it is, which is

37:10

very important because when you

37:12

look at how they actually make their fork and it's

37:15

a lot of hard mental effort.

37:18

And so enjoying that hard mental effort sure

37:20

helps. Three, they're actively

37:22

open minded. That's another term for

37:24

psychology. Open minded is pretty

37:27

obvious. That means, you know, OK, I've

37:29

got my perspective, but I want to hear your

37:31

perspective. I want to hear somebody else's perspective.

37:34

I want to hear different ways of thinking about this

37:36

problem. And then they're going to gather

37:38

all these different perspectives together and try

37:40

to synthesize them into their own view.

37:42

Now, that's the open mind department. Of course,

37:45

there's an old saying about open mindedness. Don't be so

37:47

open minded that your brain falls out. Well,

37:50

these folks, that's where that's the active

37:53

part and active open mindedness. And

37:55

these folks were very active in their open mindedness,

37:58

meaning that as they're listening to

38:00

all these other perspectives and gathering

38:02

these other perspectives, they're thinking critically

38:04

about them. They're saying, does that really make

38:06

sense? Is that actually supported by evidence?

38:09

Is that logical? So they're doing

38:11

that constantly when they draw these

38:13

perspectives together and synthesize them into

38:15

their own view, which again, I would emphasize

38:17

that that sounds like a heck of a lot of work.

38:20

It is, it is. Unfortunately, as

38:22

I said, they like hard thinking. And

38:25

fundamentally also, they're

38:27

intellectually humble. I mentioned intellectual humility

38:30

earlier, that is absolutely true here.

38:33

And all the things that flow from that are true.

38:35

You know, they are hard

38:37

mental workers. They are deeply

38:39

introspective people that constantly looking at

38:42

their thinking trying to find the mistakes, trying

38:44

to correct it and improve it. And they're probabilistic

38:47

thinkers that also flows from intellectual humility.

38:49

And so another one, another

38:52

element I would also add is simply this.

38:55

If you ask, you know, well, how do they actually approach

38:57

a problem? How do they actually make

38:59

a judgment? One of the critical

39:02

differences between a superforecaster and most

39:04

ordinary folks is

39:06

rather than simply vaguely mowing

39:08

over information, you know, it's stroking

39:11

your chin until somehow an answer

39:13

emerges somehow and you don't know how.

39:16

That is a terrible way to make a forecast,

39:18

by the way. What they do is

39:20

that they methodically unpack

39:22

the question. So they take a big

39:25

question and they unpack it and

39:27

make a whole series of smaller questions. And then

39:29

they unpack those and they make a series of smaller questions

39:32

and they methodically examine them.

39:34

Each one,

39:34

step by step, by step,

39:37

by step. Again, this

39:39

is a very laborious method.

39:42

A lot of hard mental work goes

39:44

into it,

39:45

but it's demonstrably effective.

39:48

There's a famous physicist named

39:51

Enrico Fermi,

39:52

one of the fathers of the atomic bomb, who became

39:54

famous for his ability to estimate things

39:57

accurately. And he actually taught

39:59

this method.

39:59

Fermi estimates basically

40:02

involve unpacking questions so

40:04

that you methodically tackle them one after

40:06

the other after another People

40:08

who work in physics or engineering

40:11

will be familiar with this. Fermi estimates

40:13

are actually taught in those departments.

40:16

In fact

40:17

Engineers to engineers this is almost like

40:19

a nature this idea of unpacking the problem

40:21

and methodically tackling it that way It's

40:23

probably not It's

40:26

probably not a coincidence that a

40:28

disproportionate number of the super forecasters

40:31

have engineering backgrounds So

40:34

software engineers computer

40:36

programmers, whatever people

40:39

with engineering backgrounds sort of get this

40:41

That was fascinating I think one of the most

40:44

important things you said is that it's not easy and it

40:46

takes a lot of hard work to Make

40:49

effective decisions or in this particular context

40:52

effective forecasts One

40:54

of the things that that I always say is that there's no

40:56

kind of get-rich-quick strategy to becoming

40:58

a better thinker It takes a lot

41:00

of time energy Reading

41:03

and introspection to really build kind

41:05

of a robust thought process to improve

41:08

your own ability to think and make better decisions

41:11

That's absolutely correct and

41:13

it also touches on a further factor

41:15

Which I didn't mention which is

41:18

certainly one of the most important Which is

41:21

that these are people who have what psychologists

41:23

call the growth mindset Which

41:25

is that they believe that

41:27

if they think hard and they work

41:30

hard and they practice their

41:32

forecasting skill and they

41:34

look at the results of their Forecasts and

41:36

they think about how they got them right or how

41:38

they got them wrong and then they try again That

41:41

they will improve their forecasting skill

41:43

just as you would improve any skill that

41:45

you practice Carefully with good

41:47

feedback over time.

41:49

You might say but isn't that perfectly

41:51

obvious? Everybody understand

41:54

that in order for you to improve

41:56

a skill you have to practice it and

41:58

the more you practice the better

41:59

get. And unfortunately, that's

42:02

just not true. There's a psychologist

42:04

named Carol Dweck, who has done an

42:06

armistice about a research in this field, and she talked about

42:08

two mindsets. One is the growth

42:10

mindset that I just described. But

42:12

the other mindset is the fixed mindset,

42:15

which is basically the idea that we're

42:18

all born with abilities and talents

42:20

and skills. And that's

42:22

all we've got. So if

42:24

I try something and I fail, I'm

42:27

not going to try it again, because

42:30

I have demonstrated the limits of my

42:32

abilities, and it would be foolish on me to waste

42:34

time trying to improve those abilities.

42:37

And so that's why it's very, very critical.

42:39

And then we see this clearly in super forecasted,

42:41

they have very strong growth mindset. And

42:44

more importantly, they put it into

42:46

action. So they were making

42:48

their forecasts, they were doing

42:50

postmortems trying to figure out what went right,

42:52

what went wrong and why they were trying

42:55

to improve on the next round trying

42:57

to improve on the next round. And they did,

42:59

there was demonstrable improvement. And

43:01

so it's very clear that

43:04

underlying all of this is

43:06

you have to have some belief in

43:09

the ability to grow, or you

43:11

won't engage in the hard work

43:13

that's necessary to grow.

43:15

And longtime listeners to the show will know that

43:18

that on here we're huge fans of Carol Dweck

43:21

and the book mindset. And we actually

43:23

have a whole episode on kind of the difference

43:25

between the growth mindset and the fixed mindset

43:28

and Oh, great breaking out all those things. So I'll

43:30

include links to both of those things in the show notes

43:32

for people to kind of be able to dig down

43:34

and really understand those concepts who have

43:37

may not have heard the previous episodes we have about

43:39

that kind of stuff. But yeah, I totally agree.

43:41

I'm a huge fan of the growth mindset. And

43:43

I think it's critically important.

43:46

Yeah, and there's no question that in

43:48

Phil Tetlock's research, super forecasting

43:50

research, the data very clearly demonstrate

43:53

that.

43:54

So for somebody who's listening, what

43:56

are some sort of small concrete steps

43:58

they could take right now to kind of implement

44:00

some of the best practices of super forecasters

44:03

to improve their own thinking?

44:04

Well, the first thing I would say is

44:07

adopt as an axiom because of

44:09

course as humans we all have to have axioms

44:12

in our thinking. Adopt as an axiom

44:15

that missing is certain. And

44:16

if you say that in the

44:18

abstract but it's a lot harder

44:21

to apply in our lives because

44:23

if you stop and you think about

44:25

your own thinking you'll begin to realize that you

44:27

use the language of certainty constantly

44:30

which is normally fine. You know

44:32

I'm sure in this conversation I've

44:34

used certainly and

44:36

that sort of thing. But remember

44:39

at a minimum that any time that you

44:41

say certain or refer

44:43

to certainty there's an asterisk

44:46

almost right. The asterisk means almost

44:49

because in fact in reality

44:51

literally nothing is certain not

44:54

even death and taxes.

44:55

And once you start to think in those terms

44:57

and you make that an axiom you can start to make

44:59

it a habit to say okay it's

45:02

not certain how likely is

45:04

it. Think in terms of probability

45:06

and you know it's often said that the

45:09

ability to distinguish between

45:11

you know a 48 percent

45:14

probability and a 52 percent probability

45:16

or even a 45 and a 55 percent

45:18

probability

45:19

it sounds like a modest thing but

45:22

if you can do that consistently that's

45:25

the difference between going bankrupt and making a fortune

45:27

in certain environments such as Las Vegas

45:30

or Wall Street.

45:31

And so thinking learning to think to

45:33

make it habitual to think in terms of probability

45:36

is I think step number one.

45:38

And for listeners who want to find you

45:41

or the book what's the best place for people to find

45:43

you online?

45:44

Oh probably dangardener.ca

45:46

that's dangardener g-a-r-d-n-e-r.ca

45:50

for Canada.

45:51

And for listeners who might have missed it

45:54

earlier the book that we've primarily been talking about is

45:56

super forecasting. Highly recommend

45:58

it as you can tell from this interview.

45:59

interview, Dan is incredibly sharp about

46:02

all of these different topics. Dan, for somebody

46:04

who's listening, obviously, they should check

46:06

out Superforecasting. What are some other resources

46:08

you'd recommend if they want to learn more

46:10

about how to make better decisions and how

46:12

to make better forecasts?

46:14

Oh, that's an easy question. The very first book,

46:16

in fact, I would recommend it before my own books,

46:18

which is something

46:20

authors aren't supposed to do, but here goes.

46:23

The very first book folks should read is

46:26

Daniel Kahneman's book, Thinking Fast

46:28

and Slow. Kahneman is a course,

46:30

a Nobel Prize winning psychologist who is

46:33

one of the seminal figures of our time.

46:36

Fortunately, he finally got around to,

46:38

long after I read all of his papers and

46:40

learned the hard way, he finally got around to

46:42

writing a popular book.

46:45

Thinking Fast and Slow is absolutely

46:48

essential reading. Anybody who makes

46:51

decisions, whether

46:53

it's in business or in government or

46:55

in the military or anywhere else, anybody who

46:57

makes decisions that matter should read Thinking

46:59

Fast and Slow.

47:00

I totally agree. It's one of my favorite books and I think

47:03

one of the deepest, most information-rich

47:05

books about psychology that's

47:08

on the market today. Absolutely.

47:10

Well, Dan, this has been a great conversation

47:12

and filled with a lot of fascinating

47:15

insights. So thank you very much for being on

47:17

the show.

47:18

Thank you. It was a lot of

47:19

fun. Thank you so much for listening to The Science

47:22

of Success. Listeners like you are why

47:24

we do this podcast. The emails

47:26

and stories we receive from listeners around

47:28

the globe bring us joy and fuel our

47:30

mission to unleash human potential.

47:33

I love hearing from listeners. If you want to reach out,

47:35

share your story, or just say hi, shoot

47:37

me an email. My email is matt.scienceofsuccess.co.

47:41

That's M-A-T-T at scienceofsuccess.co.

47:46

I would love to hear from you and I read and respond

47:48

to every single listener email. The

47:50

greatest compliment you can give us is a referral to

47:52

a friend either live or online. If

47:55

you enjoyed this episode, please leave

47:57

us an awesome review and subscribe on iTunes.

47:59

That helps me out a lot. more and more people discover

48:01

the science of success. I get a ton

48:04

of listeners asking me, Matt, how do you organize

48:06

and remember all this information? Because of

48:08

that, we created an amazing free guide

48:11

for all of our listeners. You can get it by texting

48:13

the word smarter, that's S-M-A-R-T-E-R,

48:16

to the number 44222, or

48:18

by going to scienceofsuccess.co, that's

48:21

scienceofsuccess.co, and joining

48:23

our email list. If you wanna get all

48:25

this incredible information, links, transcripts,

48:28

everything we talked about, and much more, make

48:31

sure to check out our show notes, which are also

48:33

on the website at scienceofsuccess.co.

48:36

Hit the show notes button at the top, you can get

48:38

the show notes from this interview and every single interview

48:40

we've done. We've got transcripts, we've got everything

48:42

you need in there. Be sure to check that out for all

48:44

the resources that we discussed in this episode.

48:47

Thanks again, and we'll see you on the next

48:49

episode of the Science of Success.

48:52

♪ Look at me, look at me, look at me,

48:54

look at me, look at me ♪

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features