The Data Detective

The Data Detective

BonusReleased Tuesday, 2nd February 2021
 3 people rated this episode
The Data Detective

The Data Detective

The Data Detective

The Data Detective

BonusTuesday, 2nd February 2021
 3 people rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:15

Pushkin. Hello,

0:18

Cautionary Tales listeners, Tim Harford

0:21

here, I have some good news followed

0:23

by a treat. The

0:26

good news is that, after long months

0:28

in the making, the new mega

0:30

season of Cautionary Tales is about

0:33

to appear right here on this feed.

0:36

Fourteen episodes of fiasco

0:38

and catastrophe, of nerdy insights

0:41

and heroic failures, and occasionally,

0:43

not too often, a happy ending.

0:46

There are murderers, idiots and heroes.

0:49

There are fraudsters and fighters and whistleblowers.

0:52

There are gamblers and gamers and geeks

0:54

galore, all played by a stellar

0:57

cast of actors, so stellar

0:59

in fact, that I'm still pinching myself

1:01

and I'm looking forward to revealing their names very

1:03

soon. I loved writing

1:05

this series, and I really hope that you're going

1:08

to love listening to it, starting weekly on

1:10

the twenty sixth of February, and

1:15

now the treat. Loyal listeners

1:17

may know that my new book, The Data

1:20

Detective has just been released in the US

1:22

and Canada. My publishers, Riverhead

1:24

Books, have kindly agreed to let me share

1:26

with you the final chapter of

1:28

the audiobook, in which I reveal

1:31

the golden rule of thinking about numbers

1:33

in the news. I've been so

1:35

pleased with The Data Detective. The international

1:38

edition was called How to Make the World That Up

1:40

and was a number one business bestseller

1:43

in the UK. The Data

1:45

Detective is a book about how to think clearly

1:47

about the world by being wiser about

1:49

statistics and wiser about ourselves

1:52

and our cognitive biases. In

1:54

it, I offer ten simple rules

1:56

to help you be calmer and smarter

1:59

as you scroll through social media or

2:01

scan the headlines, and plenty

2:03

of stories too. The book is

2:05

available wherever books are sold, and

2:08

so as the audio book read by yours

2:10

truly. I hope you like the audiobook

2:12

extract you're about to hear, and if you do, look

2:15

out for The Data Detective book, ebook

2:17

and audiobook, and please spread

2:20

the word the

2:26

golden rule be

2:29

curious. I

2:32

can think of nothing an audience won't

2:34

understand. The only problem

2:36

is to interest them. Once they're interested,

2:39

they understand anything in

2:41

the world. Orson

2:43

wells. I've

2:46

laid down ten statistical

2:48

commandments in this book. First,

2:51

we should learn to stop and notice

2:53

our emotional reaction to a claim, rather

2:56

than accepting or rejecting it because

2:58

of how it makes us feel. Second,

3:01

we should look for ways to combine the

3:03

bird's eye statistical perspective

3:05

with the worm's eye view from personal

3:08

experience. Third,

3:10

we should look at the labels on the data we're

3:12

being given and ask if we understand

3:15

what's really being described. Fourth,

3:18

we should look for comparisons and context,

3:21

putting any claim into perspective. Fifth,

3:25

we should look behind the statistics,

3:27

at where they came from and what other

3:29

data might have vanished into obscurity.

3:32

Sixth, we should ask who is

3:34

missing from the data we're being shown, and

3:37

whether our conclusions might differ if they

3:39

were included. Seventh,

3:42

we should ask tough questions about algorithms

3:44

and the big data sets that drive them,

3:47

recognizing that without intelligent

3:49

openness, eight cannot be trusted.

3:52

Eighth, we should pay more attention

3:54

to the bedrock of official statistics

3:57

and the sometimes heroic statisticians

4:00

who protect it. Ninth

4:02

we should look under the surface of any

4:05

beautiful graph or chart, and

4:07

tenth we should keep an open

4:09

mind, asking how we might be mistaken

4:12

and whether the facts have changed. I

4:16

realize that having ten commandments

4:19

is something of a cliche, and

4:21

in truth, they're not commandments so much

4:23

as rules of thumb or habits

4:25

of mind that I've acquired the hard way as

4:27

I've gone along. You might

4:30

find them worth a try yourself when you come

4:32

across a statistical claim of particular

4:34

interest to you. Of course, I don't

4:36

expect you to run personally through the

4:38

checklist with every claim you see in

4:40

the media. Who has the time for that

4:44

they can be useful in forming a preliminary

4:46

assessment of your new source? Is

4:49

the journalist making an effort to

4:51

define terms, provide context,

4:54

assess sources. The

4:56

less these habits of mind are

4:58

in evidence, the louder

5:00

alarm bell should ring. Ten

5:04

rules of thumb is still a lot for

5:06

anyone to remember, so perhaps

5:09

I should try to make things simpler. I

5:11

realize that these suggestions have

5:13

a common thread, a

5:16

golden rule. If you like, be

5:19

curious, look deeper,

5:22

and ask questions. It

5:24

is a lot to ask, but I hope that it's

5:27

not too much. At

5:29

the start of this book, I begged you not

5:31

to abandon the idea that we can understand

5:33

the world by looking at it with the help

5:36

of statistics. I believe we can and

5:38

should be able to trust that numbers

5:40

can give us answers to important questions.

5:44

But of course nullius inverber

5:46

we shouldn't trust without also

5:49

asking questions. The philosopher

5:51

and Norah O'Neill once declared well

5:54

placed trust grows out of active

5:57

inquiry rather than

5:59

blind acceptance. That

6:01

seems right. If we want to be able

6:03

to trust the world around us, we need to show

6:06

an interest and ask a few basic questions.

6:09

And despite all the confusions

6:11

of the modern world, it has never

6:13

been easier to find answers

6:15

to those questions. Curiosity,

6:19

it turns out, can be a

6:21

remarkably powerful thing. About

6:26

a decade ago, a Yale University

6:29

researcher Dan Kahan showed

6:31

students some footage of a protest

6:33

outside an unidentified building.

6:36

Some of the students were told that it was a pro

6:39

life demonstration outside an

6:41

abortion clinic. Others

6:43

were informed that it was a gay rights demonstration

6:46

outside an army recruitment office.

6:49

The students were asked some factual

6:51

questions. Was it a peaceful protest?

6:54

Did the protesters try to intimidate people

6:56

passing by? Did they scream or

6:58

shout? Did they block the entrance

7:00

to the building. The

7:03

answers people gave depended

7:05

on the political identities they embraced.

7:08

Conservative students who believed they were looking

7:10

at a demonstration against abortion, saw

7:13

no problems with a protest, no abuse,

7:15

no violence, no obstruction. Students

7:19

on the left who thought they were looking at

7:21

a gay rights protest reached the

7:23

same conclusion the protesters had

7:25

conducted themselves with dignity and

7:27

restraint. But right

7:30

wing students who thought they were looking at a gay

7:32

rights demonstration reached a

7:34

very different conclusion, as did

7:36

left wing students who believed they were watching

7:38

an anti abortion protest. Both

7:41

these groups concluded that the protesters

7:43

had been aggressive, intimidating,

7:46

and obstructive. Kahan

7:49

was studying a problem we met in the first

7:51

chapter. The way our political

7:54

and cultural identity are desired

7:56

to belong to a community of like minded,

7:58

right thinking people can,

8:00

on certain hot button issues, leaders

8:03

to reach the conclusions we wished to

8:05

reach. Depressingly, not

8:08

only do we reach politically comfortable

8:10

conclusions when parsing complex

8:13

statistical claims on issues such

8:15

as climate change, we reach politically

8:17

comfortable conclusions regardless

8:19

of the evidence of our own eyes. As

8:23

we saw earlier, expertise is

8:26

no guarantee against this kind

8:28

of motivated reasoning. Republicans

8:30

and Democrats with high levels of scientific

8:33

literacy are further apart

8:35

on climate change than those with little

8:37

scientific education. The

8:39

same disheartening pattern holds

8:41

from nuclear power to gun control

8:44

to fracking. The more scientifically

8:46

literate opponents are, the more

8:49

they disagree. The same

8:51

is true for numeracy. The greater

8:54

the proficiency, the more acute

8:56

the polarization, notes Kahan.

9:00

After a long and fruitless search

9:02

for an antidote to tribalism, Kahan

9:05

could be forgiven for becoming jaded. Yet

9:08

a few years ago, to his surprise,

9:11

Kahan and his colleagues stumbled upon

9:13

a trait that some people have and

9:16

that other people can be encouraged to

9:18

develop, which inoculates

9:20

us against this toxic polarization

9:24

on the most politically polluted

9:26

tribal questions, where intelligence

9:29

and education fail, this

9:31

trait does not. And

9:34

if you're desperately, burningly

9:36

curious to know what it is, congratulations,

9:40

you may be inoculated already. Curiosity

9:45

breaks the relentless pattern. Specifically,

9:48

Kahan identified scientific

9:50

curiosity that's different

9:53

from scientific literacy. The two

9:55

qualities are correlated, of course, but

9:57

there are curious people who know rather

9:59

little about science yet and

10:02

highly trained people with little appetite

10:04

to learn more. More

10:07

scientifically curious republic plans

10:09

aren't further apart from Democrats

10:11

on these polarized issues. If

10:13

anything, they're slightly closer

10:15

together. It's important not

10:18

to exaggerate the effect. Curious

10:20

Republicans and Democrats still disagree

10:23

on issues such as climate change, but

10:25

the more curious they are, the

10:27

more they converge on what we might call

10:30

an evidence based view of the

10:32

issues in question. Or

10:34

to put it another way, the more

10:36

curious we are, the less

10:38

our tribalism seems to matter. There

10:42

is little correlation between scientific

10:44

curiosity and political affiliation.

10:47

Happily, there are plenty of curious

10:49

people across the political spectrum.

10:52

Although the discovery surprised Kahan,

10:55

it makes sense, as we've

10:57

seen one of our most stubborn defenses

11:00

against changing our minds is that we're good

11:02

at filtering out or dismissing unwelcome

11:04

information. A curious

11:07

person, however, enjoys being surprised

11:09

and hungers for the unexpected. He

11:12

or she will not be filtering out surprising

11:15

news because it's far too intriguing.

11:19

The scientifically curious people Kahan's

11:21

team studded were originally identified

11:24

with simple questions buried in a

11:26

marketing survey so that people weren't

11:28

conscious that their curiosity was being

11:30

measured. One question, for

11:32

example, was how often do you read

11:35

science books? Scientifically

11:37

curious people are more interested in watching

11:39

a documentary about space travel or

11:41

penguins than a basketball game

11:43

or a celebrity gossip show. And

11:46

they didn't just answer survey questions

11:48

differently, they also made different

11:50

choices in the psychology lab. In

11:52

one experiment, participants were

11:55

shown a range of headlines about climate

11:57

change and invited to pick the

11:59

most interesting article to

12:01

read. There were four headlines,

12:05

Two suggested climate skepticism

12:07

and two did not, two

12:09

reframed as surprising, and

12:12

two were not. One

12:15

scientists find still more evidence

12:18

that global warming actually slowed

12:20

in last decade skeptical,

12:23

unsurprising. Two

12:26

scientists report surprising

12:28

evidence Arctic ice melting

12:30

even faster than expected, surprising

12:34

and not skeptical. Three

12:38

scientists report surprising

12:40

evidence ice increasing

12:42

in Antarctic not currently

12:44

contributing to sea level rise, skeptical

12:47

and surprising. Four

12:52

scientists find still more evidence

12:54

linking global warming to extreme

12:57

weather, neither surprising

12:59

nor skeptical. Typically,

13:02

we'd expect people to reach for the article

13:05

that pandered to their prejudices. The

13:07

Democrats would tend to favor a headline

13:09

that took global warming seriously,

13:12

while Republicans would prefer something

13:14

with a skeptical tone. Scientifically

13:17

curious people Republicans

13:19

or Democrats were different.

13:22

They were happy to grab an article which ran

13:25

counter to their preconceptions

13:27

as long as it seemed surprising and

13:29

fresh, and once you're

13:31

actually reading the article, there's

13:34

always a chance that it might teach you something.

13:37

A surprising statistical claim

13:39

is a challenge to our existing worldview.

13:42

It may provoke an emotional response,

13:45

even a fearful one. Neuroscientific

13:48

studies suggest that the brain responds

13:51

in much the same anxious way to

13:53

facts which threaten our preconceptions

13:56

as it does to wild animals which

13:58

threaten our lives. Yet,

14:00

for someone in a curious frame of

14:03

mind, in contrast, a surprising

14:05

claim need not provoke anxiety.

14:08

It can be an engaging mystery or

14:11

a puzzle to solve. You're

14:17

listening to an excerpt of The Data

14:20

Detective courtesy of Penguin

14:22

Random House Audio. The Data

14:24

Detective is a brand new book written

14:27

and narrated by me, Tim Harford,

14:29

and we'll be back with more after this

14:31

message. A

14:36

curious person might at

14:38

this point have some questions.

14:41

When I met Dan Kahan, the question

14:43

that was most urgent in my mind

14:46

was can we cultivate

14:48

curiosity? Can we become

14:50

more curious? And can we inspire

14:53

curiosity in others? There

14:55

are reasons to believe that the answers

14:57

are yes. One reason,

15:00

says Kahan, is that his measure

15:02

of curiosity suggests that incremental

15:05

change is possible. When he measures

15:07

scientific curiosity, he doesn't

15:10

find a lump of stubbornly incurious

15:12

people at one end of the spectrum

15:15

and a lump of voraciously curious

15:17

people at the other, with a yawning gap

15:19

in the middle. Instead, curiosity

15:22

follows a continuous bell curve. Most

15:25

people are either moderately incurious

15:28

or moderately curious. This

15:30

doesn't prove that curiosity can

15:32

be cultivated. Perhaps that bell

15:35

curve is cast in iron. Yet

15:37

it does at least hold out some hope

15:40

that people can be nudged a little further towards

15:43

the curious end of that curve, because

15:45

no radical leap is required.

15:49

A second reason is that curiosity is

15:51

often situational. In the

15:53

right place at the right time, curiosity

15:55

will smolder in any of us. Indeed,

15:59

Cahan's discovery that an individual's

16:01

scientific curiosity persisted

16:03

over time was a surprise

16:05

to some psychologists. They had

16:07

believed with some and that there was

16:09

no such thing as a curious

16:11

person, just a situation

16:14

that inspired curiosity.

16:16

In fact, it does now seem that people can

16:19

tend to be curious or incurious.

16:22

That does not alter the fact that curiosity

16:24

can be fueled or dampened by

16:27

context. We all

16:30

have it in us to be curious or

16:32

not about different things

16:35

at different times. One

16:37

thing that provokes curiosity is

16:40

the sense of a gap in our knowledge to

16:42

be filled. George Lowenstein,

16:45

a behavioral economist, framed

16:47

this idea in what has become known as

16:49

the information gap theory

16:52

of curiosity. As Lowenstein

16:54

puts it, curiosity starts to

16:56

glow when there's a gap between what

16:58

we know and what we want to know. There's

17:01

a sweet spot for curiosity.

17:03

If we know nothing, we ask no questions.

17:06

If we know everything, we ask no questions.

17:09

Either. Curiosity is fueled

17:11

once we know enough to know that we do

17:14

not know alas

17:16

all too often we don't even think about what

17:18

we don't know. There's a beautiful

17:20

little experiment about our Incuriosity,

17:23

conducted by the psychologists Leonard

17:25

Rosenblitt and Frank Kyle. They

17:28

gave their experimental subjects a simple

17:30

task to look through a list of

17:33

everyday objects, such as a flush lavatory,

17:35

a zip fastener, and a bicycle,

17:38

and to rate their understanding of each

17:40

object on a scale of one to seven.

17:44

After people had written down their ratings,

17:46

the researchers would gently launch a

17:48

devastating ambush.

17:51

They asked the subjects

17:53

to elaborate. Here's

17:55

a pen and paper. They would say, please

17:57

write out your explanation of a flush

18:00

lavatory in as much detail as he can

18:02

by all means include diagrams.

18:05

It turns out that this task wasn't

18:07

as easy as people had thought. People

18:10

stumbled struggling to explain

18:12

the details of everyday mechanisms. They

18:15

had assumed that those details would

18:17

readily spring to mind, and

18:19

they did not. And to

18:21

their credit, most experimental subjects

18:24

realized that they've been lying to themselves.

18:27

They had felt they understood zip fastness

18:29

and lavatories, but when invited to

18:31

elaborate, they realized they didn't

18:33

understand at all. When

18:35

people were asked to reconsider their

18:38

previous one to seven rating, they

18:40

marked themselves down, acknowledging

18:42

that their knowledge had been shallower than they'd

18:44

realized. Rosen

18:46

Blitt and Kyle called this the

18:49

illusion of explanatory

18:51

depth. The illusion

18:53

of explanatory depth is a curiosity

18:56

killer and a trap. If

18:59

we think we already understand, why

19:01

go deeper? Why ask questions?

19:04

It is striking that it was so easy

19:06

to get people to pull back from their earlier

19:09

confidence. All it took was

19:11

to get them to reflect on the gaps in

19:13

their knowledge, and, as Lowenstein

19:16

argued, gaps in knowledge fuel

19:19

curiosity. There

19:21

is more at stake here than zip fastness.

19:24

Another team of researchers, led by

19:26

Philip Fernbach and Steve Sloman,

19:29

authors of The Knowledge Illusion,

19:32

adapted the flush laboratory question

19:34

to ask about policies such as

19:36

a cap and trade system for carbon emissions,

19:39

a flat tax, or a proposal

19:41

to impose unilateral sanctions on

19:43

Iran. The researchers

19:46

importantly didn't ask people

19:48

whether or not they were in favor of or

19:50

against these policies. There's

19:52

plenty of prior evidence that such questions

19:55

would lead people to dig in.

19:58

Instead, Fernbach and his colleagues

20:00

just ask them the same simple question,

20:03

Please, rate your understanding

20:05

on a scale of one to seven. Then

20:09

the same devastating follow

20:11

up, please elaborate, tell

20:14

us exactly what unilateral sanctions

20:16

are and how a flat tax works, and

20:19

the same thing happened. People

20:21

said, yes, they basically

20:23

understood these policies fairly well. Then

20:26

when prompted to explain, the

20:29

illusion was dispelled, they

20:31

realized that perhaps they didn't

20:34

really understand at all. More

20:37

striking was that when the illusion

20:39

faded, political polarization

20:42

also started to fade. People

20:45

who would have instinctively described

20:47

their political opponents as wicked

20:50

and who would have gone to the barricades to defend

20:52

their own ideas tended to be

20:54

less strident when forced to admit

20:56

to themselves that they didn't fully

20:59

understand what it was that they were so

21:01

passionate about in the first

21:03

place. The experiment

21:06

influenced actions as well as words.

21:09

Search has found that people became less likely

21:11

to give money to lobby groups or other organizations

21:14

which supported the positions they had once

21:17

favored. It's a rather

21:19

beautiful discovery in a

21:21

world where so many people seem to

21:23

hold extreme views with strident

21:26

certainty. You can deflate

21:28

somebody's over confidence and moderate

21:30

their politics simply

21:32

by asking them to explain the details.

21:36

Next time you're in a politically heated argument,

21:39

try asking your interlocutor

21:41

not to justify herself, but

21:43

simply to explain the policy in question.

21:47

She wants to introduce a universal basic

21:49

income, or a flat tax, or a points

21:51

based immigration system or medicare

21:54

for all. Okay, that's

21:56

interesting. So what

21:59

exactly does she mean by that? She

22:02

may learn something as she tries to

22:04

explain. So may you

22:07

and you may both find that you understand

22:09

a little less and agree

22:12

a little more than you had assumed.

22:17

Figuring out the workings of a flush

22:19

lavatory, or understanding what a cap

22:21

and trade scheme really is, can

22:23

require some effort. One

22:25

way to encourage that effort is to embarrass

22:28

somebody by innocently inviting an

22:30

overconfident answer on a scale

22:32

of one to seven. But another

22:35

kinder way is to engage their

22:37

interest. As Orson

22:40

Wells said, once people are

22:42

interested, they can understand

22:44

anything in the world. How

22:47

to engage people's interest is neither

22:49

a new problem nor an intractable

22:52

one. Novelists, screenwriters,

22:54

and comedians have been figuring out

22:57

this craft for as long as they've existed.

22:59

They know that we love mysteries, are drawn

23:02

in by sympathetic characters, enjoy

23:04

the arc of a good story, and will

23:06

stick around for anything that makes us laugh,

23:09

and scientific evidence suggests

23:12

that Orson Wells was absolutely

23:14

right. For example, studies

23:16

in which people were asked to read narratives

23:19

and non narrative texts found

23:21

that they zipped through the narrative at twice

23:24

the speed and recalled twice as

23:26

much information later. As

23:28

for humor, consider the case of the

23:30

comedian Stephen Colbert's civics

23:33

lesson. Before his current

23:35

role as the host of The Late Show, Colbert

23:38

presented The Colbert Report in

23:41

character as a blowhard right

23:43

wing commentator. In March

23:45

twenty eleven, Colbert began a

23:47

long running joke in which he explored

23:50

the role of money in US politics.

23:53

He decided that he needed to set up

23:55

a political action committee a

23:57

pack to raise funds in case

24:00

he decided to run for president. I

24:03

clearly need a pack, but I have no idea

24:05

what packs do, he explained to

24:07

a friendly expert on air. Over

24:10

the course of the next few weeks, Colbert

24:13

had packs and super PACs

24:16

and five or one se fours

24:18

explained to him from where they

24:20

could accept donations up to what limits,

24:23

with what transparency requirements,

24:25

and to spend on what He

24:27

was to discover that the right combination of fundraising

24:30

structures could be used to raise

24:32

almost any amount of money for

24:34

almost any purpose with almost

24:37

no disclosure. Clearly,

24:40

Sea fours have created an unprecedented,

24:42

unaccountable, untraceable cash

24:44

tsunami that will infect every

24:47

corner of the next election, he mused,

24:50

and I feel like an idiot for not having one.

24:54

Colbert later learned how to dissolve

24:56

his fundraising structures and keep

24:58

the money without notifying

25:01

the taxman by repeatedly

25:04

returning to the topic and in

25:06

character demanding advice as to

25:08

how to abuse the electoral rules. Colbert

25:11

explored campaign finance in far

25:14

more depth than any news report

25:16

could have dreamed of doing. Did

25:19

all of this actually improve viewers

25:21

knowledge of the issue, It seems

25:24

so. A team including

25:26

Kathleen Hall Jamieson, who also

25:28

worked with Dan Kohan on the Scientific

25:31

Curiosity Research, used

25:33

the Colbert storyline to investigate

25:35

how much people learned amid the laughter.

25:38

They found that watching the Colbert report

25:40

was correlated with increased knowledge about

25:43

super pacts and five O, one C four

25:45

groups. How they worked what they

25:47

could legally do. Reading a

25:49

newspaper or listening to talk radio

25:51

also helped, but the effect

25:53

of the Colbert Report was much bigger.

25:56

One day a week of watching Colbert

25:59

taught people as much about campaign finance

26:02

as four days a week reading a newspaper,

26:04

for example, or five

26:07

extra years of schooling. Of

26:10

course, this is a measure of correlation,

26:13

not causation. It's possible

26:15

that the people who were already interested

26:17

in super PACs tuned in to Colbert

26:20

to hear him wisecrack about them,

26:23

or perhaps politics junkies

26:25

know about super PACs and also

26:27

love watching Colbert. But

26:30

I suspect the show did cause the

26:32

growing understanding because Colbert

26:34

really did go deep into the details, and

26:37

large audiences stuck with him

26:40

because he was funny. You

26:42

don't have to be one of America's best loved

26:45

comedians to pull off this trick. The

26:48

NPR podcast Planet Money

26:50

Wants shed light on the details of the global

26:52

economy by designing, manufacturing,

26:55

and importing several thousand

26:58

T shirts. This allowed

27:00

a long running storyline investigating

27:02

cotton farming the role of automation

27:05

in textiles, how African

27:07

communities make new fashion out of

27:09

donated American T shirts, the

27:11

logistics of the shipping industry, and

27:14

strange details such as the fact

27:16

that the men shirts which were made in Bangladesh

27:19

attract a tariff of sixteen point

27:21

five percent, whereas the women

27:23

shirts made in Columbia

27:26

are duty free. These

27:28

examples should be models for communication

27:31

precisely because they inspire

27:33

curiosity. How does

27:35

money influence politics is

27:38

not an especially engaging question,

27:40

But if I were running for president,

27:43

how would I raise lots of money with few conditions

27:46

and no scrutiny is much more

27:48

intriguing. Those

27:50

of us in the business of communicating ideas

27:53

need to go beyond the fact check and the

27:55

statistical SmackDown. Facts

27:58

are valuable things, and so

28:00

is fact checking. But if we

28:02

really want people to understand complex

28:04

issues, we need to engage their

28:06

curiosity. If people are

28:08

curious, they will learn.

28:12

I found this in my own work with a

28:14

team who make more or less for

28:16

the BBC. The program

28:19

is often regarded affectionately as

28:21

a MythBuster. I feel

28:23

that our best work is when we use statistics

28:26

to illuminate the truth, rather

28:28

than to debunker stream of falsehoods.

28:31

We try to bring people along with us as we

28:33

explore the world around us with the help

28:35

of reliable numbers. What's

28:38

false is interesting, but

28:41

not as interesting as what's true.

28:45

After the referendum of twenty sixteen,

28:47

in which my fellow British voters decided

28:50

to leave the European Union, the economics

28:53

profession engaged in some soul searching.

28:56

Most technical experts thought

28:58

that leaving the EU was a bad idea, costly

29:02

complex, and unlikely

29:04

to deliver many of the promised benefits

29:07

or solve the country's most pressing

29:09

problems. Yet,

29:11

as one infamous sound bite put

29:13

it, the people in this country have

29:16

had enough of experts. Few

29:18

people seemed to care what economists

29:21

had to say on the subject, and to

29:23

our credit, I think professional

29:25

economists wanted to understand what

29:28

we had done wrong and whether we might

29:30

do better in future. Later,

29:32

at a conference about the profession

29:35

and the Public, the Grades and the

29:37

Good of the British economics community pondered

29:39

the problem. The discussed solutions.

29:42

We needed to be more chatty and approachable

29:44

on Twitter, suggested one analysis.

29:47

We needed to express ourselves clearly

29:49

and without jargon, offered many

29:51

speakers, not unreasonably. My

29:55

own perspective was slightly different. I

29:58

argued that we were operating in a politically

30:00

polarized environment in which

30:03

almost any opinion we might offer

30:05

would be fiercely contested by partisans.

30:09

Economists deal with controversial

30:11

issues such as inequality, taxation,

30:14

public spending, climate change,

30:16

trade, immigration, and

30:19

of course Brexit. In

30:22

such a febrile environment, Speaking

30:24

slowly and clearly will

30:27

only get you so far. To

30:29

communicate complex ideas, we

30:31

needed to spark people's

30:33

curiosity, even inspire

30:36

a sense of wander the great

30:38

science communicators, after all, people

30:41

such as Stephen Hawking and David

30:43

Attenborough do not win over

30:46

people simply by using small

30:48

words, crisply spoken. They

30:51

stoke the flames of our curiosity,

30:54

making us burn with desire to learn

30:56

more. If we economists

30:58

want people to understand economics, we

31:01

must first engage their interest. What

31:04

is true of economists is equally true

31:07

for scientists, social scientists,

31:09

historians, statisticians, or

31:11

anyone else with complex ideas to

31:13

convey. Whether the topic

31:16

is the evolution of black holes or

31:18

the emergence of black lives matter, the

31:21

possibility of precognition or

31:23

the necessity of preregistration. The

31:26

details matter, and

31:28

presented in the right way, they should

31:30

always have the capacity to fascinate

31:32

us awaken

31:36

our sense of wander. I say to my fellow

31:38

nerd communicators, ignite

31:40

the spark of curiosity and give

31:42

it some fuel using the time

31:44

honored methods of storytelling, character,

31:47

suspense, and humor. But

31:50

let's not rely on the journalists and the

31:53

scientists and the other communicators

31:55

of complex ideas. We

31:57

have to be responsible for our own

31:59

sense of curiosity. As

32:01

the saying goes, only boring

32:04

people get bored. The world

32:06

is so much more interesting

32:09

if we take an active interest in

32:11

it. The cure for

32:13

boredom is curiosity, goes

32:16

an old saying, there is no cure

32:18

for curiosity. Just

32:21

so once we start to peer

32:23

beneath the surface of things, become

32:26

aware of the gaps in our knowledge, and

32:28

treat each question as the path

32:30

to a better question, we find

32:32

that curiosity is habit forming.

32:36

Sometimes we need to think like Darrel

32:38

Half. There is a place in life

32:40

for the mean minded, hard nosed

32:42

skepticism that asks where's

32:44

the trick? Why is this line bastard

32:47

lying to me? But while

32:49

I don't believe it is sometimes

32:52

the right starting point. When

32:54

confronted with a surprising statistical

32:56

claim, it is a lazy

32:58

and depressing place to finish,

33:02

and I hope you won't finish there. I

33:04

hope that I have persuaded you that

33:06

we should make more room both for the

33:09

novelty seeking curiosity that says,

33:11

tell me more, and the dogged

33:13

curiosity that drove Austin Bradford

33:16

Hill and Richard Doll to ask

33:18

why so many people were

33:20

dying of lung cancer and whether

33:23

cigarettes might be to blame. If

33:26

we want to make the world add up, we

33:29

need to ask questions, open

33:31

minded, genuine questions,

33:35

and once we start

33:37

asking them, we may find

33:39

it delightfully difficult to

33:42

stop. That

33:46

was an extract from my new book, The

33:48

Data Detective. The International

33:51

edition is How to make the World Add Up. Thanks

33:53

for listening, and keep on listening,

33:55

because Cautionary Tails is back on

33:58

the twenty sixth of February.

34:09

Wouldn't be a

34:10

b

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features