Episode 4: Respect the Polygon

Episode 4: Respect the Polygon

Released Tuesday, 19th April 2022
 2 people rated this episode
Episode 4: Respect the Polygon

Episode 4: Respect the Polygon

Episode 4: Respect the Polygon

Episode 4: Respect the Polygon

Tuesday, 19th April 2022
 2 people rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:15

Pushkin, what's

0:20

your biggest fear. I'd

0:24

say the biggest fear is something

0:27

a mistake that I would make that

0:29

would damage my credibility to where

0:32

people would not listen to me when there's a tornado

0:34

down. James

0:36

Span meteorologist, maybe

0:39

Alabama's best known person aside from

0:41

some football coaches. He's

0:43

all over TV talking about the weather, especially

0:46

when the weather might kill you. So this is

0:48

a tornado emergency, but the city's at

0:50

Tuscaloosa and North Fort and the campus

0:52

of the University of Alabama.

0:55

James is one of those people who's never really had a

0:57

job because he found his calling.

1:01

He once stayed on the air as he watched a tornado

1:03

make straight for his own home, pleading

1:05

with people to see the risk. If

1:07

you're just joining us, This is James Span

1:10

with Taylor Serrallo mainly chicken

1:12

on my wife's

1:15

she's okay and she's in the tornado shelter. Okay,

1:17

go ahead, Taylor. I'm sorry. I was

1:20

put on this planet to mitigate loss of life

1:22

when their tornadoes flying around here, and

1:25

I have to be very careful in what I

1:27

say and what I do, not just

1:30

on the air, but on social

1:32

media. And in real

1:34

life. To

1:37

build trust with his audience, James goes to incredible

1:40

links. He's published a children's

1:42

book called Benny and Chipper Prepared

1:45

Not Scared. He spends time

1:47

in dollar stores talking to people because

1:50

the people who shop in dollar stores are also

1:52

the people who live in trailer homes, the sort

1:54

of homes that tornadoes obliterate. He

1:57

memorizes the names of Alabamians who've

1:59

died in storms, people he

2:01

might have saved. There's lots of them.

2:04

On a single day back in April twenty eleven,

2:07

a line of tornadoes in Alabama killed

2:09

two hundred and fifty three people. I

2:11

know their stories, I know their family

2:14

members. I've talked to many of

2:16

them, and it's very motivating for me. And

2:18

that's my main job in life. It's to make

2:20

the warning process better with severe

2:23

weather. He's doing all he

2:25

can to warn people, yet

2:28

people still don't understand what he's saying.

2:33

I'm Michael Lewis. Welcome

2:35

back to Against the Rules, where

2:38

we explore unfairness in American

2:40

life by looking at what's happened

2:42

to various characters in American life. This

2:45

season is all about experts.

2:49

Today, we're going to explore the strange thing

2:51

that's happened to experts. Not all experts,

2:53

a certain percentage of them, the

2:55

experts who think and speak in

2:57

probabilities, who use data

3:00

to forecast the likelihood of this or that

3:02

coming to pass. The

3:04

experts who can never be perfectly certain,

3:06

and who risk our wrath because we love thinking

3:09

in absolutes. James

3:22

Span has been making and explaining weather forecast

3:25

for the better part of half a century. In

3:27

that time, it's kind of incredible how much has

3:30

changed. So here's in nineteen

3:32

seventy eight forecast partly sunny

3:34

tomorrow with a chance of showers in the high of eighty.

3:37

That's it. So today, under

3:39

the same circumstances, I'd say we'll

3:42

have a pretty good bit of sunshine between nine

3:44

and eleven o'clock. After eleven o'clock rain

3:47

is likely between eleven and one. The chance of any

3:49

one spot getting wet during that two hour window

3:51

it's about seventy five percent. It's going to rain

3:53

about a half inch in most places. There could be some

3:55

thunder. Most of that should be out of here by two

3:58

thirty. After three o'clock, you're good to go. The sun breaks

4:00

back out a temperature should peek around eighty

4:02

at one o'clock, then falling back into the seventies

4:04

by four o'clock. That's the difference in what

4:06

we can do now compared to nineteen seventy eight.

4:08

It's the prince between daylight and darkness. If

4:11

you go back to the beginning of your career, were

4:13

you encouraged to speak to the audience that

4:15

way, like, we don't know that much about this,

4:18

this could be wrong. Oh no,

4:20

no, no, they didn't want you to say that. I mean, coodn't

4:22

you know. Back in the seventies, this was when

4:24

TV news was coming of age and I witness

4:27

news, you know, and they wanted

4:29

to be this godlike figure, you

4:31

know on television. I was

4:33

scared to communicate uncertainty because that wasn't

4:35

encouraged. We were the news,

4:38

the evening news, the Ron Burgundy

4:40

newscast. Weather forecasts

4:42

are inherently uncertain, the

4:44

where, the when, the how much. With

4:47

the current data we have, the best you can do

4:49

is judge the odds. But

4:51

the odds have gotten much more accurate

4:53

over time. Back when James span

4:55

was a young meteorologist, he knew very

4:57

little but tried to sound like he knew

4:59

a lot. Now that he knows a

5:02

lot, he works hard to explain what he

5:04

doesn't know. You're

5:06

giving the audience more information

5:08

and more new, honest information. So it's more demanding

5:10

on the audience, right

5:13

it is. And you know I

5:16

hear this all the time. I just want to know if it's going to rain tomorrow,

5:19

and they want a yes orno. They want

5:21

that deterministic forecast, deterministic

5:25

as imperfectly predictable, which

5:27

is something the weather still isn't. When

5:30

James Span started out, the ten day forecast

5:32

was no better than just guessing. Now it's

5:34

a lot better. But maybe the

5:36

most obvious improvement, the one people

5:39

really should notice, has been in forecasters

5:41

understanding of the kind of weather that kills people.

5:45

In nineteen seventy eight, we were using nineteen fifty

5:47

seven era radar and

5:50

the old black and white printouts of

5:52

radar. It looked like somebody barsed on a piece of paper,

5:54

and so warnings. In nineteen seventy eight, let's

5:57

say we had a tornado down. We didn't

5:59

really know where it was, we had an idea, so

6:01

warnings were issued by an entire county.

6:03

Tornadoes, even the big tornadoes are

6:05

small and counties are huge. So

6:08

here you are warning an entire county

6:10

to get into your safe place and do something where

6:13

most people didn't need to do anything. We're

6:16

today we know literally within

6:18

maybe a few city blocks of where the tornado is

6:20

located. Well, so if I'm a

6:22

consumer of tornado warnings,

6:25

I get a much more precise warning,

6:27

and I do I get a more advanced

6:29

warning. Am I likely to get it? Get

6:31

more more time to prepare for this thing? Yes?

6:34

They have. Average lead time here is about

6:36

twelve to fifteen minutes, and the average

6:38

lead time back in the seventies was zero

6:41

to three minutes. So we've

6:43

come a long way, and we don't use counties anymore.

6:45

We use small, small, small segments of counties.

6:47

Geometric shapes, polygons. Anybody

6:50

that knows James span I've said this over and over.

6:52

Respect the polygon, and if you're in it, you do

6:54

something. Respect the polygon.

6:57

And if you're in the polygon, you respect the polygon.

6:59

Respect polygon. Every

7:02

storm today will mean business. Respect

7:05

A James Spans superfan did a remix

7:07

of his famous frame a

7:10

polygon. I

7:15

love this, of course, but it also raises a

7:17

question, why respect

7:20

the Polygon instead of just respect

7:22

what I say. It's weird. If

7:26

the James Span back in nineteen seventy

7:28

eight had been as accurate as James

7:31

Span is now, he'd have

7:33

endured hail storms of gratitude,

7:36

hurricanes of appreciation, tornadoes

7:39

of awe. But that's not

7:41

the wather he now lives with. Hello

7:45

friends, this is James Span. It's time

7:47

to read some mean tweets.

7:49

And thanks to all of you for sending in the mean tweets.

7:52

I really appreciate from from my heart. You

7:55

cost the people in the state millions of dollars

7:57

by your boop

7:59

poor boot forecasts. I

8:03

woke up today expecting snow. I blame

8:05

you, James. I got my dogs all excited

8:07

for nothing. James,

8:10

either you're the worst meteorologist I've

8:13

ever layer my eyes on,

8:16

or you have the worst luck at predicting the weather.

8:18

I think it's time to step down. Brother. The

8:22

only difference between James Span

8:24

and every other meteorologist is that

8:26

James reads his mean tweets on the air.

8:29

Just to show you where we stand, my producer

8:32

called up some weather tweeters. Here's

8:34

the kinds of things that people have to say

8:37

weather forecasting is the only

8:40

job you can have where you can

8:42

be wrong fifty percent of the time and still

8:44

make thousands of dollars. If

8:46

we were wrong fifty percent in our

8:49

jobs, we probably would be fired.

8:51

I know nothing about meteorologists, but

8:53

I know that you know they always wrong.

8:56

I'm one of those people that actually vainly

8:58

looks at the weather forecast because nine

9:01

times attend it's different from

9:03

the forecast. As technology

9:05

improves, they don't improve. The continue

9:08

to be it's smine

9:10

boggling. You're

9:13

going to get the hate, not necessarily because

9:15

of your missed weather forecast, but just because

9:17

of who you are. You're a you're

9:19

a weather person, and you know you're

9:22

a stooge. You're a you

9:24

don't deserve to be on the planet. You shouldn't

9:26

be breathing air. People

9:29

have that attitude towards weather people. Oh

9:31

listen. So I cut off a basketball

9:33

game on Christmas Day in twenty fifteen,

9:35

and we had a tornado coming up on the southwest

9:38

part of the city here. It could have killed a lot of people,

9:41

So we had to cover up about twenty twenty five minutes

9:43

of that game, and nobody lost their life.

9:46

The warning system worked beautifully but

9:48

this is Christmas Day. Joy to the world, peace

9:50

on earth, goodwill toward men. The first

9:52

email I got, you know what it said. It said you

9:55

should have been aborted by a coat

9:57

hanger. So this

9:59

is the stuff I deal with. I

10:01

mean, I'm amazed he still goes on the air.

10:04

His forecast just keep getting better

10:06

and better, but the job of being a meteorology

10:09

just keeps getting worse and worse. But

10:11

I till these young people, you know, you better have

10:13

a thick skin when you get out of here and you get

10:15

your first job, because they're going to come after you

10:17

when you found that first forecast up. Back

10:22

in an earlier season of this show, I talked about

10:24

the problem of referees and a strange

10:27

phenomenon. A lot of refs

10:29

are getting better at their jobs. They

10:31

have new tools, they're better train they

10:33

get better feedbacks, they are less likely to repeat

10:35

mistakes. I mean, there's just no

10:37

way that the refs in pro sports are less

10:40

accurate than they were forty years ago when

10:42

there was no replay, less training and all

10:44

the refs got hired from the same old boys club,

10:47

But they didn't used to need police escorts

10:49

from the arenas. Now they do.

10:54

In December of twenty twenty one, tornadoes

10:57

ripped through Kentucky. Weather

10:59

experts gave people lots of warning,

11:02

So this is just an explosive severe

11:05

weather set up, and that's the outlook

11:07

that we have heading our way, especially after

11:09

midnight through about eight o'clock in the morning. We

11:12

definitely need to stay aware of the weather

11:14

game. Meteorologists like this guy

11:16

on WHASTV and Louisville

11:19

were better than they'd ever been. Make

11:21

sure you have a way to wake up if

11:23

a warning is issued like this one

11:25

that we have. That night in Kentucky, at

11:27

least seventy seven people were killed, more

11:30

people than have ever died from a weather event

11:32

in the state's history. All

11:34

people had to do to survive was listened

11:36

to the experts, and still a

11:38

lot of them didn't. I

11:48

think that having data is a really

11:51

recent phenomenon, Rebecca

11:53

Golden, math professor at George

11:55

Mason University. We didn't have data

11:58

about how things were, we didn't record

12:00

what happened previously. Then it's

12:02

only really recently that we think maybe

12:04

our lived experiences could

12:06

be in part based on something

12:09

probabilistic, Like a lot

12:11

of people who are good at math Rebecca

12:13

noticed the confusion and wrongheadedness

12:15

of people who weren't. She also

12:17

noticed that even when statistics and these new

12:20

big piles of data were properly

12:22

explained, people didn't really

12:24

grasp their meaning. People have a hard

12:26

time being convinced by data. It's

12:28

just that they don't think that their experiences

12:31

is in line with that data, and so

12:33

they dismiss it, or they

12:36

have other experiences that tell them that

12:38

there are reasons to be skeptical of the source

12:40

of that data or the source of the

12:42

statements that are relying on the data.

12:44

The problem isn't just in the quality of the information

12:47

we have access to. It's in

12:49

the way we make sense of the world. There's

12:52

a very large segment of the population

12:54

who are really struggle with basic mathematics.

12:58

So people are making mistakes because

13:00

they don't think in probabilities. I

13:02

think that's right. Rebecca

13:04

actually helped to start an organization called

13:07

stats to expose the statistical

13:09

mistakes made by journalists. She

13:12

thought that if statistics were conveyed more accurately

13:14

to the public, the public would see the

13:16

world more clearly. Eventually,

13:19

she decided she was wasting her time because

13:21

there was this bigger problem

13:24

how people comprehend statistics,

13:26

even when they're accurate. Why

13:29

is it that people don't think in probabilities,

13:32

like the world's probabilistic. Why

13:34

are our minds so

13:36

deterministic? It's kind of a philosophical

13:39

question. I think we're hardwired to believe.

13:41

I think it helps us make

13:44

decisions without being stressed about those

13:46

decisions. It helps us act

13:48

with certainty and make decisions

13:51

so we don't

13:53

hesitate too much and think too

13:55

hard. In the savannah, we don't

13:57

say that's probably a lion, right,

14:00

we just run. But

14:05

to just run is less and less

14:07

a viable way to move through the world world

14:10

because this relatively new thing called

14:12

data has given us a far shrewder

14:14

alternative. Everywhere you turn,

14:17

you find someone analyzing data to

14:19

generate the same sort of probabilistic

14:21

understanding of the world that weather people

14:23

do. I

14:25

kind of come from this world

14:28

of like, you know, kind of quantz

14:30

and like baseball geeks and like poker

14:33

players. That's Nate Silver.

14:35

He got swept up in the nineteen nineties by the

14:37

statistical revolution in baseball. And

14:40

I think I'm kind of like one of the

14:42

relatively people who's kind of escaped, so to speak,

14:45

from that world in the like mainstream society.

14:48

Back in two thousand and seven, Nate

14:50

quit forecasting the future of young baseball

14:52

players. He began to forecast

14:55

elections instead. It's sort

14:57

of what you're doing is actually accepting the possibility that maybe

14:59

you can predict something that's right.

15:02

But yeah, it's like kind of like saying,

15:04

hey, look, we built an audience for this in

15:08

in baseball, and so politics

15:11

is still in the Stone Age, and so there

15:13

must be kind of an audience for some politics too.

15:16

When you turn your attention to politics,

15:19

at what point are you aware that

15:22

the expertise in political

15:24

forecasting is sort of limited,

15:28

that there's kind of an opportunity. I

15:30

mean I had an intuition from

15:33

that from the very beginning in politics,

15:35

I mean, the campaigns have to be fairly

15:38

smart and data driven about they were targeting. But

15:40

like, but the media was all about kind of

15:43

narratives. It

15:45

was really quite bad in two thousand and eight. Right.

15:47

It's really like a bunch of like, you know,

15:50

old white men getting together and kind of deciding

15:52

based on, you know, what

15:54

their friends think, kind of what the narratives

15:56

should be in

15:59

the presidential primaries of two thousand and

16:01

eight, Nate Silver gave an upstart

16:04

senator named Barack Obama a

16:06

much better chance than most everyone else did. In

16:09

the general election. He nailed not just the outcome,

16:12

but the result in every state, plus

16:15

the precise number of votes Obama received

16:17

in the Electoral College. People

16:19

paid more attention to what Nate had done

16:22

than how he had done it. He'd

16:24

simply use polling data rather than his gut

16:26

or some anecdote about some Iowa farmer.

16:30

The polling data might not be perfect, but

16:32

it was better than every other source of information,

16:36

and they never made outright predictions.

16:38

He issued political forecasts like weather

16:40

forecasts, with probabilities

16:42

attached to them. Going into election

16:45

day of two thousand and eight, he'd given Obama

16:47

a ninety point nine percent chance of winning.

16:50

I mean, the irather thing about it is like like

16:53

there was always a chance that we would be

16:55

wrong, you know what I mean, and probably

16:57

never heard from politics again,

16:59

potentially. Instead,

17:02

Nate became basically overnight the

17:05

country's leading political forecaster because

17:08

his expert piece was superior to the

17:10

storytelling it replaced. Nate

17:14

Silver is his name, fortune

17:16

telling is his game. He's

17:19

a celebrity statistician.

17:22

Please welcome Nate Silver. That's

17:26

right, Nate Silver's the good will hunting

17:28

of political prognosticasia.

17:30

There's a difference between weather forecasting

17:32

or sports statistics and politics, a

17:35

difference more of degree than kind, but

17:37

still a difference. The people who

17:39

celebrated Nate Silver really

17:42

really didn't understand how to judge him.

17:45

His better insights into pulling data had

17:47

allowed him to see that Obama was basically always

17:49

doing better than political pundits thought he was

17:51

doing, but there was still no

17:53

law that said Obama had to win. Polling

17:57

data is a bit like the data that card counters

17:59

get in blackjack. It's a lot

18:01

better than having no data at all. It

18:03

helps you to predict what comes next, but

18:06

even card counters lose lots of hands.

18:09

And here we go, ladies and gentlemen, welcome to

18:12

Decision Night in America. Here at NBC's

18:14

Democracy Point, which brings us to two thou sixteen.

18:20

Nate Silver now had an enormous

18:22

following. Once again, the pundits

18:25

gave Hillary Clinton better odds than the polls.

18:28

On election day, Nate gave Donald Trump

18:30

a roughly thirty percent chance of winning

18:33

at the time that was a radical call.

18:35

A few traditional pundits thought Trump had

18:38

that much of a shot. Yeah, I guess question,

18:40

guys, are we post Nate Silver, are

18:42

we pulled out? Well? They've been wrong, not

18:45

only just wrong, they're just they're superfluous.

18:47

And at the point where they just that's when you kind of begin

18:49

to realize that, like, the

18:52

way you define success and the way other people

18:55

look at your forecast as being successful are

18:58

very different. And also because it wasn't just that, like we

19:00

got criticized after twenty sixteen for having

19:03

quote unquote been wrong, it was also in

19:05

the roup twentyteen people were actually mad at us for not

19:08

being confident enough in Clinton's

19:10

chances. Right, Nate never

19:13

claimed to have some mystical ability to call

19:15

a presidential election, and

19:17

assigning probabilities is not the

19:19

same as taking sides. Yelling

19:22

at him for saying that Donald Trump had a thirty

19:24

percent chance of winning was like being

19:26

mad at the weather man for saying there was a thirty

19:28

percent chance of rain and

19:30

then getting mad all over again after

19:33

it rains. I'm gonna get myself in a

19:35

little bit of trouble for saying this, right,

19:37

But like people like me really

19:40

care about being

19:42

right quote unquote for

19:45

the intrinsic value of like making a good forecast

19:48

as opposed to like influencing the

19:51

narrative, if you will, Okay,

19:56

so I would love because it would

19:58

educate me. How do you evaluate

20:01

a probabilistic a

20:04

forecast? What's the right way for

20:06

people to judge Nate's Silver

20:08

Expert. The right way

20:10

is if you take a whole bunch

20:12

of forecasts that we've made and look

20:15

at how they've done collectively. Right, So,

20:17

let's say you made one hundred forecasts where

20:20

the favorite had a seventy percent chance of winning. Look

20:22

at that group of forecasts, and was

20:24

it true that the favorite actually

20:27

won about seventy percent at a time? Right?

20:29

The slip side of that is that like it

20:32

does mean that, like, you can

20:34

tell very little from

20:36

anyone prediction. I mean, unless you're like, unless

20:38

you're very very close to

20:41

one hundred zero percent, right, then

20:44

one prediction alone won't tell you that much.

20:48

Experts have gotten better, but they've

20:50

also gotten harder to judge, so

20:53

hard that you need an expert to judge them.

20:55

And that's a problem, right, I Mean, who's

20:57

going to go to the trouble of evaluating hundreds

21:00

of Nate Silver's forecasts. And while

21:02

it's true that he's made thousands of election forecasts,

21:05

he hasn't made thousands of forecasts for presidential

21:07

elections. Most people don't even

21:09

think about elections or forecasts or anything

21:12

else the way Nate Silver does. Most

21:14

people don't even speak his language. I

21:16

actually think that the

21:18

word uncertainty is used in English

21:21

in a very different way than uncertainty

21:24

is used in statistics. Rebecca

21:27

Golden again, So when we talk

21:29

about uncertainty and statistics, we

21:31

might say something about a confidence interval,

21:34

or we might use a pee value.

21:36

I'm not really sure you want this on your podcast,

21:38

Like, maybe that's a little bit too technical.

21:42

It might be better to

21:46

trying to think of how it might be better

21:49

to talk about uncertainty for your Well,

21:51

this is the root of the matter. So, because

21:54

it's not just my podcast listeners who are

21:56

cut above the average human beings,

21:58

it's like, how the American public understands

22:01

uncertainty? How do you

22:03

convey it? I think the best way to talk

22:06

about it is to actually put

22:08

it with specific numbers, Like

22:11

instead of talking percentages, let's

22:13

talk about numbers instead

22:16

of ten percent, say one in ten,

22:18

that kind of thing. But there's a

22:20

much bigger problem behind all this,

22:23

an emotional problem. It

22:26

comes from us wanting certainty in situations

22:28

where certainty just doesn't exist. If

22:31

the weatherman says is an eighty percent chance

22:34

of rain and it doesn't rain,

22:36

the people on the receiving end of the forecast don't

22:39

say, oh, that was one of the twenty percent of that

22:41

was one of the times when it wasn't going to right. They say the expert

22:43

doesn't know what he's talking about. So the

22:46

inability to think in terms

22:48

of probabilities also becomes

22:50

an inability to evaluate the experts.

22:53

There's a huge amount of

22:57

inability to evaluate who is an expert,

23:00

and yeah, it costs lives, it really

23:02

does. A

23:04

new kind of expert appears on the scene, an

23:07

expert who works with these new big piles

23:09

of data, an expert who thinks

23:12

improbabilities, an expert

23:14

who admits to being uncertain. These

23:17

new experts are clearly better than the

23:19

experts they replaced, and

23:22

yet people treat them as if they're worse and

23:24

neglect their advice, even when

23:26

their lives depend on it. We

23:29

who depend on the experts still want

23:31

them to have a definitive answer. Either

23:34

it will reign or it won't. Trump

23:36

either will win or he won't. But

23:39

that's not the nature of the world we live

23:41

in, and we're having some trouble

23:44

accepting that fact. The

23:52

more you look for it, the more you see this

23:54

problem. We've been talking about the

23:56

problem of the experts getting better yet being treated

23:59

as if they've gotten worse, a problem

24:01

that leads to a lot of mystifying behavior,

24:04

like what's gone on in the past two years inside

24:07

the American healthcare system.

24:10

I definitely saw a lot of this coming.

24:13

Alison Fearing is a nurse at Rush

24:15

Copley Medical Center in Aurora, Illinois.

24:18

I can't tell you how many times I've

24:21

had somebody coming and say I

24:23

have this because I read X, Y

24:25

and Z on WebMD, so I know

24:27

that that's what's going on, and it's like, well, there's a lot

24:29

more that goes into it that we need

24:32

to work up further, because there are

24:34

also other things that this could be and

24:36

we won't know this until we diagnose it

24:38

with lab work or a

24:40

cat scan or whatever. Has this

24:42

been going on your whole career? I

24:45

would say it's definitely gotten worse

24:48

over the past five

24:50

years or so. I think prior

24:53

to that there was a bit of it, but definitely

24:55

not to the extent that there is now.

24:59

Modern medicine is one of the great miracles

25:01

of our age. If you went to a doctor in

25:03

the nineteenth century, he was more likely to kill

25:05

you than cure you. Now

25:07

he's vastly more likely to cure you,

25:10

and the odds of that get better with each

25:12

passing day. Do you

25:14

think there's anything that, if

25:16

it happened to me that required me to go to the emergency

25:18

room, that I'd be better off

25:20

forty years ago than now. Honestly,

25:23

no, I really can't think of a single thing, just

25:25

because we have so much technology. At

25:29

first glance, she's not really like James

25:31

Span or Nate Silver. Doctors

25:34

and nurses don't usually speak in

25:36

probabilities, but her

25:38

expertise is essentially probabilistic.

25:41

Behind her is a world of medical science

25:43

that's calculating the odds all the time,

25:47

the odds that you have this disease as opposed

25:49

to that one, the odds that this

25:51

treatment will work versus that one. Every

25:54

year, Gallup publishes polls that show nursing

25:56

as America's most trusted profession. But

25:59

every year the number of people who say they trust

26:01

nurses and doctors, that

26:03

number keeps falling. Alison

26:06

sees it in the number of patients who argue with

26:08

a diagnosis or treatment. I

26:11

for one, wouldn't like not ever

26:13

like go to my mechanic with my car and be

26:15

like, oh, I know it's you know whatever,

26:18

because I know nothing about cars, Like I know

26:20

nothing, and so it's just really

26:23

wild for me to see that in

26:25

healthcare, because a

26:27

human body is so significantly more

26:30

complicated than a car. If I went

26:32

on Facebook and I said, if you go jump

26:34

off the Bay Bridge, you'll fly and it'll be the greatest

26:36

experience of your life, there are a whole lot of people

26:38

cann go jump off the Bay Bridge. What

26:41

is it about this kind of information that

26:44

causes people to respond to it? The

26:46

answer popped into my head as soon as I'd ask

26:48

the question, which I realize raises a

26:51

question about the question. There

26:53

are exactly zero examples of

26:55

people jumping off the Bay Bridge and flying

26:57

to safety because there's no

26:59

uncertainty involved. However,

27:02

there are plenty of examples of doctors and nurses

27:04

being wrong because medical

27:06

expertise is a series of probabilistic judgments.

27:09

The experts are using huge piles of data

27:11

to judge the odds, the odds that the vaccine

27:14

will make you ill or keep you safe,

27:17

which brings us to our most recent national

27:20

crisis. I

27:22

have been hit while trying

27:24

to perform a COVID swab on somebody

27:27

who is very clearly dying and like crashing

27:29

fast, and we need to do everything fast, and we say

27:32

what we're doing. Hey, I'm Ali,

27:34

I'm your nurse. Today, I'm going to be swabbing your nose real

27:36

quick for a COVID swab and getting

27:38

batted at. You know, told that this is

27:40

not COVID, that this is just bronchitis, and

27:43

just give me treatment for bronchitis. This isn't

27:45

what's going on. And it's

27:47

like watching somebody basically

27:51

breakdown right in front of you and watching them

27:53

choose to basically

27:56

not help themselves. You've

27:58

heard some version of these stories, and

28:01

you likely have an opinion about them.

28:03

But what haunts Allison is

28:05

one particular case. He was

28:07

a member of our local police apartment and

28:10

you know worth as a copy I

28:12

mean, he was like the epitome

28:14

of health. He had no pre existing issues,

28:16

nothing else going on, and

28:19

unfortunately he was unvaccinated and

28:22

came in very very ill. I

28:25

took some time to call his wife and explain

28:27

to her what was going on, what

28:30

the game plan was, what we knew so far,

28:33

and the first thing that she had said to me was

28:35

just so you guys know, he does not want to be intubated.

28:38

He knows what happens when people go on the

28:40

ventilator, and he knows that hospitals are

28:42

killing people with this. So this is a

28:44

police officer who thinks hospitals are

28:46

killing people. Yes, I just

28:48

had to take a step back and be like, if

28:51

only you could see the

28:53

tears in your husband's eyes right now and see

28:56

how absolutely terrified

28:59

he is right now, and understand

29:01

that this is not something that we're

29:05

wanting to do. Alison was

29:07

faced with a man with a severe case of COVID

29:09

who had refused the vaccine. Now,

29:12

the man's wife wouldn't let the hospital

29:14

improve his odds of living. We had

29:16

an extra thirty minutes or so before I

29:18

had to take him up to the ICU, and

29:21

so I thought, Okay, you know, I

29:23

know how this is going to go. I've seen how this

29:25

has gone. Alison's

29:29

patient wasn't insane, not in

29:31

the way a person would be if they jumped off the Bay Bridge

29:33

thinking they were going to fly when there are zero

29:35

chances of that happening. The patient

29:38

had refused a vaccine. There are

29:40

actual true stories of people getting sick

29:42

from the vaccine, and look at all

29:44

those unvaccinated people who were totally

29:46

fine. The wife was

29:48

refusing to allow the treatment most likely

29:50

to save his life. Well, there

29:52

are actual true cases of people being put

29:54

on ventilators when they'd been better off if they hadn't.

29:58

In a probabilistic world, improbable

30:01

things do happen. We

30:03

hear stories of the unlikely thing coming

30:05

true or not coming to pass, and

30:08

they stick in our minds. But

30:11

so does Alison's story.

30:13

I went into his room and you know, gound

30:16

up and everything, and asked, hey, like,

30:18

we have a few minutes, do you want to try facetiming your

30:20

family? And so we get his phone

30:23

out, we FaceTime his wife, and a

30:25

couple minutes later, she says, hey,

30:27

do you want to say hi to the boys? The boys want to say hi

30:29

to you, And she brings on his young

30:31

children and I mean they were like three

30:34

and five years old. If I had to guess, in

30:40

my heart, I knew this might be the

30:42

very last time that they ever get to see

30:45

their dad. And they

30:47

start saying, Daddy, Daddy, we love you,

30:49

we love you. And then one says, why

30:51

aren't you saying it back, dad? And I

30:53

had to pan the camera over to myself

30:56

and say, oh, no, he's saying it back. You

30:58

just can't hear him. The machines are really loud

31:00

in here, but your daddy loves you, and

31:02

he's saying it back too, I promise. And

31:05

we got off the phone call and I got

31:07

him up to icy you and I had to take a good ten

31:09

minutes to go to the bathroom and just cry

31:12

what happened to him. He unfortunately

31:15

passed away a week later. We're

31:20

not wired to see the odds. We're

31:23

not wired to accept the expertise that

31:25

falls out of a giant pile of data,

31:28

but our minds still long for the simple

31:30

answer rooted in our personal

31:33

experience or some story

31:35

we've heard, even

31:37

when the simple answer kills us. We

31:41

don't naturally respect the polygon, but

31:44

really we should, Really we

31:46

should. Against

31:52

the Rules is written and hosted by me Michael

31:54

Lewis and produced by Catherine Girardo

31:56

and Lydia Jeancott. Julia

31:58

Barton is our editor, with additional

32:01

editing by Audrey Dilling. Beth

32:03

Johnson is our fact checker, and Mia

32:05

Lobell executive produces. Our

32:08

music is by John Evans and

32:10

Matthias Bossi of Stellwagon

32:12

Symphonette. We record our

32:14

show at Berkeley Advanced Media Studios,

32:17

expertly helmed by tofur Ruth.

32:20

Thanks also to Jacob Weisberg, Heather

32:23

Fain, John Snars, Carly

32:25

Migliori, Christina Sullivan,

32:27

Nicole Morano, Royston

32:29

Deserve, Daniella Lacan, Mary

32:32

Beth Smith, and Jason Gambrell.

32:36

And an extra special thanks to Sam Sharpel's

32:39

for letting us use his amazing respect.

32:41

The Polygon Remix Against

32:44

the Rules is a production of Pushkin Industries.

32:47

Keep in touch, sign up for Pushkin's

32:49

newsletter at pushkin dot fm,

32:52

or follow at Pushkin Pods. To

32:54

find more Pushkin podcasts, listen

32:56

on the iHeartRadio app, Apple Podcasts,

32:59

or wherever you listen to podcasts,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features