Truth Seeking in a Dark Age | Phil Harper on DarkHorse

Truth Seeking in a Dark Age | Phil Harper on DarkHorse

Released Saturday, 1st June 2024
 1 person rated this episode
Truth Seeking in a Dark Age | Phil Harper on DarkHorse

Truth Seeking in a Dark Age | Phil Harper on DarkHorse

Truth Seeking in a Dark Age | Phil Harper on DarkHorse

Truth Seeking in a Dark Age | Phil Harper on DarkHorse

Saturday, 1st June 2024
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:02

Hey folks, welcome to a joint presentation

0:04

of the Dark Horse Podcast and The

0:07

Digger. I am sitting today with Phil

0:09

Harper and that is a near impossibility

0:11

except for the fact that I'm in

0:13

the Uk and I was here to

0:15

do a couple of events with the

0:17

World Council for Health Test Lorries organization.

0:19

One of them was last night and

0:21

Fl you were present at that event

0:24

and we thought it would be a

0:26

good idea to sit down and compare

0:28

notes. You and I, of course met

0:30

at the Better Way Conference in this

0:32

very. Venue where we are sitting Now

0:34

When with that have been two years ago.

0:36

the Up: Two years ago in this exact

0:39

city where it all started to feel like

0:41

a real thing for the first time, not

0:43

something that was happening on computer screens. People

0:46

came out like world and was than we

0:48

thought. That's quite right Actually I was the

0:50

first time for me meeting a lot of

0:52

people that I had been interacting with and

0:55

had been partnering with. In fact over. Covered.

0:57

Dissident. see so it was a

1:00

a powerful moment. Anyway, it's interesting

1:02

to be two years on from

1:04

that, and to see where we

1:06

are. How should

1:08

we? How should we start him? Well, I'd I wanted

1:10

to talk with the some. Super glad that we've been

1:12

able to do that. You're.

1:15

Talk last night I think is a good

1:17

as place as any to kickstart. The name

1:20

of the talk was something like dark age

1:22

or enlightenment. the hype, a novelty crisis and

1:24

you posed this question which was interesting to

1:26

me because it split the room. At

1:29

the beginning I would say roundabout down the middle. Which.

1:32

Was surprising. Now this question of

1:34

are we. Entering. A

1:36

dark age? or are we perhaps leaving one

1:39

and reaching in an age of enlightenment

1:41

by didn't quite get your stare on that.

1:43

Not that you should watch yourself twenty

1:45

position before we kick start this thing, but

1:47

I didn't see your hum get raised Brett.

1:50

Now I deliberately didn't raise it because

1:52

I didn't mean I'll as when you're the

1:54

speaker, you don't. Want. To

1:56

influence people because they you know some of

1:58

them are you know? They. Have

2:00

appreciated something you said from afar and of

2:03

you put up your hand they don't want

2:05

to feel like know they're wrong. So anyway

2:07

I kept my opinion to myself. On the

2:09

other hand, ah. It

2:11

is interesting. There was a time when

2:13

I thought I was the only person

2:16

who believed that we were in a

2:18

dark age. and then I met Steve

2:20

Patterson and I met him in part

2:22

because he was talking openly about the

2:24

fact that he believed we were in

2:27

a dark age. So I reached out

2:29

to him and it turned out that

2:31

not only did he agree on that

2:33

formulation, but he placed the beginning of

2:35

that dark far earlier than I would

2:38

have. So I still don't know whether

2:40

I believe my formulation or his. but

2:42

I definitely believe this is a cryptic

2:44

dark age and that that is a

2:46

frightening prospect because. In.

2:49

Normal human circumstances and dark age might

2:51

be a terrible thing to live through,

2:53

but lived here is what would happen

2:56

in our dark age. The power of

2:58

the tools at our disposal in combination

3:00

with the retreat of enlightenment values is

3:03

tremendously perilous, and frankly, I don't think

3:05

if we don't recognize that, I don't

3:07

think we get out of it. Yes,

3:09

and I think it is hard to

3:12

get a handle on. That.

3:14

I spoke to Steve as well and it's

3:16

strange that with covering the subject matter kind

3:19

of twice as a T, this may be

3:21

going out after that are before that nice

3:23

will say. But yes he mentioned nineteen run

3:26

about nineteen twenties the started this dark age.

3:28

but like you say is that's the case

3:30

if you were to ask. I

3:32

lay member of the public how they felt about

3:34

that is that the case? They would. Their.

3:38

Instinct would be that we've lived through

3:40

an enormous technological change over one hundred

3:42

years or more. Those

3:45

two things irreconcilable. Really? Well, no,

3:47

I don't know that they are

3:49

suckers, because there's a difference between

3:52

a technological change and enlightenment. In

3:54

other words, you know, Nineteen Eighty

3:56

Four paints, albeit for most technologically

3:58

primitive place in. History: A

4:01

technologically enabled dark it

4:03

So technology is not

4:06

synonymous with enlightenment. But

4:09

nonetheless, I do. Or

4:12

I think. Having encountered Steve

4:14

Patterson perspective and now pondered quite a

4:16

bit what I really think the beginning

4:18

of the dark Age might have been,

4:21

I think the answer's going to be

4:23

something like this: A

4:25

dark age does not necessarily dawn

4:27

across civilization all at once. Maybe

4:29

you only call it a dark

4:31

age once it has, but it

4:33

dawns sort of field by field

4:35

and. In my field. I.

4:38

Don't think the dark age

4:40

set in in Nineteen Seventy

4:42

Six, or thereabouts because and

4:44

so. It's. Quite possible that

4:47

Steve is right in general. And.

4:49

That my experience is exceptional because I

4:51

was in a field that was lively

4:54

until shortly after I was born. So

4:56

I was sort of a you know,

4:58

I grew up in the wake of

5:00

this period of discovery that was real

5:03

and in fact I knew the people

5:05

who had in large measure brought about

5:07

the discoveries of that era. So to

5:09

me I wouldn't have detected the Dark

5:12

Age because you know effectively in my

5:14

palm of we were learning things right

5:16

or her. So that's a different experience.

5:19

But. If we, if we imagine that,

5:21

it's. A. Bit more granular than

5:23

we would ordinarily say that it only

5:25

looks like and dark age across civilization

5:28

in retrospect, but as it dawns it's

5:30

piecemeal than that Would make a lot

5:32

of sense of how you would reconcile

5:34

both the technologically vibrant aspects of the

5:37

period that Steve claims is entirely dark

5:39

age level and the difference in his

5:41

opinion in my opinion about when exactly

5:43

that began as a would is holding

5:46

it back. Then what is this If

5:48

is. Saying. Because

5:50

let me put some examples and

5:53

because last night I said this

5:55

phrase is a a lot that

5:57

it's an amorphous blob of. The

6:00

concepts in my view sometimes to specific

6:02

a my that is is is much

6:04

more out. Unambiguous saying that holding progress

6:07

back and I think the seems broadly

6:09

right that wherever you look you often

6:11

find the seems to be some scandal

6:14

at the heart of so many disciplines.

6:16

But you mentioned Nineteen Seventy Six or

6:18

at there or thereabouts what exactly is

6:21

it the think he was tried to

6:23

but was to it when I think

6:25

you know it again I know my

6:28

feel better than than others. But

6:30

I think several things have. Conspired

6:34

against Us This episode is sponsored

6:36

by Paleo Valley. Leo Valley makes

6:39

a huge range of products from

6:41

supplements like see throw, an organ

6:43

complex, grass fed bone broth, protein,

6:46

and superstars. Everything we've tried

6:48

from them has been terrific. I've spoken before

6:50

about your beef sticks, which are one hundred

6:52

percent grass fed and finished or ganache and

6:54

naturally fermented. The. Today I'm going to talk

6:57

about their superfood golden. Golden.

6:59

Milk also known as turmeric milk is a

7:01

delicious, nutritious hot drink or rich and turmeric,

7:03

usually made in a base of either milk

7:05

or coconut milk. Turmeric.

7:08

Is a flowering plant in the ginger family

7:10

and goes across much of topical Asia. Just.

7:13

As with ginger, the rhizome of

7:15

turmeric has been used color nearly

7:17

and medicinally across cultures for very

7:19

long time. Modern research accept ancient

7:21

traditions, and we now know that

7:23

turmeric is an antioxidant and anti

7:26

inflammatory, among many other beneficial mack

7:28

mechanisms. I'm actually. A

7:30

particularly delicious way to get turmeric in your

7:32

diet is through Golden Not. Enter.

7:34

Paleo Valley Superfood Golden Whoop! Paleo.

7:37

Valley is delicious product as turmeric of

7:40

course and also ginger, cinnamon, black pepper,

7:42

coconut milk powder and a little bit

7:44

of month free to add sweetness. Along

7:46

with several species of musharraf, Lion's mane

7:48

receive talkie and court assessed. It's

7:51

gluten free, grain free, soy free,

7:53

non she ammo and it's delicious.

7:56

Paleo Valley doesn't cut corners either. They

7:59

source only the. highest quality ingredients and

8:01

they use the whole ingredient unlike many

8:03

competitor products. Their Superfood Golden Milk has

8:05

whole termrec not just curcumin, the component

8:08

of curmeric, and whole certified

8:10

organic mushrooms not just mycelium. Golden

8:13

Milk is understood to produce

8:15

inflammation, enhance cognitive function, support

8:17

immune function, improve digestion, and

8:19

increase endurance. Paleo

8:22

Valley is passionate not only about human

8:24

health but environmental restoration and animal welfare

8:26

as well. They're a

8:28

family-owned company. Try Paleo Valley's

8:31

Superfood Golden Milk today. You'll be

8:33

so glad you did. Head over

8:35

to paleovalley.com slash dark

8:37

horse for 15% off

8:39

your first order. One

8:43

of them is that there

8:45

was a shift from

8:47

the age of enlightenment

8:50

in which the discovery in

8:52

which the truth seeking was

8:54

done by people who

8:56

were rich for the most

8:58

part. Which I don't consider a good

9:00

thing. In fact it has substantial downsides

9:02

but it did have one important

9:05

upside which is that it

9:07

removes the readily

9:10

corruptible nature of the

9:13

people doing the discovery. Because they have no reliance

9:17

on funds and whatnot. It's not

9:19

about keeping a job and competing

9:21

for tenure. It's about immortality. I

9:23

guess what I would say is

9:26

wanting to discover something important enough

9:28

that your name gets remembered is

9:31

a little bit petty but it's

9:33

awfully close to a proxy for something

9:36

that would get you to strive to

9:38

actually see what's going on. Yeah, it's

9:40

very interesting actually because that brings things

9:42

into a very real space of how

9:45

things even get funded. That's very relatable.

9:48

That model wherever you

9:50

find it is deeply, deeply problematic

9:53

because where the money is is where the insights

9:56

are. You see this in the

9:58

media industry you'll know this all too. with

10:00

what's happened with the Dark Horse podcast. You

10:02

can see inside the mainstream media industry, it's

10:06

very difficult to hold a view in

10:08

those spaces that's going to get you ejected

10:10

from the

10:13

theater. Right. And you

10:15

are constantly asking the question, you know,

10:18

even where there are places that you're

10:20

reasonably confident you know where things are

10:22

headed, there's a question of, well,

10:24

I'm going to hate to put it in these

10:27

terms, but is that the hill I want to

10:29

die on? Do I want to look at the

10:31

whole basket of things I'm interested in talking about?

10:33

Or do I want to talk about this one

10:35

topic where I am far enough out of the

10:38

mainstream that if I mention it, that will be

10:40

used to portray me as a

10:42

crank? And you know, what

10:44

do I really care if I'm

10:46

vindicated, you know, 50 years

10:49

after my death? And let me

10:51

point out, I do care

10:53

that the vindications come soon, not

10:56

because I think it's so important,

10:58

you know, personally, but because we are

11:01

in a battle over something. The question

11:03

of who we are in a battle

11:05

against is a totally valid one, but

11:07

we have to win. And so if

11:09

the vindications are very delayed, that means

11:12

in the meantime, we are not making

11:14

progress against the thing that threatens us.

11:17

So, so I do

11:19

think, you know, in answer to your question, one thing that's

11:21

going on is that the

11:23

fact that truth seeking has become a

11:26

career makes it extremely

11:28

vulnerable to corruption.

11:31

And if you've lived inside

11:33

of science, you either

11:37

rationalize the corruption around you, or

11:39

you discover that it's overwhelming, which drives

11:42

most people out. Well, it

11:44

would become like in any other job

11:46

at that stage, like, I mean, to

11:48

use a really crude example, who could

11:51

feel, you know, if you were

11:53

washing cars every day, there's a

11:55

story you would tell yourself about that. It's

11:58

completely odd when is deciding

12:00

you're nobly gonna go and tell the truth.

12:03

Because once you see it as a job, as

12:05

a means to an end, it's

12:07

just a way to get through the

12:09

day. And there's a whole range of

12:11

gradients that you can use in that

12:13

story, all the way up from teaching

12:15

kids at kindergarten to brain surgery to

12:18

hopefully not flying a plane. Where it really matters.

12:22

But in the information sphere, in the

12:24

places where we're relying on expertise, like

12:26

what is happening in this space, in

12:29

a sense, there was

12:31

an academic in the UK, he was a

12:34

Canadian or American academic, and he died recently.

12:37

His name was, I'll have

12:39

to remember Gray, but he wrote this

12:41

book called Bullshit Jobs. Oh, his name

12:43

is David Graymoor. Right. Is that right?

12:46

So, right. Yeah. This idea that everybody

12:48

just has this job

12:51

that doesn't really fulfill them and it's kind

12:53

of bullshit. And they privately know, but

12:56

publicly, they hold a very different perspective about that

12:58

position. Because they have

13:00

to play the game. That's much easier to understand in

13:02

really, in jobs that people don't

13:04

really want to do. Because

13:06

you just get on with it. Who wants

13:09

to, you know, be grinding all day long. But

13:11

in the intellectual space, I think that

13:13

same kind of thing actually happens. That

13:16

people find themselves in these positions, they

13:18

quietly, privately might get themselves thinking, you

13:21

know what, I'm not actually 100% sure with this, but

13:23

I don't have any avenue to go down to change the

13:25

course. I'm

13:28

stuck. I'm focused on that

13:30

because the weeding

13:32

process is

13:34

profound for reasons that

13:37

are arcane and

13:40

largely uninteresting. The

13:43

way the university solves its

13:45

problem, the university is fueled

13:47

by grant money. And

13:50

that grant money comes in when

13:53

the principal investigator Can

13:56

spend as much of their

13:58

time writing grants. As

14:00

possible. which means that the natural order

14:03

of things. As for the. Primary.

14:05

Investigator to accumulate a laboratory of

14:07

people who do the actual work

14:09

not only of the laboratory it's

14:12

but of teaching and. The.

14:15

Result is that the system takes

14:17

these people in far as labor

14:19

that doesn't have to be paid

14:22

a full wage because technically their

14:24

students they're seeking a degree and

14:26

they're being subsidized through teaching. Which

14:28

means the P I knows which

14:30

of their students are going to

14:32

get a job. The

14:34

students don't know. The students live under

14:36

this false rubric of imagining. Yeah, I

14:39

really, you know, stick to it. I'll

14:41

find something, I'll get a job, you

14:43

know. And by the time they get

14:45

to the end of graduate school, they're

14:47

bitter because they've discovered the truth. They've

14:49

discovered how much factory there is, They've

14:51

discovered that you know that they are

14:53

being taken advantage of. and most people

14:55

at the point that they realize. That

14:58

they've wasted the better part of a decade

15:00

pursuing a degree that was used to pay

15:02

them in lieu of money. Leave

15:06

and they're never seen again. They go

15:08

into industry or something and. So.

15:10

The point is, who's left over? Well,

15:13

the cutthroat people who were willing to

15:15

claw their way to the top are

15:17

left over and of true believers are

15:19

left over. But it's not the people

15:21

you would want to be doing the

15:24

work. And so it's not as surprising

15:26

that the work is cruddy as the

15:28

public thinks it is. The. Work

15:30

is credit because the system is

15:32

broken and the process that the

15:35

system is stewarding is so delicate.

15:38

At. The difference between science that

15:40

is done well enough that it

15:42

actually produces the correct product

15:44

which is inside and science that

15:47

is very close to that process

15:49

but is distorted and produces sometimes

15:52

the upside down results that

15:54

difference is small were rivals. It's

15:56

like. And. ring and an enormous

15:58

you know ben nuclear fusion Not

16:01

as much as you might think. Well, but you

16:03

might know this. Nuclear fusion

16:05

is both capable

16:08

of releasing fantastic amounts of

16:10

energy from mundane starting

16:12

materials, but it is also

16:15

fantastically difficult to maintain the conditions in

16:17

which that energy is released so that

16:19

it becomes a self-sustaining process, which is

16:21

why we don't have fusion generated

16:24

electricity coming out of the wall yet and haven't.

16:26

It's always 20 years out, right? The

16:29

reason for that is that the conditions necessary

16:31

to make this marvelous process work are very

16:33

difficult to maintain, and I would argue science

16:35

is like that too. Yes,

16:38

the conditions for it are not always perfect. I guess

16:40

that's why you would always find these periods in history

16:42

where you have this massive explosion of activity

16:44

and you think in such a short

16:47

period of time something amazing happened. Because

16:49

the conditions for that were right. And

16:51

we've, you're arguing, and Steve is arguing

16:53

or making the case that something

16:56

changed. So times are

16:58

different. But you're saying maybe 1976,

17:00

Steve's saying maybe 1920, something

17:03

enters the phrase. So if

17:06

people could, I mean, this is where, are

17:09

we entering a dark age or coming out

17:11

of one? I'm actually optimistic about

17:13

this in the sense that these tools that have been afforded

17:15

to us kind of free

17:17

people to some degree, not perfectly,

17:19

from some of those forces inside

17:22

the previous institutions of learning. And

17:25

you can now, if an audience or

17:27

an interested group come with you, the shackles

17:31

of all of that weird system of trying

17:33

to impress person A and looking

17:36

for money to fund the research, all of

17:38

that can kind of to some degree disappear.

17:40

But it's not super conducive to like the

17:43

heavy sciences where you need a budget to

17:45

do your work. Well,

17:47

I agree with you, but people really

17:49

have to break their addiction

17:52

to the proxy of

17:54

authority. If you

17:57

Are of the belief, as I think. Most.

18:01

Normal. People are until some

18:03

point that in general. The.

18:06

Authorities and the institutions in which they

18:08

live are basically right about the big

18:10

stuff. They make some really embarrassing errors

18:12

but it's self correcting overtime. So basically

18:14

if you say look, I'm one of

18:17

the enlightened people I'm don't have the

18:19

time where the capacity to study all

18:21

the stuff independently myself so more or

18:23

less with see acknowledgement that there is

18:25

a great deal of noise in the

18:27

system. The signal to noise ratio is

18:29

such that I'm an atrocity institutions in

18:31

the experts and I'm going to be

18:33

rights a lot more often than I

18:36

months. If you think that that's true,

18:38

You've. Got a problem because something is

18:40

actually corrupted the institutional framework so every

18:42

institution is now broken and as soon

18:45

as you get out of that and

18:47

you say actually you know what. I

18:50

do need to address other people because I can't do

18:52

all this work myself. It's impossible. Who.

18:54

Do I trust? Well, I'm going

18:56

to trust people. Who.

18:59

Have predictive power to see

19:02

things ahead. If I

19:04

tracked people's track record. Then

19:06

I know something about whether or

19:08

not what they say is insightful

19:10

and I wanna watch what they

19:12

do when they get it all.

19:15

I if I know that they're capable

19:17

of saying okay, Here's what I miscalculated:

19:19

misplaced. Here's what I now believe and

19:21

I'm now going to adjust my over

19:23

arching model in order to restore its

19:26

capacity to predict more than other people's

19:28

miles. Of people

19:30

doesn't really matter what degree they have

19:32

and I literally don't care if your

19:34

self taught all the better for the

19:36

question is does your model predicts things

19:38

other people's model doesn't. That's. The

19:41

real test and had a degree is

19:43

no longer a proxy for us. I

19:45

mean the problem with that after societal

19:47

level is that not everybody can do

19:49

it and so you you end up

19:51

in a situation where the the the

19:53

requirement for people who have ecstasies just

19:55

remains and then you you would become

19:57

alone soul voice in the dark that.

20:00

The people you ought not to trust

20:02

every think you hear from these institutions

20:04

but people was basically have no choice

20:06

and i think very my it's been

20:08

soaking steeple about this and i i

20:10

mentioned as deep as well. I

20:13

rarely talk about this outside

20:15

of my work because the

20:18

social danger of speaking about

20:20

the them. Lack.

20:22

Of trust One oh it's a house

20:24

in the very critical things your thousand

20:26

talk to. Live in a society is

20:29

basically what I call an anti social

20:31

position a word seen in our stay

20:33

in for in a in a sort

20:35

of semi professional space you on speak

20:37

to people and sites. hey did you

20:39

see this sense or just uncovered the

20:41

and you'll often be talking with people.

20:43

Were going through this process of. It

20:46

really. Deconstructing themselves and reconstruction

20:49

themselves a new. It's painful

20:51

and it's difficult. And

20:54

in out there in the realms of

20:56

the real world, I'm super interesting to

20:58

think, what strategies do we even adopt

21:00

that to work at a societal level

21:02

because we can't say to having one.

21:04

Guess what Earth you going to have

21:07

to go into full time research? I

21:09

think I think it's also something the

21:11

know Chomsky spoke about. use the first

21:13

interesting, the use the one the first

21:15

people to open the up to something

21:18

being wrong. And I've gone on a

21:20

journey from reading and engaging in the

21:22

works of Noam Chomsky to. Suddenly being

21:24

interested in some So and Freddie

21:26

was used. Read those two people:

21:29

these they are worlds apart Yes,

21:31

But Noam Chomsky save you Actually.

21:33

If you wanted to really understand the

21:35

truth and get to the trees and

21:37

have some investment in the truth, as

21:39

a citizen, it's a full time research

21:41

project. You will be able to live

21:44

your life sooner. Yeah. butcher. I.

21:46

Mean, I hear what you're saying. I

21:48

don't disagree with any of it that

21:51

you're using the wrong metrics. You're really

21:53

mixing two things. And this is part

21:55

of why Dark Age is not hyperbole.

21:57

You are saying. Well

22:00

if we really just abandon the idea

22:02

of listening to be experts, We.

22:04

Are Lost at Sea. We are

22:07

lost at sea. Ill. And for

22:09

those of us who believe we

22:11

are in a dark age, the

22:13

answer is it. Yeah, that's the

22:15

message. We are Lost at Sea

22:17

already. Rights And the only argument

22:19

in favor of the experts would

22:21

be that listening to them is

22:23

actually. Better. Net.

22:26

Net. It's better for you to listen

22:28

to them then to ignore them. And

22:30

that's not the simple calculation. Sounds like

22:32

they can be right more than they're

22:35

wrong, but if what they're wrong about

22:37

results in new hurting yourself. To.

22:40

A degree that overrides the benefit

22:42

of what they do know been

22:44

ignoring them is the right thing

22:46

to do, and seeking people who

22:48

are demonstrably expert in something is

22:50

a much better plan. So I'm

22:52

not telling you. That

22:55

it is clever and a dark age

22:57

to abandon the university and listen to

22:59

people on podcast instead rights and do

23:01

your own research. I'm a huge fan

23:04

of it, but it is not a

23:06

substitute for functional institutions some. but I

23:08

would also point out that's not my

23:10

fault. I've been pointing out the institutional

23:12

problem for decades and nobody was listening

23:15

until cove it. So so we are.

23:18

The Dark Ages? Not, it's not a

23:20

metaphor. Dark Ages You notice the same

23:22

way. It's the Great Depression and the

23:25

U S right? Those of us who

23:27

are too young to have seen it

23:29

have this image of it is the

23:31

sort of sepia toned realm move. It

23:33

wasn't Sepia toned are good. The Great

23:35

Depression was in full color. The up

23:37

For the people who lived, it was.

23:40

This. Doesn't look like a dark

23:42

age? Guess what? It wouldn't right? And it

23:44

is one. Frequently. And

23:46

that doesn't mean to. I do want

23:49

to push back slightly on. Your

23:52

point about? ah? And

23:55

mine too. I use this term advisedly.

23:57

Norm is. Not. easy as or s

23:59

and sometimes refer to myself as one. It's an

24:01

online meme, isn't it? And it is a good approximation.

24:03

It's a funny term. It's a

24:05

bit like muggles really in Harry Potter. Yeah, it's

24:08

a little bit like muggles, except I think if

24:10

you have a good model of it, the point

24:12

is there are topics on which all of us

24:14

are one. Oh, for sure. And then there are

24:16

topics in which we're not and that that is

24:19

actually useful for generating insight into what it's like

24:21

when you confront a normie with something like the

24:23

terrifying reality of what happened to us over COVID.

24:27

But I would point out this. During

24:30

COVID, and by COVID,

24:32

I mean the crisis, I don't mean the pandemic because

24:34

there wasn't one. I do believe there was a virus.

24:36

I do not believe there was a pandemic. Not

24:40

by any reasonable definition in any case. I

24:44

live in Portland in that time. Heather

24:47

and I began to experiment with something.

24:50

I think I started it because it's

24:52

sort of my nature anyway. But when

24:54

people would say normie stuff about

24:57

what was taking place, I

25:00

would make a point of not

25:04

agreeing and saying actually,

25:06

not how I see it. Yeah. And

25:09

I would give a rough approximation of

25:11

the place that I had ended

25:13

up. And it was

25:15

shocking how frequently the person on

25:17

the other end of that would

25:19

be liberated by knowing that you

25:21

were not signed up for the

25:23

mainstream narrative and they would just

25:25

start confessing their own doubts. Right.

25:28

So I think that's hovering out

25:31

there much more commonly than we

25:33

believe on more topics than we believe. It's

25:36

a huge gamble to take socially, though, I

25:38

think. I think we spoke at one time

25:40

about the difference between speaking about being

25:42

treated and the idea that people could be

25:44

treated during the COVID pandemic. Again, I know

25:46

there's a lot wrapped up in that now

25:48

in hindsight. I don't know

25:51

where I sit on that discussion or debate or whether I even

25:53

have a, you know, I'm not

25:55

super interested in how that winds up. But there

25:58

was this idea of the fear of discussing that. is

26:00

the fear of the V word here. It's

26:02

like the most overwhelming taboo. And so when

26:04

you get into those spaces of

26:07

challenging those things, it's a deeply,

26:09

deeply antisocial thing to do in

26:11

normal circles. And I think everybody

26:14

can feel that pressure. But

26:16

what you mentioned about the relief that

26:18

someone then takes that someone has said,

26:21

I think it's very, very true. And to come back to

26:23

that analogy we used earlier about being

26:26

lost at sea. If

26:28

in this analogy we

26:31

collectively are lost at sea, we still

26:33

have leaders, let's say we have captains

26:35

and boat staff in this analogy. There

26:39

will be a time in this analogy where

26:41

the captain still is of the view that

26:43

we're not lost. Yeah.

26:46

And you may have

26:49

to take over the helm. But

26:52

what you want, the huge relief comes when

26:54

you are nudging and itching at this

26:56

captain, you're saying, look, there's the smoke

26:58

coming out of the engine room. The

27:00

sun is on the wrong horizon. The

27:02

star is on the wrong side. All

27:04

these signs are showing that something is

27:06

wrong. The captain isn't listening. I can

27:08

sense the relief that finally comes when

27:11

the captain sits down with the normal

27:13

people on the boat and says, we're

27:16

lost. That

27:18

is, I think, what people are trying

27:20

to itch towards. And I'm less sure

27:22

sometimes whether we will get that. Because

27:26

this catharsis that everybody wants, and I

27:28

really, truthfully put myself in that group,

27:30

I kind of want some

27:33

kind of acknowledgment from

27:35

someone important that something

27:37

went wrong over

27:39

the last three years. Absolutely. All

27:42

we will instead get is politics,

27:45

very slight adjustments here

27:47

and there, nudging. It

27:50

just doesn't look like we're actually going to get it. And

27:52

so to jump

27:54

from one crude analogy of being lost at

27:56

sea desperately wanting the captain to admit that

27:59

to another And I know this

28:01

is crude I really feel like

28:04

it is an analogy here about the

28:06

collapse of the Soviet Union I know

28:08

that sounds completely extreme, but as that

28:10

was happening imagine it from the inside

28:13

right people on the inside had two

28:15

personas There was a public persona

28:17

that had to play the game and

28:20

continue on as if everything was working

28:22

fine And there was a private persona

28:24

that even to oneself may have been

28:26

more private than you'd like to admit

28:29

I'm not saying that they would get home,

28:31

but like oh my goodness. It's all gone

28:33

wrong It's a private growing feeling that something

28:35

is wrong and the two are in

28:37

tension Now those are the

28:39

people in our current setup that I'm interested

28:42

to speak with Because you know

28:44

we're now at this they thing again this

28:46

this idea of an acute Central

28:48

point in this enemy system. I just don't

28:50

see it like that. I see lots and

28:52

lots of people Trying

28:55

to get by in an

28:57

incomplete imperfect system Who

28:59

I think I hope I believe

29:01

I have an intuition that there

29:03

is a quiet growing skepticism inside

29:06

of them I think is

29:08

wrong and maybe Much

29:10

like with what happened with the collapse of

29:12

the Soviet Union it had collapsed

29:15

a long time prior To

29:17

actually officially ending you know when it was

29:19

like okay. This simply doesn't work because I

29:21

got those shoes I've just sold my child's

29:23

coat to pay for some potatoes, and it

29:26

is only so far one's Lying

29:29

eyes can continue to cover the fact

29:31

that something is very deeply wrong where

29:34

we are on that line I

29:36

don't know, but I feel like we're certainly on

29:38

a trajectory on it I

29:40

love this point right because and

29:43

we really need somebody Excellent

29:47

with history to tell us you

29:50

know What was late

29:52

stage Pravda like yeah, right as

29:54

Pravda continued to maintain the party

29:56

line and that collapsed at

29:59

the level of of the

30:01

populace, what

30:03

was it like to be watching the television

30:05

with other people who were increasingly ready to

30:08

admit that it was preposterous? That

30:11

is bound to have been important. I would

30:13

actually say I have been to the Czech

30:15

Republic for the first time in my life

30:18

recently, this year. I

30:22

loved the Czech people. I

30:26

knew I would like my life people generally, but

30:28

the Czech people were so fantastic.

30:31

Part of what made them fantastic was that

30:35

they viewed themselves in

30:37

realistic terms. They had

30:40

lived through this painful era that they

30:42

were in no position to do anything

30:44

about. But the one thing they

30:46

could do something about was not buy

30:48

it. It was

30:50

like this little enclave of

30:52

people who knew better. But

30:55

yeah, they played the game. But I think they

30:57

were better at being honest with each other, or

30:59

at least some of them were. So

31:02

it creates this culture. I

31:04

would point to something. I mentioned it last night at the

31:06

event, but I really think

31:08

it's very important. In-person

31:12

relationships, especially

31:15

profound in-person relationships that

31:18

are truly honest, are

31:22

the antidote to this bullshit. The

31:25

ability to look one person in the eye,

31:27

right? Maybe it's your spouse. Hopefully

31:29

it is. But the ability to

31:31

look one person in the eye and

31:34

say something like, honey,

31:39

I can't tell exactly what's wrong with this story,

31:41

but I'm pretty sure they're lying to us. And

31:44

we are going to have to think about how to live through this

31:46

era. And

31:48

the problem is that I see a lot

31:51

of young people who have

31:55

reached the false sophistication

31:58

Of thinking that a romantic relationship is going to be a reality. And ship

32:00

is something that might be nice to

32:03

have or maybe is something that's more

32:05

costly than it's worth. and they're ready

32:07

to dispense with it. And my feeling

32:09

is you haven't even really understood what

32:12

it's primary significance is. Certainly a hell

32:14

of a of work, but. It's.

32:17

Primary significance is that you

32:19

are actually, in some ways.

32:22

Fusing your persona with somebody and

32:24

hopefully you've picked somebody who is

32:26

worthy. I say this you having

32:29

just met your wife I'm thinking

32:31

you've done marvelously. She's a much

32:33

as that stuff but that that

32:36

is a very important. Immunity.

32:38

To have far more important than any

32:40

vaccine you might ever get, he up

32:43

rights and so. That can be

32:45

cultivated in communities, but you know it's

32:47

one thing to have the television telling

32:49

you something insane. It's another thing to

32:51

be sitting next to somebody on the

32:53

couch who also knows it's insane. Do

32:56

I think the real occasion that comes

32:58

from that shared reality that is, you

33:00

know, too tightly knit to be invaded

33:02

by propaganda is is really important and

33:04

much more so in anything that might

33:06

arguably be a dark it. And there

33:08

was a discussion last night that perhaps

33:10

that's why dwell on A and I

33:13

don't know where I sit on the

33:15

about why. Maybe that's why it came

33:17

under attack. The idea of are. Both.

33:20

Cold truly. And literally At times I

33:22

think I'm. The.

33:24

Family unit came under attack. Culturally I

33:26

speak on because a it. Certainly.

33:29

In my circle of. My

33:32

social circle. It's much less common

33:34

not see a person my age

33:36

settling down starting a family. It's

33:38

getting delayed in delayed and delayed

33:40

further down the line. the some

33:42

awareness of all of this group,

33:44

whether anyone has of solid view

33:46

about why that's happening on whether

33:49

it's more importantly whether it's important

33:51

is another saying it's it's but

33:53

it's. It's that. We. Were

33:55

in this thing that people discuss openly

33:57

and dumb. or not quite sure anyone

34:00

could really put a handle on where

34:03

the sensor of gravity is until much further down the line. It

34:05

might be 10, 15, 20 years

34:07

until we look on this period. Yeah, well,

34:09

you know, this goes back to your question

34:11

about they. I am unembarrassed

34:15

about referring to a they I cannot identify,

34:18

and I don't think we should be embarrassed.

34:22

I'm coming from the perspective of an

34:25

evolutionary biologist. In fact, I

34:28

probably don't know. I'm a tropical biologist.

34:31

Now, to be a tropical biologist means that

34:34

you walk into the habitat

34:37

that you study, and

34:39

if you're any good at all, you know we don't know

34:42

the first thing about it, right?

34:44

Like really almost nothing. Tropical habitat is

34:47

highly complex, arguably the most complex thing

34:49

in the known universe if we include

34:51

that these things contain people and therefore

34:54

populations and shared cognition and all

34:56

of that. But in any case,

34:58

you look at a tropical forest and it

35:00

is not incorrect to

35:03

discuss what

35:06

creatures might have absorbed

35:08

a particular nutrient, right,

35:10

or a particular wavelength of

35:13

light that hit the canopy.

35:15

It is not incorrect to discuss

35:17

the herbivory that is happening

35:20

as the result of creatures you have not yet

35:22

found, right? These things are all

35:24

part of rigorous discussion, right? You

35:27

can see that the action is there before

35:29

you know what did the action. In our

35:31

case, if all our

35:33

antagonists must do is cloak their

35:36

identity and then forbid us to

35:38

talk about they in anything other

35:40

than precise terms, they win. So

35:42

we have to be able to discuss them,

35:44

but we have to be careful not to

35:46

impose a belief in which

35:50

we imagine When

35:53

there's a range of possibilities to

35:55

explain some phenomenon. We Can't leap

35:57

to the conclusion that it was

35:59

individual. Deciding to do something without

36:01

could be an emergent phenomena. He

36:04

has yet to be agnostic about that. Yeah, I

36:07

may, and I did. That could bring it to

36:09

the next stage of this discussion if we say

36:11

that without caped parts were lost at sea. How

36:14

long we've been lost at sea is debatable.

36:16

But. the some realizing now because

36:18

the crisis that of coded

36:20

that a lot of people

36:22

recognized. Hey, there's a

36:24

problem here. Medical Literature: Oh hey, this is

36:27

a problem here in Medical science. Hey, there's

36:29

a problem here in Public Health. And as

36:31

about. Zoomed out. When. I

36:34

was the point. Where were he know

36:36

he takes the partisans V that Aids

36:38

is a problem near enough everywhere. Actually,

36:40

what you found in medical literature is

36:42

in physics. What you found a medical literature

36:44

is in mathematics. What you found in

36:46

medical issues is in biology and and

36:48

so on. And she would simply that

36:50

feels ah, I'm not too far from

36:52

the trees. Funny, I'm not too, but.

36:56

It should we not my radio. My first

36:58

rodeo on these kind of topics. rapes sweet

37:00

node you lost at sea. But the other part?

37:02

your discussion last night was. We are not

37:04

as in that context. Were.

37:07

Also, entering into the height

37:09

a novelty crisis and the

37:11

to Ghana quite difficult things

37:13

to reconcile. Where the pace

37:15

of change at the tail

37:17

end of this arguable doc

37:19

at that age is accelerating

37:22

enormously such a degree that

37:24

you made the case that

37:26

whilst. We have

37:28

created. A bubbling Ss

37:30

and slowing cultural layer on top

37:33

of humanity and us helps us

37:35

with these things. Before it is

37:37

perhaps now superfluous to this attack

37:40

from a rapidly changing world. Let's

37:42

jump into that. What? Like what

37:45

is this hype and I will

37:47

see crisis. okay

37:49

i want to put one more thing on

37:51

the table before we get that okay let's

37:53

do that and it really goes back to

37:55

your initial question about how did we get

37:58

into something dark age like So

38:00

we talked a little bit about the perverse

38:02

incentives that come from the

38:05

career-based nature of truth-seeking

38:07

now. But here's

38:09

this other phenomenon, which is really more organic.

38:14

Very often, something important

38:16

is discovered in one place

38:18

first, and then we suffer

38:20

from the fact that forever after, it is

38:23

overly identified with that thing, rather than identified

38:25

with the full scope of places that it

38:27

applies. At the moment, the one

38:29

that I'm focused on is

38:31

diminishing returns, which was first discovered

38:33

in economics, where

38:35

it's called diminishing marginal returns, and

38:38

it is treated as an economic law. Now,

38:41

my claim is that it

38:43

is actually a law of complex adaptive

38:45

systems. Any time a system

38:48

has an objective and

38:50

is truly complex rather

38:53

than complicated, diminishing returns

38:55

will always apply. So

38:58

what that means is

39:00

that you get a curve. Here, maybe

39:02

I'll just draw it

39:04

so people know what we're talking about.

39:12

So this is

39:15

your return on investment

39:17

over time, and

39:20

we could make some very precise arguments about the conditions.

39:22

And I would just argue that generally you should expect

39:24

this, because it's a law of nature that it will

39:27

emerge when the two conditions I've mentioned

39:29

are present. What this

39:31

means in a field is very

39:33

unfortunate. Let's just say

39:36

field X. Field

39:38

X is stuck. Somebody

39:42

comes up with something

39:44

that unsticks field X, an

39:46

idea that contains

39:48

enough truth that suddenly

39:50

you have this amazing burst of

39:53

productivity because people now suddenly again

39:55

know how to study the question.

39:58

That puts you... on this steep

40:02

face. That

40:04

steep face is a bargain, right?

40:06

That's a bargain zone, right? Where

40:08

you're getting really high payback. Your

40:10

return on investment is incredibly high,

40:12

right? Now the problem

40:14

is when you have a competitive academic

40:16

environment in that phase, right? Suddenly

40:19

the school of thought has brought, you

40:22

know, manna from heaven, right? They've made the

40:24

reign and it is

40:26

treated as if it is true because

40:28

how could it not be true? Look

40:30

at how much productivity it's creating, right?

40:32

But it isn't true. At best it's

40:34

approximate. And the point at

40:37

which you're going to plateau and

40:39

pay ever higher costs for ever

40:41

smaller returns is coming. But by

40:43

the time you get there, maybe

40:45

you're two generations later, okay? You're

40:48

two generations, you're two academic generations later,

40:51

the founding generation is dead. The people

40:53

who remember that they made certain assumptions,

40:55

who remember what it was like to

40:57

be stuck and then to be unstuck.

41:00

And then they're stuck again, but those

41:02

people are gone or they've become

41:05

irrelevant in some sense. And

41:07

the disciples don't know

41:09

that the assumptions were

41:11

actually made with

41:14

some awareness that they might not be quite

41:16

true. And so the point

41:18

is now the field, there's no second school

41:20

of thought because anytime anybody tries a second

41:23

school of thought, they're derided as a

41:26

gatekeeper. Well, it's gate

41:29

kept, but it might also be just

41:31

that the people who are descended from

41:33

the generation that made such progress actually

41:37

believe that alternative schools of thought are just

41:39

wrong rather than saying, actually, let's wait till

41:41

our school of thought peters out and then

41:43

we'll look for the next school of thought.

41:45

That would be the rational thing to do,

41:47

right? There should always be a second school of thought. And when

41:50

the one school of thought runs its course, then the point is,

41:52

well, you know, let's dust some

41:54

stuff off and let's go back to the

41:56

stuff that this school of thought can't answer.

41:58

And let's jumpstart productivity. again, and so you

42:00

would climb a sequence of

42:02

diminishing returns curves. That would be the right

42:05

thing to do, but instead the

42:07

fields get mired in their assumptions and

42:11

my claim will be every stuck field has broken

42:13

assumptions. And if you could actually spot which assumptions

42:15

are wrong, you can figure out what the field

42:18

is supposed to do. Very easy

42:20

to beat a stuck field. It's not

42:22

easy to get credit for it. It's not easy to

42:24

get the field to recognize it, but it's very easy

42:26

to outthink a field that is, you know, spinning

42:29

its wheels in a ditch. It's a

42:31

social problem rather than an academic problem

42:33

very often. It's a complex systems problem

42:36

mapped onto a career environment in which

42:38

it just so happens that the people

42:40

who are ascendant get the right to

42:42

kill off those who are not making

42:44

productivity at the same rate. So which

42:46

examples can we use? Let's throw some

42:48

people under the bus here. Well,

42:51

I would say in my field there

42:53

are assumptions, you know, I'll point to one.

42:57

The idea that fitness is essentially

43:00

synonymous with reproductive success.

43:03

That evolutionarily what creatures are trying to do is leave

43:05

a lot of offspring. Right.

43:07

Okay. Now on the one hand, most

43:11

of the time that's a great proxy. A

43:15

great proxy for evolutionary success for

43:17

what we call fitness, but they're

43:19

not synonymous. Okay. Creatures

43:22

that leave a lot of offspring can

43:25

leave a lot of offspring because they have

43:27

an advantage and that advantage can be a

43:31

kind of superiority, which is going

43:33

to drive its inferior competitors to

43:35

extinction and whatever the advantages

43:37

then goes to what we call fixation.

43:41

Or it can be that they cheated and

43:44

that they have produced a lot

43:46

of offspring by adapting too much,

43:48

let's say, to short-term circumstances at

43:51

the expense that their great grand

43:53

offspring will not be capable of

43:55

facing the conditions that they're going

43:57

to face. Like, say, a

43:59

bacteria in the inside a Petri dish rapidly expands

44:01

and hey I did really well in the

44:03

context of this dish, it hits the out

44:05

limits of the dish. It now has no

44:08

evolutionary advantage to expanding that quickly because it's

44:10

not figured out how do we get out

44:12

of this dish. Or even better than that.

44:14

Okay I like the I like the examples

44:16

the right style of thinking the problem is

44:18

the dish is artificial. Right?

44:22

What if we take the case of

44:24

the bacterium that has the advantage that

44:27

it decides to dispense

44:29

with the capacity to tolerate

44:31

an antibiotic and remember that

44:34

antibiotics are not a human

44:36

product. Antibiotics are germ warfare

44:38

agents produced by fungi and

44:40

bacteria to fight each other.

44:43

So the evolutionary capacity

44:45

to endure antibiotics exists but in

44:47

an era where there's no antibiotic

44:49

present they are a needless cost.

44:52

So a bacterium can get a

44:54

reproductive advantage by dispensing with that

44:56

capability. It's just energy it doesn't need

44:58

to expand. You know

45:01

you're carrying an instruction set for

45:03

a molecule you don't need.

45:05

So if you get

45:07

rid of it you can reproduce a bit

45:09

faster. Right? Looks clever until the antibiotic shows

45:11

up again. And then it's the opposite of

45:14

clever. So anyway there you know

45:16

in bacteria I would argue that there is

45:18

actually an elegant solution to this that we

45:20

don't see very readily which

45:22

is that bacteria

45:24

are not single-celled organisms the way we think they

45:27

are. That they are fascinatingly

45:30

colonial and

45:32

that what happens is there's

45:35

a library at the back

45:37

that remembers how to deal with the

45:40

antibiotic and the plasmids that carry that

45:42

information can be exported to the front

45:44

line. Right. So that's a way of

45:46

getting the best of both worlds basically reducing

45:48

the cost of the trade-off. the

46:00

time evolutionarily. And if

46:03

you say that reproductive

46:05

success is synonymous with fitness,

46:08

you will misunderstand every case in which

46:10

they divert. Right? So

46:14

anyway, that's one place that you can

46:16

take the assumption where when somebody says

46:18

reproductive success, people hear fitness

46:20

and when somebody says fitness, they hear reproductive

46:22

success and they don't even realize that they've

46:24

made an equivalency that is

46:27

only approximate. My

46:30

argument is that happens everywhere all the time. And the

46:32

real, if I were running the

46:34

world, the academic world,

46:37

I would say, look, you should never

46:39

have any field kill off the

46:42

second best school of thought. So what do you

46:44

think stops the way? Why does

46:46

that not crack through? What forces

46:49

are in place to stop that other

46:51

than being a professor in exile? Like

46:54

one component of this is

46:56

it ego? I don't mean

47:00

in this sort of disparaging social term. I

47:02

mean that it's sometimes hard for people

47:04

to let go of things they've invested a lot of

47:06

time in. And so it

47:08

just has this inertia, it won't move. Well,

47:12

how do you break through that? Here's the problem. The

47:15

thing I've described with bacteria

47:18

is the exact analog for what goes

47:21

on academically. The university

47:23

that preserves a school of

47:25

thought that is underperforming, that

47:28

university underperforms. The university that

47:30

puts everything in to the school of thought

47:32

that's paying the dividends at the highest rate,

47:35

it's winning in the short term. But

47:37

the point is what that does is

47:40

it causes that school of thought to

47:42

dominate every university. Nobody's banking on some

47:44

other way of thinking. And

47:46

so the point is that the point these

47:48

things get stuck, literally nobody remembers that there's

47:50

any other way of thinking about it. Right.

47:52

Yeah. So we're too far down the path

47:54

with one particular method of thought. Right. Now

47:57

compare that to the, you know. The

48:00

enlightenment mentality where you had gentleman

48:02

scientists and again, everything's wrong with

48:04

that. It was just guys, it

48:06

was just rich guys almost exclusively.

48:09

Wallace's an interesting exception to that. The put him

48:12

aside for the second. Men:

48:15

You had people competing

48:17

for immortality rather than

48:19

grants. Now competing

48:22

for, you know, actually in that

48:24

regard, right? statues? I'm willing to

48:26

accept that. A little bit gross,

48:28

but. And

48:30

I mean grotesque. What's? The

48:33

point is if you're doing that,

48:35

Then. You're actually thinking. What?

48:38

Are my colleagues doing. Because that's

48:40

the way I'm gonna get the statue. Brightest.

48:43

I'm going to figure out what

48:45

everybody's got wrong and so that

48:47

desire to spot the error of

48:49

your field is profoundly important and

48:51

at the moment fields just have

48:53

veto power over. Believe that. Say

48:55

it's one thing to discover are.

48:58

These. Problems as as as many people

49:00

in the in this cove a dissident

49:02

space the founder there is there are

49:04

people coming out of sees the woodwork

49:07

as it were now's slowly. Gathering.

49:09

A position that they want to take on this

49:11

issue is they think maybe maybe the chips will

49:14

fall. There's

49:16

one thing to create. The. Inside

49:18

the says. You. Know the house

49:21

because upon which your position is constructed

49:23

is. Falling. It's

49:25

it's a listen to get big so as

49:27

they acknowledge realize that the quite happy to

49:29

just stay in this vacuous on space where

49:31

nothing really make sense and an open to

49:33

discussion about that are you actually get. The

49:36

public has to come with right? this

49:38

new paradigm is opening up com with

49:40

let go at that's the part that's

49:42

the difficult that I mean obviously discovering

49:44

this, the editing is wrong or the

49:47

late we can make progress if we

49:49

just change ah mode of thought of

49:51

course that requires work by strikes means

49:53

that wanted to make up and I've.

49:55

Noticed this is bringing anybody with you.

49:57

It's all. well and

50:00

I do think there's another

50:02

element here which is the...

50:04

I want to describe it with the

50:08

correct level of breadth. One way

50:10

to see it is any time

50:12

that the truth

50:14

is blurred

50:17

with what is

50:19

morally right. The

50:22

truth may in fact inform what is morally right

50:24

and does but they are not

50:26

the same thing. Things can be true and

50:28

morally awkward. Yes. But the

50:30

problem is we have fields like let's

50:32

say climate science. The

50:36

idea that humanity's

50:38

fate rests on the

50:41

truth produced about what is

50:43

decidedly a complex

50:45

system and that

50:47

you have truth tellers and then you

50:49

have those who are muddying the water with

50:51

nonsense and those who are

50:53

muddying the water with nonsense must

50:55

be silenced because the fate of humanity rests

50:58

on us doing the right thing in this

51:00

place that only a select

51:02

number of people are even capable of having

51:04

the appropriate discussion. Yeah. Right? Well

51:08

the problem is...

51:10

Okay. What happens if

51:14

eight discoveries in 10 point

51:16

to anthropogenic

51:20

climate change? Uh-huh. But

51:23

two in ten don't. Uh-huh. Well

51:27

the right thing to happen is to let

51:29

the tips fall where they may and figure

51:31

out all of the competing things and some

51:34

of them are actually you know making anthropogenic

51:37

climate change less significant than people thought. Some

51:39

of them are making it more and to

51:41

figure out you know to build the model

51:43

so it's actually predictive. Yeah. That's

51:46

not what happens. What happens is

51:48

the moral imperative

51:52

Then causes you to purge anybody

51:54

who finds awkward things, who even

51:57

studies questions that have any chance

51:59

of coming up with something else.

52:01

So you get in almost surely

52:03

religious perspective here, which leaves people

52:05

like nice. Hey, I'm perfectly ready

52:07

to discover they humanity is screwing

52:09

up the climate, but I believe

52:11

it less and less because I

52:13

know that. A I

52:16

know there are a great many things

52:18

we are not discussing the have profound

52:20

influence as the climates and be. I

52:22

know that if the climate science itself

52:24

said actually, it's less of a problem

52:26

than we thought that the field would

52:28

shut it down entirely. Yeah, I

52:30

think that's a very very good

52:32

insight. I think it's very strong

52:34

insides because the moral compiled and

52:37

I discovered. A lot's

52:39

the it. It's in politics. It

52:41

is there a clouds reality it

52:44

in such a way that you

52:46

can often find a route through.

52:50

The Moral. Quandary

52:52

the trouble finding finding

52:54

acknowledging a difficult race

52:57

am not require charisma

52:59

as well something. Sorely.

53:02

Lacking in the political space,

53:04

A really does because we

53:06

see these difficult questions require.

53:08

Us to believe and invest in a leader

53:11

who can get you through a very challenging

53:13

and difficult things with. Know that stat you

53:15

need you need it, You need to Churchill

53:17

or a Kennedy? who can? Who can. Give

53:21

off enough strength yeah and character.

53:23

See of that you are willing

53:25

to entertain the possibility that things

53:27

are not as you thought. Yes

53:29

Alex the climate change is a

53:32

very good one because I you

53:34

know it's I wrote down here

53:36

area we've We've grown up in

53:38

a stable state. Really, my use

53:40

span and relatively prosperous nation has

53:42

changed since I have two wonderful

53:45

parents as like a nice stable

53:47

family my whole. A.

53:50

Political upbringing is that stability

53:53

is the on and. So.

53:56

This challenge comes along and you have all

53:58

these kind of well meaning well. educated,

54:00

well-brought up, nice people,

54:03

and they want to do right by that challenge. Who

54:06

wouldn't? If this is our first

54:09

great challenge as a people, as it were,

54:11

and our you know our grandfathers and ancients

54:14

have suffered through two world wars, and let's

54:16

pick up the mantle and really solve this

54:19

as best that we can. I think you're

54:21

right that this

54:23

perspective can actually very seriously

54:25

cloud our picture of reality

54:27

to such a degree

54:30

that it does force us back towards

54:32

this place that we started the conversation

54:34

on as being lost at sea, because

54:36

nobody would want to acknowledge or has

54:38

the capacity to acknowledge, well what if

54:40

it isn't that? There's

54:42

a huge cost attached to us doing

54:44

all of these things if that's not

54:46

the case. And I'll give a real

54:48

example of this. I saw

54:50

a comment on a YouTube video years ago

54:52

and I since saw it get used a

54:54

lot. Now I want

54:57

to pre-curse this story with divulging

55:00

myself of a strong position on this. I

55:03

watched Al Gore's An Inconvenient Truth.

55:05

I've grown up being very

55:07

sympathetic to these ideas, and

55:10

still broadly am. But since

55:13

this crisis has happened, perhaps like you,

55:15

I have started to wander back towards

55:17

the contrarians in this space and give

55:19

them a second look. Now

55:21

to come back to the story, the comment said, oh

55:24

you guys who

55:26

would argue against the action we might take on

55:28

climate change, so you think we're just going to

55:30

clean up what the whole planet, clean up all

55:32

of the air, sort everything out, and then it

55:34

turns out that anthropogenic climate change is not really

55:36

a thing. Well so what? We

55:38

cleaned everything up. It's one of my favorite

55:41

cartoons, and in fact I will have Zach

55:43

dig it up and put it in here.

55:45

What if the climate crisis is a hoax

55:47

and we've fixed

55:53

the world for nothing or something

55:55

like that? But this position is

55:57

also, I believe, particularly

56:00

strong, because if

56:03

it's not true, we're already in a

56:05

situation where the idea, the reality of

56:07

this, is having a material impact on

56:11

culture, a massive material impact. I don't think

56:13

that people are

56:16

ready to acknowledge—not that they even want to

56:18

or would like to, because from in the

56:20

perspective of the future is completely on life

56:22

support. Your children are going to have a

56:25

very difficult time. Crop's are going to fail.

56:27

We're going to have mass migration. They

56:30

paint this incredibly doomy picture

56:32

that has material consequences if

56:34

it's wrong, regardless of everything

56:37

else. We actually have a

56:40

responsibility, I believe, to

56:42

put ideology to one's eye. I

56:44

often now try and find myself purging

56:46

myself of ideology in the sense

56:49

that not what do

56:51

I want to be true, not what have I

56:53

shown to be true, not what

56:56

does my philosophy say to be true. But

56:59

is this true? Because the truth

57:02

really does matter, because if you are trying

57:04

to start a family or you want to

57:06

have a relationship to start a family, why

57:08

will people do that if they're being told

57:11

again and again and again, guess what? Ten

57:13

years from now, the whole world is going to be gone. Just

57:17

the other day, a friend of mine shared

57:19

an article with me, and this is

57:21

probably quite a common experience for people

57:23

of my age, two years to save the

57:25

climate. It's completely normal to share these

57:27

kinds of perspectives. Let me trigger something

57:29

in me. I thought, I've

57:32

heard this before. I heard this in

57:34

2002. I heard it in 2004.

57:37

I heard it in 2006. You can

57:39

actually do a very interesting Google search. If

57:42

you type in before, you

57:45

can time where the searches happen, and

57:48

you say, climb it and save

57:50

the world, you'll

57:52

find, again and again and again, we've been told

57:54

the same message since I was a child, that

57:56

we have 10 years, 20 years, 30 years. of

58:00

common conduct, that does not

58:02

mean, therefore, that this position that the

58:05

climate is changing is wrong. I don't

58:07

believe that. What it means is you're

58:09

starting to see a picture in which we've

58:12

been raised inside a weird hall of

58:14

mirrors that has material

58:17

impact on our lives. We

58:19

have the responsibility to be

58:21

accurate. And how we

58:24

actually get people to acknowledge this without

58:28

obsessing them and without cracking too many

58:30

eggs, because like you've said, there is

58:32

this very strong central

58:34

moral drive that makes this thing

58:36

an unquestionable

58:40

reality in everybody's lives. And it's

58:42

deeply unpolite to be the contrarian

58:44

on this matter. Right. It is

58:47

morally offensive to

58:49

be analytically in disagreement. And

58:52

that is an intolerable breach

58:54

of a firewall that must exist. This

58:56

is it. I see this

58:59

pattern everywhere. It seems to come up

59:01

in so many fields. How dare you

59:03

ask such a question? How dare you

59:05

ask such a question? And so do

59:08

you feel that pressure? Oh, 100%. I mean, I ignore it, but

59:11

you were in the

59:13

cradle of that pressure, I guess. That was in a

59:15

way, that's what's brought you to public prominence, that you

59:17

were the first person to get shot out of a

59:19

cannon very publicly over something

59:21

like this. Well, I

59:24

will say, I'm not

59:26

going to drag the audience back through it

59:28

again, but I do have

59:30

a story. Actually,

59:32

I have a couple of them, but I have one

59:34

story in my professional history

59:38

as a scientist in which

59:40

the world turned upside down

59:43

right in front of my eyes. And

59:46

it alerted me to the fact that

59:48

what I had believed all along about

59:50

the practice of science was just simply

59:52

incorrect. So this

59:55

is the story of what happened with

59:57

my what I thought was just an

59:59

abstract decision. discovery about something interesting

1:00:01

about evolution, senescence, and cancer.

1:00:05

The problem is that I accidentally stumbled into a

1:00:07

realm where there was a

1:00:10

tremendous amount of money and prestige at stake,

1:00:14

a Nobel Prize, profound

1:00:19

career opportunities. And

1:00:23

because I stumbled into that realm that

1:00:26

was medically important, or potentially

1:00:28

medically important, I watched scientists stare

1:00:30

down the truth, even

1:00:32

though human disease was going

1:00:36

to be the consequence of the game they were playing. What

1:00:38

is this, then? Well, all right. The

1:00:42

idea is the following thing. Why

1:00:46

we grow feeble and inefficient with age is

1:00:48

a little bit of a mystery. Because we

1:00:50

are capable of producing new cells from old

1:00:52

cells. So I want to ask a dumb

1:00:54

question about this. I'm glad you brought this

1:00:56

up. We're having a

1:00:58

baby. My wife is pregnant. And I'm aware that during

1:01:00

that process, the mother is

1:01:02

creating stem cells. In fact, one

1:01:04

way or another, them cells are being made in the body.

1:01:07

And the eggs for the

1:01:10

baby, if it's a female, are being made.

1:01:13

So there's this ability to

1:01:15

recreate a perfect biological organism inside

1:01:17

of an aging human. And

1:01:19

it's like, this is possible for

1:01:22

the baby. For reasons unknown,

1:01:24

it's not possible for you. Now,

1:01:28

there are two levels of reason. One

1:01:32

level of reason is that you

1:01:34

are caught between hazards.

1:01:38

You are caught, if you have

1:01:40

the capacity to produce young

1:01:43

tissue in any organ that

1:01:45

needed it at all times, and then you just

1:01:47

perpetually dim that, you

1:01:51

would be overrun by

1:01:54

tumors before you would ever get a

1:01:56

chance to reproduce. OK? That

1:01:58

that capacity to repair your tissue. Comes

1:02:00

with the risk of generating a killer

1:02:02

and so the body has a very

1:02:04

elegant solution to this problem which involves

1:02:06

a limitation on the total number of

1:02:09

cells that each of your cell lines

1:02:11

can produce. Their a couple of exceptions,

1:02:13

but in general any somatic cell line

1:02:15

sell in your body somewhere. As

1:02:17

a limits in terms of how many offspring

1:02:20

Silva can produce and that means that if

1:02:22

you get an unfortunate mutation that makes a

1:02:24

cell death to the signals that it should

1:02:27

stop growing business then it grows to a

1:02:29

size and them arrests to. There was a

1:02:31

thing as sure about this were there was

1:02:33

a treatment for dna repair about I'd say

1:02:36

give the layman questions here on the space

1:02:38

of the about extending these particular parts of

1:02:40

the Dms right Have at it was like

1:02:43

okay this is working gray and is reversing

1:02:45

aging and slowing aging a more not less.

1:02:47

Owner we closed. Oh yeah, totally predict

1:02:49

where in every time where we had

1:02:51

this story before, right? So anyway, that

1:02:54

was what I was working on. Gray

1:02:56

A So am. I spotted this tension

1:02:58

between these two things and it happened

1:03:00

to resolve a very old question in

1:03:02

my seal them nice shouted Eureka or

1:03:04

I was going to until I realized

1:03:06

that there was one gaping flaw in

1:03:08

the whole story for a. Gaping

1:03:11

Flom Whole story was that mice

1:03:13

were well known to have very

1:03:15

long till a mere spread and

1:03:17

so that suggested something was off

1:03:19

because why didn't they have very

1:03:21

long lives? right? The see

1:03:24

this is the part of of the

1:03:26

this is a particular paw essentially of

1:03:28

the dna strand decided to sell. it

1:03:31

is the ends of each chromosome read

1:03:33

and it's basically a it's written in

1:03:35

the language of dna. It is a

1:03:37

series that repeats and it is not

1:03:40

in does not produce such nineteen and

1:03:42

the since the rest of the genome

1:03:44

that we think of in classical times

1:03:46

produces proteins as a result of a

1:03:49

sequence, it's just a number of repeats

1:03:51

and the more repeats there. Are the

1:03:53

more times the cell can't reproduce side makes

1:03:55

unless and so anyway I had a. Lovely.

1:03:59

Model. How every tissue in

1:04:01

the body you could have it's

1:04:04

unique number of fan of replacement

1:04:06

cells set in utero. Anyway, all

1:04:08

beautiful. Accept the mice didn't set

1:04:10

and there's no way you know

1:04:13

my sir mammals they should have

1:04:15

effectively the same system we do

1:04:17

with some alterations for size perhaps.

1:04:20

Has no way to treat Mason if the

1:04:22

owners and times as long as people given

1:04:24

this model sodium like the model had to

1:04:27

be. Well

1:04:29

to make a long story short, turned out. The.

1:04:36

Laboratory. Mice. And

1:04:38

what happened was the Colonies.

1:04:41

And. In fact there are many fewer colonies in

1:04:43

you think sacked All of the of mice in

1:04:45

the Us come from the. Tax.

1:04:47

Lab and bar remains. And

1:04:49

anyway, the cell line sir

1:04:51

them. The mouse lines in

1:04:53

these colonies had. Evolved

1:04:55

to the conditions in the colony rather

1:04:58

than in nature and the conditions in

1:05:00

the colony. The mice are thrown out

1:05:02

after eight months a breeding. right?

1:05:05

Because that. Keeps. The

1:05:07

rate of. What?

1:05:12

That does, it eliminates the possibility

1:05:14

for cancer. This is a hypothesis,

1:05:16

but it I believe it is

1:05:18

the only hypothesis for the long

1:05:20

telomeres of laboratory mice that remained

1:05:22

standing. So in any case, The.

1:05:25

My some economies have evolved is ultra

1:05:27

low on t lemurs because the threat

1:05:29

of them succumbing to cancer before they

1:05:31

are thrown out as breeders is eliminated

1:05:33

by this protocol. That

1:05:36

creates model mice that if you allow them

1:05:38

to live to old age, they all die

1:05:40

of cancer. Really don't die of old age

1:05:42

and they're very bizarre. So you know people

1:05:44

will tell you that that might lie, that

1:05:46

they're bad model organisms or this is a

1:05:49

big part of why they lie. This is

1:05:51

one of these scandals sitting at the Have

1:05:53

One Aziza. It's decent citizen and and learned.

1:05:55

Things Well, I believe this or a

1:05:57

tree rats are a problem that they.

1:06:00

Are in fact we've seen the

1:06:02

same pattern in. Other rodents

1:06:04

you seen it and chickens that

1:06:06

are bred for food time but.

1:06:09

To. Make a long story short,

1:06:11

the. These.

1:06:13

Animals are hyper prone to cancer.

1:06:15

They are. Likely.

1:06:18

To be capable of enduring toxic

1:06:20

insult in a way that no

1:06:22

human being can, because they have

1:06:24

effectively an infinite capacity to produce

1:06:26

nutrition, so any toxin that doesn't

1:06:28

outright kill them you can be

1:06:31

endured. And worse than that, Because.

1:06:34

They're all dying of cancer. If

1:06:37

you give him something truly toxic, it

1:06:39

functions like chemotherapy. So. It may actually

1:06:41

make them live longer. And the problem as

1:06:43

we use these damn thing he drug safety test.

1:06:46

Pilot. It's like this one

1:06:48

issue is off stream in the

1:06:50

peripheries and people. Almost. Don't

1:06:52

want to know. It's also extraordinarily complex

1:06:54

I think to get people to really

1:06:57

wrap their head around that issue. So

1:06:59

when he tries South Fire in a

1:07:01

in the theater effectively people of the

1:07:03

number one they don't understand the word

1:07:05

fire yeah to out how you'd go

1:07:07

up there and say everything down stream

1:07:09

of this things here. We have a

1:07:11

big problem because the dates are we

1:07:13

getting back from these experiments? Is

1:07:16

flawed. By. Virtue of X

1:07:19

and Y and Z is systematically flawed

1:07:21

in a way that in not only

1:07:23

distorts everything we think we've learned from

1:07:25

these mice about mammals. You know these

1:07:27

are our primary model of my million

1:07:30

physiology and the ups and think they're

1:07:32

broken and when you know has to

1:07:34

be you can success from at my

1:07:36

whole lox. the the problem is we

1:07:39

also because we use them for drug

1:07:41

safety testing and because they are biased

1:07:43

in the direction of making pox and

1:07:45

look not only non toxic but sometimes.

1:07:47

Beneficial to health Seattle cause

1:07:50

toxins function as chemotherapy idea.

1:07:52

They are tailor made to

1:07:54

produce problems like we saw.

1:07:57

with fiat seldane sends

1:07:59

valuation all of these drugs

1:08:01

that we think are safe that turn out to

1:08:03

do damage, especially to the heart. But they're not

1:08:05

even doing damage to the heart. The heart is

1:08:07

where we see the damage because the heart is

1:08:09

a special organ. They're doing body-wide damage. This

1:08:12

is, again, a hypothesis, but I believe it's the

1:08:14

only one standing. So as

1:08:17

a young bright-eyed, bushy-tailed graduate

1:08:19

student, I

1:08:22

found this thing, and I thought that at

1:08:24

the point that I published it, that

1:08:27

people were going to say, oh my God, and they were going

1:08:29

to fix the problem right away, not

1:08:32

because they were wonderful people, but because it didn't

1:08:34

make any sense not to fix it. Any

1:08:37

year that it remained this way was going to

1:08:39

be a year in which we published wrong things

1:08:41

that were going to embarrass us later, in which

1:08:43

we allowed drugs to come onto the market that

1:08:45

would... The material impact on people's lives. I mean,

1:08:47

life and death. Yes, life and death. Right?

1:08:49

So I thought that they

1:08:51

would have to fix it, and that ain't

1:08:54

what happened. As far as I know, the... Years ago,

1:08:56

no. It's 20 years ago. 20 years ago. But

1:08:59

here's the thing. I

1:09:01

wasn't just some lone graduate student

1:09:03

screaming in the wilderness. Right? My

1:09:07

advisor was a member of the National Academy

1:09:09

of Sciences. He was one of

1:09:11

the greats of his generation, and he backed

1:09:13

this thing. Right? George

1:09:16

Williams, the literal author

1:09:18

of the evolutionary theory

1:09:20

of senescence, wrote

1:09:23

a letter to Nature Magazine

1:09:25

saying, take this seriously. They

1:09:27

rejected it without review. So

1:09:30

my point is, forget all of that

1:09:32

at a scientific level. What

1:09:35

I learned from that experience,

1:09:37

from watching Nature Magazine stare

1:09:39

down my advisor and

1:09:42

George Williams, the literal author of the evolutionary

1:09:44

theory of senescence, was that

1:09:46

the system that I thought existed didn't. Yeah.

1:09:49

Now, what that means is when we got

1:09:51

to things like COVID, I

1:09:54

had already seen human

1:09:57

life put at risk. You

1:09:59

know what he drank the coolest? I had already I

1:10:01

had undrunked the Kool-Aid I had seen what happened

1:10:03

to the people who drank the Kool-Aid. Yes, and

1:10:06

Anyway, it put me in a

1:10:09

position where I'm better able to

1:10:11

you know I will more quickly see that

1:10:13

we are actually adrift at sea and the captain is delusional

1:10:17

Because I've seen it up close, you

1:10:19

know, it happened in front of me. It's a

1:10:21

really good example It's a real and it's a

1:10:23

problem that exists right now that the mice that

1:10:25

we're using to do these laboratory experiments to roll

1:10:27

up these drugs are Flawed

1:10:32

put simply they're providing answers

1:10:35

to the to the medical establishment that are

1:10:37

incorrect But here's the next part of

1:10:39

it which which should be even more concerning the incorrect

1:10:41

in a particular way that is useful to the industry

1:10:44

and so I Have

1:10:46

this analogy of how the the medical

1:10:49

publishing realm works in time With

1:10:52

regard to things like pharmaceutical products that end up on

1:10:54

the market Have

1:10:56

you ever done a Ouija board

1:10:59

in the occult? No. Okay. So in my

1:11:01

view, it's an interesting game. It's an interesting

1:11:04

trick I'm interested in magic and these things

1:11:06

I'm not interested in the occult demonic side

1:11:08

by any stretch if that concerns people I'm

1:11:11

curious as to why this thing works at all In

1:11:15

case you don't know if you've never done one

1:11:17

that the idea behind the game is that every

1:11:19

everybody places their hand on this It's

1:11:21

kind of movable friction-free object and

1:11:24

underneath is a series of letters

1:11:26

that so this story tells

1:11:28

us the Spirits will

1:11:30

come through and channel some message.

1:11:33

Yeah that arrives, right? The

1:11:36

rational part of me that understands what I think

1:11:38

is happening here is that

1:11:41

very slight movement less in

1:11:43

everybody's hands produces this outcome

1:11:47

and What's great about

1:11:49

the trick is that broadly if it's

1:11:51

if it's performed well

1:11:53

in this group Everybody

1:11:56

walks away like wow. Wow was

1:11:58

not incredible, right? But no

1:12:00

one is really sure, even the people

1:12:02

who think they might be driving it,

1:12:05

was it me that did that? Right, did I drive that?

1:12:08

This is how I feel like

1:12:10

the medical industry actually works. So in

1:12:13

your example here, you

1:12:16

have this mouse and

1:12:18

we need the mouse because the mouse

1:12:20

has given us particular answers that

1:12:23

we need to sell these particular products.

1:12:27

The data that gets these products on

1:12:29

the market is somewhat dependent on the

1:12:31

long telenails in these mice, so we

1:12:33

can't comfortably acknowledge the long telenails in

1:12:35

the mouse. At which point

1:12:37

in the Ouija board example, they

1:12:39

are an ever slight, completely plausibly

1:12:42

deniable nudge forthright

1:12:44

away. And nature can say,

1:12:46

oh well, we've probably got some problem

1:12:48

in this and we're not sure it's right

1:12:51

for this magazine. They

1:12:53

have complete plausible deniability

1:12:55

that it's actually because

1:12:57

we subconsciously recognise that

1:13:00

to remove this domino

1:13:02

from the table creates a cascading

1:13:05

effect that ruins everything. And

1:13:07

so it's just a slight nudge

1:13:09

away. Like in the firing squad is the

1:13:12

same idea, right? If everybody

1:13:14

pulls the trigger, who pulled the trigger?

1:13:16

No one really knows. And

1:13:20

you can walk away from that situation about

1:13:22

the mouse thinking the they

1:13:25

component of it is

1:13:28

very specific. It was

1:13:30

because the scientific realm

1:13:32

don't want a new idea or this

1:13:34

or that, the other. You can come

1:13:36

up with any story you want and

1:13:38

you can plausibly believe it, but we'll

1:13:40

never be any the wiser if the

1:13:42

where the centre of this thing is

1:13:44

because it's operating like this odd Ouija

1:13:46

board with 100 different people with their

1:13:48

hand on the thing. And it just

1:13:50

on its own produces these outcomes and

1:13:52

we're not really in the driving seat

1:13:54

of any of it. And you

1:13:57

can be in the crowd as that's happening and thinking,

1:13:59

I think... I know what's

1:14:01

happening here, but I can't plausibly

1:14:03

explain it to anybody. Well,

1:14:06

I love your analogy. I'm a little worried

1:14:08

that people will take it incorrectly, partially because

1:14:10

they're motivated

1:14:12

to hear Ouija board and

1:14:14

laugh. But

1:14:17

I get exactly what you're saying. And

1:14:19

it is, I think, very

1:14:21

much like you're describing and worse than you thought

1:14:24

because the... Because

1:14:26

you can also have an anti-social actor within

1:14:28

the group. It accommodates that

1:14:30

entirely because I've seen in the

1:14:32

game version of this, you

1:14:35

can really scare someone with that

1:14:38

because you can think, well, if I spell out a

1:14:40

name, such and such knows. And

1:14:44

you have complete plausible deniability that you

1:14:46

just nudged it that way. No one

1:14:48

would ever know. Right.

1:14:50

And if you think about this,

1:14:52

because science

1:14:55

is effectively a gentleman's

1:14:58

sport, the mechanisms

1:15:00

of enforcement to keep... It's

1:15:03

basically the honor system. And

1:15:05

so experiments can very easily... If

1:15:07

the PI knows where

1:15:09

their bread is buttered, it is very

1:15:11

easy to push an experiment in a

1:15:14

direction that just so happens to do

1:15:16

good things for your career. And

1:15:19

this results in preposterous nonsense, which

1:15:21

frankly I

1:15:24

saw coming. I didn't think inherently

1:15:26

psychology was the place to spot it.

1:15:28

But the replication crisis was completely obvious

1:15:31

that in a world of P-values, that

1:15:33

there was a way in which people

1:15:35

who were playing this career game would

1:15:37

even inadvertently create a world of beliefs

1:15:40

that just weren't true through

1:15:42

the part of, you don't see the experiments they

1:15:44

didn't publish. Given

1:15:47

what P-values are, this was inevitably

1:15:49

going to create this issue. A

1:15:54

couple other things I wanted to come back to. Oh

1:15:56

yes, it's worse than you think because it wasn't just

1:15:58

pharma. Now I don't know. I

1:16:00

will probably never know. I don't

1:16:02

think Foreman knew that the mice were working for them until

1:16:06

my work. The

1:16:08

point that they discovered that of course they

1:16:10

would become advocates for not fixing this problem

1:16:13

because frankly I don't know how many drugs

1:16:15

actually would pass a proper safety test if

1:16:17

the mice weren't broken in this particular way.

1:16:20

It would be a projection of

1:16:22

what are on the market. Nonetheless,

1:16:26

so I believe that they have started protecting

1:16:28

these broken mice. I don't

1:16:30

think they were doing it before but there's

1:16:32

a bunch of other people

1:16:35

who aren't pharma-based who have a perverse

1:16:37

incentive here too. Right? If

1:16:39

you've built a career on

1:16:42

papers that were distorted by these broken

1:16:44

mice, you don't want your counter

1:16:46

reset by the fact that you have to

1:16:48

redo the experiments. That's one thing. You've

1:16:51

got a world of mice in which some

1:16:53

gene or other has been knocked out in

1:16:55

order to create a pathology so you can

1:16:57

study it. You know, do you want to

1:17:00

really have to redo mice that are built

1:17:02

on a broken background? No. So

1:17:05

the point is, the outsiders, us,

1:17:07

have every interest in this being fixed because

1:17:09

I want to know what freaking drugs in

1:17:12

my medicine cabinet are actually tolerably safe, right?

1:17:15

But from the point of view of the people using

1:17:17

the mice, there is a

1:17:19

pretty broad, perverse

1:17:22

incentive to keep things pretty much as they

1:17:24

are. And then here's the worst

1:17:27

one of all. And

1:17:30

I won't go into the social details of my

1:17:32

interactions with my

1:17:34

bettors in science over this. But

1:17:38

at one point I was collaborating,

1:17:41

I will just say, I was

1:17:43

collaborating with a woman named Carol Greider who has now

1:17:45

gone on to earn a Nobel Prize. She

1:17:47

earned it for the discovery of the

1:17:50

telomerase enzyme with

1:17:52

her advisor, Elizabeth Blackburn. That

1:17:54

had nothing to do with me and it

1:17:56

was a profound discovery. So anyway, I'm not raising an objection. about

1:18:00

getting a Nobel Prize for that. But anyway,

1:18:03

I had partnered with her before she

1:18:05

got her prize and

1:18:10

she had done what I couldn't

1:18:12

do. She took my hypothesis that

1:18:15

laboratory mice have long telomeres but that that will

1:18:17

not be true for wild mice. And

1:18:20

she and her graduate student Mike

1:18:23

Heeman ran

1:18:25

an experiment and they discovered that in fact

1:18:27

this was true and they came back to

1:18:29

me very excited telling me, oh my god

1:18:31

the hypothesis is true, this is wonderful. And

1:18:33

I thought, well, hey this

1:18:35

is great. I'm a graduate student

1:18:37

who's just made an important discovery,

1:18:39

yes partnered with laboratory scientists but

1:18:41

couldn't be better really. Right?

1:18:44

And I expected her to publish it and at

1:18:46

the point that I went to publish my evolutionary

1:18:48

work I wanted to cite her

1:18:50

laboratory work because my evolutionary

1:18:53

work depended on this result which was

1:18:55

a test of my hypothesis. And

1:18:58

I contacted her and I said, Carol

1:19:02

where are you going to publish that so I

1:19:04

can cite it? And there are ways of citing

1:19:06

papers that aren't yet right

1:19:08

in press. And

1:19:10

she said something to me that I was too naive at the

1:19:12

time to understand what she meant. She

1:19:14

said something and it just rings in my ears now. She said,

1:19:18

actually we're not going to publish it we're going to keep that

1:19:20

in house. You've

1:19:23

just made a profoundly important

1:19:26

discovery. With huge consequences.

1:19:28

Immense consequences and you're not going

1:19:30

to publish it? That is unlike

1:19:33

anything I've been led to imagine about the way science

1:19:36

functions. Yeah. But you see why she did it? Well,

1:19:39

yes. So this is

1:19:41

the human element of this. It's

1:19:43

still a form of personal gain.

1:19:46

It's not necessarily just financial. And

1:19:48

again it's something Asim Malhotra has

1:19:50

said and I'm sure everybody in

1:19:52

this space has seen and heard

1:19:54

that clip that he became aware

1:19:56

that there was almost rock-solid data

1:19:58

to demonstrate that a link

1:20:00

between the mRNA vaccination and this

1:20:04

effect which is now actually broadly acknowledged. Yeah.

1:20:06

But at the time, they had this clear signal,

1:20:08

we can just go and publish, and it's

1:20:10

ready. And it's kind of like the

1:20:12

equivalent of crystal clear 4K,

1:20:15

HT data, you know?

1:20:18

And they wouldn't publish because they recognize to

1:20:20

publish this, it will be damaging for me

1:20:22

personally. It's hard to map those

1:20:24

kinds of things onto people. For

1:20:26

me personally, I really struggled with that, because

1:20:29

I want to see the best in people.

1:20:31

I want to see

1:20:33

someone trapped in

1:20:35

some odd system they can't get out

1:20:37

of. I want to see that it's

1:20:39

the systems that's creating these incentives.

1:20:42

It's more difficult for me to come

1:20:44

to terms with the fact that sometimes it's

1:20:46

people doing this. Yeah, but I

1:20:50

get the loose parallel. This one is,

1:20:53

it's a more fascinating case. It's not. It's

1:20:55

nailed on. Here's the

1:20:57

thing. If you are

1:21:00

in possession of the information that

1:21:03

the mice are broken in

1:21:05

a very particular way, you

1:21:08

now have the ability to

1:21:10

predict the outcome of experiments that

1:21:12

other people will never see coming.

1:21:14

This is this asymmetry of knowledge

1:21:16

that you can then exploit. Right.

1:21:18

It's insider information. Yeah, like insider

1:21:20

trading. Don't think of insider information

1:21:23

in science. In my naive graduate

1:21:25

student mind, you have made

1:21:27

a great discovery. That's what this is

1:21:29

about. Of course you

1:21:31

would publish it, because it would be insane

1:21:33

not to collect your winnings. In

1:21:36

her mind, I think the

1:21:38

point was, well, okay, I can publish one

1:21:40

paper, and then everybody has the information. Or

1:21:44

we can keep it in-house. How

1:21:46

many different papers can you publish

1:21:48

where you predict something amazing that

1:21:50

other people don't see coming, and

1:21:52

it's like you have a crystal

1:21:54

ball. You're that smart. Wow, yeah.

1:21:56

So she did ultimately publish it.

1:22:00

didn't mention me at all, literally

1:22:03

did not acknowledge where the hypothesis has

1:22:05

come from. But anyway, so the point

1:22:07

is, if you're an

1:22:09

outsider, you

1:22:12

just don't understand what game

1:22:14

exists inside this career

1:22:16

environment. And you don't

1:22:18

understand that you really

1:22:20

need your scientists to

1:22:22

be, you're not asking them not to

1:22:24

be human, but you are asking

1:22:26

them to put their desire to

1:22:29

do good in the world aside

1:22:32

and to trust in the fact that discovering

1:22:34

what is true is the contribution

1:22:36

of science. What to do about it

1:22:38

is a separate process. And I'm not

1:22:41

arguing that scientists shouldn't be able to

1:22:43

say, here's what we find, here's what

1:22:45

it implies, and that should inform policy.

1:22:48

But the problem is, any

1:22:51

time that the same field

1:22:54

is a place that you are seeking the

1:22:56

truth about something, whether it's climate

1:22:58

or psychology or

1:23:00

medicine, where the

1:23:03

science and the therapy are

1:23:06

housed in the same, under the same banner,

1:23:08

you've got a problem. Right?

1:23:10

Yes. I mean, let's take psychology. This is

1:23:12

the easiest one of these to see. My

1:23:16

claim as a biologist who has

1:23:18

thought a great deal about human beings is

1:23:21

that we know very little

1:23:23

about how to connect the

1:23:26

phenomenology of human psychology

1:23:28

to neurobiology.

1:23:32

We've got some gross anatomy that we

1:23:34

can say. We know how neurons work

1:23:36

quite well. But

1:23:39

the point is, if you think that

1:23:42

psychology is a story in

1:23:45

which we treat a patient

1:23:47

based on our understanding of

1:23:49

the underlying neurobiology and neurochemistry,

1:23:53

you've signed up for a fiction. And

1:23:55

so what I would hope for is a

1:23:57

world in which we have actually two different names. for

1:24:00

the study of how the

1:24:03

mind works and

1:24:06

the study of how we treat people who have

1:24:08

issues. Those are

1:24:11

not close enough together for them to be

1:24:13

the same field. Which

1:24:17

means that would be greatly liberating

1:24:20

for the therapeutic

1:24:22

side, clinical psychology, to

1:24:25

treat people on the basis that here is how

1:24:27

a patient walks through your

1:24:29

door, here's the pattern of dysfunction in their

1:24:31

life or in their mind, and here is

1:24:33

how we interact with them in order to

1:24:35

get them to see something they're not seeing

1:24:38

or to get over a neurosis

1:24:40

or whatever. That's

1:24:42

one thing. Then separately,

1:24:44

we study how the

1:24:47

phenomenology of the neuron is

1:24:49

scaled up into some sort

1:24:51

of processing that we are

1:24:53

only at the very earliest

1:24:55

stages of understanding. Instead,

1:24:58

by treating them as the same thing, you

1:25:01

open the door of therapeutic

1:25:03

interactions with patients to the

1:25:06

illusion that we know enough

1:25:08

to mess with the neurochemistry,

1:25:12

which has been a disaster. Of

1:25:15

course, you can map on top

1:25:18

of that the bonus incentive. It

1:25:21

wants you to find a particular answer. What

1:25:23

we found is X and

1:25:25

Y and Z, there's almost this ever-present looming

1:25:27

force in the background that's like, well, maybe

1:25:30

this would work, guys. This could work.

1:25:32

It's this encouraging voice in the space

1:25:34

that wants to say, well, we found

1:25:36

these things, maybe this can work, and

1:25:38

they can push that stuff through. Now,

1:25:40

it gets to

1:25:42

a stage many, many years in that for

1:25:46

lay people to cook—I don't even like to

1:25:48

do it. Even now, it's making me uncomfortable,

1:25:50

because I know that as we make this

1:25:52

video, there'll be people out there who've received

1:25:55

those therapeutics. There'll be people out there who've

1:25:57

had those diagnoses, who've been told that they

1:25:59

have this particular thing, I don't know,

1:26:02

but how you navigate

1:26:04

through that space, especially when you get

1:26:06

to the point of people receiving medication

1:26:09

from a credible space that I certainly am

1:26:11

not able to challenge, all

1:26:14

I can do is say, I've seen how

1:26:16

these decisions get made. Ultimately, we see the

1:26:18

very sort of front end of it, that

1:26:20

this prescription is what's been given to you.

1:26:23

But everything behind the gate, behind the door,

1:26:25

when you really start to look, is very

1:26:27

concerning. There's tons of weird stuff that goes

1:26:29

on. And it makes you wonder,

1:26:31

how did this therapeutic get from there

1:26:35

to being labeled as useful for

1:26:37

this condition, and ultimately

1:26:39

into the prescriptions of millions of

1:26:41

people? Right.

1:26:43

The work, it's not higher

1:26:46

quality than what we saw with COVID

1:26:48

therapeutic. No. And once

1:26:50

you've seen that things that look very much

1:26:52

like science, they take the exact

1:26:54

form of science, they're published. They

1:26:57

look quite good to the casual observer, but

1:26:59

they don't stand up to scrutiny when you

1:27:01

get into the nuts and bolts of how

1:27:03

the experiments were done. Once

1:27:05

you've seen that, and it's like,

1:27:08

oh, there's this illusion of science

1:27:10

that gets published in the top places,

1:27:13

and actually there's a fair amount of

1:27:15

work on the corruption of those journals

1:27:17

and how it in fact happens. Once

1:27:20

you know that, then the point is,

1:27:22

okay, well, what are the chances that

1:27:24

this doctor who's telling me that this

1:27:27

pill is going to address this cognitive

1:27:29

malfunction is A,

1:27:32

perversely incentivized to tell

1:27:34

me that, B, has a

1:27:36

detailed enough knowledge to understand what

1:27:38

the downsides might be. It's

1:27:42

a preposterous story at some level, even if

1:27:44

there are people who are occasionally helped by

1:27:46

these things. Yeah, definitely. But it

1:27:49

always comes back to the same place where

1:27:51

I could personally always find myself, which is,

1:27:54

dare I pick this rock up again?

1:27:57

Dare I lock on the- We'll crawl out. Because

1:28:00

it has, again, like with the climate

1:28:02

thing, it's like, broadly, I'm on side.

1:28:05

I'm on side that we should sort our future

1:28:07

out. We should crack

1:28:10

on, lower the carbon in the atmosphere,

1:28:13

et cetera. But

1:28:15

having learned what I've learned over the last three or

1:28:17

four years, I'm now in a position where it's like,

1:28:19

do I want to go down that path to really

1:28:22

figure out whether all of

1:28:24

that is subject to the same problems that

1:28:26

I've identified? And I'm

1:28:28

quite content to discuss in this

1:28:30

one domain. As

1:28:33

I'm getting older, I'm getting to the point of being

1:28:35

more comfortable to say, it's very likely we're going to

1:28:37

find the same thing. And

1:28:40

there's a kind of ontological shock that comes with

1:28:43

that, that makes people

1:28:45

uncomfortable. It makes other

1:28:48

people you speak with uncomfortable, not in this space. People

1:28:52

relish the opportunity to discuss about it in this

1:28:55

space. But it's such a weird thing to get

1:28:57

into with people. And I do want to come

1:28:59

back to something, which was, I mentioned

1:29:02

about the—there's a moral quandary attached to

1:29:04

telling people that the future is going

1:29:06

to be completely messed up via climate

1:29:08

change and this sort of overwhelmingly

1:29:11

negative message about that, and

1:29:13

how that has real implications if it's not

1:29:15

true. We ought to

1:29:17

hold ourselves to the same standard, I

1:29:19

believe, in this new, weird space that's

1:29:21

emerged. And I think

1:29:23

we sometimes paint a picture that may

1:29:26

well be more

1:29:28

conducive to clicks and views. And

1:29:32

are we subject to the same dynamics, do you

1:29:34

think? Are we painting a fair

1:29:36

picture of what might come? Well here's the

1:29:39

problem. I

1:29:41

spend a lot of time watching

1:29:47

myself to make sure that I

1:29:49

don't fall into that. I'm

1:29:51

not using a you as that. No, I understand

1:29:53

that. But look, I think it's a real enough

1:29:56

problem that I have watched myself to say, all

1:29:58

right, in what ways might this migrate? into my

1:30:00

way of thinking. And

1:30:02

I will say I have a very

1:30:05

remote relationship

1:30:08

with the feedback on what I put

1:30:10

into the world and how many people

1:30:12

see it. I do not

1:30:15

check in general. Occasionally, I encounter on one

1:30:17

platform or another how many views a particular

1:30:20

video has had. But

1:30:22

by not monitoring

1:30:24

that, it is

1:30:26

harder to fall into that trap, what things

1:30:28

it is that cause people to drop off,

1:30:31

what things it is that cause people to

1:30:33

sign up. So

1:30:35

anyway, I do think it is important to

1:30:37

break that feedback. I also think it is

1:30:39

important to cultivate in

1:30:42

oneself, just

1:30:44

as I've advocated for a mindset

1:30:46

like the gentlemen scientists of old

1:30:48

who are interested in being right

1:30:50

in the end, it

1:30:53

is important not to

1:30:55

monitor the ebb and flow over

1:30:58

these things. But I will say that there is

1:31:00

a, and you've

1:31:03

done a very good job here, the

1:31:07

fact that I don't trust the

1:31:09

climate scientists at all.

1:31:11

And I will say I don't trust

1:31:13

the climate scientists because they are

1:31:16

obsessed with models. Yes.

1:31:18

Now, I know a

1:31:21

bit about models. And I would say models

1:31:24

are arguably useful

1:31:26

in generating hypotheses. They cannot be used

1:31:28

to test them in a complex system.

1:31:31

I believe the models are another place

1:31:33

where my analogy of a Ouija board

1:31:35

is broadly correct. It is. There are

1:31:37

so many inputs to them. It's like

1:31:40

we can just gently, plausibly nudge this.

1:31:42

Right. In certain ways,

1:31:44

and we can walk away and no one would

1:31:46

ever know. And maybe even I don't know that

1:31:48

I did that. Exactly. You don't know. You

1:31:51

get a lot of true believers who are deploying

1:31:53

models. And the point is somewhere in the

1:31:56

back of their mind is the idea if they're

1:31:58

still in the field, they pro- I probably

1:32:00

believe the world is in immediate

1:32:02

peril. It is dependent

1:32:04

on people waking up to a message. I

1:32:07

am part of a great movement of those

1:32:09

trying to awaken the world to this grave

1:32:11

danger. It's very

1:32:13

late. And

1:32:15

therefore, as long as I err in

1:32:18

a direction which causes people to wake

1:32:20

up and do the right thing, how

1:32:22

much harm is there in that? They

1:32:24

probably think that. Now, my feeling

1:32:26

is, wow, is there a lot of harm in

1:32:28

that, because you are

1:32:30

talking about intervening

1:32:33

in a complex system where you

1:32:35

are almost certain to create massive

1:32:38

unintended consequences. You just can't predict.

1:32:40

You can't predict. And we know

1:32:42

that one of the consequences that

1:32:44

is being queued up has to

1:32:46

do with the elimination of basic

1:32:48

civil rights and sovereignty. They

1:32:51

are declaring an emergency that

1:32:56

means we are no longer entitled

1:32:58

to the rights that are the

1:33:00

foundation of life in the West.

1:33:04

Like, for example, the right to travel, the

1:33:07

right to have a car, drive

1:33:09

where you would like to drive. And

1:33:13

the way all of those things are being discussed

1:33:15

in a moralistic turn, that's the part that I

1:33:17

think is curious and interesting. Because we are back

1:33:19

at this idea of a they at this stage.

1:33:21

But it's the

1:33:23

moral core of that which drives that forward.

1:33:26

And it puts you into a position of

1:33:28

pariah again to say, well, can

1:33:30

we just hang on one moment? And

1:33:32

the moment that's done, the same dynamic sort

1:33:35

of spoke about all through this podcast,

1:33:37

it forces people out of

1:33:39

the public square. And

1:33:42

you don't really make any progress from outside of the

1:33:44

public square. You are on your own. Now,

1:33:47

to your point about what happens

1:33:49

to those who are in the

1:33:51

contrarian space. And I don't

1:33:54

think of myself as a contrarian, though,

1:33:56

increasingly, because the institutions are failing across

1:33:58

the board, one ends up. in opposition

1:34:00

to all of them, which looks like contrarians.

1:34:03

But the risk that I

1:34:06

see is that

1:34:11

when you correctly glean

1:34:13

that the truth-seeking apparatus

1:34:17

is malfunctioning in

1:34:19

a particular direction of climate

1:34:21

alarmism, you

1:34:24

do not have a corrective. So one

1:34:28

thing I see amongst my friends on

1:34:30

the right is

1:34:33

that they do not hear a

1:34:35

distinction between concern over the environment

1:34:37

and concern over climate. These are

1:34:40

synonymous. It's like the failure

1:34:42

to distinguish between reproductive success

1:34:44

and fitness. And as

1:34:47

a biologist and somebody who has

1:34:49

been passionate about nature since I

1:34:51

was a kid, I can

1:34:54

tell you the

1:34:56

earth is in trouble. I

1:34:58

am not convinced it's in climate

1:35:00

trouble. I'm convinced there is climate

1:35:03

change. I know that there is

1:35:05

a basic non-model-based issue with the

1:35:07

Arrhenius equation that carbon does in

1:35:09

principle cause the earth

1:35:11

to trap some more heat. But

1:35:14

again, this is a complex system

1:35:16

in which that reality, which

1:35:18

I believe is likely to be true, is

1:35:21

embedded in a lot of factors and

1:35:23

feedback loops. And it's very hard to

1:35:25

determine what's going on. And humans that decide

1:35:28

to take control of it are almost certain to screw

1:35:30

it up, even if it were true that there was

1:35:32

some sort of an emergency. But the

1:35:35

emergency about what

1:35:37

we are doing to habitat

1:35:42

and to creatures is

1:35:45

real and utterly obvious

1:35:48

if you are observing them

1:35:50

directly. I agree. And

1:35:52

these are one-way processes

1:35:55

to the extent that you lose species, you do

1:35:57

not regain them. I

1:36:00

have through a lot of careful

1:36:02

thought arrived at the idea that

1:36:04

it is perfectly reasonable to

1:36:06

imagine that the

1:36:09

way to steward the planet is

1:36:11

to maximize the degree

1:36:13

to which we preserve things that are

1:36:15

valuable to people. I

1:36:18

would love to be a person who could say

1:36:20

actually the creatures of the world are entitled to

1:36:22

be here as well and we have to steward

1:36:24

the world with that in mind. The

1:36:27

problem with that is does

1:36:29

it apply to malaria? Right,

1:36:31

I don't think it does. Well,

1:36:35

that's a huge and ethical, difficult

1:36:37

question. I think it's a consequence

1:36:39

of living in a

1:36:42

complex system because like we've identified when you

1:36:44

stick your oar into the water there are

1:36:46

consequences you can't measure. And

1:36:48

that's where this idea of engaging with

1:36:51

the truth, trying to find the truth

1:36:53

is extremely hard because once you've found

1:36:55

it you actually have to be able

1:36:57

to contend with the consequences of what

1:37:00

it actually means. So broadly then people

1:37:02

might say, okay, well, I'll

1:37:05

not acknowledge it or deal with it. It's kind

1:37:07

of a huge cosmic trolley problem. It's

1:37:09

easier to just ignore because let's say

1:37:11

for example in some bizarre argument that,

1:37:14

okay, you can solve malaria tomorrow, great,

1:37:16

you take it. What you didn't see

1:37:18

was that in doing that X

1:37:21

and Y and Z happened and we don't have

1:37:23

that information. So now you

1:37:26

but then you can think, okay,

1:37:28

well, now we're stuck in complete

1:37:30

paralysis because to take

1:37:32

that logic to its

1:37:34

extreme, this idea that, okay,

1:37:37

I grew up in a stable state, I grew

1:37:39

up in a stable economy, I've grown up in

1:37:41

a country and in a culture that believes the

1:37:43

institutions, teachers, scientists and

1:37:46

clever, wise people can have

1:37:48

positive influences in our lives.

1:37:51

You question that reality and think

1:37:53

actually many of the interventions we're

1:37:56

making are having these massive downstream

1:37:58

consequences that were only... just beginning

1:38:00

to identify, there's a danger, I

1:38:03

believe, to revert back toward a

1:38:05

Buddhist philosophy of

1:38:09

complete indifference, which is to say, I'm not

1:38:11

going to try and alter or change or

1:38:13

shape the world at all, broadly. I know

1:38:15

this is a real kind of hashing

1:38:20

of Buddhist philosophy. But broadly,

1:38:22

the philosophy that I think people are starting

1:38:24

to pick up is that I can change

1:38:26

myself. What's important is

1:38:28

me, myself, my family, and everything

1:38:30

external to that is whatever's going

1:38:32

on, and you become indifferent

1:38:34

about everything. Somewhere between

1:38:37

those two very huge polarities, we

1:38:39

have to find a path of,

1:38:42

yes, we can know something

1:38:45

to some degree. We can

1:38:47

know enough that we can intervene in a

1:38:49

way that we think and hope will become

1:38:52

positive. We

1:38:55

don't have a choice to revert back

1:38:57

to just washing our hands of everything,

1:38:59

which you mentioned is like

1:39:01

our friends on the right. I really share

1:39:04

that view. You say that the

1:39:07

complexity of trying to solve the

1:39:10

environmental question is often lost, because

1:39:12

the whole conversation becomes well. Any

1:39:15

attempt to change or fix

1:39:18

this is wrapped up with

1:39:20

the worst possible intervention, the

1:39:22

worst possible policies and laws.

1:39:25

And how you can bridge

1:39:28

those two very contrasting and difficult

1:39:31

perspectives to say that we can't

1:39:33

actually make a difference and still

1:39:35

feel confident enough to act at all?

1:39:37

Can we just be stuck in paralysis,

1:39:41

fearful of having any influence on Earth in

1:39:43

case you might screw something up that we

1:39:45

didn't realize? Well, you

1:39:48

know, I've got a set of principles to

1:39:50

deal with this, but I want to go

1:39:52

back to the narrow question of should we

1:39:54

steward the world for the

1:39:56

purpose of human well-being, or should we steward the world

1:39:58

for the purpose of human well-being? of preserving

1:40:03

every element of the

1:40:05

biota, with malaria

1:40:07

being the obvious test

1:40:09

case. Now, my

1:40:11

claim is neither of these solves the problem

1:40:13

completely of what you do. Would

1:40:16

eliminating malaria have

1:40:18

downstream consequences that are negative?

1:40:22

It might, but I'm willing

1:40:24

to risk it given the harm of malaria. Would

1:40:27

eliminating anopheles mosquitoes have

1:40:29

downstream consequences that are negative? Almost

1:40:33

certainly. They're playing a role in

1:40:35

the environment. Would it still be worth it

1:40:37

from the point of view of the well-being of humans

1:40:39

if we project indefinitely into the future? I

1:40:41

would guess so, but I would certainly want

1:40:43

to gather people who don't have a perverse

1:40:46

incentive to talk about whether or not there

1:40:48

is any negative consequence that is as large

1:40:51

as the ongoing costs that

1:40:53

we pay over malaria, including

1:40:55

what happens if

1:40:57

people who are being weeded from these

1:40:59

populations are suddenly not weeded. It

1:41:02

is possible to create an even worse

1:41:04

problem as

1:41:06

a result of the fact that this process has

1:41:08

been altered. So all of those things need to

1:41:10

be discussed. But nonetheless, I would just simply argue

1:41:13

that if

1:41:15

you simplify to me

1:41:18

counterintuitive idea that

1:41:22

the stewardship should be about human well-being,

1:41:24

I would argue it actually recovers the

1:41:26

ability to protect all of the stuff

1:41:28

that must be protected,

1:41:30

because future generations have

1:41:32

a right to a world in which orcas

1:41:35

continue to exist. It

1:41:37

allows you to protect all of the

1:41:40

things that are necessary, but it does

1:41:42

not require you to treat them with

1:41:44

a sentimentality that will cause you to

1:41:46

do harm because some substance species

1:41:49

somewhere will be disrupted by this

1:41:52

important thing. to

1:42:01

sort of end on, I suppose. It's

1:42:04

actually tractable, is kind of my point.

1:42:06

Whereas the alternative, where it may be

1:42:08

sentimentally nicer, is

1:42:11

not tractable. You're suddenly

1:42:13

defending malaria's right to exist, and

1:42:16

my feeling is anything that leads you down that road is

1:42:19

bound to be a mistake. This

1:42:21

view you have is contingent upon

1:42:24

what kind of vision we begin to

1:42:26

build for what humans should

1:42:28

become. Because one vision is expressed

1:42:30

most beautifully, I think, in the

1:42:32

song by Radiohead, fake plastic trees,

1:42:35

a fake plastic watering can,

1:42:37

or a fake Chinese rubber plant. You

1:42:39

can follow that vision to its logical conclusion.

1:42:43

As cynical as that song sounds, I

1:42:45

do believe it summarizes and shows us one

1:42:48

particular vision for humanity. You

1:42:51

would then say, well, we're cultivating an

1:42:53

Earth to facilitate that kind of a

1:42:55

human. It can lead us to ruin. But

1:42:58

there is another vision, which is

1:43:00

amorphous and strange in my mind, but I can see

1:43:02

it. You raised the idea of orcas, and I can

1:43:05

already see this in the ground, in the ocean, forests

1:43:08

and rainforests. It's a very different vision for

1:43:11

humanity. We

1:43:13

still have a pretty big job

1:43:16

on, I think, making sure we

1:43:18

convince enough people that this is where we

1:43:20

ought to go. Because

1:43:22

we don't. It's no real danger.

1:43:24

We would retreat back towards this

1:43:27

stratified internet-addled

1:43:29

singleton existence, which as cynical as it sounds,

1:43:32

we can all see it happening. It's happening

1:43:34

in it. Well, if

1:43:37

I had, if there was one button I could

1:43:39

push, make an alteration that I think would

1:43:42

bring us in the direction of reason. It

1:43:46

would be to break the

1:43:48

connection between the idea of

1:43:50

sustainability and any ideology.

1:43:54

My feeling is that actually any ideology

1:43:56

that falls down on, and I'm not

1:43:59

arguing. arguing that you can

1:44:01

easily operationalize sustainability, but

1:44:04

that loosely speaking, we

1:44:06

are obligated morally to

1:44:09

not leave a lesser world

1:44:11

than we inherited. Which is the subjective thing

1:44:13

is the danger. That would be cynical about

1:44:15

it, but it's that people will argue what

1:44:18

kind of world. Some people would look at

1:44:22

Las Vegas and say

1:44:24

this is the vision. Yeah, but

1:44:27

here's my point. I don't want to fight about Las

1:44:29

Vegas. Okay, what

1:44:31

I want to point out is you

1:44:33

don't have the right to decide that

1:44:35

future generations don't need an Amazon.

1:44:40

You don't know what future generations

1:44:42

will discover is in

1:44:44

the Amazon that they need in

1:44:46

terms of medicines. You don't know

1:44:48

what psychological impact it is. Have

1:44:51

we been driven crazy in part because

1:44:53

we've made light pollution

1:44:56

and that has caused people to become

1:44:58

solipsistic because they can't see the vastness

1:45:01

of even our arm of the galaxy? I

1:45:05

think there's an argument to be made. So

1:45:07

my point is nobody has

1:45:09

the right to make that decision for

1:45:11

future generations. What you do

1:45:14

with the minutia, I don't know. I

1:45:16

don't want us to be paralyzed by

1:45:18

the inability to do anything because it

1:45:20

might conceivably have a consequence or something.

1:45:22

But I do think the

1:45:24

big important stuff that we can see, we

1:45:27

have an obligation to preserve it even if

1:45:30

we don't think it's so important to people

1:45:32

who can't speak for themselves because they're five

1:45:34

generations out. And

1:45:36

what we are doing presently is we

1:45:38

are liquidating

1:45:40

the wellbeing of the planet.

1:45:43

Things are being lost that will

1:45:45

not come back. And whether

1:45:47

you are a believer that

1:45:50

these species were specially created

1:45:52

by an individual

1:45:54

who had a purpose for us, or

1:45:56

you think these are the product of a... mindless

1:46:00

process that just happens to

1:46:02

have created endless forms most

1:46:04

beautiful? It doesn't matter. Preserving

1:46:07

it is the sensible thing to do. Not

1:46:09

preserving it, you know, it's not that you

1:46:11

can't move a grain of sand, but to

1:46:13

the extent that you are going to poison

1:46:16

a landscape, right?

1:46:18

Because that landscape isn't really important to

1:46:21

anybody. You

1:46:23

are robbing future generations of discovering

1:46:25

that that was actually valuable

1:46:27

in some way. And it's hard

1:46:30

for me to imagine how that isn't a

1:46:33

morally secure foundation even if

1:46:35

there are questions about how

1:46:37

you would operationalize it. So

1:46:39

are you optimistic to finish?

1:46:41

Are you optimistic about getting

1:46:44

towards that? Well, how do

1:46:46

you what's your feel right now? Well, look,

1:46:48

I should I should tell you that I'm

1:46:50

optimistic because it would do

1:46:52

some good, but I'm not going to tell you

1:46:54

that. I'm going to tell you the truth. The

1:46:56

truth is, I believe we

1:46:59

are in grave danger. A

1:47:01

dark age is one thing if you

1:47:03

have primitive technology, it's quite another if

1:47:05

you have nuclear technology and beyond. So

1:47:08

we have to wake up to that danger. I believe

1:47:11

we have an obligation to preserve what we

1:47:13

have and that we are falling down on

1:47:15

that obligation. I believe it

1:47:17

is very late, but I do not know that

1:47:19

it is too late, which is

1:47:21

why I'm doing what I do. I

1:47:24

want people to wake up. I will

1:47:26

say on the bright side,

1:47:28

in order to get,

1:47:32

there's an evolutionary metaphor for this, but in order

1:47:34

to get from a low peak to a higher

1:47:36

peak, a better state of being, one

1:47:38

has to go through an adaptive valley. The

1:47:41

fact that things are very dark at

1:47:43

the moment may just simply be necessary

1:47:45

in order to break us out of

1:47:47

our paralysis and get us to discover

1:47:49

what we are supposed to be doing

1:47:51

next. So I

1:47:53

don't find the fact of the darkness

1:47:56

itself to be indicative that we won't

1:47:59

solve the problem. problem, but I'm

1:48:01

very concerned about the mechanisms

1:48:04

that we have at our disposal

1:48:06

to even recognize that we

1:48:08

have a shared problem. All of

1:48:10

us on Earth are tied together in

1:48:13

a way that there's no conceivable divorce.

1:48:15

So what we collectively

1:48:18

understand to be our hazards

1:48:23

and opportunities matters, whether

1:48:25

we like it or not. We can retreat to

1:48:27

our corners and leave it to fate, but that's

1:48:29

not a wise thing to do. So I

1:48:35

think we could fix things and I think

1:48:37

we could make things substantially better

1:48:39

than they have been. I think we could

1:48:41

frankly defend the ideals of the West, which

1:48:43

is what I believe we should be doing.

1:48:45

We should be globalizing those ideals. We

1:48:48

have a global West in

1:48:50

which all of us understand that human

1:48:54

life is precious, that what makes it

1:48:56

precious is the liberty to be distinct

1:48:59

from each other, and that

1:49:01

that creates the dynamism of humans,

1:49:04

that we are not all identical and that

1:49:06

requires freedom. The purpose is not to

1:49:09

enjoy or endure or whatever else it

1:49:11

might be. That liberty is fundamental and

1:49:14

that anything that is pushing us in the

1:49:16

direction of eliminating liberty to deal with an

1:49:19

emergency is bound to

1:49:21

be wrong, if not an outright con.

1:49:25

So anyway, I'm hopeful in the sense that

1:49:29

I believe there

1:49:31

is still time and that people's growing

1:49:33

awareness means that there

1:49:35

is hope for us to fix it, but

1:49:38

I am also concerned that

1:49:42

that dark age is resulting

1:49:44

in people embracing kinds of

1:49:50

mysticism that actually will not end up productive

1:49:55

in the end. So when you say you want

1:49:57

people to wake up, like let's see if we

1:49:59

can be pragmatic and practical about

1:50:01

that for the people who

1:50:04

maybe are already on side. They

1:50:06

have friends, they have families, because

1:50:08

they live in a culture that exists.

1:50:10

Like what do people feasibly do?

1:50:12

Like my instinct has

1:50:15

been, and I'm newer to this, is

1:50:17

that my strongly held

1:50:19

views may actually

1:50:22

be wrong. It's

1:50:24

very possible that something that's motivating

1:50:26

me quite deeply and upsetting me

1:50:28

quite deeply is on closer

1:50:32

inspection not serving me at all. That can

1:50:34

come down to even things like class, perspective,

1:50:39

ambitions, the ideas

1:50:41

you have about the shape of the world that you're

1:50:43

living in. Is that part of it? Does it

1:50:45

recognize that you might be lost

1:50:48

at sea and then you can make progress

1:50:50

from there? I think that's it. I mean,

1:50:52

I love your analogy. We

1:50:54

are lost at sea. The

1:50:56

captain does not admit it. The moment

1:50:59

at which hope will grow

1:51:01

is the moment at which

1:51:06

we do sit down with the captain and

1:51:09

say, look mate,

1:51:12

you're not doing us any favors by telling

1:51:14

us everything is fine. Recognizing

1:51:17

the peril we're in is really the

1:51:19

first step to figuring out, okay, what

1:51:23

do we do first? Where do we

1:51:25

go from here? How

1:51:27

many lifeboats are there exactly? Is there

1:51:29

anybody coming if we board those lifeboats?

1:51:31

These are important questions. Look, I think

1:51:34

some people are

1:51:40

not ready under any circumstances. They could literally be

1:51:42

on a sinking ship and they wouldn't be ready

1:51:44

to hear it. Which is much like the analogy

1:51:46

of the famous case, isn't it? People went down

1:51:48

with the violins, rearranging the

1:51:50

deck chairs, as they say. Yeah, rearranging

1:51:52

the deck chairs. But the point is,

1:51:54

look, ultimately,

1:51:57

if you can't handle the

1:52:00

that we might be on a

1:52:02

sinking ship, that's okay, but

1:52:04

you don't belong in the conversation about what to do

1:52:06

about it, right? People in

1:52:09

denial about that are not helpful in

1:52:11

light of the obvious peril. So

1:52:14

what we need to do is have a conversation with

1:52:16

those who are ready to have it, right? There

1:52:19

is hope, but it will be squandered if

1:52:21

we are constantly debating whether or not there's

1:52:24

a problem. There's obviously a problem. And,

1:52:29

you know, as has

1:52:31

been the case with the COVID dissidents,

1:52:34

the discovery that there were other people

1:52:36

who had not lost their minds, who

1:52:38

were seeing the same issues, who had

1:52:40

different pieces of the toolkit we needed

1:52:42

in order to navigate it, was a

1:52:48

tremendously hopeful discovery. Yeah,

1:52:51

it was. And I think in that analogy,

1:52:53

if this helps anyone at all listening or

1:52:55

watching, people got off the ship

1:52:58

way before they got off. People said this

1:53:00

is actually just not working. And many places

1:53:02

across the world, they just cracked on it

1:53:04

that did their own thing. From

1:53:07

a certain perspective, it's still

1:53:09

so all-encompassing for so

1:53:11

many people. But if

1:53:14

you were traveling around the world at that time, as I was,

1:53:16

the whole different range of ways

1:53:19

in which people experienced that global

1:53:21

pandemic and the whole

1:53:23

rainbow of different perspectives that people

1:53:25

have made you realize, oh, there

1:53:28

are a whole many different

1:53:30

perspectives you can have about being on this

1:53:33

vessel. You can hop off at any time.

1:53:35

Well, actually, and that brings us to the

1:53:37

really important thing. Right. We can see in

1:53:40

what the World Health Organization is doing that

1:53:42

there is a desire to make it impossible

1:53:44

to navigate your own course. They made it

1:53:46

difficult. They're looking to make it

1:53:48

impossible to navigate your own course in

1:53:50

the next emergency that they declare. And

1:53:52

that we must oppose

1:53:55

absolutely vigorously and we must win.

1:53:57

Yeah, I expect incursions on freedom

1:54:00

of speech, but if

1:54:02

you can't recognize that we are somewhere

1:54:04

lost at sea. Yeah, well, freedom of

1:54:06

speech mandates

1:54:08

of any kind. And I'm not arguing,

1:54:10

you know, I see as well as

1:54:13

anybody the game theory that

1:54:15

in a perfect world would

1:54:18

have you govern

1:54:21

your way through a pandemic, but

1:54:23

we don't live in a perfect world.

1:54:26

We're not going to live in a perfect world. And in this

1:54:28

world, you have to leave people the freedom

1:54:30

to say, no, actually, I don't

1:54:32

accept what you're telling me. And

1:54:35

I have a right not to be

1:54:37

injected. I have a right not to have, not

1:54:40

to be locked into my home, not

1:54:42

to be robbed of my ability to

1:54:47

make facial expressions at another human being.

1:54:50

These things are beyond whatever

1:54:52

model you've deployed. And

1:54:55

you know what? People get sick

1:54:57

and they die. And you don't have the power

1:54:59

to stop that from happening. And you don't have

1:55:01

the power. You don't have the right to declare

1:55:04

some disease so important that you get

1:55:06

to override the natural processes of science

1:55:09

and deliberation. If you've got a point

1:55:11

to make, you have one tool at

1:55:13

your disposal, and that is persuasion. So

1:55:15

get on it. I agree.

1:55:19

I like that. I think we can leave that there. But what do

1:55:21

you think? I think it's great. Nice. Yeah,

1:55:24

been a real pleasure seeing you

1:55:26

again. And it's been a great conversation. Yeah,

1:55:28

thanks, Brad. All right. Thanks

1:55:30

for joining us. And see you next time.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features