How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 2

How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 2

Released Wednesday, 23rd October 2024
Good episode? Give it some love!
How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 2

How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 2

How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 2

How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 2

Wednesday, 23rd October 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

The time for holiday hosting is upon

0:02

us, so make your second bathroom second

0:05

to none with Home Depot.com's best savings

0:07

of the season. Right now, enjoy up

0:09

to 40% off select online bath. Find

0:11

the latest on-trend styles of vanities, faucets,

0:14

showers, tubs, toilets, and more, all at

0:16

prices that will let your budget relax

0:18

right along with you and your beautifully

0:21

renovated bath. Get up to 40%

0:23

off select online bath plus free delivery

0:25

at The Home Depot. Subject to

0:27

availability, see homedepot.com/delivery for details. Think

0:30

you know what makes a great episode of Impact Theory? I

0:33

want your help. I'm giving

0:35

you the reins and it only

0:37

takes two minutes. Look, with a

0:39

little bit of your help, I

0:41

can make this show exponentially better.

0:43

You know I am relentless about

0:45

optimization and that includes this show.

0:47

Here's the deal. We've set up

0:49

a super quick survey at gum.fm

0:51

slash impact. It'll take you just

0:53

two minutes, but those two minutes

0:55

could be so useful to making

0:57

this even better for you guys.

0:59

Your feedback is gonna help shape

1:01

the future of Impact Theory, the

1:03

guests that blow your mind, the

1:05

topics that transform your life, even

1:07

the ads that actually provide value.

1:09

It all starts with your input.

1:11

This is your chance to influence

1:14

the show and help make sure

1:16

that it's serving you in the

1:18

best way possible. Remember, exponential growth

1:20

comes from brutally honest feedback and

1:22

then us taking decisive action on

1:24

that feedback. So here's the mission.

1:26

Head to gum.fm slash impact and

1:28

give your feedback right now. We

1:30

promise to make very good use

1:32

of it. Again, gum.fm slash impact.

1:34

That's GUM.F-M slash impact and make

1:36

sure that your voice is heard.

1:38

I'm Tom Bilyeu and this is Impact

1:41

Theory. Let's dive right back in to

1:43

part two with Mark Andreessen. Why

1:46

do so many people in society want censorship right

1:48

now? Well, they want censorship if

1:50

it's on their, they want censorship if it's on their side,

1:52

right? So, you know, so my version of this is, so

1:54

when I, you know, I told you I grew up in

1:56

the, I wasn't really part of it, but I grew up

1:58

in the middle of this sort of great, evangelical. awakening in

2:00

the 70s and 80s. And at that time, the sort of

2:02

Christian conservatives in the US were the

2:05

forces for censorship. And so the classic thing was

2:07

it would be like religious groups that would try

2:09

to censor movies or books. And

2:11

then it was the coastal liberals who would be arguing

2:13

in favor of free speech. And so it would be

2:16

that, and famously like the press, the Pentagon Papers, they

2:18

had all these stories about how great free speech was

2:20

and libraries were sacrosanct that have free speech and they

2:22

weren't going to censor things. So

2:25

the censorship pressure was coming from the right in that

2:27

era. And my analysis of

2:29

that is that's because at that time,

2:31

the right was culturally ascendant. American society

2:33

was much more overtly religious at that

2:36

time. And the Christian conservatives were very,

2:38

very powerful from a cultural standpoint. They

2:41

got to write the textbooks and all these things. And

2:44

so because they were winning culturally,

2:46

they wanted to lock down speech so that

2:49

they would continue to win. And the left

2:51

was the counterculture. Classically, the left, the hippies,

2:53

the 60s, 70s, 80s, the left was the

2:55

counterculture. And the press and so forth

2:58

was the counterculture. And they wanted to challenge

3:00

the dominant frame. And they wanted

3:02

to disrupt the system. And so they were pro-free speech. And

3:04

then 30 years

3:06

later, it's inverted where

3:08

the left owns the universities. They own the book

3:10

publishers. They own the media. They own the press.

3:12

They own the newspapers. They own most of the TV

3:15

stations. They own the internet

3:17

companies. They own the

3:19

sort of these commanding heights of society

3:21

and culture. And so

3:23

now that they won, and now that they're in

3:25

charge, they want to lock down discourse. And then

3:28

the right has become, it's inverted. The right has

3:30

now become the counterculture. And so the

3:32

censorship pressure comes to the left, and then the right wants to

3:34

open things back up. If the right

3:36

becomes culturally ascendant again, I would expect that polarity

3:38

to shift. Once again,

3:41

it'll flip. Whoever's in charge will not want

3:43

free speech, and whoever's the rebel will want

3:45

free speech. The principled position

3:47

is I want free speech regardless. It's just

3:49

very few people sign up for the principle,

3:51

because most people are part of the tribe.

3:54

But yeah, I'm as I say, I'm an old fashioned

3:56

genics libertarian. Like I actually believe in the principle. Yeah,

4:01

no, me too. For me,

4:03

free speech is important because

4:06

part of thinking is speaking out loud,

4:09

having your ideas challenged. Also,

4:11

facts have a half-life,

4:15

and so all the things that we

4:17

believe, man, a

4:19

lot of them call it

4:21

30, 40 years down the road. We

4:24

don't believe them anymore. We've realized we had an

4:26

approximation of the truth, but not the real truth.

4:28

The example I always use on people is Newtonian

4:30

physics versus relativity. It's like, hey,

4:32

when we had Newtonian physics, we thought

4:35

everything worked. We thought we understood

4:37

it, and then we get to relativity. And up,

4:39

actually, you couldn't have had GPS with Newtonian physics,

4:41

so this was an update that was absolutely necessary.

4:44

And as we mentioned earlier, we still aren't

4:46

at ground truth. So we know that we're

4:48

going to be revising that even further. And

4:50

if you really internalize,

4:52

every time we get closer to ground truth,

4:54

it unlocks things for us, then it's like,

4:56

okay, I just want my ideas to be

4:58

challenged. And so because I

5:01

teach young entrepreneurs a lot, I'm like, look,

5:03

you've got to recognize that skills have utility.

5:05

And so the reason you want your idea

5:07

challenged is you can actually develop a better

5:09

skill once you realize, oh, I was wrong

5:11

about x, y, z thing. I can now

5:13

be right, and that actually has utility in

5:16

the real world, lets me do something I

5:18

couldn't do previously. And so when you lock

5:20

that down, now all of a sudden people

5:22

get stuck. You get stuck because you're not

5:24

able to have the best

5:27

arguments thrown at your own idea. It's

5:29

that that one is pretty traumatic to

5:31

me. Now, speaking from

5:34

a position of utility, Elon

5:38

is somebody that has really demonstrated

5:40

an obscene ability to get things

5:43

done. You have bet on a lot

5:45

of entrepreneurs in your career, you've obviously

5:47

been very good at picking the best

5:49

of the best. What

5:51

what is it that Elon does

5:53

either in worldview or action that

5:55

makes him so effective? Yeah,

5:58

this is in my mind, this is the single biggest question.

6:00

I'm really glad you asked it because it's the single biggest

6:02

question in the world right now. Like here,

6:04

it's the single biggest question in my world, right? Which is like,

6:07

okay, how is it that he does what he does? And

6:09

I would say like, I don't, you know, there are people who have

6:11

worked with him for a lot longer who probably understand this better, but

6:13

I've had, you know, an up close kind of look at it for

6:15

the last for the last several years now and have

6:18

come to really, I think, really respect it. And I think

6:20

understand at least parts of it. Look,

6:24

it's the, a lot of it. There

6:27

was that famous text exchange and actually he's a friend

6:29

of mine, a wonderful guy, Prag, who

6:31

was running Twitter at the time when Elon first kind of

6:33

tangled with it. And,

6:36

and he's probably a wonderful guy and he had literally just

6:38

become CEO like a month earlier or something. And so he

6:41

was just putting his plans in place when kind

6:43

of everything, you know, the hurricane hit, but you know, there

6:45

was an exchange where Prag is talking about whatever. And it's

6:47

the famous text exchange where Elon's like, all right, fuck it.

6:49

I'm not having this conversation anymore. And then he's like, he

6:52

said, you know, what have you gotten done this week? And

6:56

what I realized when I read that was like

6:58

that, that is the Elon method. Like the Elon

7:00

method boiled all the way down is what have

7:02

you gotten done this week? Right. And that's very

7:04

important because at anybody who has ever been in

7:06

a large company trying to do anything big, the

7:09

big things happen over the course of years, you

7:11

know, decades, years, months,

7:14

things don't happen in weeks. Like, you

7:16

know, companies have like five year plans,

7:19

right? Like, you know, cars take

7:21

like seven years to design, right? Like rockets

7:23

take like a decade. Fighter

7:25

jets take like 25 years. Big

7:28

software systems take five, 10 years. You

7:31

know, any large scale effort anywhere in the economy,

7:33

we've just all gotten used to this idea that

7:35

things just take years and years and years. And

7:37

then you've got like processes and procedures and plans

7:39

and this, you know, documentation and, you

7:42

know, rules and structure and strategies and

7:44

like frameworks and PowerPoint presentations coming out

7:46

of your ears. You know, Amazon's

7:48

big breakthrough was to go from Amazon's big breakthrough is

7:50

to just go from having PowerPoint presentations to having like

7:52

15 page written documents that everybody reads

7:54

at the start of a meeting, which actually is an improvement off of

7:56

a PowerPoint presentation. But like, you know, there was that it was like,

7:59

no, I mean, I'm not doing any of that. Like I'm not doing

8:01

any of that. We're not doing any of that. Basically

8:04

it's we're gonna like staff these companies almost

8:06

entirely with engineers. I myself,

8:08

I myself, Elon, am an engineer. I

8:11

am going to understand every aspect of every technical system that

8:13

we're working on. I am going to be able to be

8:16

in all the meetings on everything from rocket design to database

8:18

design at Twitter and everything else. I'm

8:20

gonna only talk to the engineers if I can possibly

8:22

avoid it. I'm never gonna talk to anybody who's not

8:25

an engineer. I'm gonna talk to the person who's

8:28

directly relevant to the project. I'm not going through layers. I'm going

8:30

all the way down to the company to just talk to the

8:32

person who's in charge of this thing. And

8:35

then basically what he does is he goes to each of

8:37

his companies each week. He identifies whatever is the bottleneck at

8:39

that company this week. And then he works with the engineers

8:41

and he fixes it that week. So

8:46

what happens is his companies

8:48

move so much faster than

8:50

everybody else's. Like it's just like, it's

8:52

like tortoise and rabbit. Like they just

8:55

move so much faster. They're so much

8:57

leaner. They don't have all these layers. They

8:59

don't have all these like systems and controls and processes and

9:01

all this stuff. And

9:04

but what they have is like many of the

9:06

best engineers in the world who just absolutely love

9:08

working with a CEO who understands the substance of

9:10

what the product is and

9:12

then is willing to actually work with them hands-on. I

9:14

mean, I've been in meetings with him at edX where

9:16

he's in there with like 24 year old engineers and

9:18

they'll just like walk through fire for him. Right,

9:21

because he's like their idol and he's able to have a

9:23

pure conversation with them and he cares about the work that

9:26

they're doing. And if they succeed at it, he is going

9:28

to love them for it. And if they fail at it,

9:30

he's going to be very disappointed in them. And

9:32

it's just a completely different relationship than the CEO of

9:35

one of these big tech companies has. It's just completely

9:37

different. He does a,

9:39

I went to see him one night when

9:42

he took over edX and I was sitting in

9:44

the conference room. So I went, okay, so it's like 10 o'clock.

9:46

It's a classic illustration. So it's 10 o'clock at night. And

9:49

he's like, yeah, meet me at Twitter at 10 o'clock at night. I'm

9:52

like, fine. So I

9:54

drive up and I go in and I go to the

9:56

conference room and it's Elon on his, it's Elon on his

9:58

iPhone doing email. And there's a dog

10:00

on the floor. And I'm like, oh. And

10:02

in retrospect, I was just like, oh, is that your

10:04

dog? And he looks at me completely deadpan. He's like,

10:06

I've never seen that dog before in my life. What?

10:12

I'm like, what is it? Just like the company dog. He bursts

10:14

out laughing, because of course it's his dog. And

10:16

then he's like, all right, I want to talk. But

10:19

he's like, I need 15 minutes. And he's like, by the way, you

10:21

can sit and hang out if you want. I just have to

10:23

take a call. And he

10:25

gets on Zoom. And he's on Zoom with

10:27

rocket engineers for the Falcon rocket, the next

10:29

generation rocket in Texas. And it's

10:31

whatever, I don't know, 12 o'clock their time, midnight their

10:33

time. And it's just him on his iPhone on

10:35

a Zoom call designing the next rocket,

10:37

which is probably the rocket that we just

10:39

saw work. And

10:43

he's fully conversant, completely conversant in that. And

10:46

he and the engineers fix whatever the problem is that week with the

10:48

rocket. And he's like, all right, now we're going to go fix the

10:50

database here at Twitter. And

10:53

so it's just rinse and repeat, rinse and repeat, rinse and

10:55

repeat. Do that every single week. I

10:58

once offered him a place where I thought he

11:01

might want to take it. I was like, I know you're under a lot of

11:03

pressure. You can go to this place for a week if you want. Because he

11:05

famously doesn't own any houses. He sold all his houses. He

11:07

doesn't own any houses. So he stays at friends' houses. So

11:09

I was like, you can go use my house for a week. And if

11:11

you need a vacation, you can go use my house for a week. I

11:13

got back five minutes later one line, I don't take vacations. I'm

11:21

going to frame and bronze that email. And

11:24

so this is what he does. And he just

11:26

does this at an incredible high rate of speed. He doesn't tolerate

11:28

anything that stands in the way of it. By

11:32

the way, this is the same thing that drives everybody

11:34

crazy. And so this was the whole thing on the,

11:36

this is this whole thing he's in this big fight

11:38

with, with regulators on Starship launches. Which

11:40

is like, a normal rocket company would

11:42

take whatever, a decade or 20 years to

11:44

design a new rocket. He's going to put out the prototype as fast

11:46

as he can. He's going to launch it and see what happens. It's

11:50

going to explode in mid-air. My

11:53

nine-year-old and I love watching the SpaceX rocket

11:55

explosion compilation videos on YouTube. They're

11:57

hysterical because they just show these larger and larger

11:59

and larger rocket. rockets launching and exploding in midair.

12:02

And his competitors, all the way SpaceX, all the way SpaceX

12:04

was on its way up, his competitors were like, he's crazy,

12:06

he can't make rockets work, see, they're all exploding. And

12:09

what he was doing was he was iterating on the

12:11

rocket design so much faster than they were. And so

12:13

he would run through five rocket generations of

12:15

which four would fail, but he would learn so much

12:17

that the fifth one would work, and he would go

12:19

through the five generations faster than his rocket competitors could

12:22

do one generation. And he's

12:24

just like, fuck it, I don't care. Like, of course some

12:26

rockets are gonna explode. Nobody's gonna get hurt, it's totally fine.

12:29

But a big company can't tolerate that, because it's like headline

12:31

news and everybody's gonna get mad. And

12:34

so anyway, it's just like this completely, it's a

12:36

base level reality, he calls it first principles. You

12:38

just, you get straight to base level reality, you

12:40

get straight to substance. You spend no time on

12:42

anything other than substance. And so anyway, like if

12:45

you, like me, if you're an engineer and you kind

12:47

of see this, I'm an engineer by training,

12:49

and so if you kind of see this, you're like, oh my

12:51

God, this is like obviously the way that everything should be run.

12:53

But if you see it from the outside, it just looks so

12:56

wild compared to all of these

12:58

other large systems and rules that we've

13:00

all gotten used to, and

13:03

therein lies the conflict. What

13:06

do you, so there's a lot of engineers in

13:08

the world and none of them are having the

13:10

kind of success that Elon is having. How much

13:12

credit do you give to the bundle of

13:15

traits that he

13:17

must have? You've already talked about several of

13:20

them, just getting to first principles thinking, moving

13:22

very quickly. But there's also something that seems,

13:24

I don't know, never met him, but by

13:26

things that I have read, one of the

13:28

early biographies, there's just a

13:30

level of, this is not emotional

13:32

for me at all. It's the,

13:35

your assistant asks for, this was in the

13:37

original, one of the original biographies on him,

13:40

assistant asked for higher payer, something he's like, take a

13:42

vacation for four weeks, I'm gonna do your job and

13:44

see how hard it is. If it's hard, cool, I'll

13:46

give you a raise, and if it's not, you're gone.

13:49

And she'd been with him for like 15 years or something

13:51

crazy, and she comes back and he's like, yeah, it wasn't

13:53

that hard. Bye. And

13:56

people were gobsmacked by that.

13:59

And. And I was like, yeah,

14:01

I get it. I get it. Is

14:05

there something to that? Like,

14:07

if that were your friend and your friend treated

14:09

you like that, it would not feel good. But

14:13

in terms of proportion of his

14:15

success, his ability to just completely divorce emotion

14:17

and just say, this is either right or

14:19

wrong for the project. Yeah.

14:22

Look, I think there's a lot to that. By

14:25

the way, I think Steve Jobs had a lot of that. It's

14:27

just, I mean, there's a lot

14:29

of ways to look at it. And people can have lots

14:31

of views on this, of course. But substance,

14:35

I would say dichotomy, substance versus style,

14:39

or substance versus protocol. It's

14:44

so easy to slide into a

14:46

way of thinking and being in

14:48

which you are thinking

14:51

abstractly about things. You are

14:53

following, we talked about the goal. You're following

14:56

rules that were established years ago. Most

15:00

big companies, so our companies started startups. And

15:02

then basically what happens is, generally what happens

15:05

is, they either fail or they succeed. If

15:08

they fail, they go away. If they succeed, what happens

15:10

is they succeed by going through basically scandal after scandal,

15:12

crisis after crisis after crisis. I always

15:14

describe it as like a process of falling upstairs.

15:17

You're just constantly falling and smashing your face into the

15:19

stairs. But you're gaining altitude as you go. And it's

15:21

just like these companies are just constant internal crisis. And

15:26

the normal response, and by the way, it's the thing that everybody

15:28

in business is trained to do. It's what

15:30

they train you to do at Harvard Business School and Stanford Business

15:32

School, all the books and all the stuff, all the CEO coaches.

15:34

It's like, oh, you go through a crisis, you fix the crisis,

15:36

and then you put in place a set of rules to make

15:38

sure that crisis never happens again. It's

15:41

like the legal thing we were talking about. It's

15:43

like, OK, that by itself would be fine. But

15:46

you do that 20 times over 20 years, and

15:48

you have buried a company in bureaucracy to the

15:51

point where it just basically, right? At

15:54

that point, it's a company primarily that exists

15:56

to follow rules. By the way,

15:58

rules that were in many cases defined. by

16:00

people who aren't even there at the company anymore. And so

16:02

nobody at the company today actually even understands why they were

16:04

there. Toby Luki has a version of this, the

16:07

guy who runs Shopify, who's an amazing CEO. He

16:09

has a version of this, which is it's like every

16:11

whatever year or something, or every six months, he just

16:13

can't, he requires all standing meetings to be canceled, taken

16:16

off people's calendars. And so all management

16:18

reviews, one-on-ones planning meetings, like everything just

16:20

gets taken off. And then he says,

16:22

we only put the meetings back on where people are howling in pain,

16:25

because we don't have them, right? But his point, and

16:27

you have to do that over and over and over

16:29

again, because if you don't, everybody's calendar

16:31

just decreets meetings, and then everybody's sitting in meetings

16:34

all day long, and nobody's doing anything, right? And

16:36

of course, anybody listening to this who works at

16:38

a big company knows exactly what I'm talking about,

16:40

because that's the day-to-day life, which is, oh my

16:42

God. You know, I worked

16:45

at IBM, I worked on the other, I've seen the other

16:47

side of this. So my first real professional experience was I

16:49

was an intern at IBM in 1989 and 1990, when

16:52

they were on top of the world, they in as late as

16:54

1985, IBM was 80% of

16:56

the market capitalization of the entire tech industry. They

16:59

were a giant, they were like, fang

17:01

combined into one company, they were like totally dominant.

17:03

I was there 1989, 90, right before they

17:06

basically fell off a cliff and

17:08

caved in. And so it had been 70 years

17:11

of success, they never had a layoff. Everybody,

17:13

wait, lifetime employment. There were entire buildings full of people

17:15

there who did not have actual jobs, because you couldn't

17:18

actually fire people. Oh God. Oh, I'll tell the story.

17:20

So I got tied to IBM, and my manager's kind

17:22

of showing me around, and as I'm in this giant

17:24

division in Austin, building these sort of, at the time,

17:26

we're called workstations, these supercomputers, basically. And

17:28

he's like, yeah, he's like, look, here's how it works.

17:30

He's like, we're the development, we're development, and we have

17:32

the development building, and we have like 6,000 people doing

17:34

development of the product. And then

17:37

they have what's called, they call marketing,

17:39

but everybody else calls sales, which is the people who go

17:41

sell the product. And then he's like, and

17:43

then that building over there is the planning department. And

17:46

I was like, oh, I get it. In development, we come up with

17:48

ideas, and then we work with the planning department to have the plans

17:50

to be able to do it. And he's like, no, we never talk

17:52

to them. We will never

17:54

visit that building, because that's the department that we assign people

17:56

to when we can't fire them. Right?

18:00

Woof. Right, by the way, this

18:02

is how, of course, public school systems work. You know,

18:04

the New York public school system famously has, I think,

18:06

what they call the rubber room, which

18:08

is the place they send the teachers who are so

18:10

terrible, they can't put them in a classroom, but they

18:12

can't fire them. And so they just have them sit

18:15

and they do crossword puzzles all day. Right?

18:17

It's the longshoremen who are sitting at home, right? So

18:19

anyway, so big companies develop their own version of this.

18:23

And it just, and it accretes.

18:25

By the time I got to IBM, two

18:27

things. Number one, there was an app that they had

18:30

that showed me the number of reporting, the number of

18:32

manager layers to get me the CEO. So if I'd

18:34

stayed at IBM and I want to become the CEO,

18:36

how many layers would I have to climb? And I

18:38

was 12 layers below the CEO. Right?

18:41

Which meant that my boss's boss's boss's boss's

18:44

boss's boss, who was

18:46

like the big cheese, was still six

18:48

layers down from the CEO. So

18:51

there was that. But the other part of it was they had

18:53

a formal process of decision making they called concurrence. And

18:56

concurrence was if you're going to make a decision at IBM

18:58

in those days, you had to make a formal list of

19:00

every person in the company who was going to be affected

19:02

by the decision, like every manager, every function. And

19:05

for any sort of product related decision, that was like 35 names

19:07

on the checklist. And it was like, you know, the sales heads

19:09

of all the different regions and all this stuff. And

19:13

to make that, to be able to get to a yes on

19:15

the decision, you had to get concurrence from every single person on

19:17

that list. Any one person at

19:19

list could say the

19:21

term was a de-conquer. I

19:23

de-conquer was the internal term and

19:25

de-conquer meant veto. And so you needed

19:28

35 people to agree and any one person could veto a

19:30

decision. Right. And so so so

19:32

decision making just simply stopped. And this is why

19:34

the company fell apart is because they couldn't make

19:36

they couldn't adapt because they couldn't make decisions. Right.

19:39

They literally couldn't act. And they had four hundred

19:41

and forty thousand employees. Right. Oh,

19:43

my God. So on

19:45

a on a on a on a time adjusted basis

19:47

for the growth of the market, it was like equivalent

19:49

of today would be a million or a million to

19:51

employees, something like that. So

19:54

it's like a nation state. This is the other thing is at IBM

19:56

in those days, I've been those

19:58

days you could you could work there for years. and

20:00

you could never, you could work there for years and

20:02

you would never meet anybody either at work or in

20:04

your social life who didn't work for IBM. Right,

20:08

because it was so big, right? And so all

20:10

of your friends, so everybody worked at the same

20:12

company. The metric, the thing I always look at

20:14

when I visit big companies is I always look

20:16

for the signs, the signs in

20:18

the parking area and the signs in the buildings because

20:21

everybody who works at the company knows where everything is.

20:24

And so they don't rely on signs. And

20:26

so when you go to a big company and there's no signs for

20:28

what the buildings do, it's a sure sign that they're losing touch with

20:30

the market because it means they don't get visitors. Interesting.

20:33

Right, because they're completely insular.

20:37

More to come, we'll be back in a bit. There's

20:41

a reason why I tell aspiring

20:43

entrepreneurs about a particular tool. It

20:45

is as close as you're gonna

20:47

come to a success shortcut. Wanna

20:49

know what it is? It's Shopify.

20:52

Shopify is the backbone of millions

20:54

of successful businesses for a reason.

20:56

Take companies like Alo, Allbirds, or

20:58

Skims. They've got killer products and

21:00

highly effective marketing, but their secret

21:02

weapon, it's Shopify. Here's why. Shopify

21:05

has the number one checkout on

21:07

the planet, period. And with Shoppay,

21:09

they're boosting conversions by up to

21:11

50%, I can tell you with

21:13

absolute certainty, businesses that want to

21:15

grow, grow with Shopify. Upgrade your

21:18

business today and get the same

21:20

checkout as Allbirds. Here is the

21:22

simple action plan. Sign up for

21:24

a $1 per

21:26

month trial period

21:28

at shopify.com/impact, all

21:31

lowercase. Just go

21:33

to shopify.com/impact to

21:35

upgrade your selling

21:37

today. Again, that's

21:39

shopify.com/impact. Impact Theory is

21:42

sponsored by BetterHelp. You've heard of flight

21:44

or flight, but did you know there's

21:46

a third option when confronting fear? It's

21:48

the secret to achieving all of your

21:50

goals. It's called face it.

21:53

Facing your fears head on is

21:55

where real growth happens. That's where

21:57

BetterHelp comes in. They're making therapy.

22:17

So if you're ready to face

22:19

your fears and unlock your highest

22:21

potential, give better help a try,

22:24

overcome your fears with better help.

22:26

Visit betterhelp.com/impact theory today to get

22:28

10% off your first month.

22:30

That's better help. H

22:32

E L p.com/impact

22:34

theory. A great logo

22:37

can make or break your business. Imagine

22:39

having a world-class design at your fingertips,

22:41

24 seven, ready to create

22:43

custom polished branding for your company

22:45

at a moment's notice. professional

23:06

logos for your business in seconds,

23:09

but it doesn't stop there. design.com

23:11

is your complete branding warehouse. Create

23:13

websites, social media posts, and ads

23:15

with templates that automatically use your

23:17

logo and color scheme. It lets

23:20

you focus on growing your business

23:22

and not get lost in the

23:24

design details. Ready to level up

23:26

your branding? Head to design.com/impact theory

23:29

and get up to 88% off.

23:33

That's design.com/impact

23:36

theory. We're back. Let's dive

23:38

right in. This is the

23:40

natural trajectory for all these companies just to end up

23:42

in this state. And

23:45

that's the polar opposite of the

23:47

Elon method. Like that's the barbell.

23:50

Now, to your point with your example

23:52

and the assistant, the

23:55

big question is why aren't there more Elons? And how do you make more Elons?

23:57

The second question is, can you have a part of an, partial

24:00

Elon. And so one

24:02

of the ways I describe this is, is

24:04

there a unit of metric which is milli-Elons,

24:08

right, like millimeters, right? So

24:11

could you have like 900 milli-Elons? Could

24:14

you have 90% of Elon? But

24:17

maybe not 100%, or could you have the 50% version, or

24:20

the 10% version, or maybe just the one

24:22

milli-Elon, maybe somebody who's just a little bit more

24:24

like that. It's your question,

24:26

do you need the whole package, or

24:29

can people learn these techniques and be this way, even if

24:31

they're not Elon, even if they don't have his natural capacities,

24:33

and even if they're not willing to go all the way

24:35

to where he goes, can they go partway there?

24:37

And I actually think that's an open question today. And

24:40

there are, I would say there are shockingly few CEOs, I

24:42

know who are even asking that question, they're trying to figure

24:44

it out. Now,

24:47

in theory, in theory, in theory, if you've

24:49

got one, in theory, you should be able

24:51

to have a thousand. There's a lot of

24:53

smart people in the world. So

24:55

here's the other example is, what would it do for a civilization

24:57

if we had a thousand of them? Yeah,

25:01

I mean, at the rate that he's producing now,

25:03

a lot. Yes. A lot. And

25:06

what would happen in our civilization if every single industry

25:08

had an Elon? Right?

25:11

And so like- Yeah, I mean, it's legitimately

25:13

insane. Yeah, so that possibility

25:15

exists, like you can see it, right? You

25:18

know, so I find

25:21

that very exciting, very optimistic. I

25:24

don't know if it'll go, but I

25:26

think that's one of the really big questions in front of us right now. Yeah,

25:30

no doubt. It'd be interesting to see if anybody can

25:32

pull those principles out in

25:34

a way that's metabolizable by other

25:37

entrepreneurs. The economy, did

25:40

we just dodge a recession? Does

25:43

debt make the recession inevitable? And we just kicked the

25:45

can a little bit down the road. What's your health

25:47

check on the economy right now? Yeah,

25:49

so the way I think, okay, so let me give you a couple of

25:51

things on this. So number one, how

25:53

do we differentiate between the United States and America? I

25:57

think they're two different concepts. Say

25:59

more. I think the United

26:01

States is this system. It's the formal governance system.

26:04

So it's the government and all the stuff we've been talking about It's

26:06

all the rules and all the processes and

26:08

all the procedures and we all complain about you know We all

26:10

have our various complaints about it and you know, you whoever we

26:12

are in the political spectrum We've got all kinds of complaints about

26:14

the government But then

26:17

there's America and for me America is the people Right

26:20

and you know, they're part and parcel the government the people

26:22

are kind of part and parcel of a country But like

26:24

they are different They're not they're not the same thing And

26:27

you know, we happen to be a very large country with

26:29

a very large number of very smart talented, you know driven

26:31

capable people And then I

26:33

you know, I'd also say my mental model America is

26:35

like we're just like a giant sprawling mess Like

26:38

you know, we're just we're just like chaos like and

26:40

we have been you know for our entire 250 year

26:42

existence Like we're the place people come when they're just

26:44

like to ornery to start out where they were You

26:47

know They just can't tolerate it And so we you know

26:49

We get the most disagreeable people from all over the world who

26:51

come here because they get to you know They get to basically be wild

26:53

they get to do things that they wouldn't normally get to you know And

26:56

I of course I benefit from that because you know That's we get all

26:58

the we get so many of the good founders from all over the world

27:00

who come here to do it Because they don't think they

27:02

can do it in the countries where they grew up and

27:04

so we're we're we are America

27:07

is a country of like tremendously

27:09

talented driven capable ambitious people From

27:12

all over by the way from all over the world who

27:14

have aggregated here and their descendants over many generations And you

27:16

know, we've just we've selected ourselves into the best We've dealt

27:18

the best possible hand in terms of the quality of our

27:20

people Like you know It's just extraordinary what

27:22

this country is capable of and and then

27:24

most of what the country does is not done by the

27:26

United States It's not by the government. Most of it's done

27:29

by the people most of it's done by it by America

27:32

And you know and you know the it's the old line

27:34

of the business of America's business Which

27:36

is this this this whole line from the 50s It's just like

27:38

most of what most people do every day as they go to

27:40

work and they try to they try to do things You know,

27:42

they try to do things they try to contribute. They

27:45

try to take care of their family They try to you know build

27:47

their companies. They try to do a good job, you know, they try

27:49

to build good products They try to

27:51

take care of customers. And so, you know, most

27:53

of what people do every day is actually really productive and really

27:56

helpful and then We're just the

27:58

best ranked by that that we're just

28:00

the best, we're the best country. Like we have

28:02

the best combination, you know, we have this sort

28:04

of rule of law of like an advanced society,

28:08

but we have less rules than like the European

28:10

countries, for example. And then we have

28:12

like all the energy of a new country, right,

28:15

because of all the immigration and because of all the

28:17

talented people that we have. And so we're, you know,

28:19

we're kind of the, we're kind of at the sweet

28:21

spot of sort of a combination, you know, big country,

28:23

small country, old country, new country. Like we're kind of

28:26

in that, we're kind of in that sweet spot. And

28:28

so I go through that to

28:30

just say like America wants to grow, right?

28:32

The America, the country, the people, we want to

28:34

grow, we want to succeed, we want to build

28:37

great things. We want to build businesses, we want

28:39

to have economic growth, we want to have, you

28:41

know, we want to just

28:43

like shock the world with all these amazing inventions. Like

28:45

we want to do all these things. We

28:47

are held back in all kinds of ways

28:49

by the United States, but America wants to

28:51

do that. And so basically if

28:53

the government isn't too much on our throats,

28:56

the economy will naturally just grow forever. It'll

28:58

just grow in perpetuity and America will remain the best

29:01

bet, you know, globally. It'll just be the, you know,

29:03

it will remain the best market to invest

29:05

in. It'll remain, it'll produce the best, you know, the

29:08

largest number of high quality new companies and so forth. And

29:11

so the American, the American economy wants to grow. And that's

29:13

what's happened, which is, you know, we came out of COVID

29:15

and if you just like plot a chart of, you

29:18

know, American economic growth versus, you know, Europe and other

29:20

countries, it's just, you know, there we are, we're off

29:22

to the races and, you know, Germany is like, you

29:24

know, starting to shrink, you

29:27

know, and the UK, a bunch of other countries like have

29:29

severe problems. They're not able to reignite growth. The

29:31

new UK labor government just had, the

29:33

labor government just had a growth conference this week because it's now

29:36

hit such a crisis point in the UK. They don't know how

29:38

to get economic growth. And

29:40

so yeah, our economy wants to grow. It wants to do

29:42

fine. Yeah, we probably did, we probably did dodge a recession

29:44

and that's just cause the productive energies of the American people

29:46

just, you know, kicked in. You

29:49

know, it's all completely unpredictable from here, but

29:51

like, you know, fundamentally I feel really good about, I

29:54

feel really good about America. I

29:56

feel really good about the people and I feel really good about the engine

29:58

that we have. I believe. that I

30:00

forget who said it. I actually think you know because I've

30:02

heard you talk about this but inside of all of us

30:04

is a God-shaped hole and that hole

30:07

right now I think is

30:09

having a resurgence of people really

30:11

trying to re-embrace religion

30:15

from an interesting angle that's probably outside of

30:17

today's purview what we're gonna talk about but

30:19

they have a need to fill that and you're gonna

30:21

get the question of the soul so what's gonna happen

30:23

is you're gonna get somebody like me who doesn't have

30:25

kids and I'm gonna raise an

30:27

AI child that is embodied because why not

30:29

I can rush through the terrible twos I

30:32

can pause when they're seven years old for a couple

30:34

years and just enjoy that whatever I can if I

30:36

want to go to a movie with my wife I

30:39

can literally put them in the kitchen and shut them

30:41

down like it's just all of the upside and none

30:43

of the downside and then all of the sudden other

30:45

people can be like yeah that's dope and

30:48

people are either going to be in

30:50

relationships with robots romantically or

30:52

they're gonna be in a romantic relationship

30:54

with a human but they're gonna raise

30:57

AI kids and you will literally at

30:59

least for pockets because

31:01

there will be like the amateur whatever there will

31:03

be this sort of super producers who keep their

31:05

fertility high because cultural value says yes there

31:08

will be some that won't and

31:10

so those cultures will hit

31:12

an existential crisis based on that which

31:14

I think will cause the religious element

31:16

to really push and say you know this

31:19

is an abomination before God and we

31:21

just absolutely cannot do it so

31:24

that's where I feel like huh there's

31:26

gonna be this weird tension and then

31:28

if people are getting augmented with neural

31:30

link and obviously I'm talking these are

31:32

20-year time horizons maybe 30 maybe

31:34

50 but this is gonna play out

31:36

for somebody in the not too distant future in my estimation

31:39

and just to put one more thing in the mix you

31:42

know very well that

31:44

in backroom conversations in the government

31:46

people are asking questions should we

31:49

be prepared to do

31:51

airstrikes on data centers because we

31:53

are so worried about AI breaking

31:55

free so there's already

31:57

this ambient anxiety about it You've

32:00

got me talking like a sci-fi

32:02

writer, but it's a pretty plausible

32:04

scenario. How

32:09

do we stop that from happening? Or what

32:11

is the automatic in the human mind kill

32:13

switch that will stop that from happening? So

32:17

let's just start by saying there's a lot in there and I would love to talk

32:19

about every part of it. And by the way, we

32:21

should go as deep as you want with me anyway

32:23

on the religion stuff and so forth. Cause I agree

32:25

with a lot of the setup to the question. So

32:30

let's see how to come at this. So look, to start

32:32

with, I would say we have a crisis of meaning already,

32:34

right? And so you talk about like pop, you know, it's

32:36

talking about fertility, right? You know, Elon's been talking about this

32:38

a lot lately, but like fertility rates are

32:40

crashing all over the world. Right?

32:43

And it's actually really striking what's happening, right? Which

32:45

is it's happening across cultures, right? And so normally

32:48

when there's like something happening in America or whatever,

32:50

Europe or Japan or something, like you generally analyze

32:52

and you're like, okay, what's happening in American culture

32:54

that's causing this? Or what's happening in Japanese culture

32:56

that causes this? But like it's happening in all

32:59

those cultures simultaneously. Is it population's crack

33:01

growth is crashing here. It's crashing in Europe is

33:03

crashing to Korea. It's crashing in Japan. It's crashing

33:05

in China. And by the way,

33:07

like, you know, China, Japan and Korea have very

33:09

different cultures than we do. And they have very

33:11

different cultures between each other. Like they're really different.

33:13

Like the Japanese and Koreans are like really different.

33:16

And yet it's happening in all these sort of advanced

33:18

societies. And so I guess I would

33:20

say it's like, that's sort of a preexisting condition. You

33:23

know, we just have that. And so

33:25

that, and that's sort of a fundamental, you know, fundamental

33:27

question. We have this, you know, this question of meaning,

33:30

right? Which, you know, the God shaped hole, which is, you know,

33:32

a process that kicked off, you know, probably, you

33:34

know, basically like 150 years ago that, you know, has been,

33:36

has been playing out and, you know, people have been grappling

33:38

with that for a long time. And, you know, as, you

33:40

know, we've done through various phases of religious revivals, you know,

33:43

boom, boom, bus cycles with religions over the last, over the

33:45

last hundred years. When I was growing up in the Midwest

33:47

in the seventies and eighties during the, one of the great

33:49

awakenings. So the, you know, sort of come back of evangelical

33:51

Christianity and, you know, kind of born again, they're born again,

33:54

you know, kind of phenomenon. So I remember it well. You

33:56

know, I've seen, I've seen that happen. Yeah.

33:59

So like. I think that that's all

34:01

true. That's all super important. And

34:04

then look, tech

34:07

obviously changes culture. By the

34:09

way, culture changes tech. It's a positive

34:11

feedback loop, different cultures, reactive tech in

34:13

different ways. Let's see where to

34:15

take it. I think the counter argument,

34:17

maybe the leash to put on it, I guess maybe I should start

34:19

with, if you don't mind me asking, do you have kids yet? I

34:23

don't know. Yeah. So one of

34:25

the things that I say, one

34:27

of the things I find in my conversations with my friends

34:29

who don't have kids and then have kids

34:32

that I went through. And it's almost

34:34

like a little bit, I have

34:36

these conversations with my friends. I

34:38

work in tech and a lot of people don't have kids or they wait for a long

34:40

time. And I have this conversation where it's

34:42

like, the people with kids sound like pod people. They

34:46

sound like they got the brain fungus in

34:48

the last of us or something. It's like, oh,

34:51

you don't understand. When you have a kid, everything

34:53

changes. And my

34:55

friends are like, what happened to you?

34:57

What's wrong? You sound like you're in

34:59

a cult. And I'm like, no, no. And it's literally like,

35:01

that was me before I had my first kid. Was

35:04

like, oh, I just, whatever. I wanna live my life. I

35:06

don't know whether I want this additional responsibility. But basically, I

35:08

think this is true. I was almost a universal thing. If

35:10

you talk to parents, when you have your first kid and

35:12

you look in the kid's eyes for the first time, and

35:15

literally what you see, it

35:17

look in the best case scenario, we've

35:20

got a blend, literally a blending of DNA. And

35:23

the person you love most in the world is

35:25

combined with you. And then the baby shows up

35:27

with these eyes and the eyes look back at

35:29

you. And it's like looking at yourself. And it's

35:31

like looking at the person you love the most

35:33

in the world. And it's like looking at

35:35

this new soul all at the same time. And

35:37

it's like a psychological reset. And

35:42

so that's like such a, it

35:44

seems so universal that parents understand

35:47

that and non-parents don't. In

35:50

fact, I have friends who are like, I don't know that I

35:52

wanna have kids. Cause it sounds like it changes your psychology so

35:54

much. Like I'm worried it's gonna ruin everything I like about my

35:56

life today. And I'm like, no, no, it makes everything better. And

35:58

they're like, but you have to. spend all your time with the

36:00

kid. And I'm like, yes, but it's the thing I want to do

36:03

most in the world. My friends are like, well, that's not what I

36:05

want, because I want to work all the time. And I'm like, you're

36:07

missing out. And it's like, you're brainwashed. So

36:09

that's a thing. I

36:13

mean, look, I fully believe people are going to have

36:15

AI pets, AI friends. They're going

36:17

to have AI, all kinds of relationships, AIs.

36:19

They'll have some form of proxy children. I

36:21

totally buy that. By the way,

36:23

that will probably be based on their information. One of

36:25

the things I think, for example, your AI kid is

36:29

probably going to be a version of you, basically trained on

36:31

your own training data. That's

36:33

terrifying. Well, so the concept actually that's starting

36:35

to take off in the tech world right now is what's

36:37

called the digital twin. So it's not the

36:39

digital kid, it's the digital twin. But the idea is, look,

36:42

for example, I haven't done this yet, but I might do

36:44

this, which is I'm not available 24-7. But

36:47

if I feed a language model everything I've ever written

36:49

and everything I've ever said, then maybe if somebody we

36:51

work with wants to ask me a question and

36:53

it's the middle of the night, they can ask my digital twin,

36:55

and they'll get back a representative answer to what I would say.

36:59

That's starting to happen. So yeah, I think

37:01

a lot of that stuff's going to happen. But

37:03

the primal relationship that you have with another

37:05

human being, and that could be another human

37:07

being you're related to, or by the way,

37:09

just another human being that you're not related

37:12

to, there's a level. We are very, very,

37:14

very deeply wired to have those relationships be

37:16

the center of our universe. And

37:18

again, like I said, there's a big issue here, which is

37:20

people aren't having kids. And

37:22

so that's not getting transmitted, and there's very big questions

37:24

that kind of flow out of that. But

37:28

it's just different. It's just

37:31

flat out different. When you have your first kid, and

37:33

certainly you should have like a dozen

37:35

kids, they'd be

37:37

great. I'm pretty

37:40

sure if we tape a show after that, like

37:42

two years later, you're going to be like, oh yeah, I don't know what I was thinking. This

37:45

is just so different. And maybe I

37:47

could- So do you think that's the kill switch? Well,

37:50

let me broaden out the answer, which

37:52

is fundamentally technology, AI, all this. It

37:54

has implications on lots of things for sure, but one of the things

37:56

that it does is it makes us richer. Like

37:59

it makes our society richer. it makes material comfort

38:01

a lot better, makes it a lot

38:03

easier, by the way, to provide for kids and family, be able

38:05

to have a higher level of material welfare. There's

38:09

this line of critique of new technology, which is

38:11

like, well, material welfare is not sufficient because it

38:13

still leaves this God-shaped hole. But the way I

38:15

think about it is, at higher levels of material

38:17

comfort, we have a better shot at figuring out

38:19

the answer to the God-shaped hole. If

38:23

you're going to be confronted with existential questions

38:25

about religion and philosophy and how to live

38:27

your life, would you rather

38:29

do that with material deprivation or

38:32

with material plenty? It's

38:34

really easy for people to say that they

38:37

would prefer to, it's like, would you rather

38:39

be a monk with a straw mat on the floor

38:42

eating bread and water, trying to figure out the meaning of

38:44

life, or would you rather be you with a

38:46

nice, like fluffy bed and

38:49

air conditioning and like artisanal

38:51

cheese from whole foods? I

38:54

love that that's the one you pick. You'd

38:57

much rather be you. Of course,

38:59

I'm going to have a much better chance at figuring out

39:01

the important questions in life if I'm not worried about where

39:03

my next meal comes from, if I'm not worried whether the

39:05

power is going to go out, if I'm not worried that

39:07

it's going to freeze to death overnight, if I'm not worried

39:09

that my kid is not going to have access to a

39:11

needle-nail incubator, that I have to worry about where my income

39:13

is coming from. Of course, with material plenty,

39:15

I'm going to have a lot more capacity to

39:18

answer the deep questions. I think that's

39:20

going to be the unanticipated payoff, which is

39:23

as technology and as AI makes the world

39:25

materially better off. I

39:27

believe it increases our ability to address these

39:29

big questions, not decrease it. Hold

39:32

tight, we're going to take a quick break. In

39:36

business, flying blind is a

39:38

recipe for absolute catastrophic disaster.

39:40

Yet, most companies are doing

39:42

exactly that, making decisions based

39:44

on intuition or

39:47

outdated information. If that

39:49

sounds familiar, let me introduce you

39:51

to NetSuite by Oracle. It's like

39:53

giving your company X-ray vision, letting

39:55

you see every aspect of your

39:57

business in real time. It puts

39:59

your accounting. finances inventory and

40:01

HR into one seamless,

40:03

beautiful system. No more juggling multiple

40:06

tools or guessing at the big

40:08

picture. This means you make decisions

40:10

based on facts, not hunches. Success

40:13

in business isn't about predicting

40:15

the future. It's about being

40:17

ready for whatever comes. NetSuite

40:19

gives you that readiness. Download

40:21

the CFO's guide to AI

40:24

and machine learning at netsuite.com/theory.

40:26

The guide is free to

40:28

you at netsuite.com/theory. Again,

40:31

that's netsuite.com/theory.

40:33

The nutrition secrets of Olympic athletes and

40:36

Fortune 500 CEOs are no

40:38

longer off-limits to all of us. Here's

40:40

how you can tap into that power.

40:43

When I see a company collaborating with experts

40:45

I trust, I take notice and that's

40:47

why I'm excited to tell you guys

40:49

about Momentus. Momentus has partnered

40:51

with Andrew Huberman and his lab to

40:53

develop a suite of supplements that target

40:56

the core pillars of health. Sleep, cognition,

40:59

focus, physical performance, hormone support,

41:01

and longevity. In an industry

41:03

full of hype and empty

41:05

promises, Momentus is different. Every

41:08

product is NSF and Informed

41:11

Sports Certified. That means that what's

41:13

on the label is actually all

41:15

that's in the bottle. No guesswork,

41:17

no fillers, just pure high-quality ingredients.

41:19

Whether you're an athlete or an

41:21

entrepreneur or just someone committed to

41:23

living your best life, Momentus is

41:25

worth your attention. Here's your action

41:27

plan. Go to livemomentus.com

41:29

and use code impact

41:31

for 20% off. That's

41:35

livemomentus.com with

41:37

code impact. AI

41:39

might be the most important

41:41

new computer technology ever. It's

41:44

storming every industry and literally billions

41:46

of dollars are being invested. So

41:48

buckle up. The problem is that

41:50

AI needs a lot of speed

41:52

and processing power. So how do

41:55

you compete with costs that are

41:57

spiraling out of control? It's

41:59

time to up the line. to the next

42:01

generation of the cloud, Oracle

42:03

Cloud Infrastructure or OCI. OCI

42:06

is a single platform for

42:09

your infrastructure, database, application development

42:11

and AI needs. OCI

42:14

has four to eight times

42:16

the bandwidth of other clouds,

42:18

offers one consistent price instead

42:20

of variable regional pricing. And

42:22

of course, nobody does data

42:25

better than Oracle. So now you

42:27

can train your AI models at twice the

42:29

speed and less than half the cost of

42:31

other clouds. If you

42:33

wanna do more and

42:36

spend less like Uber,

42:38

8x8 and Databricks Mosaic,

42:40

take a free test

42:43

drive of OCI at

42:45

oracle.com/theory. Again, that's oracle.com/theory,

42:48

oracle.com slash

42:51

theory. All

42:53

right, let's pick up where we left off. Yeah,

42:55

so I'll agree with you there, but

42:58

there's one division that I'm gonna make, which

43:00

is the reason

43:02

that religion is so impactful is

43:04

because it addresses every

43:07

intellectual, every person

43:09

on the intellectual spectrum. So

43:11

when I went through

43:13

a phase where I was trying to explain to

43:15

people, hey, think like this, act like this, it

43:18

will make your life better, these ideas just radically

43:20

changed me. And I found

43:22

that largely because as people age,

43:25

they're just not able to be as intellectually

43:27

nimble, but you also run into the reality

43:29

that some people do not have the intellectual

43:31

horsepower. Whenever I talk about this, I wanna

43:33

remind people, it's entirely possible I fall below

43:35

the line. I'm perfectly willing to accept that,

43:37

but you have to understand that there are

43:39

dumb people that cannot process some of these

43:41

ideas. And so religion becomes this catch-all for,

43:44

hey, this is how you live a good

43:46

life. And it will speak to highly intelligent

43:48

people and it will speak to people who

43:50

are just gonna follow the 10 commandments. I

43:52

mean, the 10 commandments are basically the Bibles,

43:55

the TLDR, right? So it's like, hey, don't worry about

43:57

reading that, just here are the 10 things, go do

43:59

these 10. and things and you're gonna be fine, done

44:02

in a story format, so it really speaks to

44:04

people. So I don't

44:06

think this sort of intellectual approach to hey, this is

44:08

why AI is gonna be great for you, and in

44:10

the future it's gonna solve all these problems. What's going

44:12

to happen as a punctuated moment, I think on a

44:14

long enough timeline, this is all great and it's wonderful

44:17

and it brings about an age of abundance. So,

44:19

but I'm talking about the punctuated moment where

44:21

people start losing their jobs and

44:24

they don't wanna make the transition. People

44:27

get the sort of warmth and

44:29

comfort from religion, they're being drawn

44:31

back into it. I don't know if

44:33

the data will support this exact statement, but this

44:35

feels accurate, that people are

44:37

coming back into religion and sort of regionally

44:43

large numbers, like higher numbers than

44:45

region. I'm not saying ever in

44:47

human history, but locally, time-wise.

44:49

And so we've got this massive influx into

44:52

religion right now. You've got this massive thing

44:54

that's gonna disrupt all the things that religion

44:57

is gonna talk about, taking

45:00

care of people, the soul, a

45:02

connection to God, the afterlife, all

45:04

these things that AI

45:07

and robotics are going to

45:09

challenge. And now I think

45:11

you have this collision of people that

45:13

aren't able to navigate intellectually the nuance,

45:16

it becomes problematic and I think that is

45:18

gonna have to be addressed. Now let's take

45:20

the super boring version of this and it

45:22

just plays out as regulatory capture and

45:25

the government's just like, nah, my constituents don't want

45:27

it. It gets mired, it gets super bogged down

45:30

and now everything gets caught up in red tape

45:32

and the thing that I can already feel happening

45:34

now where there's just so much regulation that it's

45:36

hard to move forward at the rate we could say back

45:38

when I was a kid, that

45:41

gets exacerbated. That's my sort of

45:43

mundane vision of how this

45:45

plays out, but I don't see a world in

45:48

which it

45:51

just all happens in a sunny

45:53

rosy way. Do you? I

45:58

don't know, it's complicated. So look, I'm

46:00

a techno optimist, not a techno utopian. And

46:03

so I start by saying a couple of things, which

46:05

I don't think technology, I don't think technology answers all

46:08

these questions. And so I

46:10

don't think technology for that matter, economic growth, give answers

46:12

to most people for meaning. And

46:15

so I don't think any of this is a

46:17

substitute for religion. And so from

46:19

that standpoint, I maybe have a little bit of humility just

46:22

on the scope of the importance of what we do out

46:24

here. So, and like I said, I

46:26

think even in a world of technological

46:28

abundance and economic abundance, material welfare, I

46:30

think the big questions of meaning are

46:32

still open questions. And so, like

46:35

I will hesitate to make sweeping claims on

46:37

that. Yeah,

46:41

I guess I've just, maybe the other way to

46:43

come at this, maybe the other way to think

46:45

about this is, I talk more about on the

46:47

religion side. So my take on religion, I completely

46:49

buy religious revivals and I think we're actually in

46:51

quite a religious time right now,

46:53

which we should talk about. For

46:56

example, politics have become a

46:58

branch of religion. We've

47:01

invented a whole series of secular religions in the

47:03

last 150 years and we continue

47:05

to do that. And so the sort of form and shape

47:07

of religion keeps playing out even if they don't have sort

47:11

of supposedly supernatural kind of elements to them. And

47:14

I'm completely open to the idea of, like I

47:16

said, I live through a fundamentalist religious revival. I'm

47:18

completely open to more of those. Those clearly are

47:20

happening at various places in the world. Yeah,

47:24

so I

47:26

will certainly grant all that. That

47:28

said is we do, like

47:31

we moderns and postmoderns, like

47:33

we don't relate to religion the way

47:36

that people did back before our times.

47:39

So like the further you go back in history,

47:41

and for sure this was true like 150 years

47:43

ago back, the

47:48

relationship that people had with religion was different

47:50

than they have it today. And

47:53

I'm going to go way down the

47:55

rabbit hole in this, but basically for

47:57

most of recorded human history, religion was not.

48:00

not an a la carte thing. It was something

48:02

that was a very deep part of who you

48:04

were as a person. And specifically,

48:06

they had the concept, they had a concept

48:08

of peoplehood. There was a people

48:10

and the people would have shared genetics, all

48:13

be related to each other, the people would have

48:15

shared culture, the people would have

48:17

a shared place, right, you know, their own

48:19

land. And then they would

48:21

have they would have religion and those concepts were all

48:24

conjoined. There's this great book, it was a great book

48:26

called the ancient city that goes through basically the prehistory

48:28

of Western civilization. It goes through the basically what are

48:30

called the old Indo European religions and cultures, you know,

48:32

that sort of ultimately resulted in the Greeks and the

48:35

Romans and then Christianity. So it's sort of it goes

48:37

all the way back to the beginning of basically like

48:39

how Western societies formed. And it

48:41

was basically a three part structure, it was family, it

48:43

was tribe, and then it was city. And

48:46

then these concepts of shared

48:48

kinship, genetics, shared culture, shared religion

48:50

and shared geography were all conjoined.

48:53

And if you told somebody in that era

48:55

that, you know, oh, you can switch religions,

48:58

they would have considered you completely insane. Because

49:02

being of that religion with those

49:04

gods was precisely tied to these

49:06

other factors of culture, genetics, and

49:09

place. Of course, in our society,

49:11

we have completely disconnected those things. You know, if I

49:13

if I go out in public today, and I'm like,

49:15

no, I'm a part of a peoplehood where I have

49:17

shared genetics, culture, religion, and place, and I'm going to

49:20

have, you know, ethno state for German, Dutch, you know,

49:22

people in the Midwest, like, you know, obviously, I get

49:24

instantly tagged as a white supremacist, and like I get,

49:26

you know, shunned and ostracized from society, by the way,

49:28

I'm not proposing that I don't want that just for the

49:30

record. Right. And so we

49:34

live in a different time, we have abstracted

49:36

religion away from those other things. And kind

49:38

of to your point, actually, as

49:40

a consequence of that, we can now choose our

49:42

own religion, right. And as a as a modern

49:44

Westerner, you or I are completely free tomorrow to become

49:47

a, you know, Catholic or a Baptist or Jewish

49:49

or Muslim or whatever we want, or by the

49:51

way, to make up our own religions, and by the

49:53

way, proselytize and go try to get followers. And

49:56

you know, when we call those cults, and people do that all the time, and

49:58

we, you know, I would argue we live in a world of cults. and

50:00

we've got all these new cults out here in California, and some

50:02

of them are, by the way, super involved in AI, so it's

50:04

a thing. So, but

50:07

religion has become an a la carte. It's like the old

50:09

choose your own adventure books you might've had when you were

50:11

a kid. You can basically design the religion that you want.

50:14

And so, on the one hand,

50:16

you would say, oh, well then this is going to

50:18

be a time of tremendous invention of religious concepts and

50:21

religious behaviors. And by the way, and I believe that's true. I

50:25

do think that's happening. On the

50:27

other hand, is this like, okay, is religion

50:29

going to control our lives in the way

50:31

that it did back when that concept was

50:34

conjoined with genetics, culture, and place? It's

50:38

hard, like, we just don't take religion that

50:40

seriously anymore. We could choose

50:42

to take it seriously again if we want to, but just

50:44

observationally, we don't. And

50:47

when it becomes inconvenient, we

50:49

change, right? I'll

50:51

be, I'm going to run something by you. Tell me

50:53

how this lands. I know you have a broad historical

50:55

context, so also being a student

50:57

of history, I hesitate to say this, but I

51:01

have a hypothesis that the

51:03

religious impulse plays out at the same

51:05

volume no matter what. It just becomes

51:07

a question of what is

51:09

the religious impulse aimed at? So for instance,

51:11

as a game developer, I

51:14

am constantly awestruck by how

51:16

toxic the communities can become.

51:19

And so I sat down one day and I

51:21

was like, what on earth is going on here?

51:23

And I realized this is the religious impulse that's

51:25

being met by a video

51:27

game. So you are communing with the other players.

51:29

You are committing a ton of your time to

51:32

this. You are giving yourself over to this

51:34

game. You care about the lore. You care about

51:36

the time that you've invested into it. I mean,

51:38

this is a level of belonging

51:40

to a game and

51:43

a game community that you would only

51:45

have gotten historically as a part of

51:47

either a town, a family, or a

51:49

religion. And so it meets that criteria.

51:51

And so when you have this sense

51:53

of tremendous belonging and

51:56

you, as the game developer, go in and

51:58

mess with their thing, The easiest way

52:00

to explain it is imagine I could

52:02

go in and mess with the rules of

52:04

football without consulting anybody. And tomorrow you roll

52:06

up and it's just different. And

52:09

now the player that you loved is no longer a good player

52:11

and you don't really like it anymore, doesn't speak to your skill

52:13

set. People would be outraged. Like my dad

52:15

was into this team, my dad was into this game and

52:17

I was raised on it and now I'm here and

52:20

you changed it in your trash. And that's

52:22

basically what happens. Now, if

52:24

I'm right that that's writing on the

52:26

neurological architecture

52:29

that makes religion so powerful.

52:33

It's like, hey, that volume is still dialed to 11.

52:35

Now hopefully nobody's going to

52:37

go kill in the name of their

52:39

favorite video game. But I think that's

52:41

a narrative question and not an

52:44

architectural question. So if I were to

52:46

get people to believe that by investing

52:48

in this video game, like a cult,

52:51

somehow meant something about you and society. And we

52:53

were all fighting for the, you know,

52:55

insert now politics and you get how suddenly

52:58

with the right narrative, whoa, like

53:00

people will go. And that's another area I

53:02

think people are politics right now is triggering

53:04

the religious impulse. So I don't

53:06

think the volume is dialed down. Even if we

53:08

quote unquote don't take religion as seriously who I

53:10

think the outcome is going to be the same because this

53:12

isn't, this is a, the architecture of the human mind. Yeah.

53:16

So I a hundred percent agree with everything you

53:18

said. I just interpret the consequences of it differently,

53:20

which is imagine

53:22

telling an

53:25

Athenian Greek or a Roman or a Christian

53:27

at 300 AD or a Christian for that

53:29

matter in 1800 AD that you're now religion

53:33

as a video game. They

53:36

would have thought you completely lost your mind. Right?

53:38

Like, wait a minute. Like you've

53:41

now taken that entire religious impulse, which is every bit

53:43

as strong as it was. And you've

53:45

like now applied it to video game. Like you're

53:47

like, you have, you have completely disconnected

53:49

the importance of religion from reality from

53:53

like actual physical reality. Like it no longer is

53:55

relevant to you in terms of like the shape

53:57

and form of any aspect of like your actual,

53:59

anything, any. traditional concept of community, city

54:01

environment, anything like that, family, by the way, does

54:04

it guide your decisions about like, you know, things

54:07

like reproduction children, you

54:09

know, are you indoctrinating your kids? By the

54:11

way, maybe you are, maybe you're indoctrinating your kids in the world of Warcraft,

54:13

but like indoctrinating your kids in the world of Warcraft is like, that's

54:16

not the same as like indoctrinating your kids in

54:18

Catholicism, like that's a world

54:21

of Warcraft. It

54:23

may be equally intense, but it's not as comprehensive

54:26

at impact on the worldview of how people live

54:28

their lives. And so I

54:30

just, I agree with you, but I just think that

54:32

leads to like tremendous amounts of displacement.

54:36

But then also let me say, I really agree with your

54:38

last point, which is the politics point, which I think is

54:40

something that is extremely important because it, you know, especially sitting

54:42

here today, three weeks before, you know, very big election. Something

54:46

that I often point to when I talk to people about

54:48

this is, if

54:51

you look at the charts of, you know, the

54:53

big general population surveys of, would you be comfortable

54:55

with your kid marrying somebody of a different X?

54:59

You know, there's the famous chart of

55:01

a different race and you know, whatever six, 80

55:03

years ago, that was like 90% uncomfortable, today it's

55:05

like 10% and

55:07

falling. Somebody of a different,

55:09

and then another one would be somebody

55:12

of a different religion. And if you had polled people 80 years ago,

55:15

when they polled people on this, like Catholics, Jews,

55:17

Protestants, all were like, no way, you know,

55:19

you're not marrying outside the faith. And today,

55:21

at least like in the US, very

55:23

few people care. And so like that chart is like way

55:25

down. The chart of, do you care

55:28

if your kid marries somebody of

55:30

the other political party? That chart is up

55:32

and to the right. And

55:35

so to me, that maps exactly to what you

55:37

said, which is, yeah, so politics has become our

55:39

religion. There was actually a very, very

55:42

important thinker, writer in the 20th century,

55:44

Eric Vogelen. And

55:46

he was, he's the best writer I found

55:48

in this topic. And

55:51

he basically started his work actually in the 30s and

55:53

40s. And he was basically

55:55

trying to explain at the time, the rise of both

55:57

communism and fascism. And

55:59

he's like, wow, you know, these people are crazy.

56:01

Like these people are really extreme. And then he's

56:03

like, all right, like what is leading, you know,

56:06

Bolsheviks in the one hand and like, you know,

56:08

Nazis on the other hand, to be like this,

56:10

you know, sort of fevered and enthusiastic about these,

56:12

like incredible, you know, these incredibly high, you know,

56:14

kind of impact social movements with all these consequences.

56:18

And so he basically developed a theory very consistent with what

56:20

you said, which is, you know, which he called, I think,

56:22

up, you know, political religions. And he

56:24

did the mapping and basically said, like, these are

56:27

direct, these are in fact, direct standard, standards for

56:29

religion and Christianity, actually, both Christianity and

56:31

Nazism, sorry, both communism and Nazism were legendarily

56:33

very hostile to Christianity, you know, precisely for

56:35

that reason, because Christianity was was the threat,

56:38

they were, you know, quite literally trying to

56:40

displace the, you know, the dominant religion in

56:42

Europe at that time. And

56:45

so, you know, again, like exactly, you're

56:47

right, I think the impulse is with

56:49

us. I think many, you know,

56:51

both Republicans and Democrats in the US today exhibit

56:54

that exact that exact same kind of religious behavior

56:56

around their politics. You

56:58

know, on the one hand, it can sound, I think, patronizing

57:00

to say that, because, you know, people think that their politics

57:02

are all carefully thought through, they don't think they're doing it.

57:05

But, you know, politics are important to people in the

57:07

same way that religious religion is and was important to

57:09

people. And so, you know, they're certainly acting, you know,

57:11

like, like it, and they certainly point in their politics

57:13

to how political choices are going to affect how people

57:15

live, which is very consistent with the view of a

57:18

religion. Yeah, and so I

57:20

think they're displacing that religious energy into politics, I

57:22

think if they displace that religious energy in a

57:24

video game cults, like that's probably

57:26

an improvement. Maybe,

57:30

maybe. It's certainly more benign, I

57:32

think, for the reasons that you

57:34

said earlier. So what

57:36

does the religious impulse done well look like?

57:39

So there's obviously just funnel it into a

57:41

traditional religion that's lasted for thousands of years,

57:43

probably going to be fine. But

57:46

given that a lot of people are not doing that,

57:48

how can you do that well? Yeah,

57:51

so the anthropological view of religion, I think, is

57:53

it's about group formation and cohesion, right?

57:55

And this is the role in the Asian city to talk about this,

57:57

like this is the role that religion so that so the original. the

58:00

original form of this and sort of pre-history, the

58:04

original form of this was we've

58:06

got the family, which is like

58:08

up basic cousins, it's basically the extended family up through

58:10

cousins. And by the way, cousin marriage, you marry your

58:12

cousins. And so you try to keep the family in

58:14

the family. And then

58:16

the family has its gods. And

58:19

then over time, the families aggregated, the clans

58:21

aggregated up into tribes, which consisted of multiple

58:23

families. And then the tribes would have its

58:26

gods. And then the tribes

58:28

would aggregate up into the cities and the cities would

58:30

have their gods. And so as

58:32

the member of a city, you had three tiers

58:34

of gods that you basically were required to basically

58:37

to worship and to honor. And you literally

58:39

had it with the hearth, you had the fire, the permanent

58:41

fire, and you had to keep the fire lit and you

58:44

had to sacrifice the gods and so forth. And

58:47

then the original morality of it

58:49

was if you meet somebody from

58:51

another family tribe city, they

58:56

worship different gods, right? They

58:59

have their own gods. And so your gods

59:01

are inherently at war with their gods and your

59:03

moral obligation is to kill them on sight. That's

59:07

aggressive. Right, which literally, right,

59:09

it was literally like kill them on

59:11

sight. So had you told them, had

59:13

you told people from that era, from

59:16

those many centuries, no,

59:18

you're supposed to be tolerant to people from other religions, they would

59:20

have said, are you out of your mind? They're a threat. If

59:22

we don't kill them, they're gonna kill us. We

59:25

kill them on sight. And so it was like, you know, it's

59:27

like the concept of human rights is like 180 degree inversion from

59:29

like the original form of society. By the

59:31

way, a big improvement, I think, but

59:33

a very, very big inversion. And

59:36

so like at sort of the most

59:38

fundamental level, so why don't I go

59:40

through that? At the most fundamental level, what's the religion for

59:42

is for group cohesion. Why did

59:44

it work that way? It's because that's what maximally bonded

59:46

the family, the tribe and the city together at a

59:48

time when physical survival was very much up for grabs,

59:51

right? Like is the family, the tribe, the city is gonna make

59:53

it through the year TBD. Is

59:55

there gonna be a famine, a flood, a mudslide, you know,

59:58

a volcano eruption? Is another tribe gonna come over? kill

1:00:00

you, are you going to run out of food? Those are all

1:00:02

very important questions. The entire

1:00:04

tribe, Citi, had to really pull together

1:00:06

for physical survival. And so religion

1:00:08

was like the bonding element that pulled together a

1:00:10

group. And I would argue, fast forward to today,

1:00:12

that's exactly the behavior you see in video games.

1:00:15

Which is, it's not just a member of a video game cult, it's

1:00:19

not just an individual. They're not acting as an individual. Inevitably,

1:00:21

they're acting as a member of a group. And

1:00:24

it's group cohesion. And then I also

1:00:27

apply the Jonathan Haidt theory here, coming

1:00:30

from psychology, which is, he

1:00:32

has this great line he talks about in the book, The

1:00:34

Righteous Mind, where he says, he uses the word morality, but

1:00:36

you can basically, equivalently, I think he uses the word religion.

1:00:40

He said, morality binds and blinds. Which

1:00:43

is to say, a shared morality or a

1:00:45

shared religion, it binds people together into a

1:00:47

group. It identifies us versus them, friend versus

1:00:49

foe, in the way that it did

1:00:51

also in prehistory. And then he said, and this is really important,

1:00:53

the other part is it blinds. It

1:00:55

sets up a knowledge framework, a

1:00:58

perception framework by which you emphasize confirming

1:01:00

information that's good for your group, and

1:01:02

you dismiss disconfirming information that's bad for

1:01:04

your group, and you literally

1:01:06

become blind. Right? And

1:01:09

to the point, and you see this today with

1:01:11

Republicans and Democrats, where generally the more passionate the

1:01:13

Republican or Democrat, the less able

1:01:15

they are to articulate the other side's point

1:01:17

of view correctly. Right?

1:01:21

The less able they are to steal man the other side's

1:01:24

view, which means that they're literally giving up on psychological terms,

1:01:26

they're giving up what's called theory of mind. They're giving up

1:01:28

the ability to understand what it's like in somebody else's shoes

1:01:30

because it's more important to be a member of the group

1:01:32

than it is to be able to understand the other. Anyway,

1:01:36

so this is all very much in support of what

1:01:38

you're saying, like these are very fundamental primal behaviors. I

1:01:42

think that they're very important today in

1:01:44

our society as much as ever,

1:01:46

which you see in the politics. And then,

1:01:48

I think they're gonna be equally important hundreds of years from

1:01:51

now. Hopefully this

1:01:53

impulse gets channeled in productive

1:01:56

directions. Yeah. Yeah,

1:01:59

we'll see. So. Kai-Fu Lee has

1:02:01

talked about how we could experience

1:02:03

up to 50% of job displacement.

1:02:07

It's not like there won't be

1:02:09

new jobs, but you're gonna have

1:02:11

a very substantive percentage of people

1:02:13

that are either just temperamentally or

1:02:15

age-wise unwilling to make a change.

1:02:18

Societally, how do we handle that? Yeah,

1:02:20

so I don't think that's true at all. So

1:02:23

I just, yeah. Same one? Yeah, so

1:02:25

that's the classic. In economics, that's what's

1:02:27

called the lump of labor fallacy. And

1:02:29

by the way, Kai-Fu is a very bright guy, so

1:02:32

he may well be right on this. But what any

1:02:34

economist will tell you is it's a fallacy, and it's

1:02:36

actually a fallacy at the heart of Marxism, at

1:02:39

the heart of socialism. And it's a very intuitive fallacy.

1:02:41

It's one that people fall into very easily. It's

1:02:44

called the lump of labor fallacy because, and there's

1:02:46

like big Wikipedia page in this, people

1:02:48

can read. The lump of

1:02:50

labor fallacy basically is there's a certain amount of labor being

1:02:52

done in the world today, right? And

1:02:55

that labor is either going to be done by people or

1:02:57

it's gonna be done by machines. And

1:02:59

if it's done by people, then they're gonna make money by doing it,

1:03:01

be able to provide for themselves, and it's done by machines, then the

1:03:03

people are gonna become unemployed and they're gonna be screwed. And

1:03:07

what's interesting about this fallacy is this has

1:03:09

been a fallacy that literally has been in

1:03:11

place in basically political thought and

1:03:14

sort of Marxist economic thought, socialist economic thought for

1:03:16

like 300 years. The

1:03:18

Marxists really kind of packaged it up and

1:03:21

turned it into a religion actually. But

1:03:24

this is kind of the pervasive thing. This

1:03:26

was sort of the immediate kind of concern,

1:03:28

panic at the very beginning of the Industrial

1:03:30

Revolution, which was you were gonna

1:03:32

have machines that were gonna substitute for human labor, that were gonna

1:03:34

miserate everybody. This actually is

1:03:36

sort of embedded in a lot of myths and

1:03:38

legends that we kind of have

1:03:40

in our kind of cultural DNA. There's

1:03:43

a famous, I don't know if you've heard about it,

1:03:45

there used to be, or is a famous ballad song

1:03:47

of the myth of this figure, John Henry. And

1:03:51

kids are often taught this song, it's

1:03:54

John Henry, the steel driving man. And

1:03:56

the idea was it's the guy, this

1:03:59

would be like when the... railroads are getting built. Like,

1:04:01

so this is like the guy who's like using a

1:04:03

hammer to drive spikes into the rail bed to put

1:04:05

railroad tracks down, which used to be something people did

1:04:07

by hand. And it was this

1:04:09

thing where, you know, one day the, you know, John Henry's like

1:04:11

the famous guy who can drive in the most spikes. And

1:04:14

then one day the foreman shows up with a machine that

1:04:16

drives in the spikes and there's the, they

1:04:18

have a contest where John Henry competes with the machine,

1:04:20

sir, who can drive in most, most spikes. And it

1:04:22

turns out John Henry wins the contest and then drops

1:04:24

dead from a heart attack. Some

1:04:27

kind of symbolic, you know, the last gas with

1:04:30

human effort before the machines take over. And

1:04:32

that dates back to like, I don't know, like 1870, right? So

1:04:35

that's like 150 years ago, people had this fear. And

1:04:38

then basically what we've had is we've had 300 years of

1:04:41

modern technology, industrialization, automation,

1:04:43

computerization, literally three centuries

1:04:45

now. And sitting here today,

1:04:47

there are more jobs than ever in the world

1:04:49

than ever and at higher

1:04:51

wages for people, right? And

1:04:53

so in practice, what's happened is we now have

1:04:55

three centuries of evidence that basically that's a fallacy.

1:04:57

That's actually not what happens. What happens actually is

1:05:00

the opposite, which is technology creates

1:05:02

far more jobs than it destroys and creates

1:05:04

jobs that are better, right? At higher levels of income.

1:05:07

And so we adopt those jobs. Like are there

1:05:10

going to be people that just get left behind?

1:05:13

There will be some and look, there is some respect. And

1:05:16

I should also back up for a second and say conversations

1:05:19

about this topic, it's very easy to come

1:05:21

across in my experience, talking about myself, it's

1:05:24

very easy to come across as judgmental and

1:05:26

patronizing. Because it's very easy to come

1:05:28

across basically saying, basically, so like one of the things that I will

1:05:30

claim is that one of the things I will claim and what we're

1:05:32

about to talk about is that there are some jobs that are better

1:05:34

than other jobs. Some

1:05:36

jobs are just better jobs. They're like,

1:05:39

they're physically less taxing, they

1:05:41

pay better, whatever. But there may

1:05:43

be a bar to be able to get those jobs or people

1:05:45

may not want to do those jobs. And

1:05:47

so people may get, people can get

1:05:49

very resentful with the idea that they have to give up what they

1:05:51

have in order for the prospect of something that might be better but

1:05:53

maybe they don't want it. And who

1:05:55

are these experts on TV or on the internet to tell them if

1:05:57

they should think in these terms. So I should start by.

1:06:00

saying, look, people are going to have a

1:06:02

lot of reactions. People always have, look, a

1:06:04

lot of our politics for the same 300

1:06:06

years have been around this process of

1:06:08

industrial change and then therefore job

1:06:10

change. And it's like the rise of unions.

1:06:13

And there's all these things that happen in our politics as a

1:06:15

consequence of these fights. And so I

1:06:18

should just start by saying, you need to be

1:06:20

able to talk clinically about this because you do need to be able to

1:06:22

talk about the big issues. I do recognize that

1:06:24

it's very easy to come across as patronizing. I also recognize

1:06:26

that people are going to have different points of view on

1:06:28

this. Some people are going to struggle, for

1:06:31

sure. Look, when the car came along,

1:06:33

blacksmiths were not happy. Because all

1:06:35

of a sudden you don't need as many horses. They were

1:06:37

not happy. Now, many blacksmiths became car

1:06:39

mechanics, but many blacksmiths maybe didn't want to

1:06:41

become car mechanics and got very upset and

1:06:43

resentful about that.

1:06:45

Yes, all of the above is going to happen. Having

1:06:48

said that, the basic mechanism of

1:06:50

introducing new technology into an economy

1:06:53

is not job destruction. The basic

1:06:55

mechanism is job creation, net job

1:06:57

creation, overwhelming the job destruction. And

1:07:00

the reason for that has to do with this concept of productivity

1:07:02

growth. And so the

1:07:04

concept of productivity growth is very important. So the

1:07:06

concept of productivity growth is the economic measure of

1:07:08

the impact of technology in

1:07:10

an economy. And basically what it means is the

1:07:13

ability to generate more output with less input. And

1:07:16

so, and use the John Henry example, can

1:07:20

I put more nails in the road bed

1:07:22

to build railroad tracks faster at the

1:07:24

same cost level? Can

1:07:27

I build more cars at lower prices?

1:07:29

Can I provide, can I

1:07:31

make more video games, more video game levels at

1:07:33

lower prices? In any industry, there's

1:07:35

always this question of, how much am I producing today?

1:07:37

And then can I produce more output at lower cost?

1:07:40

And it's what every business logically wants to do. They

1:07:42

want to expand output, and they want to reduce costs.

1:07:45

And so productivity growth is the metric

1:07:47

by which economists track the impact of

1:07:49

technology, impacting the environment. And

1:07:52

this is very important. The faster

1:07:54

the rate of productivity growth, the faster the rate of

1:07:56

economic growth, the faster

1:07:58

the rate of productivity growth, the more. Prices of

1:08:00

current goods and services in the economy fall,

1:08:03

because if you're able to produce more with less, then

1:08:06

prices come down. And

1:08:08

so just take food as an example. Food today is far cheaper

1:08:10

than it was 200 years ago, because

1:08:12

of all the automation. And so to buy an

1:08:14

avocado 200 years ago would have

1:08:16

cost the modern day equivalent of $100. And

1:08:20

now it's $1. And

1:08:23

so productivity growth leads to declines

1:08:25

in prices. Declines in prices lead

1:08:28

to increase in spending power, because

1:08:31

if as a consumer I pay less

1:08:33

for the things I'm already buying because

1:08:35

of productivity growth, then spending power is

1:08:37

being unlocked. Without me even getting

1:08:39

a raise, I have new spending power. And

1:08:42

then that new spending power then leads

1:08:44

to the creation of new products, services,

1:08:46

and industries and jobs to fulfill

1:08:49

that all of a sudden I can spend on. So

1:08:51

what I'm describing is this is the basic mechanism

1:08:54

of technological adaptation of an economy. And it's a

1:08:56

basic mechanism of economic growth. And

1:08:59

theories like the one that

1:09:01

you mentioned, theories by which

1:09:03

the introduction of technology has an emissarating

1:09:05

effect as compared to a cornucopian effect

1:09:08

historically have not played out well because

1:09:10

that's not actually how this works, which

1:09:12

is why the socialists are perpetually disappointed. It's

1:09:15

like every socialist is super pissed all the time

1:09:17

because capitalism works so well. It's

1:09:19

really annoying that we live in

1:09:21

a time of material plenty after all of this runaway

1:09:23

capitalism. It's Boris

1:09:25

Yeltsin in the American supermarket in 1991, just

1:09:28

completely shocked at how much food there is.

1:09:31

It's just like shit, they lied to us. The

1:09:33

communists lied to us about how to do

1:09:35

this. Anyway, so

1:09:38

we can go into any aspect of this you

1:09:40

want to in detail. But basically, I'm completely convinced

1:09:42

that's exactly what's going to happen here. If

1:09:44

AI works the way that we're imagining what's going to happen is

1:09:47

productivity growth is going to take off. Prices

1:09:50

of current goods and services are going to fall. Volume

1:09:53

is going to expand. More people in the

1:09:55

world are going to be able to buy all the things that they want to

1:09:57

buy. But also, it's going to unlock a

1:09:59

lot of new spending. power, that spending power is then

1:10:01

going to create demand for new industries, right? It's

1:10:03

going to, it's going to unlock demand that we're

1:10:05

going to be able to satisfy by, by, by

1:10:07

producing and buying many new things. And,

1:10:09

you know, our, our future digital children, AI children, a hundred

1:10:12

years from now, we're sitting here having it, we're going to

1:10:14

have a podcast saying, can you, can you, can you believe

1:10:16

that our human parents had this fallacy where they didn't think

1:10:18

that this is going to turn out this way? Cause like

1:10:20

it always did. And it did again. And

1:10:22

so anyway, so that's why I'm so optimistic about this. I

1:10:25

love it for people that don't know you.

1:10:27

Um, he wrote a document basically saying technology

1:10:29

is going to save us all that he

1:10:32

went through in detail on a lot of

1:10:34

these points, very counterintuitive coming out of the

1:10:36

Bay area for sure. Um, one

1:10:39

thing that I wouldn't say it's going

1:10:41

to save us all. So I would say I'm an optimist, not

1:10:43

a utopian. And so it goes, this ain't very important. It goes

1:10:45

back to where we started, which is, I don't think this, everything

1:10:48

I just described is not answer all of life's deep questions,

1:10:51

right? Like it's not enough to just

1:10:53

have material welfare. Like I'm a hundred percent on

1:10:55

that. But like having a

1:10:57

real welfare is better than not having material welfare.

1:11:00

Right. And it's the best starting point to be able

1:11:02

to answer the big questions. And so I just wanted

1:11:04

to, wanted to qualify that. I'm not, I'm not, I'm

1:11:06

not, I am actually myself not proposing a new religion.

1:11:09

Mark, this has been incredible. Where can people

1:11:11

follow along with you? Oh,

1:11:13

good. Uh, so I am on a Twitter now

1:11:15

called X. Um, I am on there as a

1:11:17

PMRK PMARCA. Um,

1:11:19

that is probably one of my main presences. And then

1:11:22

I have a sub stack, um, which is linked to

1:11:24

from the Twitter account. Um,

1:11:26

and then we have a YouTube channel. Um,

1:11:28

and my partner Ben and I have a

1:11:30

YouTube show, uh, that we do intermittently. Um,

1:11:33

but we get good feedback on. So, uh, maybe we can

1:11:35

link to that. Awesome

1:11:38

guys. I can definitely vouch for his content.

1:11:40

It is, uh, amazing. I hope

1:11:42

you guys would check it out. Speaking of things

1:11:44

that I hope you will do, if you have

1:11:46

not already, be sure to subscribe and until next

1:11:48

time, my friends be legendary. Take care. Peace. Stop

1:11:51

struggling with a lack of focus and energy

1:11:54

while trying to reach your peak performance, take

1:11:56

it from me. If you want to reach another level,

1:11:58

you need to hit your. body with all

1:12:01

the nutrients it needs every single day

1:12:03

to really maximize your performance. And there's

1:12:05

no better way to make sure that

1:12:07

you get all the micronutrients that you

1:12:10

need than with AG1. AG1

1:12:12

is a foundational nutritional supplement that

1:12:14

truly supports your body's universal needs

1:12:17

like gut optimization, stress management, and

1:12:19

immune support. If you're a longtime

1:12:21

listener of the show you know

1:12:24

that I rarely use supplements with

1:12:26

the exception of vitamin D3 and

1:12:28

AG1. Just one scoop of AG1

1:12:30

supports your whole body with 75 high-quality

1:12:33

vitamins, minerals, and whole food source

1:12:35

nutrients to support optimal health of

1:12:38

your brain, body, and gut. If

1:12:40

you're looking for a simple effective

1:12:42

investment in your health try AG1

1:12:44

and get five free AG1 travel

1:12:46

packs and a free one-year supply

1:12:49

of vitamin D with your first

1:12:51

purchase. Just go to drinkag1.com/impact.

1:12:55

That's

1:12:57

drinkag1.com

1:13:00

slash impact. Give it a try. What's

1:13:02

up guys? Tom Billieu here. I am

1:13:05

beyond excited to tell you about

1:13:07

my new podcast, Tom Billieu's Mindset

1:13:09

Playbook. We are opening the

1:13:11

vault to share some of the most

1:13:13

impactful impact theory episodes of all time.

1:13:15

These are the very conversations that took

1:13:18

me from one subscriber to where I'm

1:13:20

at today. This is the

1:13:22

show where you'll find my personal deep

1:13:24

dives in the mindset, business, and health

1:13:27

topics. You're gonna hear from legends like

1:13:29

Andrew Huberman, Ray Dalio, Mel Robbins, and

1:13:31

Dave Asprey all sharing insights that are

1:13:33

gonna help you achieve the goals that you've

1:13:36

set out to achieve. If

1:13:38

you're serious about leveling up your life I

1:13:40

urge you to go in, binge listen to

1:13:42

these episodes so that you can get the

1:13:44

skill set that you need to hit whatever

1:13:46

level of success you're pursuing. Go

1:13:48

check it out. Tune in to Tom Billieu's

1:13:50

Mindset Playbook. Trust me, your future self will

1:13:53

thank you.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features