How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 1

How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 1

Released Tuesday, 22nd October 2024
Good episode? Give it some love!
How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 1

How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 1

How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 1

How AI Will Transform Medicine, Entertainment, and Your Everyday Life | Marc Andreessen - PT 1

Tuesday, 22nd October 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Ford Pro Fin Simple offers flexible

0:02

financing solutions for all kinds of

0:04

businesses. Whether you're an electrician or

0:08

run an organic farm. Because

0:12

we know that your business demands financing that

0:14

works when you need it. Like when your

0:16

landscaping company lands a new account. Wherever

0:21

you see your business headed, Ford

0:23

Pro Fin Simple can help you

0:25

pursue it with financing solutions today.

0:27

Get started at fordpro.com/financing. The

0:30

time for holiday hosting is upon

0:32

us, so make your second bathroom

0:35

second to none with Home Depot.com's

0:37

best savings of the season. Right

0:39

now, enjoy up to 40% off

0:41

select online bath. Find the latest

0:43

entrant styles of vanities, faucets, showers,

0:45

tubs, toilets, and more, all

0:47

at prices that will let your budget relax

0:49

right along with you and your beautifully renovated

0:51

bath. Get up to 40% off

0:53

select online bath plus free delivery at

0:55

the Home Depot. Subject to availability

0:58

at Home depot.com/delivery for details. I

1:00

think the A.I. censorship wars are going to be a

1:02

thousand times more intense and a thousand times more important.

1:05

My guest today is someone who doesn't just keep

1:07

up with innovation. He creates it. The

1:10

incredible Mark Andreessen. Trust me when someone

1:12

like Mark who spent his entire career

1:14

betting on future says this is

1:17

the next major disruption. You need

1:19

to listen. From a political

1:21

standpoint we should hope that we have rapid

1:23

technology progress because if we have rapid technology

1:25

progress we'll have rapid economic growth. Do people

1:27

care? And are people going to

1:29

be able to stand up for this? And I

1:31

think that's what's required. It's going to displace a

1:33

lot of jobs. Some of those people will redistribute

1:35

themselves by acquiring new skills. Other people will not.

1:38

This isn't something to think about tomorrow. You've

1:40

got to be prepared today. So let's dive

1:42

right in. I bring you Mark Andreessen. Mark

1:48

Andreessen welcome to the podcast. Awesome.

1:51

Thank you for having me.

1:53

My pleasure. Now you've had an

1:55

insane amount of success betting on

1:58

where industries are going. So

2:00

let me ask you, what is the most

2:02

radical disruption that you see coming in the

2:04

near future with AI? You

2:07

know, I just say like we're convinced AI

2:09

is one of those sort of moments of

2:11

fundamental change. And you

2:13

know, in our in the tech industry, you

2:15

know, these come along every couple of decades,

2:17

but they're not frequent. And

2:21

you know, this one is up there with a microprocessor

2:23

and the computer and the internet for sure, and maybe

2:25

bigger. And so for

2:28

us in the tech industry, this is a

2:30

this is, I think, a very, very, very profound,

2:32

powerful moment. And of course, you're already

2:34

seeing, you know, a lot of a lot

2:36

of the effects that are already playing out. But

2:39

you know, this technology is this technology is going

2:41

to change a lot of things. And it's going

2:43

to be, I think, very, very exciting. So

2:47

for people that don't know, you have

2:49

a fundamentally optimistic view of AI, of

2:51

technology in general. Do

2:54

you have like from an investment strategy? Do

2:56

you guys have a thesis on what

2:59

industry you think is going to be most advantaged

3:01

by AI that you're trying to get into? Yeah,

3:04

there are many. So we're involved in many. I

3:06

would say there's obvious slam dunk ones. And so

3:09

I would say healthcare is

3:11

a slam dunk one. I actually just I

3:13

actually just happened to have lunch with Demis

3:15

Hasabas, who just won the Nobel Prize in

3:17

chemistry for his work on protein folding and

3:19

a bad lunch date. Yeah, exactly.

3:22

And and he was knighted this year also.

3:24

So he's also a certain sort of us. You

3:28

know, he and his colleagues basically have this transformative

3:30

approach that they believe is going to

3:32

lead to dramatic breakthroughs in the development of medicine in the

3:34

years ahead, powered by AI. So,

3:37

you know, health care is an obvious one.

3:39

Entertainment is one that I think is

3:41

it's going to be extremely exciting. What happens from here? And

3:43

again, that's that's already starting to play out. And

3:46

you know, you're already seeing which is sort of incredible

3:49

creativity being applied to

3:51

that. And so, you know, maybe you could kind of maybe bookend it by

3:53

saying those because it's kind of the most serious one of the most fun

3:56

one. But then look,

3:58

there's there's there's lots and lots of

4:00

other stuff probably. the single biggest question

4:02

I'm asking right now is robotics. There's

4:04

been the promise of robotics

4:07

saturating our society and

4:09

everybody having robots in

4:12

the home and everybody having robots to

4:14

do everything, manual labor, and wash the dishes

4:16

and pack the suitcase and clean the toilet,

4:18

and conceivably everything,

4:21

manual labor, free people from

4:23

manual labor. And that's been a promise going back. In

4:26

science fiction, it's been a promise for 120 years. Until

4:28

recently, we

4:31

were no closer than we were maybe back then, but

4:34

you're starting to see very dramatic

4:36

breakthroughs. And

4:38

I think you had drones

4:40

that now work, autonomous drones are now

4:42

a standard thing, self-flying,

4:44

self-piloting drones, you now have self-driving cars that

4:47

are now a thing, and they now work

4:49

really well. And I

4:51

think it may be humanoid robots and all

4:53

kinds of other forms of robots. We

4:56

have two Chinese robot dogs at home. What? You

5:03

actually have them at your house? Yeah,

5:06

yeah. So everybody's probably seen all the demos. Remember, there's

5:08

this company, Boston Dynamics, that has all these... They always

5:10

have these great demos. You see these videos of these

5:12

robot dogs running around. But they cost like $50,000, $100,000,

5:16

and that company never really brought them to

5:18

market. So it never really worked outside

5:20

of it as a demo. But there

5:22

are now Chinese companies that have these things down to $1,500. Yeah.

5:27

And they're great. They run

5:29

around. They actually run actually quite quickly.

5:31

They can outrun you. They do flips.

5:34

They stand on their high legs. They climb stairs. There's

5:38

a version of it that has wheels that

5:40

they can go like 30 miles an hour. Then

5:43

that one can also climb stairs. It locks the wheels

5:45

and it's perfectly fine climbing stairs. Wow.

5:48

So those are really starting to work.

5:50

And then humanoids are coming fast. And

5:52

Elon just had his demo day for

5:54

the Tesla robot. That was unreal. Yeah.

5:57

And so those are starting to work. Isn't that quite

5:59

their... like those were still teleoperated. There's

6:01

still people in the background with VR headsets that are

6:03

kind of steering those and guiding those and helping those.

6:06

But that's also how you train these robots is you kind

6:08

of have, they kind of watch what people do and then

6:10

you train. So I think we might be like actually reasonably

6:12

close on robotics, which would actually have a,

6:14

you know, would have a very big impact. And so

6:16

yeah, maybe you could call out those three categories as

6:18

obvious ones to focus on. What

6:21

kind of timeline do you have for robotics? When are we

6:23

gonna start having that first round of people buying them and

6:25

having them in their home? I know Elon's pegged it at

6:27

20 to 30 grand. When?

6:31

Yeah, so the big breakthrough, so

6:34

self-piloting drones were a very big

6:36

breakthrough. And, you know, the dominant

6:39

ones on those in the global market are

6:41

this company DJI, which is this big company

6:43

in China, you know, but those now work

6:45

really well. And then there's American companies. We

6:47

have American companies that have, you know, I

6:49

think even better technology that aren't quite

6:51

the same size yet, but are really good. And

6:54

so, and that's a big deal. Like, so you can have,

6:56

you know, we have drones now that can like fly between

6:58

tree branches. They can fly,

7:00

you know, indoors, you know, they can fly, you know, completely

7:02

autonomously through like, by the way, underground tunnels. And

7:05

so those work really well. And then like I said, like

7:07

self-driving cars, you know, the Waymo, you

7:09

know, cars now are great. And, you

7:12

know, people who use them have fantastic experiences. And

7:14

then the Tesla self-driving capability is getting

7:16

really good. And

7:19

so like, so I go through those to say,

7:21

those are both robots, you know, flying robot, driving

7:23

robot. And so walking robot,

7:25

all of a sudden, it's not so crazy. Exact

7:29

timing, I don't know, you know, I

7:31

swag five years, but, you know,

7:33

could be two, could be eight,

7:37

I don't know, optimistically three or four. You

7:42

know, the promise, you know, there's

7:44

many, many possible form factors for these

7:46

things, right? Designs, the theory of humanoid

7:48

robots, which I believe is, the

7:50

great thing about humanoid robots is there's just, there's

7:52

so much of the physical world that assumes that

7:55

there's a person present, right? So person standing in

7:57

an assembly line, person driving a car, person driving

7:59

a tractor. picking vegetables in a field. There's

8:05

just all these systems that we have

8:08

that just assume there's a person. And so if you build a robot

8:10

in the shape of a person, in theory, it can kind of fill

8:12

in and do all that work. And

8:16

so that should be a very big market. And

8:19

obviously people should be very comfortable with that.

8:22

They'll dovetail really well into kind of

8:25

normal society. But I also think there'll

8:27

be a lot of other, you can package

8:29

these things up however you want. And so there will be

8:31

lots of other kinds of, there already are obviously

8:33

lots of robots in the world, but there will be more and more

8:35

of different kinds. And

8:37

what are the hard parts? What are the hurdles

8:40

they still have to overcome that's gonna cause it

8:42

to be three, four, possibly eight years from now?

8:45

Yeah, so there's basically, I would say three

8:47

big categories. So there's the physical sort of

8:49

controls, the actual physical kind of

8:51

body and its ability to kind of control itself. And

8:55

that's where if you look at like Elon's demonstration the other night,

8:57

you kind of see how fast that stuff's moving. Because

9:00

if you watch like his progression of the other companies doing

9:02

it, they're getting much better. And

9:05

so that's just moving right along. Then

9:07

there's battery power is

9:09

probably still a fundamental limit, because

9:13

it's a question of like, how long can you actually like

9:15

power one of these things before it has to recharge or

9:17

do a battery swap? And that's still a

9:19

bit of an issue and it's hard to make progress on batteries,

9:21

but a lot of people are working on it. And

9:25

then software is the big challenge. And

9:28

you know, where we would get more involved. And

9:31

so this sort of this all the software. And

9:33

so think about it like these robots have sensors,

9:35

they've got visual sensors, they've actually got like the

9:37

robot dogs have what's called LIDAR, which is sort

9:39

of the light version of radar, which

9:42

is the same thing as in the Waymo cars. And

9:44

so they've got sensors, they can kind of gather

9:46

input, they've got sound, they can gather input from

9:49

kind of all around them. Actually, they can gather input from their

9:51

environment better than human can, because they can see 360 degrees and

9:55

they can do depth sensing and so forth in ways that we can't.

9:58

So they get all the raw data, you have to

10:00

actually process that data. You have to form it into

10:02

a model of the world. You have to then, the

10:04

robot has to have a plan for what it does,

10:07

right? And then it has to understand the consequences of

10:09

the plan. Right? And

10:12

so, you know, I'm setting the coffee down on the

10:14

table. You know, I can't

10:16

set it down on somebody's hand. So I have to set

10:18

it down near the hand, but not on the hand. I

10:21

have to keep it level, because if I tip it, you know, I'm going

10:23

to scald somebody. Right? So like, and then

10:25

by the way, while I'm the robot, while I'm setting the

10:27

coffee down, the person has moved, right?

10:29

And so I have to adapt to it, right? Or,

10:32

you know, same thing, walking through a crowd, like I can't, you know, you

10:34

can't have robots running into people. And so you have

10:36

to have this kind of- Do you know how they're approaching, how

10:39

they're approaching that problem? So if I think about when

10:41

I saw the robots interacting with the people at the

10:43

party, is there an underlying

10:45

goal for the robot to be likable? And

10:47

is it like, hey, get to know people,

10:50

try to charm them? What is

10:52

the plan that they're giving to the

10:54

robot that it's moving towards? Yeah,

10:57

so, I mean, in general, if you're a

10:59

company, and generally you want basically completely benign,

11:01

right? So if you're a company, you want,

11:03

because it's actually, it lines up nicely with

11:05

the profit incentive. You know,

11:07

you want friendly, approachable, you know, think,

11:10

you know, products that make people happy, products that

11:12

make people comfortable, you know, products that

11:14

aren't threatening or intimidating, and aren't, you know, aren't

11:16

hurting people. And so you put a really, really

11:18

big focus on fitting into the environment. You put

11:20

a really big focus on avoiding anything that whatever,

11:22

you know, harm a human being, you

11:25

know, you put a very big focus on, you know, the robot

11:27

should, you know, happily, you know, should

11:29

happily, you know, whatever, step into traffic or whatever, if

11:32

it's gonna save somebody's life. And

11:35

so, you know, you want that. And then, yeah, I think, you

11:37

know, generally you want it to be, you know, sort of approachable,

11:39

safe, harmless, you know, are kind of terms that get used a

11:41

lot, you know, friendly.

11:44

Now look, this is the other thing is, there

11:46

used to be this like really hard challenge, which is how

11:48

are you gonna control these things? How are you gonna talk

11:50

to them? Are they gonna, you know, watch

11:52

Star Wars, they communicate in beeps and boops. You

11:55

know, if you watch Star Trek and you're watching, you know, Commander

11:58

Data, you know, he's talking in English, you

12:00

know. up until two years ago, we thought it would have

12:02

to be beeps and boops. But now

12:04

we have large language models and we have

12:06

these voice AI interfaces like OpenAI just released

12:08

their advanced voice mode. And it's a full,

12:11

it's like talking to the Starship computer on

12:13

the Starship Enterprise or a human,

12:15

it's just like talking to a person. And so all

12:17

of a sudden you can give these robots voices, they

12:19

can talk, they can listen, they

12:21

can explain quantum physics to you, they can sing you

12:23

a lullaby, they can forecast the presidential election. They can

12:25

now do whatever you want. And

12:29

so that's the other part of it is that you're going to

12:31

really be able to talk and interact with them. The

12:34

first one I saw the Boston Dynamics guys did this

12:36

hysterical demo where they wired up one of these early

12:38

language models a couple of years ago to their robot

12:40

dog. And they gave it a super

12:43

plummy English Butler voice. And

12:46

so it's like this mechanical robot dog stomping around,

12:48

but it's talking to you like it's like you're

12:50

Bruce Wayne and it's Alfred or something. It's

12:54

the robot dog, what do you see? And it does the very

12:56

plummy exit, oh, I see a lovely pile of rocks. And

12:59

so yeah, you're going to, by the way,

13:01

there's going to be enormous creativity. There's this

13:03

startup we're not involved in, but I like

13:05

the guys a lot called Curio in

13:08

Redwood City that basically has a plushie. So

13:11

they have a stuffed animal. And

13:13

it's basically designed for little kids. And

13:15

it's a voice UI. And it's back

13:18

ended by a large language model. And

13:20

it doesn't move, it's just a plushie

13:22

with a voice box. But

13:25

it will happily sit and tell kids jokes and

13:27

teach them all about whatever they want

13:29

to learn about and talk to them about whatever's on their

13:31

mind. And they have

13:33

a really elegantly wired up where the parent can

13:35

both control how the toy actually, like what it's

13:37

willing to talk about. So you can, as a

13:40

parent, you can define the topics that are like

13:42

go zones versus no go zones. So

13:44

you could kind of say, let it talk to the

13:47

kid about science, but not politics, for example. And

13:49

then you get, as a parent, you get a real time

13:51

transcript of the interaction. So like your kid's up in the

13:53

bedroom talking to the thing and you actually get to see

13:56

the conversation. Right. And so, and it's funny with you, when

13:58

you watch this with like little kids, they, I just think

14:00

this is like the most natural, normal thing in the world.

14:03

Right. I've talked in the past, I have a nine year

14:05

old and I brought home, um, when chat GPT first, uh,

14:07

shipped, um, you know, two years ago, uh, I guess he

14:09

was seven. And so I, uh, he has

14:11

a laptop that he does is, is some of his, uh, his,

14:13

uh, school stuff on. And so I set up chat GPT on

14:15

his laptop and I sat him down. I was so proud of

14:17

myself cause I'm like, I'm like, I don't

14:19

know. It's like, I'm, I'm, you know, I'm coming down from the

14:21

mountain to deliver like the gift of fire to my child. Like

14:24

I'm giving him like the super technology that's going to be with

14:26

him his whole life. It's going to answer any question and help

14:28

him with all of his work. And it's like the most amazing

14:30

gift of technology I could give him. And I showed him chat

14:32

GPT and I said, you know, you type in any question you

14:34

want and then it answers the question. And

14:36

he looked at me and he said, you know, so, right.

14:38

And I was like, what do you mean? So like, this

14:40

is like the breakthrough. This is like, this is the thing,

14:42

this is like the thing for 80 years we've all been

14:45

working on and it finally works. And he's

14:47

like, what else would you use a computer for? Like,

14:49

so funny, like obviously it answers your questions. Right. Um,

14:51

and so like, I think kids are going to, kids

14:53

are, I mean, it's already happening. Kids are going to

14:56

pick this up like an incredibly fast. It's going to

14:58

be, you know, super normal. Anyway,

15:00

so we have, we

15:02

have a chance to design, you know, we can design

15:05

technology to be as, as, as friendly and helpful and

15:07

accommodating and supportive as, as we can possibly imagine. And

15:09

I think that maybe the commercial products will all get

15:11

built that way for sure. Yeah,

15:13

to me, that's where the biggest disruption is going to

15:16

be. When I think about AI, I think I'm

15:18

as optimistic as you in terms of the

15:20

things that it will do for us. It's

15:23

intellect. You're going to be able to throw, you

15:26

know, God knows how many new, uh, PhD

15:28

level people and maybe one day even more

15:30

at all these incredible problems. All right. That's

15:32

going to be utterly fantastic. But then I

15:35

think about, uh, your dog

15:37

becomes a robot dog, uh,

15:40

becomes furry and fluffy and

15:42

wonderful, but it also talks to your kids and

15:44

helps raise them and you have this lens into

15:46

it. And then all of a sudden it's, well,

15:48

it's not just the dog. It's, I've got an

15:50

AI girlfriend. She's not really a girlfriend, not like

15:52

that. Well, but then I, you know, I've been

15:54

talking to her for three years and now robot

15:56

body comes online and I want to put that

15:58

AI into the robot body. and

16:00

all of a sudden, I think

16:02

that there's gonna be a

16:04

pretty fascinating, to

16:07

try to keep it positive here, a fascinating schism

16:09

that'll happen in society. So five years ago, I

16:12

wrote a comic book about this, about

16:14

what I think is gonna happen. And I think

16:16

there's gonna be a bifurcation in society, and I

16:19

really think this is actually going to happen. How

16:22

big and how dramatic that remains to be

16:24

seen. But I think you're gonna get a

16:26

subset of society that says, nope, not

16:28

doing this. It's like the opening line

16:30

in Dune, that thou shall not make

16:32

a mind, an artificial mind

16:35

mirroring human intelligence, or whatever the exact line

16:37

is. And I think

16:39

people will eschew AI, they will eschew

16:42

Neuralink and things like that, and there'll

16:44

be sort of this new, puritanical vein

16:48

of humanity, and then you're gonna get other

16:50

people, like me, that embrace the technology. I

16:52

may not be an early adopter of Neuralink,

16:54

but if it truly gets safe, and it

16:56

allows me to upgrade my abilities, man, I

16:59

will do that in a heartbeat. And so

17:01

then it becomes a question of how much

17:03

friction will there be between those two sides?

17:05

But those seem inevitable. Do

17:08

you think I'm crazy about that, or do you

17:10

see that same inevitability? And if so, how does

17:12

it play out? Hold tight,

17:14

we're gonna take a quick break. Impact

17:17

Theory is sponsored by BetterHelp. You've heard

17:20

of Flight or Flight, but did you

17:22

know there's a third option when confronting

17:24

fear? It's the secret to achieving all

17:26

of your goals. It's called

17:28

face it. So

17:54

if you're ready to face your

17:56

fears and unlock your highest potential,

17:58

give better help a try, overcome

18:01

your fears with better help. Visit

18:03

betterhelp.com/impact theory today to get 10%

18:06

off your first month. That's better help H

18:09

E L p.com/impact

18:11

theory. A great logo

18:13

can make or break your business. Imagine

18:15

having a world-class design at your fingertips,

18:17

24 seven, ready to create

18:20

custom polished branding for your company

18:22

at a moment's notice. Creating

18:35

a professional set of branding. And

18:37

that's where design.com comes in. design.com

18:39

generates thousands of custom professional logos

18:42

for your business in seconds. But

18:44

it doesn't stop there. design.com is

18:46

your complete branding warehouse. Create websites,

18:49

social media posts, and ads with

18:51

templates that automatically use your logo

18:54

and color scheme. It lets you

18:56

focus on growing your business and

18:58

not get lost in the design

19:01

details. Ready to level up your

19:03

branding? Head to design.com/Impact Theory and

19:05

get up to 88% off. That's

19:10

design.com/Impact Theory.

19:13

There's a reason why I tell

19:15

aspiring entrepreneurs about a particular tool.

19:18

It is as close as you're

19:20

going to come to a success

19:22

shortcut. Want to know what it

19:24

is? It's Shopify. Shopify is the

19:26

backbone of millions of successful businesses

19:28

for a reason. Take companies like

19:30

Alo, Allbirds, or Skims. They've got

19:32

killer products and highly effective marketing,

19:34

but their secret weapon? It's Shopify.

19:36

Here's why. Shopify has the number

19:38

one checkout on the planet, period.

19:40

And with Shoppay, they're boosting conversions

19:43

by up to 50%. I can

19:45

tell you

19:47

with absolute certainty, businesses that want

19:49

to grow, grow with Shopify. Upgrade

19:51

your business today and get the

19:53

same checkout as Allbirds. Here is

19:55

the simple action plan. Sign

19:58

up for a $1

20:00

per month trial period.

20:03

at shopify.com/impact all lowercase.

20:05

Just go to shopify.com/impact

20:08

to upgrade your selling

20:10

today. Again, that's shopify.com/impact.

20:14

All right, let's pick up where we left off. I

20:17

mean, I think it's certainly a plausible scenario. I think it's

20:19

certainly logical. I, you know, it certainly can play out that

20:21

way. I guess my

20:24

model of human behavior is different. So I'm skeptical.

20:26

I'm skeptical that that is what will happen. And,

20:28

you know, I would just start by saying that

20:30

there is a schism of that, of

20:33

like that in our society today. And they are the Amish.

20:36

Yeah. And I actually grew up, you

20:38

know, they were Amish when I grew up. And, you

20:40

know, so the good news with the Amish is they

20:42

have a defined quality of life. They

20:45

have, you know, a whole value system sort of, you

20:47

know, involves, you know, rejecting technology for some, by the

20:49

way, for some very deeply thought through reasons. And,

20:52

you know, they're, you know, by all accounts, you know,

20:54

in many cases very happy. And by the way, they're

20:56

also very fertile, you know, so they're, you know, they're

20:58

having lots of kids. And so there's, you know, there's

21:00

actually, I think, quite a bit to admire about

21:03

what they do. You know, like having said that, I would

21:05

just say two things. One is they're a very, very, very,

21:07

very, very small percent of the population. And

21:10

so there's not a lot of people who volunteer to become

21:12

Amish. And then the other thing that happens, if you track

21:14

them in detail, what actually happens is they don't reject technology.

21:16

They just adopt it on a lag. Right?

21:20

And so, and basically the lag is about 30 years.

21:24

And there's been a bunch of articles that this,

21:26

over the last decade, for

21:29

example, they're now adopting PCs, a personal computer. Really?

21:31

Yeah, yeah, yeah. Well, because it's so- I thought they

21:33

were still without electricity. No, no, no, no,

21:35

they've got electricity. I mean, you know, they can, they try to

21:37

control it, but they definitely, this is a great example. They definitely

21:39

have it, right? And then they have

21:42

teleph- they now have landline telephones. So

21:44

there's just a, there's just a, there's, there's a point where

21:46

you just, you know, things just get to be practically. So,

21:49

you know, the PC, so the PC thing, apparently the

21:51

articles that I've read, basically what it is, is the

21:53

personal computer, personal computer, like, you know, they run these

21:55

small businesses. They'll have like a, you know, like

21:58

furniture, for example, that's like the, you know, these amazing- things. Well,

22:01

it's just a lot easier to run a furniture store

22:03

if you've got a personal computer to do the ledger

22:05

and the inventory on it, right. And it's just at

22:07

a certain point, they figure out a theory under which

22:09

that's okay, they still don't connect it to the internet.

22:12

You know, but they that they do that, you know, they have

22:14

their personal computer, by the way, that, you know, and then you

22:16

just kind of say, inevitably, the next step is they're going to

22:18

want to sell their furniture online. And so it's just a matter

22:21

of time until they figure out a way to bring in a

22:23

internet connection, right. And so one of the really, really fascinating things

22:25

about AI is it went from being something that was sort of

22:27

speculative and weird, three years ago to

22:29

something that is now actually quite common already in

22:31

use. And, and

22:34

this is quite a profound and powerful thing that I think we'll

22:36

probably talk a lot about today, which is, which

22:39

is number one, it's AI is already in wide

22:41

use. And so the number of users on systems

22:44

like chat, GPT, and mid journey and whatever are already in

22:46

the hundreds of millions that are growing very fast. And

22:49

lots and lots of people are using these are using these things

22:51

and they use them in their everyday life, they use them for

22:53

work, they may or may not admit to their boss, they're using

22:55

them for work, but they're definitely using them for work. You

22:57

know, students are using them in school, if you've got like, you

23:00

know, teenage kids, like any any classroom in America, and I was

23:02

grappling with this question of like, you know, as the kid bring

23:04

in an essay that GPT wrote, you

23:06

know, but they're helping with homework and they're doing all

23:08

kinds of stuff, and the usage numbers on these services

23:11

kind of reflect, you know, already broad based adoption. And

23:13

then there's a really powerful thing underneath that that's

23:15

really important, which is

23:17

the most powerful AI systems in the world are the

23:19

ones that you get on the internet for free, or

23:23

maximum 20 bucks a month. And

23:25

very specifically, you know, I have the capability if I want

23:28

to, you know, I could go spend a million dollars to

23:30

just have like the best AI, I could go spend a

23:32

million dollars a year, if I go spend a million dollars

23:34

a year today, I do not get a better AI than

23:36

you get when you sign up for chat GPT. It's

23:39

literally not available, I can't do it. The

23:42

best AI in the world is the thing, it's

23:45

on chat GPT, or by the way, Google Gemini,

23:47

or Microsoft Bing, or, you know, anthropic Claude, you

23:49

know, there are x x, you know, Grok, the

23:51

x AI one, or Mistral, which is, you

23:53

know, one of our companies are llama for meta, there's like seven

23:55

of these now, that are like available either

23:57

for free or for most for 20 bucks a month. And

24:00

they're the best in the world. And

24:02

so it's actually quite striking, shocking, which is a lot

24:05

of people have the mental model of, oh, well, the

24:07

best technology must be basically ordered by a few people

24:09

who are then gonna order it over the rest of

24:11

us and are gonna make all the money on it,

24:14

right? It's kind of the, you know, kind

24:16

of always the fear on these things. The reality

24:18

is like this technology is democratizing faster than the

24:21

computer did, faster than the internet did. It's available

24:23

to everybody right out of the chute. By

24:25

the way, it's getting built, you know, Apple's building it into the

24:28

iPhone. It's just, you know, now it's just Apple intelligence And

24:31

so this technology basically has gone from not

24:34

present in our society to like almost universal

24:36

in one step. And

24:38

I just, you know, it may be that people choose

24:40

to voluntarily give it up, but I, in

24:42

my life, I have not yet seen people who sort of

24:44

voluntarily renounce something that they get used to. So yeah,

24:47

it would be a first if it happened. All

24:50

right, I hear that. And you're the right

24:52

person for me to have this conversation. I

24:55

love when dogs bark the loudest because they're

24:57

on a leash. So you're gonna be my

24:59

leash. I'm gonna paint a

25:01

scenario knowing that you're gonna pull me back from

25:03

the brain. Cause I'm fundamentally a techno optimist and

25:06

I'm definitely somebody that will embrace this technology as

25:08

fast as humanly possible. We're deploying it here in

25:10

my company as rapidly as we

25:12

can. I will literally, if it's proven safe,

25:14

get Neuralink the whole nine. So

25:17

here's what I think plays out. This

25:20

is as close to the sort of

25:22

realistic mess that I think we'll go

25:24

through. The long arc of history bends

25:27

towards justice, but history does not care

25:29

about any single generation. And

25:31

I think that the thing we will all

25:33

have to get very politically comfortable with is

25:35

the fact that yes, AI is going to

25:37

displace jobs wildly as we move

25:39

towards something absolutely wonderful and spectacular, but

25:42

it's going to displace a lot of

25:44

jobs. Some of those people will redistribute

25:46

themselves by acquiring new skills. Other people

25:48

will not, and it

25:50

won't be a great time for them.

25:52

And their families will rally around them

25:55

as the material wealth is

25:57

unlocked, as spending power becomes more abundant,

25:59

all of that. the younger people that

26:01

are more intellectually nimble, will

26:03

take advantage of that to care for

26:05

people. But there's gonna be this

26:08

conflict on the left and the right as to,

26:10

hey, shouldn't we just give these people UBI or

26:12

whatever to take care of the people that are

26:14

going to struggle because they are going to struggle.

26:16

And if people don't have a mental defense, if

26:18

they don't have a narrative that they can understand

26:20

about how we weather that storm, I

26:23

think they'll make very bizarre economic choices.

26:25

As you were talking, you're talking about

26:27

deflation and people

26:29

ought to wonder how on earth, given

26:31

all the technological advances we've had over

26:33

the last 300 years, how

26:35

is inflation still going up? This seems crazy. And

26:38

the reason that inflation goes up,

26:40

despite the massive deflation that technology

26:43

brings, is that the government gobbles

26:45

it up by printing money. And

26:48

oh boy, do I have a personal

26:50

bone to pick. I have no idea

26:52

your take on the economy and how

26:54

it intersects. So I'll plant

26:57

my flag and let you react. I

26:59

think that you

27:02

need to only look at the M2 money supply chart

27:04

to see, I mean, it's just absolutely

27:06

outrageous how much

27:08

more money has been poured into the

27:10

system, completely artificially just generated out

27:13

of thin air. And

27:15

that is the inflation, when we say inflation, that's

27:17

what we're talking about, the inflation of the money

27:19

supply. In doing that, the government doesn't have to

27:22

get your vote on something. I

27:25

refer to, and I do not wanna put

27:27

words in your mouth, but I refer to

27:29

that as the government steals from you, and

27:31

then they force you to play the stock

27:33

market as one stand-in for investments in order

27:35

to beat inflation caused

27:37

by them printing money and stealing from you.

27:40

And I think that's deranging. And I think

27:42

that the government has a moral obligation to

27:44

give people a non-inflatable currency in which people

27:46

can at least park their wealth so that

27:48

the average person who does not wanna play

27:50

the stock market can just save. A

27:54

guy that is a janitor and he's just trying to

27:56

get by and take care of

27:58

his family should be able to sock away.

28:00

money and not have its value eroded over

28:02

time through very

28:04

conscious and poor, in my opinion,

28:07

policies. I'm curious to get

28:09

your take on that.

28:12

If I tell you that in any given time

28:14

you could have more or less technology change, and

28:17

then that change would show up in economic

28:19

statistics the way that economists measure it as

28:22

with productivity growth, which is a thing they

28:24

measure. It's an actual number. And

28:27

so, if you have, if a society has 1%

28:30

productivity growth that's super low, if they have 4% productivity

28:32

growth per year, that's super high. Let's call that the

28:34

super ring. And if you could ever

28:36

get to 8% or 10% productivity growth, you'd

28:38

have cornucopia, technological utopia, it'd be amazing. Everything

28:41

would get super cheap and abundant, super

28:43

fast. But modern societies go somewhere between 1% and

28:45

4%. Would

28:47

you say that we live in a time

28:50

today in which productivity is growth and therefore

28:52

technological change is running high or low? I

28:55

think we are about to unleash

28:58

a ton of that productivity. But right now,

29:01

I think that the government is siphoning off

29:04

so much of that productivity that you get this

29:06

schism between the young and the old. So the

29:08

old, I think, are doing very well, and the

29:10

young are getting absolutely clobbered, and

29:12

so they don't feel it. But if AI does

29:14

what we think it's going to do, then yes,

29:16

I think that we will finally

29:19

be able to unlock a lot of that. But

29:21

just take the distribution part of it out, just because we'll come

29:23

back to that, but just take the distribution part out. But just

29:25

the rate of technology change. Do

29:29

we live right now in a time of great

29:31

technology change or low technology change? The

29:34

only great technology change is in AI,

29:36

so low. Okay. And then

29:38

you'll probably get the next answer right, which is,

29:40

did we have faster technology change between 1930 and

29:42

1970 than we do today or slower? Much

29:48

faster. Much faster. Yeah. So those are the correct

29:50

answers. And so the metric on

29:52

what's happened, and this is actually quite

29:54

important, is that productivity growth and therefore

29:56

technological change in the economy was much

29:58

faster in the decade. that proceeded the

30:00

1970s. Actually, by the way, the turning point was the year

30:03

I was born. It was 1971. In 1971- WTF happened? It was you Mark.

30:05

Yeah, so there's a website called wtfhappenedin1971.com and

30:12

it's just it's like literally hundreds of charts of

30:15

basically this discontinuous change on all kinds of economic

30:17

and social markers that kind of kicked in the

30:19

year I was born. I do believe it is

30:21

entirely my fault. I will confess

30:23

to that. But yeah, one of the things

30:25

that happened was right around that time, productivity

30:27

growth downshifted. It was running at like

30:29

2%, 3%, 4% and then it's sort of

30:31

been 1% to 2% ever since. And it ebbs and

30:33

flows a little bit with the economic cycle, but like it's been

30:36

quite low for the last 60 years. Part of it dovetails

30:40

to the political thing you were saying. There's

30:42

a lot of questions as to why it's been

30:44

so low. There's actually economists talk about something called

30:47

the productivity paradox, because it

30:49

was really weird because the computer emerged in the 1970s. And so

30:51

all the economists in

30:53

the 1970s said the computer is going to lead to cornucopia. It's

30:55

going to lead to enormous productivity

30:57

growth. Of course it is. You got Moore's law and

30:59

it's all this software and all this

31:02

inventory just in time manufacturing. And you're going to have,

31:04

by the way, robots. And so you're

31:06

going to have this for sure, you're going to have

31:08

a massive takeoff in productivity growth. And actually what happened

31:10

was productivity growth actually downshifted. And so

31:13

all of our expectations

31:15

for how society works are

31:17

actually geared towards low productivity growth and

31:20

low economic growth from a historical standpoint.

31:24

The importance of that is really key to

31:26

the next thing that you said, which is

31:28

the psychological effect of being in a low

31:30

growth environment is zero sum politics. Right.

31:33

Logically, right. Because if we're in a high growth

31:35

environment, if technology productivity growth is

31:38

running at 4% or God willing

31:40

someday more, and if economic growth is running at

31:42

4% or more, the economy will

31:44

be doing so well. It will be spewing

31:46

money in all directions. Everything

31:48

will be going crazy. Everything will be, every business will

31:50

be flush. Every consumer will feel fantastic. Jobs are being

31:52

created all over the place. Everybody's kids for sure are

31:55

going to live better lives than their parents did. It's

31:57

going to be great. By the way, the 1990s. were

32:00

that, right? There was this kind of five-year stretch in

32:02

the 1990s where economic growth really took off. And

32:05

you probably remember, you probably like,

32:07

it was fantastic, right? Everybody felt

32:10

fucking awesome, right? And

32:12

so this is one of the kind of weird, this is

32:14

why like a lot of the fears around the impact of

32:16

technology, I think are really misguided when it comes to all

32:18

these economic and political topics, which is from

32:21

a political standpoint, we should hope that we

32:23

have rapid technology progress. Because if we have

32:25

rapid technology progress, we'll have rapid economic growth.

32:28

If we have rapid economic growth, we'll have

32:30

positive some politics, right? For me

32:32

to be better in a high growth environment, for me

32:34

to be better off, I can go be better off,

32:36

I can go exercise my skills and talents and get

32:38

new jobs and switch jobs and switch careers and do

32:41

all kinds of things. And I have a path and

32:43

a future for myself and my children, that does not

32:45

require taking away from other people. In

32:48

a low growth environment, all

32:50

of the economics and all the politics go zero sum,

32:52

because the only way for me to do better is

32:54

I have to take away from you, right?

32:57

Or to your point, the government exactly completely agree with

32:59

you, or what happens is the government just inflates and

33:01

they and they inflate because they want to basically buy

33:03

votes, they want to basically spend on programs and they

33:05

want to buy votes. And

33:08

so this is sort of what I would say, which

33:10

is like, if you want zero

33:12

sum, zero sum smash mouth

33:15

destructive politics, with

33:17

the government playing a bigger and bigger role, you

33:19

want low technological development, you want a slow pace

33:21

of technological development. If you want positive sum politics,

33:23

where people are thrilled and excited about the future

33:25

and about their own opportunity, and they don't have

33:27

to feel like they have to take away from

33:30

somebody else. And they don't need handouts from the

33:32

government because they're doing so well, you want rapid

33:34

productivity growth, right? And so you said I'm saying

33:36

like, it's the opposite of the fear that everybody

33:38

thinks that they have. I

33:41

have many other thoughts on your question. But yeah, let me let me pause

33:43

there and see which part you want to you wanted to get to. Oh,

33:46

inflation. Or

33:49

yeah, so inflation. Yeah. So look, I would just

33:51

say two things on inflation is actually pretty pretty

33:53

interesting. So there's an overall concept

33:55

of inflation, which is you said is growth

33:57

of the money supply. But

33:59

the. But the way that that plays out in

34:01

the economy is, and they actually analyze it

34:03

this way, it's basically a represent, it's basically the way they

34:06

think about it, it's the basket of

34:08

overall prices of everything in the economy. And the

34:10

government agency that calculates the rate of inflation uses

34:12

a basket of sort of equivalent products over time

34:14

to try to get a sense of what's actually

34:16

happening with prices. And

34:18

so there's both the money supply aspect of inflation and

34:21

the government printing press and all that, and that's totally

34:23

true. But what's actually

34:25

happened inside that is actually because

34:28

of differences in technology regulation, you actually

34:30

haven't really actually historically unprecedented difference in

34:32

how different industries are actually inflating or

34:34

deflating. And there's a

34:36

chart that we can maybe post for your listeners that basically

34:38

shows three really big important

34:41

sectors of the economy, which are healthcare,

34:43

education, and housing, where the prices are

34:45

skyrocketing. And by the way,

34:47

everybody feels this. This is just like, okay, you want to go

34:49

buy a starter home, or you want to get good healthcare, or

34:52

you want to get your kid into good school. The

34:54

prices are going crazy. You

34:56

see this in housing prices, of course. Another

34:58

version of this is the higher education, a four-year

35:01

college degree at a private university now costs

35:03

$400,000 and is on its

35:05

way to a million dollars. That's

35:08

crazy. Completely crazy.

35:10

So the price of higher ed is just

35:12

skyrocketing. The price of higher

35:14

education, bachelor's degrees, master's degrees, is rising far

35:16

faster than the rate of inflation. And

35:19

healthcare costs are rising faster than the rate of inflation,

35:21

and housing prices are rising faster than the rate of

35:23

inflation. But then you have

35:26

all these other sectors, and these are

35:28

sectors like video games, entertainment, consumer electronics,

35:30

by the way, food, cars, which

35:34

is good, retail, consumer

35:36

products generally. Those

35:38

prices are crashing. And so

35:41

the things that you can buy today versus

35:43

20, 30, 40 years ago for the same

35:45

dollar in those categories, take obvious examples, music.

35:48

You obviously have people who have music. To buy music 30 years

35:50

ago, you had to go spend $15 to buy a CD and

35:52

get 10 songs, out of which you maybe wanted to with the

35:54

songs. Today, $10 buys you Spotify

35:56

for a month, and you have 10 million

35:58

songs on demand, and you can listen to it 20

36:00

years later. 24-7 and it's fantastic, right? And so the

36:03

price of music has crashed, right?

36:05

And so the price of housing education and healthcare

36:07

has skyrocketed, the prices of everything else is crashing.

36:10

What explains that? Well, the

36:12

prices for everything is crashing. Number one, they

36:14

have rapid technological change, which is driving down

36:16

prices because of productivity growth and

36:18

they're not regulated, right? Nobody in

36:21

the government is price fixing music, right?

36:24

Whereas housing education and

36:26

healthcare are incredibly highly

36:28

regulated and centrally controlled by the

36:30

government, right? And they have

36:33

fixed supply dictated by the government and

36:36

they have very slow rate of technological adoption,

36:39

right? It's almost impossible to get new

36:41

technology into the healthcare system, into the education system

36:43

or into housing. Like robots are

36:45

not building houses. Like it's not happening, right?

36:48

Like it's just not happening. More to

36:50

come, we'll be back in a bit. The

36:53

boardroom and the back roads have

36:56

more in common than you think

36:58

and Range Rover Sport proves it.

37:00

As a leader, you know what

37:03

it takes to excel. You set

37:05

the standard and constantly push boundaries.

37:07

That's why you're gonna appreciate this

37:10

vehicle. The Range Rover Sport is

37:12

not just any luxury SUV. It's

37:14

a masterclass in combining sophistication with

37:17

raw power designed for those who

37:19

demand excellence in every aspect of

37:21

their lives. With features like adaptive

37:24

off-road cruise control, this vehicle monitors

37:26

ground conditions and adjusts to various

37:28

terrains. The dynamic air suspension enhances

37:31

agility, control, and composure, helping you

37:33

navigate diverse driving environments with confidence.

37:35

Whether you're conquering city streets or

37:38

off-road challenges, the Range Rover Sport

37:40

rises to every occasion. For those

37:42

of you who lead by example

37:45

and expect the best, the Range

37:47

Rover Sport deserves your attention. Explore

37:49

the Range Rover Sport at landroverusa.com.

37:54

The nutrition secrets of Olympic athletes and

37:56

Fortune 500 CEOs are no longer off

37:59

limits. to all of us. Here's

38:01

how you can tap into that

38:03

power. When I see a company

38:05

collaborating with experts I trust, I

38:08

take notice, and that's why I'm

38:10

excited to tell you guys about

38:12

Momentus. Momentus has partnered with Andrew

38:14

Huberman and his lab to develop

38:16

a suite of supplements that target

38:18

the core pillars of health, sleep,

38:21

cognition, focus, physical performance, hormone support,

38:23

and longevity. In an industry full

38:25

of hype and empty promises, Momentus

38:27

is different. Every product is NSF

38:29

and Informed Sports Certified. That means

38:31

that what's on the label is

38:34

actually all that's in the bottle.

38:36

No guesswork, no fillers, just pure

38:38

high-quality ingredients. Whether you're an athlete

38:40

or an entrepreneur or just someone

38:42

committed to living your best life,

38:45

Momentus is worth your attention. Here's

38:47

your action plan. Go

38:49

to livemomentus.com and use code

38:51

IMPACT for 20% off. That's

38:55

livemomentus.com with code

38:58

IMPACT. In business,

39:00

flying blind is a recipe

39:02

for absolute catastrophic disaster. Yet,

39:04

most companies are doing exactly

39:06

that, making decisions based on

39:09

intuition or outdated information. If that

39:12

sounds familiar, let me introduce you

39:14

to NetSuite by Oracle. It's like

39:16

giving your company X-ray vision, letting

39:18

you see every aspect of your

39:21

business in real time. It puts

39:23

your accounting, finances, inventory, and HR

39:25

into one seamless, beautiful

39:28

system. No more juggling multiple tools

39:30

or guessing at the big picture.

39:32

This means you make decisions based

39:34

on facts, not hunches. Success

39:37

in business isn't about predicting

39:39

the future. It's about being

39:42

ready for whatever comes. NetSuite

39:44

gives you that readiness. Download

39:46

the CFO's guide to AI

39:49

and machine learning at netsuite.com/theory.

39:51

The guide is free to

39:54

you at netsuite.com/theory. Again, that's

39:56

netsuite.com/theory. We're

39:58

back, let's dive. right in. And

40:01

so what we have actually in the economy is a

40:03

diverse, I call these sort of the slow sectors versus

40:05

the fast sectors. The sectors for

40:07

which prices are skyrocketing because of slow technology

40:09

change and too much government regulation and

40:12

the sectors where prices are crashing because

40:14

of rapid logical advances and

40:17

lack of government regulation. When

40:20

you chart these out, you could just like

40:22

extrapolate the lines. So where this

40:24

is happening is, you know, within like a decade, if

40:26

the current trends continue within a decade, a four-year college

40:28

degree is going to cost a million dollars, right?

40:32

And a flat screen TV that covers your entire wall is going to cost $100,

40:34

right? And

40:38

at some point you might want to ask the question like, isn't that

40:40

backwards, right? Like isn't

40:43

what we all, you know, it is where I get very

40:45

emotional about this. It's like, okay, define the American dream, right?

40:47

The American dream, and by the way, for that you could

40:49

probably substitute this, you know, the dream in many other countries,

40:51

but let's just say the American dream, the American dream. I

40:53

want to buy, I want to be, I want to buy a house for my family. I

40:56

want to be able to send my kids to a great school, and then I want

40:58

my family to be able to get great health, great healthcare, right?

41:01

Like those are like the three high order bits, and those

41:03

are the things where we have wired, our system is wired

41:05

right now to drive the prices of those things to the

41:07

moon, right? And

41:09

then good news, iPhones

41:11

and cars and digital music are

41:14

plentiful, but they're not healthcare education

41:16

and housing.

41:19

And this is the other thing that's driving inflation, right?

41:21

Because then what happens is the

41:24

fast sectors of the economy with prices are

41:26

crashing, they're shrinking as a percentage of the

41:28

economy, right? Because prices are falling

41:30

so fast. And then because

41:32

prices are growing so fast for healthcare

41:35

education and housing, they're becoming larger and

41:37

larger parts of the economy. And

41:39

so the economy writ large in people's pocketbooks

41:42

and how you spend your money, it's being

41:44

eaten by these sectors that have slow technology

41:46

growth and therefore high

41:50

rapidly rising prices. By

41:52

the way, once again, if you want to fix this problem, what's

41:54

the way to fix this problem? You inject a lot more technology

41:56

into those three sectors, right? You

41:58

would want completely automated. AI-driven healthcare, you

42:01

would want AI education, every

42:03

kid having an AI tutor, teacher,

42:05

and you would want robust building houses. If

42:09

you wedged full modern technology into those

42:12

three sectors, you could crash prices, which

42:14

would also crash inflation and

42:16

would cause everybody to be far better off. Once again,

42:18

it's this thing where you think you don't want the

42:20

technology to change. You actually very, very, very much want

42:22

the technology to change. If we don't get the technology

42:24

to change, our politics for the next 30 years are

42:26

going to be so crazily vicious. Right. Because we're all

42:28

going to be fighting over the shrinking pie. And

42:31

we're just going to hate how we have to

42:33

live. So let me pause there. Do

42:35

you think the benefits of AI will

42:37

be so overwhelming that there's just no

42:39

way for politicians to hide the ball?

42:41

Or will there

42:44

be enough narrative and story and being

42:46

able to leverage the resentment that exists

42:49

right now to continue

42:51

to forestall that, continue to grow government,

42:53

keep it strong, keep it big? Yes,

42:56

let me give you a micro answer and a macro answer. So

42:58

the micro answer. So do you see the dock workers strike that

43:00

just happened? Yeah,

43:02

so the dock workers just went on strike and

43:05

they demanded this huge raise. They demanded a huge raise. They

43:07

demanded no more technology at the docks. They

43:11

have this they have this actually this dichotomy of an argument. They

43:13

say our jobs are like so

43:15

backbreaking and arduous and physically harmful to our

43:17

workers that like we need to be appreciated

43:19

a lot more. And we

43:21

want you to completely ban the introduction of

43:23

automation that would basically automate those jobs

43:25

so that our workers don't have to do them. Right. And

43:27

they kind of make both sides of this argument like at

43:29

the same time, because they're completely contradictory. But

43:32

that's not their responsibility to resolve it. But

43:34

the dock workers go on strike. They

43:37

were literally asking for no more new technology at the docks

43:40

to preserve the jobs. It

43:42

turned out through that I discovered I just had never

43:44

looked at that industry before. It turns

43:46

out there are 25000 dock workers in the US, except

43:49

that's not right. There's actually 50000 dock workers in

43:51

the US. There's 25000 dock workers actually

43:53

work on the docks and then there's 25000 dock

43:55

workers who just who don't work, who just sit

43:57

at home and collect paychecks because of prior union

43:59

agreements. banning automation. What? Yes.

44:02

Whoa. Yes. Because

44:04

in previous bargaining rounds, they

44:07

cut deals where if there were

44:09

introduction of like, for example, machines

44:11

to unload containers from ships, that

44:13

those jobs would not go away.

44:16

And so those jobs have not gone away. There's

44:19

nothing for the- That's crazy. That is

44:21

malpractice. Well, so this is the thing. So

44:23

this is the thing. Okay, so this is the classic thing

44:25

on all these things. Is that good or bad? Well, it

44:27

depends who you are. This is the, there's a political science

44:30

here, this concept of concentrated benefits and diffuse harms. And

44:32

so for those 50,000 dock workers,

44:35

this is great. For the

44:37

rest of us, it just makes everything we buy more expensive,

44:40

right? Because it makes working the docks

44:42

more expensive, right? Cause it's got all this dead

44:44

weight loss on, you know, on

44:46

ships, which is a big part of the cost

44:48

of like, all the food we buy is more

44:50

expensive as a consequence of these kinds of arrangements.

44:52

But you know, you and I pay another, you

44:55

know, five cents every time we go to the

44:57

supermarket as a consequence of this versus the 50,000

44:59

people who are organized in a union, right, and

45:01

are able to negotiate on their behalf, right? So,

45:03

so, so, so, so, so, right. Concentrated benefits to

45:05

the dock workers, diffuse harms to the rest of

45:07

the economy. And every time you get a special

45:09

interest group in the economy pleading for, you know,

45:11

this kind of employment protection, that's

45:13

what's happening, right? They're basically trying to create a

45:16

cartel, an employment cartel that benefits the people in

45:18

the cartel at the expense of everybody else. So

45:21

here's the, here's the macro version of that is

45:24

30% of the jobs in the United States today require

45:26

some form of occupational licensing. You

45:29

can't just get the job. You have to

45:31

have some form of certification that you're qualified

45:33

for the job. This

45:35

has been pushed to extraordinary lengths. In the United,

45:38

in California, you need, I think it's, it's now,

45:40

it's like 900 plus hours of

45:42

professional training to be a hairdresser, right?

45:46

Yes, correct. You

45:48

need, what? Yes, you cannot just

45:50

like start cutting people's hair for money. No,

45:53

no, no, no, no, no, no, no, that's illegal. You

45:56

need to have a, whatever a cosmetology certificate. If you

45:58

get the cosmetology certificate. that you have to go to

46:00

hairdressing school. To do that, by the way, you have

46:02

to get admitted to hairdressing school. It has to be

46:04

a certified hairdressing school. By the way,

46:07

guess who controls how many hairdressing schools there can be? Oh,

46:10

this is my favorite part. Let me give you my favorite example of

46:12

this. So the university system. So federal

46:14

student loans, there's federal student loans for you

46:16

to go to college. Normal

46:20

people, you can't afford to go to college if you can't get federal student

46:22

loans. So you can't be a university

46:24

or college or university in the US without having access

46:26

to the federal student loan program. It's not possible. But

46:30

to be a college or university that is able to give

46:32

out federal student loans, they have to be accredited.

46:35

Guess who accredits colleges and

46:37

universities? The existing colleges

46:40

and universities. Yeah, saw that

46:42

one coming. Guess how many new colleges

46:44

and universities they're accrediting? Like?

46:47

Yeah, but yes. Zero, right? And

46:49

so 30% of jobs in the

46:51

country right now require some form of license or accreditation. By

46:54

the way, this is all doctors. And

46:56

by the way, I think that's good. You probably want doctors to be accredited.

47:00

But it's also nurses, nurse practitioners, and

47:03

then it's not just lawyers, it's also paralegals. And

47:06

then it's all kind of general

47:08

contractors. It's like, and then on

47:11

and on and on, including, depending on which state you're

47:13

in, including hairdressers and many other jobs where you would

47:15

not think this is required. By the way, or another

47:17

version of this is teacher. To be

47:19

a teacher in a lot of places in the US now, you need an education

47:21

degree, right? Is there any evidence

47:23

that teachers with an education degree are better teachers

47:25

than teachers without an education degree? I don't think

47:27

so. By the way, the education schools are completely

47:29

bananas crazy. They're

47:33

the most crazy of the academic departments at these

47:35

crazy universities, right? But again, it's a

47:38

cartel structure. Of course, K through 12 education is

47:40

not just a cartel, it's a government monopoly, right?

47:42

So you have to get actually hired into the, well,

47:46

actually, this is the other great part. You

47:48

have a higher ed is like this, K

47:51

through 12 is like this, and there's other branches of

47:53

the federal workforce and state workforce that are like this.

47:56

Or actually police, police are like this. You

47:59

have quite a few people. in the economy today

48:01

who both have their government employees, they have civil

48:03

service protections because they're government

48:06

employees, which means in practice they can't be fired. But

48:09

they're also members of what are called public sector unions. Right?

48:13

So they both have to get hired by the government

48:15

with whatever criteria they set, and they have to get

48:18

admitted into the public sector union, and they have the

48:20

employment protections of both, right, of

48:22

both the civil service and the public sector

48:24

unions. And this is why, by

48:26

the way, you can't fire it. Bad teachers can't get fired,

48:28

right, because you hit all these things. So

48:31

just so the point of that, the point of that

48:33

is AI cannot change that quickly in

48:35

this system. AI

48:37

cannot become a lawyer. It's

48:40

not legally allowed to. It can't become a

48:42

doctor. It can't replace the dock worker. It

48:44

can't cut your hair. It can't build your

48:46

house. It's not legally allowed

48:48

to. Right. And so a very

48:51

large, it goes actually to the Gulliver thing,

48:53

a very large percentage of the economy as

48:55

we experience it literally cannot be automated. It's

48:57

illegal to do so. And so

48:59

I actually think what's going to

49:01

happen is the economic impact of AI is actually

49:03

going to be very muted compared to what people

49:05

are fearing or hoping or fearing,

49:08

because it's literally not legal to do that.

49:11

It's crazy. So if everything that you just walked

49:14

us through is true in terms of when you have

49:16

high growth, everybody's feeling good, more

49:18

technology equals more growth. AI is poised

49:20

to bring that growth, but you have

49:22

this trepidation. And so people will not,

49:25

it's not just that, but you have

49:27

trepidation around it. So the

49:29

fact that the government tends towards this

49:32

justified existence, create a new regulatory body,

49:34

slow things down, everything just grinds to

49:36

a halt for people that don't know

49:39

the story of Gulliver's travels. You have

49:41

this guy that encounters these tiny little

49:43

putions. And despite him being, you

49:45

know, whatever a thousand times bigger than they are, they

49:47

just end up tying him down with all these tiny

49:49

little strings. And it's an analogy

49:52

that Elon certainly has used a lot. What

49:54

do you think about his idea of going

49:56

in and creating an efficiency

49:59

program? inside the government to try

50:01

to free up some of these

50:03

strings so that the economy can

50:05

get going again. Yeah,

50:07

that's right. So I'll give you a couple of books

50:09

if people want to read about this. So one is

50:11

the Supreme Court Justice Neil Gorsuch just wrote a book.

50:14

I think it's called, I forget the name, it's like Overlawed or

50:16

Overlawed or something like that. But he basically

50:18

lays out the data on the number of laws in

50:21

the country. And by the way, this is

50:23

another one of these WTF happened in 1971 things, which is

50:25

starting in the 1970s, the number of laws and regulations in

50:28

the US just took off like a rocket. Basically

50:30

what happens is the lawyers took over everything.

50:33

By the way, a big part of that is in politics,

50:35

basically almost everybody now who's in elected office is a lawyer.

50:39

Right. And so basically the lawyers just kind of swept

50:41

into control of everything. And so if you and also

50:43

Senator Mike Lee has also done a lot of work

50:45

on this, if you can just count the number of

50:47

laws, and then you can also count the number of

50:49

regulations, which if anything is even worse, because they're not

50:51

even laws. They're just like a bureaucrat

50:53

who's decided something, right. And the number

50:55

of regulations is just like skyrocketed. So

50:58

he goes through it in the book. And then

51:00

there's another book called Three Felonies a Day. And

51:02

it goes through in detail that technically odds are

51:04

you, I and every other American citizen are committing

51:06

at least three felonies every day. There

51:09

are so many. And we just don't know it. We don't

51:12

know it. We don't know it. And the reason is

51:14

because there are so many penalties, there are so many felonies

51:16

on the books, and the felony

51:19

laws are so sweeping in

51:22

what they cover. Now, most of those

51:24

never get detected or prosecuted. But if prosecutors want to

51:26

come at you, they can figure out ways to, this

51:28

is what people with lots of experience in legal systems

51:30

tell you, if the feds want to get you, they're

51:32

going to figure out a way to do it because

51:34

you're almost certainly tripping something. And

51:37

so yeah, so I completely agree with Elon on

51:39

the nature of the problem. And

51:43

again, this is sort of

51:45

this weird, it's this concentrated benefit, diffuse harm

51:47

thing, which is like each law or regulation

51:49

and isolation seems like a good idea. And

51:53

each law or regulation has somebody advocating for it, because

51:55

they're going to benefit from it. And

51:57

they, you know, and typically, there's like some level of

51:59

self interest. somebody's trying to get something for themselves, and

52:01

then they sort of have a cover story of consumer

52:03

benefit or something. And then they

52:06

get these things passed, right? And they operate in Washington,

52:08

and they're in the state house, and they get these

52:10

things passed. And each one of them

52:12

on its own is not a big deal, but you run

52:14

that process at scale over 60 years, and that's when

52:16

you end up with the Gulliver scenario, which is

52:19

you're just drowning in laws and regulations. And

52:21

again, I'd take back to what I said before,

52:23

that's why the prices of health care, education, and

52:26

housing have skyrocketed, is because that's where the laws

52:28

and regulations in the economy are concentrated. All right,

52:30

so let's talk about then the

52:32

next four years. So if

52:35

Elon were to find himself in that

52:37

position, do you

52:39

think that we could meaningfully

52:42

strip away red tape to the point

52:44

that those, that scenario you painted where

52:46

those three things we care about so

52:48

much, where the prices begin to crash,

52:50

or is that just unrealistic, full stop?

52:53

Is it unrealistic in four years? How

52:56

much can we do? So it

52:58

could be done for sure. There is actually a case

53:00

of it actually happening in the world right now, which

53:02

sitting here today looks very good, which is Argentina. And

53:06

so Javier Mille, who's the new president

53:08

of Argentina, has

53:10

passed, I don't know the exact details, but I

53:12

think his first big reform package, which was a

53:14

real fight for him to pass, I think it

53:16

was like, fundamentally was like, I think it took

53:18

regulations out of, I think, 800 different sectors of

53:20

the Argentinian economy in

53:22

one package. And if they have a follow-up package they're working

53:24

on, like the 2000

53:27

or something. So he's

53:29

trying to do exactly what you just described. He's trying to

53:31

just basically, he's just like, Mille

53:34

is a staunch libertarian,

53:37

anti-socialist, anti-communist. He

53:39

has this great line, which he used the other day, which

53:41

I love so much. So

53:43

Margaret Thatcher had the famous line about socialism, which is

53:45

she said, the thing about spending other

53:47

people's money is eventually you run out. Mille has

53:50

a better term, which he says, anybody

53:52

can be a prostitute with other people's

53:54

asses. That

53:59

guy is a gangster. hilarious. Which is freaking amazing.

54:01

Anyway, so yeah, no, so he's trying to strip

54:03

as much regulation out as possible and the thesis

54:05

of it is precisely this. It's like, okay, you

54:08

strip out regulation, you remove government control, you liberate

54:10

the people. You liberate the people

54:12

to be able to exchange, you know, to go into

54:14

voluntary trade and exchange, to be able to actually conduct

54:16

business with each other without the government interfering with it

54:18

all the time. And then as a consequence, you get

54:20

like far higher rates of economic growth, far higher rates

54:22

of prosperity, but you know, and so it's, this is

54:24

a big experiment. And of course, Argentina has been a

54:26

case study for a hundred years of doing this the

54:29

wrong way. And he's now administering a form of

54:31

shock therapy to basically see if he could do

54:33

it the right way. And by the way, sitting

54:35

here today, you know, in very short order inflation

54:37

in Argentina has, you know, they've had a persistent

54:39

inflation problem for a very long time. He's completely

54:41

nuked inflation and economic growth has kicked

54:44

in and job growth has kicked in. Now

54:46

he is fighting like, you know, he has enemies, right?

54:48

He is fighting like crazy, both in the hit the

54:51

political system and rights in the streets, you

54:53

know, from people who are trying to stop this.

54:55

And so anyway, so that goes to the, to,

54:57

you know, to our situation, which is yes, the

54:59

theory is totally sound, right? Like everything

55:02

that Elon is describing should absolutely happen. You

55:05

know, this should absolutely be done. By the way, I

55:07

think basically everybody knows this should be done. Like, again,

55:10

concentrated benefits, diffuse farms, even people who benefit from

55:12

some aspect of this are suffering from it in every

55:14

other area of their lives. Right. And

55:16

so this is what Mille always points out

55:18

is the system in aggregate is making everybody poorer. Like

55:21

it is leading to all these like bad,

55:23

as you said, it's leading, for example, to

55:25

intergenerational conflict that's just like unnecessary and very

55:27

destructive. And so it's just like, let's

55:29

just stop this form of self harm. But to do that,

55:31

reason I say this to every single

55:33

regulation has somebody behind it who doesn't want it

55:35

to go away. Right. Because

55:38

it benefits somebody, right? It benefits

55:40

the, you know, the dock workers who are sitting at

55:42

home, right? It benefits somebody, right? It's

55:44

all the, all the little cartels and monopolies and

55:46

oligopolies and little conspiracies in the economy. Like, you

55:48

know, they are in business

55:50

because they're protected by the government. And when

55:52

you strip these regulations away, you expose them

55:54

to competition, and they really don't like that.

55:57

And so there will be a backlash

55:59

from the. system from the from all

56:01

of the you know, the special interest

56:03

groups in aggregate will rebel in

56:05

great numbers and then you know

56:07

Look, the the the key fight ultimately

56:09

is the civil service itself, you

56:12

know, the actual government employees Right. And

56:14

so, you know for example, you know

56:16

How about a reform where like there's actual performance

56:18

metrics for government employees and low performers get fired

56:21

Brother please if you want to get me an

56:23

occult start a cult about that. I'm

56:25

here for that I'll do what we

56:27

need aware whatever crazy outfit. I am here

56:30

for that one. Yeah Yeah,

56:32

let me ask going back to me lay are

56:34

in are the layoffs causing any sort of economic

56:36

downturn because one criticism I've heard of Elon is

56:38

hey if you come in and you do this

56:40

and you slash it not only is it cruel

56:42

But you're gonna tank the economy. You're gonna have

56:44

so many people without a job Yeah,

56:47

yeah, so that so yeah, so this this happens and by

56:49

the way this happened actually in in the in the late

56:51

70s early 80s There was actually a version of

56:53

this which is inflation in the US actually got completely out of control

56:57

And you know, there was everything was kind of going

56:59

sideways, but inflation went crazy I think inflation spiked at

57:01

like 15% and then Paul Volcker which was

57:06

Super destructive, right? Like really ruinously

57:08

bad Like

57:10

it destroys everything it destroy savings that destroys ability

57:12

for businesses to plan It just it's it basically

57:15

damages damages everything and Paul Paul No, and the

57:17

way you crack the back of inflation is you

57:19

raise interest rates? And you deliberately cool the economy

57:21

in order to bring down the demand for money

57:23

and then inflation falls And

57:26

so Paul Volcker who was the chairman of the Federal Reserve

57:28

who is a famous guy He's like the six foot eight

57:30

giant guy with with the cigar And

57:32

he was the head of the Federal Reserve and he lived

57:34

in the he lived in the undergraduate dorms at I think

57:36

Georgetown And like took the taxi to

57:38

work So he was like in contact with like regular people

57:40

every day even though he was like the head

57:42

of the Federal Reserve in his three-piece

57:44

suit and Whenever he

57:46

testified to Congress if you see the old photos He's just

57:48

constantly just like dank clouds of cigar smoke around him all

57:51

the time So one of these like old-school

57:53

figures and he raised interest rates in 1981. I think to

57:55

20% Whoa

58:00

which basically crushed the economy, it basically crushed a man

58:02

in the economy. It meant that nobody could borrow money,

58:04

nobody could buy a house, nobody could start a business.

58:06

It was very devastating in that moment. But

58:09

he wrote a book about this and he said at no point,

58:11

when he would walk down the street, and people would recognize him,

58:13

this is in DC, and he'd be walking down the street or

58:15

he'd be in the cab, he said nobody

58:18

was ever mad at him because what

58:20

they said was inflation is so bad. We know that inflation is

58:22

bad. We know that you have to do what you're doing at

58:24

the Theatrical Straits to do it. We know if you do it,

58:26

you're gonna fix the inflation problem and things are gonna go back

58:28

to being good again. And so we

58:30

support you, stick with it. And so

58:32

he had the people on his side, and

58:35

Miele has the same thing in Argentina right now. He

58:37

has very high level of support from the population because

58:39

they've seen the other experiment for too long. They've

58:42

been through a society with too

58:44

much regulation, too much corruption and

58:46

too much inflation for a long time. And they're just

58:48

like, look, the people are behind him. You've

58:50

seen in the polls and you see it in the voting. They're just

58:52

like, all right, we're gonna try plan B. And

58:55

so what you need is you need a politics of

58:57

plan B. You need a majority

58:59

of the population to basically say, look, whatever the

59:01

pros and cons of the old system were, they're

59:03

not working and we need fundamental

59:06

change. And then obviously you need leadership that's gonna

59:08

be willing to implement that. But if the people

59:10

are behind it, then

59:12

you can actually do that. And so the fact that it worked under

59:14

Volcker and

59:17

the fact that it's working under Miele is very promising.

59:19

Those are two great examples of how it can work. You

59:23

know, we don't yet have that, but we

59:25

could. Very,

59:28

very interesting. When

59:32

I start thinking about how we build back, we

59:34

get the economy going, we take off the Gulliver

59:36

strings. One of the things that I would wanna

59:38

see is, one

59:40

of the things I think we need to see is

59:43

a return to pricing freedom of

59:45

speech. Because if we can't

59:47

debate these ideas, if people can't get in there

59:49

and mix it up and say, okay, I think

59:51

this is way, no, that's terrible. We should be

59:53

doing it this way. But nothing being verboten, like

59:55

actually being able to discuss these ideas, that feels

59:57

like a critical need. what's

1:00:00

your take, especially coming off the heels

1:00:02

of talking so much about AI, what's

1:00:05

your take on censorship? Where

1:00:08

are we culturally and what's AI's

1:00:10

role gonna be in either breaking

1:00:12

us free from censorship or using

1:00:14

that to really tighten down? Yep,

1:00:17

yep. So I should start with I am classic Gen

1:00:19

X, I am 100% pro free speech. How

1:00:22

many is two of us? I am 100% pro free speech. By

1:00:25

the way, the first, you may know this, the First Amendment guarantees

1:00:28

the government, at least in theory is not supposed to censor

1:00:30

us, although that's been happening a bit

1:00:32

lately. Just a smidge. Just

1:00:35

a smidge, but the government also, there's the case

1:00:37

law around the First Amendment that actually defines illegal

1:00:39

speech. And there are a bunch of forms of

1:00:41

illegal speech and it's things like child porn and

1:00:43

it's incitement to violence, it's terrorist recruitment, right?

1:00:45

And so there's actually like carve outs for that

1:00:48

stuff. And so like

1:00:50

my philosophy is basically US law is

1:00:52

actually very good on this. And

1:00:54

US law isn't just US law, it's also an,

1:00:57

this has been litigated culturally in the US as

1:01:00

well as legally for 250 years, going back to the Bill

1:01:02

of Rights. We and our

1:01:04

predecessors in the US went through a long process

1:01:06

to get to where the First Amendment is. I

1:01:09

think it therefore represents more than just a law. I think

1:01:11

it's also a statement of culture and

1:01:14

a statement of values. And

1:01:16

I've always been an advocate that like the code for

1:01:19

internet freedom of speech should basically be that, it should

1:01:21

be the First Amendment with only limited carve outs for

1:01:23

things that are truly dangerous, truly

1:01:25

destructive. Like I don't want terrorist

1:01:28

recruitment anywhere than anybody else, but

1:01:30

like, should people be able to talk about their politics online without

1:01:32

getting censored 100%, right? Full

1:01:35

range of expression, 100%, of course. Like

1:01:37

it's the American way, of course. And

1:01:40

so I'm 100% on that. You

1:01:42

know, probably as much as I do about the

1:01:44

last decade, which I've seen up close, which is

1:01:46

generally things went very bad. The

1:01:49

internet companies ran into a variety

1:01:51

of externally and self-inflicted situations

1:01:54

where there ended up being a pervasive censorship machine

1:01:57

for a long time. The most dramatic change of

1:01:59

that is- Twitter before and after Elon buying it.

1:02:02

By the way, we're a proud member of the syndicate that bought it with

1:02:04

Elon. I'm completely thrilled by what he's

1:02:06

done there. Thank you for your service,

1:02:08

by the way. To me, it's

1:02:10

just so better. I just can't believe

1:02:13

that that was controversial. It's crazy.

1:02:15

Yeah. As you know, it was a big

1:02:17

change. It was an absolutely dramatic change. We're

1:02:19

also, by the way, the

1:02:21

main investor, outside investor, and sub-stack, which

1:02:24

I think has also done a spectacular job at

1:02:26

navigating through this and basically has come out the

1:02:28

other side of ... They're a small company, so

1:02:30

when the pressure gets brought to bear on a

1:02:32

small company, it can really have an impact. But

1:02:34

the team there has, I think, done a fantastic

1:02:36

job navigating to a real freedom of speech position.

1:02:38

And as a consequence, sub-stack has now the full

1:02:40

range of views on all kinds of topics in

1:02:43

a really good way. So the good news is we have

1:02:45

two case studies where this has gone really well. The

1:02:48

other ones are more difficult. Here's

1:02:51

what I would say is I think the internet

1:02:53

social media censorship wars were the preamble to the

1:02:55

AI censorship wars. I

1:02:57

think the AI censorship wars are going to be a

1:02:59

thousand times more intense and a thousand times more important.

1:03:02

Yes, 100%. And

1:03:05

the reason for that is the internet

1:03:07

social media is important because it's

1:03:09

what we all say to each other, but AI is going

1:03:11

to be, I think, the software layer that controls everything. It's

1:03:15

going to be the software layer that basically tells us everything.

1:03:17

It's going to be the software layer that teaches our kids.

1:03:19

It's going to be the software layer that we talk to every day.

1:03:23

And as I think you know,

1:03:25

there's already AI censorship. A lot of these

1:03:28

LMs are very

1:03:31

slanted. And

1:03:34

by the way, it's very easy to see because you

1:03:36

can go on them today and you just ask them

1:03:38

two questions about two opposing political candidates and they give

1:03:40

you completely different. One candidate, they're like, I'd be happy

1:03:42

to tell you all about his positions. And the other

1:03:44

candidate, they're like, oh, he's a hate figure. I won't

1:03:46

talk about him. And it's like, wait a minute. Right.

1:03:48

Like half the country's voting for one, half the country's

1:03:50

voting for the other. Who are you

1:03:52

as an AI company to basically censor like that? And

1:03:55

so look, the AI censorship, the

1:03:59

AI censorship. conflict is already underway. The war,

1:04:01

the war, the information war around AI is

1:04:03

already underway. The, by the way, the

1:04:05

same people who were pushing so hard for social media

1:04:07

censorship have now shifted their focus to AI censorship. By

1:04:10

the way, a lot of the actual censors themselves who used

1:04:12

to work at companies like Twitter and now work for the

1:04:14

AI companies. So there's been

1:04:16

like a direct, you know, just, you know, lessons learned

1:04:18

and now applying it at a larger scale. And so

1:04:20

I think that, yeah, no, look, I think it's going

1:04:22

to be a giant fight. I think it's just starting.

1:04:24

I think it's, you know, maybe the most important, I

1:04:27

think it's maybe the most important political fight of the next

1:04:29

30 years. Tell me why. Well,

1:04:32

because it, it, it, it, everything is downstream. Everything

1:04:34

is downstream from being able to discuss and argue and

1:04:37

be able to, you know, be able to communicate. And

1:04:40

so if you can't have,

1:04:42

if you cannot have open discussions about important topics, you

1:04:44

can't get good answers. Let

1:04:48

me give you an angle on this.

1:04:50

I'm, I am pretty sure we will

1:04:52

agree about this. The thing about AI

1:04:54

censorship that scares me isn't just the,

1:04:57

that person is a bad person. And so I'm

1:04:59

not going to tell you about them. It

1:05:02

is that you can control the

1:05:04

entire world through framing just

1:05:07

how you frame something

1:05:10

and everything has a frame. And

1:05:14

when you have humans with

1:05:16

a desire to convert the, or

1:05:20

indoctrinate rather than

1:05:23

seek truth, then now

1:05:26

the only thing I can guarantee is, okay,

1:05:28

the, the AI is responding to me from

1:05:30

within a frame. They are using that to

1:05:33

nudge my thinking in a direction. And it

1:05:35

becomes a form of mind control. And,

1:05:38

and if you've ever seen a dear listener,

1:05:40

if you've ever seen an incredible debater, I

1:05:42

promise you what you love about them is

1:05:44

they can reject the frame and then put

1:05:47

their own frame on it. And now they're

1:05:49

arguing from a position of power. Most

1:05:51

people can't do it. Most people don't even realize somebody just put

1:05:53

them in a frame and they

1:05:56

don't realize how constraining that frame is. And

1:05:59

that's what really freaks me out is everything else

1:06:01

felt more like it was out in the open.

1:06:03

Like even when it was still Twitter and Twitter

1:06:05

was being censored like crazy, everybody was like, bro,

1:06:07

this is so obvious. Like, look, you post about

1:06:10

this, poof gone. I post about this, it's gonna

1:06:12

explode. So when the Twitter files came out, I

1:06:14

don't think anybody was like, wait, what? Everyone

1:06:17

was like, yeah, that's exactly how it felt.

1:06:20

This will be a game of frame

1:06:23

and really does come down to, it's hard

1:06:26

for humans to determine what is true. We

1:06:28

were talking earlier about why is technology stalled

1:06:30

out? The reason technology stalled out in my

1:06:32

humble opinion is physics broke somewhere around, call

1:06:34

it 50, 60 years ago. It

1:06:37

just got hung up and we haven't

1:06:40

been decoding the real world, that's truth.

1:06:42

Now, once you're able to make contact

1:06:44

with that ground level truth, new

1:06:46

things are open to you. And

1:06:49

so that's my big concern with

1:06:51

AI is that we

1:06:53

will not be getting informed by

1:06:55

what is making contact with ground

1:06:57

truth. We're gonna be having the

1:07:00

frame set and we're gonna be

1:07:02

taught as kids, as adults, as

1:07:04

everybody based on the frame that

1:07:06

matches somebody's ideology. And that scares

1:07:08

the life out of me. Yeah,

1:07:11

it should, I agree with that. Of

1:07:14

all the radical things that Elon is doing, maybe the most

1:07:16

radical is that he's declared that his goal, and I would

1:07:18

say we're investors in it with him, but

1:07:20

his goal for XAI is what he calls

1:07:22

maximally truth seeking. And

1:07:25

if you've listened to him on this, what you know is he actually

1:07:27

means two different things by that. I mean,

1:07:29

they're the same thing ultimately, but two different angles. One

1:07:32

is maximum truth seeking in terms of actually understanding

1:07:34

the universe. And so to your point, actually learning

1:07:36

more about physics. But he also

1:07:38

means maximally truth seeking in terms of social

1:07:40

and political affairs. And

1:07:43

so being able to actually speak openly about, having

1:07:45

AI actually be fair and truth seeking when it comes

1:07:47

to politics. And of course, that's, you know, that

1:07:52

is possibly the most radical thing anybody could do, is build

1:07:54

a legitimately truth seeking AI. And

1:07:57

at least he has declared the determination to do that.

1:08:00

So, yeah, there's a version of the world where he

1:08:02

succeeds and that becomes

1:08:04

the new benchmark. And by the way,

1:08:06

open source AI plays a big role here because

1:08:08

people can field open source AI to do this without

1:08:11

permission. And so

1:08:13

there's a version of the world where AI becomes

1:08:16

an ally in trying to understand ground truth and

1:08:18

trying to enable all the actual discussions and debates

1:08:20

that need to happen. And then there's a version

1:08:22

of the world in which, yeah, it's a Orwellian

1:08:25

thought control. My line on it is

1:08:27

1984, the novel 1984 was not written in

1:08:29

the instruction manual, right? Like that

1:08:32

was not the goal, right? It

1:08:35

was supposed to be a dystopian future that we were trying

1:08:37

to avoid. And so the idea that the machines are telling

1:08:39

us what to think and that they're slanted and biased by the

1:08:41

people who build them, yeah, I find

1:08:43

it to be completely unacceptable. But there is

1:08:45

a, I mean, look, we have that today. Most of

1:08:47

the AI's in the world today are like that. And

1:08:50

there is a very big danger of that. And by the way, those companies,

1:08:53

people always, the

1:08:56

people who are the most upset about freedom of

1:08:58

speech, I think, justifiably aim, internet freedom of speech,

1:09:00

they justifiably aim a lot of criticism at the

1:09:02

companies. And I think that is valid in many

1:09:04

cases. But I would just

1:09:07

also tell you, these companies are under intense pressure. And

1:09:10

there's tons of activists that are very

1:09:12

powerful, that are basically bearing down

1:09:14

in these companies all the time, but then also the government

1:09:16

directly. And one of the

1:09:18

things that has really kicked in in the last 10

1:09:20

years is governments both here and in Europe and other

1:09:22

places, basically seeking to censor and

1:09:24

control, even in ways

1:09:27

that I think are just like obviously illegal by their

1:09:29

own laws. And that

1:09:31

pressure remains very strong. And

1:09:34

I think if anything, that pressure probably is gonna

1:09:36

intensify. And so this for

1:09:38

me is in the category of, yes, these are

1:09:40

the right concerns. And then ultimately, this is a

1:09:42

democratic, a lowercase D democratic

1:09:44

question, which is, do

1:09:47

people care? And are people gonna be willing to

1:09:49

stand up for this? And I think that's what's required. That's

1:09:52

it for part one, everybody. But the conversation

1:09:54

with Mark Andreessen is definitely not over. Part

1:09:56

two is packed with even more mind blowing

1:09:58

insights. So be sure to come back tomorrow.

1:10:00

I will see you then. Stop

1:10:06

struggling with a lack of focus and energy

1:10:09

while trying to reach your peak performance. Take

1:10:11

it from me, if you want

1:10:14

to reach another level, you need

1:10:16

to hit your body with all

1:10:18

the nutrients it needs every single

1:10:21

day to really maximize your performance.

1:10:23

And there's no better way to

1:10:25

make sure that you get all

1:10:28

the micronutrients that you need than

1:10:30

with AG1. AG1 is a foundational

1:10:32

nutritional supplement that truly supports your

1:10:35

body's universal needs like gut optimization,

1:10:37

stress management, and immune support. If

1:10:39

you're a long-time listener of the

1:10:42

show, you know that I rarely

1:10:44

use supplements with the exception of

1:10:46

vitamin D3 and AG1. Just one

1:10:48

scoop of AG1 supports your whole

1:10:51

body with 75 high-quality vitamins, minerals,

1:10:53

and whole food source nutrients to

1:10:55

support optimal health of your brain,

1:10:58

body, and gut. If you're looking

1:11:00

for a simple, effective, investment in

1:11:02

your health, try AG1 and get

1:11:05

five free AG1 travel packs and

1:11:07

a free one-year supply of vitamin

1:11:09

D with your first purchase. Just

1:11:12

go to www.drinkag1.com/impact. That's www.drinkag1.com slash

1:11:15

impact. Give it a try. What's

1:11:18

up, guys? Tom Billieu here. I

1:11:20

am beyond excited to tell you

1:11:22

about my new podcast, Tom Billieu's

1:11:24

Mindset Playbook. We are opening

1:11:26

the vault to share some of the

1:11:28

most impactful impact theory episodes of all

1:11:30

time. These are the very conversations that

1:11:32

took me from one subscriber to where

1:11:34

I'm at today. This is

1:11:37

the show where you'll find my personal

1:11:39

deep dives in the mindset, business, and

1:11:41

health topics. You're going to hear from

1:11:43

legends like Andrew Huberman, Ray Dalio, Mel

1:11:45

Robbins, and Dave Asprey all sharing insights

1:11:47

that are going to help you achieve

1:11:49

the goals that you've set out to

1:11:51

achieve. If you're serious about

1:11:53

leveling up your life, I urge you to

1:11:55

go in, binge listen to these episodes so

1:11:58

that you can get the skill set that

1:12:00

you need. to hit whatever level of success

1:12:02

you're pursuing. Go check it

1:12:04

out. Tune in to Tom Billieux's Mindset

1:12:06

Playbook. Trust me, your future self will

1:12:08

thank you.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features