AI CEO Alexandr Wang

AI CEO Alexandr Wang

Released Tuesday, 18th February 2025
Good episode? Give it some love!
AI CEO Alexandr Wang

AI CEO Alexandr Wang

AI CEO Alexandr Wang

AI CEO Alexandr Wang

Tuesday, 18th February 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Calling all Call of Duty

0:02

fans, the iconic map Verdansk

0:04

returns to Call of Duty

0:07

war zone. Starting April 3rd,

0:09

you'll be able to drop

0:11

back into Verdansk. Experience all

0:14

the chaos and relive the

0:16

thrill you've been. Missing. Not

0:18

only will you get the

0:21

classic Battle Royale experience we

0:23

all know and love, but

0:26

Verdansk is back with gameplay

0:28

updates and the return of

0:30

Verdansk era weaponry. That's right.

0:33

You'll experience Verdansk like never

0:35

before. Smoother movement. stunning visuals

0:37

and new mechanics. Whether you're

0:40

dropping in solo or teaming

0:42

up with your squad, it's

0:44

time to come home to

0:47

Verdansk. So download Call of

0:49

Duty Warzone for free and

0:52

drop in a Verdansk on

0:54

April 3rd. Read it in

0:56

for mature. I want to

0:59

let you know we've restocked

1:01

all your favorites on the

1:03

merch site. Everything is in

1:06

stock. You can show up

1:08

the Yovan store.com and thank

1:10

you so much for the

1:13

support I have some new

1:15

tour dates to tell you

1:18

about I'll be in Chicago

1:20

Illinois on April 24th at

1:22

the Win Trust Arena Fort

1:25

Wayne Indiana on April 26th

1:27

at the Allen County War

1:29

Memorial Coliseum and Miami Florida

1:32

on May 10th at the

1:34

Kasea Center Get your tickets

1:36

early starting Wednesday, February 19th

1:39

at 10 a.m. with pre-sale

1:41

code, Rat King. We also

1:44

have tickets remaining in East

1:46

Lansing, Victoria, B.C. in the

1:48

Canada, College Station, Belton, Texas,

1:51

Oxford, Mississippi, Tuscaloosa, Alabama, Nashville,

1:53

Tennessee, Winnipeg, in the Canada

1:55

and Calgary, in the Canada.

1:58

All tickets at theovon.com/T-O-U-O-U-L- Today's

2:00

guest is from Los Alamos,

2:02

New Mexico. He's a leader

2:04

in the world of business

2:07

and technology. He's an entrepreneur.

2:09

He started Scale AI. A

2:11

company recently valued at

2:13

$14 billion. He started

2:16

it when he was only 19.

2:18

And at 24, he became the

2:20

youngest self-made billionaire in the

2:23

world. We talk about his company,

2:25

The Future of AI, and the

2:27

role it plays in our human

2:29

existence. This was supereducational for me.

2:31

I hope it is for you.

2:33

I'm grateful for his time. Today's

2:36

guest is Mr. Alexander Wang.

2:54

Alexander Wang man, thanks

2:56

for hanging out, bro. Yeah, thanks

2:58

for having me. Yeah, good to

3:00

see you, dude. Last time was at

3:02

the inauguration. Yeah, what'd you think of

3:05

that? Like, what were your thoughts after

3:07

you left? Because you and I ended

3:09

up, we like left out of there

3:12

and then we got lunch together, which

3:14

was kind of crazy. It was, there

3:16

was a, were you there the whole

3:19

weekend? No, I just got there, the

3:21

day, the morning of the inauguration, the

3:23

new administration is really excited about it

3:25

and wants to make sure that we

3:28

get it right as a country. So that was

3:30

all great, but it was kind of a crazy,

3:32

the whole like event, like everything was pretty

3:34

crazy. I don't know, what'd you think? I

3:36

mean, when I saw Connor McGregor show up,

3:38

that's when I was like, shit is, where

3:40

are we? It felt like the boost bizarre.

3:43

I mean, you were there Sam Altman was

3:45

there. I was just like, it was like,

3:47

what happened here? And part of that's because

3:49

Alex Broustowitz, you know, totally, yeah. I mean,

3:51

it's all because of him that we all

3:54

went. But just the fact that he would

3:56

bring these people together, it kind of makes

3:58

you question. Crossover, it's like a. It's like

4:00

in those like TV shows when you

4:02

have those crossover episodes. Yeah. Oh yeah,

4:05

when like the Harlem Globe Trotters and

4:07

it would be like them versus like

4:09

the care bears or whatever would show

4:11

up. Exactly. Yeah, this seems. Yeah, some

4:13

cross-pollination. Yeah, what'd you think when Connor

4:16

showed up? Was that strange to see

4:18

him? Like was there somebody you thought

4:20

that was strange, like unique to see

4:22

there for you? Well. Yeah, I mean

4:24

you, Connor, like the Pauls, I don't

4:27

know, the whole, the whole thing was

4:29

pretty crazy. I see Sam all the

4:31

time and I see, I see some

4:33

of the tech people all the time.

4:35

I mean, it was funny crossover and

4:38

it was obviously like, so many people

4:40

flew in to be able to go

4:42

to like the outdoor inauguration, right? And

4:44

so, I mean, there were so many

4:46

people in the city, we ran into

4:49

them obviously, but like there were so

4:51

many people in DC, in DC, who

4:53

were just around for the inauguration for

4:55

the. Yeah I didn't even think about

4:57

that all there was a couple hundred

5:00

thousand people probably that it's got kind

5:02

of displaced in a way and suddenly

5:04

there's in bars or whatever just like

5:06

buying like there was like people I

5:08

saw walking just adult onesies and shit

5:11

that said trump or on the best

5:13

crazy shit yeah um Yeah, thanks for

5:15

coming in man. I'm excited to learn

5:17

about AI. I know that that's a

5:19

world that you're in. You're in the

5:22

tech universe. And you're from New Mexico,

5:24

right? Yeah. And so when you grew

5:26

up there, did you have like a

5:28

strong sense of like science and technology?

5:30

Was that like part of your world?

5:33

Like did your parents like lead you

5:35

into that? Like what was just some

5:37

of your youth like there? Did you

5:39

watch Oppenheimer? Yeah. Yeah. So the so

5:41

all the New Mexico shots in Oppenheimer,

5:44

that's exactly where I grew up. Oh,

5:46

damn. So it was like originally where

5:48

all the scientists came together and and

5:50

did all the work in the atomic

5:52

bomb. And there's still this huge lab.

5:55

That's basically the everybody I knew effectively

5:57

like their parents worked at the lab

5:59

or were some affiliate with the lab.

6:01

like Nookville over there, huh? Nookville, yeah.

6:03

Is it scary like that? It was,

6:06

you know, at a level of mystery,

6:08

like is your prom like last night

6:10

under the stars or something like? There's

6:12

this one museum, well the funny thing

6:14

is like, first you hear, you hear

6:17

this, you learn the story of. the

6:19

atomic bomb and the Manhattan project, like

6:21

basically every year growing up, because there's

6:23

always like a day or a few

6:26

days where, you know, there's a substitute

6:28

teacher and they just, they just play

6:30

the videos. So you're just like, yeah,

6:32

an alcoholic or something. Yeah, yeah. And

6:34

recreational use or use that term. That's

6:37

crazy that that just like every year

6:39

you guys have like, yeah, just like

6:41

blast it Thursdays or whatever. And it's

6:43

just, yeah, you're just learning, you're learning

6:45

again about the Manhattan project. there's a

6:48

little there's a little museum in town

6:50

which is like you walk through and

6:52

there's like a life-size replica of the

6:54

nukes so it's it's pretty wild yeah

6:56

and where did they drop the atomic

6:59

bombs on they dropped them on Asia

7:01

right Hiroshima they yeah they dropped them

7:03

on in Japan yeah Hiroshima Nagasaki is

7:05

it crazy being like semi Asia part

7:07

Asian Yeah, my parents are Chinese. Oh

7:10

nice man. Is it crazy being Asian

7:12

and then having that happen to Asian

7:14

people with a bond like? Is that

7:16

a weird thing there or it's nothing?

7:18

No I don't think so. I mean

7:21

I think the thing is like, you

7:23

know, so there weren't, I didn't grow

7:25

up with very many Asians, because in

7:27

that town it was, you know, it's

7:29

in New Mexico, there's very few Asians

7:32

in New Mexico. So. I was one

7:34

of the only Asian kids in my

7:36

class growing up and so I didn't

7:38

think that much about it honestly. But

7:40

then like, but it is super weird,

7:43

you know, you grow up and you

7:45

learn about this very advanced technology that

7:47

had this like really, really big impact

7:49

on the world. And I think that

7:51

shaped. Yeah, it's like the scientific John

7:54

Jones over there really. He's New Mexican,

7:56

isn't he? Lives in Albuquerque, yeah. He

7:58

does? I think he does, yeah. Oh,

8:00

sweet man. Yeah, yeah. Oh, yeah, so

8:02

you, so you're there, there's this, there's

8:05

this, there's this. Energy always there of

8:07

this creation, so probably the possibility of

8:09

creation maybe was always in the air.

8:11

I'm just wondering like how did you

8:13

get formed kind of, you know, like

8:16

what's your origin story kind of. It

8:18

was super scientific because, you know, there

8:20

were all these, there were all these

8:22

presentations around what were the new kinds

8:24

of science that were going on at

8:27

the lab. So there's all these chemistry

8:29

experiments and these different like earth

8:31

science experiments and physics experiments. And

8:33

my mom studied like plasma. and

8:35

like how plasma you know worked

8:37

inside of stars and stuff like

8:40

that so it was just the

8:42

wildest stuff and you would talk

8:44

to people's parent people like I talked

8:46

to my classmates or I talked

8:48

to their parents about what they're working

8:50

on is always some crazy science

8:53

thing so that was wow that was

8:55

really cool because everybody in that town

8:57

basically is they're all working on

8:59

some kind of crazy scientific

9:01

thing and so you kind of I mean I

9:03

feel like I grew up like I grew up Feeling

9:06

like you know anything was possible in

9:08

that way like yeah because the rest

9:10

of us in other communities shitty communities

9:12

or whatever we're just making that volcano

9:15

or whatever you know what I'm saying

9:17

we're doing like grassroots level bullshit you

9:19

know dang that's got to be wild

9:21

so every you see somebody just sneak

9:23

a Mahanna alley and buying a bit

9:26

of uranium and shit like that in

9:28

your neighborhood that's got to be kind

9:30

of unique. I remember there's someone from

9:32

our town who did a science fair

9:34

project called um It's called like great

9:37

balls of plasma and for the science,

9:39

literally for the science fair, this was

9:41

in like high school, for the

9:43

science fair, they made like huge

9:45

balls of plasma in their garage. And

9:47

I was like, what the heck? Like

9:50

this is, we're all just doing this

9:52

in high school? Damn. So do you

9:54

feel competitive or do you just feel

9:56

like hyper capable? Like did you feel

9:58

like kind of advanced? just in your

10:01

studies like when you were young

10:03

like you were in class like are

10:05

some of this stuffs kind of

10:07

coming easy to me? I ended up

10:09

what what I did is there

10:11

was a I ended up getting really

10:14

competitive about math in particular and

10:16

so so my sport was math which

10:18

is kind of crazy that algebra lazy

10:20

son I know I feel you

10:22

and it was because when I was

10:25

in middle school when I was

10:27

in sixth grade there was this one

10:29

math competition where if you if

10:31

you got top four in the state,

10:33

save New Mexico, then you would get

10:36

an all expense paid trip to

10:38

Disney World. And I remember as a

10:40

sixth grader, like, that was the

10:42

most motivating thing I could possibly imagine

10:44

was like an all expense paid

10:46

trip to Disney World. Yeah. And then,

10:49

did you win it? I got I

10:51

want I got fourth place so

10:53

I snuck in there, snuck in, I

10:55

and then I went to and

10:57

then we went to Florida to Disney

11:00

World and I hadn't traveled like

11:02

I didn't travel around too much growing

11:04

up like I mostly was in

11:06

New Mexico and we did some road

11:08

trips around the southwest. So I remember

11:11

getting to Florida and it was

11:13

extremely humid. It was like I never

11:15

felt what humidity felt like when

11:17

I landed in Florida. I was like

11:19

oh this feels bad. And then

11:21

I... Yeah, it definitely does. Yeah, it's

11:24

funny because I grew up inhuman, dude.

11:26

Like you would, like, you try

11:28

to shake somebody's hand, you couldn't even

11:30

land it because it was just

11:32

not enough viscos, just too much Loub

11:35

in a handshake. You'd sit there

11:37

for 10 minutes trying to land a

11:39

handshake, you know? Everybody was always dripping,

11:41

you know, you'd get really humid

11:43

over there. So then I became, so

11:46

then I became a athlete. That

11:48

was like a big part of my

11:50

identity was being a athlete and

11:52

I would. And you have to wear

11:54

a vest, like what do you guys

11:57

do? Is there an out, is

11:59

there a uniform for that or? Well

12:01

you, you need a, you have

12:03

a calculator. Okay. Everyone had their favorite.

12:05

Okay, you got that drakeau on

12:07

you, baby. I feel you keep strapped.

12:10

Yeah, you stay strapped at that thing.

12:12

And then, but outside of that,

12:14

not really. I mean, like, everyone of

12:16

their favorite, like, is pencil. You

12:18

know, you're pencil, your calculator, that was

12:21

your gear. And, but yeah, no,

12:23

I was, I was like a nationally

12:25

ranked math-like, is there... Wow, that's crazy

12:27

dude. So you go to what

12:29

other competitions you go to with that?

12:32

There's competitions, yeah. There's like, there's

12:34

like state level competitions, there's national level

12:36

competitions, there's like these summer camps,

12:38

math camp, I went to math camp

12:40

a bunch of times. uh... where you

12:43

were you convene with like-minded mathlets

12:45

okay wow fucking wizards in the park

12:47

uh... couple dudes fucking fighting over

12:49

a common denominator in the lobby that's

12:51

crazy bro but then you're but

12:53

just like any other sport like you're

12:56

like it's competitive yeah you gotta like

12:58

you gotta win and uh... and

13:00

so so you're chummy with everyone but

13:02

you're also like Like who's going

13:04

to win, you know? Yeah. No, dude,

13:07

competition's amazing, man. That's one thing,

13:09

too, that's so nice. I think about

13:11

when you're young, is like, if you're

13:14

involved in a sport or a

13:16

group or whatever it was, just that

13:18

chance to be competitive, you know?

13:20

Yeah. What were, like, some of the

13:22

competitions at the math thing, like,

13:24

what's a math competition, like, when you

13:27

get there? But once you get to

13:29

high school, it's just, you just

13:31

take a test. And you just, it's

13:33

like, the biggest one in America

13:35

is called the USMO. Bring it up.

13:38

USMO? USAMO? USAMO. OK. And it's

13:40

like, people all around the country take

13:42

this test. And there's a- United States

13:44

of American Mathematical Olympia. Yeah, there

13:46

you go. Okay, a participant must be

13:49

the US citizen or a legal

13:51

resident of the US. Okay, go on.

13:53

And then you, and then it's

13:55

a nine hour test. It's split over

13:57

two days, it's nine hours, it's nine

14:00

hours, it's four and a half

14:02

hours, and it's nine hours, and you

14:04

have six problems. So it's kind

14:06

of nerve racking, and you get in

14:08

there, you have four and a

14:10

half hours the first day, four and

14:13

a half hours the second day. Can

14:15

you cheat in between the days?

14:17

No, you because you only get three

14:19

questions the first day and then

14:21

you only get three questions the next

14:24

day and I remember the first

14:26

time I took it I was in

14:28

I think I was in eighth

14:30

grade first time I took it and

14:32

I like I was like so nervous

14:35

and I just I like brought

14:37

a thermus full of coffee and I

14:39

drank so much coffee that like

14:41

those four and a half hours felt

14:43

like like year it was like

14:45

I was so jittery the whole time

14:48

it was uh it was crazy damn

14:50

you out there rattling brother oh

14:52

integers make me rattle brother I feel

14:54

you know that dude so that's

14:56

pretty so so so you're competitive and

14:59

so you get out of there

15:01

and you're obviously I guess um admired

15:03

in the world of math probably or

15:05

that's like a thing that like

15:07

a that's like a pen that's a

15:10

feathering your cat so that helps

15:12

you get into MIT MIT right So

15:14

then, yeah, in MIT, on the

15:16

MIT admissions, they ask for your like

15:18

competitive math scores. So if you're, so,

15:21

so you knew a lot of

15:23

kids going there probably because it was

15:25

tons of kids. Yeah, yeah. Tons

15:27

of kids. It was like a reunion.

15:29

It was like math camp reunion.

15:31

Damn. Damn. Because I was wondering where

15:34

all y'all were at, dude, because we

15:36

were doing some other shit. And

15:38

so then, yeah, I was at MIT.

15:40

And then MIT is. like a

15:42

really intense school because they you know

15:45

the classes they don't fuck around

15:47

with they just really like they load

15:49

you up with tons of work and

15:51

most of the time you have

15:53

you you like they load you up

15:56

with like huge amounts of work

15:58

and they you know you you you

16:00

don't really know what's going on

16:02

initially and so you're just kind of

16:04

like just trying to make it through

16:07

but you know there's this motto

16:09

that that MIT has called ICTFP which

16:11

among the students stands for I

16:13

hate this fucking place oh yeah it's

16:15

heavy huh yeah but then but

16:17

then the school when you go they

16:20

tell you oh it stands for I've

16:22

truly found paradise so oh so

16:24

a couple differences of opinion of opinion

16:27

Yeah. Yeah. Damn. So it's so

16:29

it's really severe there. There's a lot

16:31

of work that loads you down

16:33

out of the gate. You do you

16:35

do a lot of work. But it's

16:38

it's kind of awesome. I mean,

16:40

because because I think the students really

16:42

band together like you like instead

16:44

of like I think instead of it

16:46

being competitive really, MIT is much

16:48

more about like everybody kind of like

16:51

coming together, working homework together and just

16:53

kind of like making through together.

16:55

What's that social life there? Like

16:57

are you dating? Are you going

16:59

to parties? What's that like for

17:01

you at that time? It's a

17:03

bunch of parties because people like

17:05

MIT, there's a lot of people

17:07

who like tinkering with gadgets. So

17:09

like tinkering with like, you know,

17:11

lights and big speakers and DJ

17:13

sets and all this stuff. So

17:15

actually the parties are pretty good

17:17

because they're like the production. value

17:19

is high, the production quality is

17:21

high. Damn! And what about when

17:24

the science kids come through, the

17:26

lab dogs, are they bringing, are

17:28

people making like unique drugs, was

17:30

there like designer drugs that are

17:32

being created by actual people, like

17:34

just, because, you know what I'm

17:36

saying, like my friends in college,

17:38

none of us would know how

17:40

to do that, but there may

17:42

be somebody at a... smart at

17:44

a more prestigious school that would

17:46

know how was that a thing

17:48

even or is that just there's

17:50

one there was one part of

17:52

the campus called East Campus where

17:54

it was like it was more

17:56

fringe and so there was like

17:58

at one point in the school

18:00

year they would in their courtyard

18:02

they would build a gigantic catapult

18:04

like a huge catapult like a

18:07

trebiche yeah what's the difference? I

18:09

don't know what the... Yeah, let's

18:11

get the difference here because people

18:13

need to know this anyway. People

18:15

have for decades now, people have

18:17

been. A Trebiche has like a

18:19

rope attached to the end of

18:21

it that flings it where Catapolt

18:23

just launches it. No, it was

18:25

a Catapolt then. Okay. Like a

18:27

big, because there's a big, like,

18:29

like ice cream scooper, looking thing

18:31

that would like, that would fling

18:33

it. And whether it's flinging Adderall

18:35

under the rest of the rest

18:37

of the campus. They would fling

18:39

stuff into the rest of the

18:41

Now that I'm thinking about, I

18:43

don't know what the, yeah, no,

18:45

these giant, this giant like catapult

18:47

things. Yeah. And so this was

18:49

like a event that would go

18:52

on and people would kind of

18:54

rave there? What are you saying?

18:56

Yeah, they would do this, they

18:58

would do other things. There were

19:00

like, they were into, they would

19:02

like build a lot of stuff

19:04

over there. And there would be.

19:06

Like people that ended up at

19:08

Burning Man later on. Yes, yes,

19:10

that was the burning, that was

19:12

core Burning Man, like there was

19:14

a, there was a satanic ritual

19:16

floor. Oh yeah. Yeah, like a

19:18

lot of, a lot of, like

19:20

it's fringe. Cool, it's cool. Right.

19:22

But, yeah, so there's all these

19:24

parties. We bragged at MIT that,

19:26

uh. you know, people from all

19:28

the neighboring schools, because Boston is

19:30

a huge college town, like tons

19:32

of tons. Boston's an amazing city.

19:34

Yeah, but no, MIT was fun,

19:37

but I was only there, I

19:39

was at MIT for one year.

19:41

Right, and you dropped out, is

19:43

that safe to say? Yeah. Okay,

19:45

you dropped out, and then you,

19:47

so you got into AI, into

19:49

the AI world, is that kind

19:51

of a safe bridge to say?

19:53

Yeah, yeah, yeah. Okay, and I

19:55

want to ask this, because, because

19:57

I know, because I know Mark

19:59

Zuckerberg, because, you know, forward thinking

20:01

that sort of world. Is there

20:03

something in college that you felt

20:05

like didn't nurture you or did

20:07

you just feel like this isn't

20:09

the place for me? Do you

20:11

feel like college doesn't nurture a

20:13

certain type of thinker or was

20:15

it just a personal choice? I

20:17

think for me it was like,

20:20

I was just feeling really impatient

20:22

and I don't really know why

20:24

really, but I remember like I

20:26

remember I was in school the

20:28

first year, it was really fun

20:30

and I really enjoyed it. But

20:32

then I remember, you know, this,

20:34

in the year when I was

20:36

at MIT was one of the

20:38

first, like, it was like one

20:40

of the early big moments in

20:42

AI, because it was, I don't

20:44

remember this, but there was an

20:46

AI that beat the world champion

20:48

at go. This was in 2015,

20:50

which when I was in college.

20:52

It's like a big checkerboard with

20:54

like white and black stones. It's

20:56

like, uh, and it was, uh,

20:58

yeah, this, this game, Alpha Go

21:00

versus Lisa Dole. So Alpha Go

21:02

versus Lisa Dole, also known as

21:05

the Deep Mind Challenge match was

21:07

a five game go match between

21:09

top go player Lisa Dole and

21:11

Alpha Go, a computer go program

21:13

developed by Deep Mind, played in

21:15

Seoul, South Korea, between nine. 9th

21:17

and 15th of March 2016. Alpha...

21:19

That's confusing how that's written. It

21:21

is very confusing on. You think

21:23

that... We got a... Alpha Go

21:25

won all but the fourth game.

21:27

All games were won by resignation.

21:29

The match has been compared with

21:31

the historic chess match between Deep

21:33

Blue and Gary Kasparov. Huh. The

21:35

winner of the match was stated

21:37

to win $1 million since Alpha

21:39

Go won Google Deep Mine stated

21:41

that the prize would be donated

21:43

to charities, including UNICEF and USAID.

21:45

That's just a joke. That's just...

21:47

But Lee received $150,000 for playing.

21:50

So this was a big moment

21:52

because this had never kind of

21:54

happened before? Never happened, yeah. And

21:56

it was a big moment for

21:58

AI. It was like, oh wow,

22:00

this stuff is like, it's really

22:02

happening. And so then this happened

22:04

in March, and I guess, yeah,

22:06

I dropped out, starting my company

22:08

in May. So I guess two

22:10

months after this, I was, yeah,

22:12

that's what the game looks like.

22:14

Oh, I've I've played this game

22:16

before. It's honestly. It's a really

22:18

like I'm not very good at

22:20

the game. It's a little more

22:22

fun than playing. Yeah, so unless

22:24

you're like, you know, in a

22:26

like Renaissance Fair Board games or

22:28

whatever. Yeah, yeah. Okay, so now

22:30

we got you, you're loose, dude,

22:33

you're out of the school, and

22:35

you're in the world, you see,

22:37

did that match, did realizing that

22:39

kind of like, spurn, you don't

22:41

wanna leave school? Or was that

22:43

just something that happened around the

22:45

same time? It kind of did,

22:47

basically, it was, it was, I

22:49

remember feeling like, oh wow, AI,

22:51

AI, it's happening. And this was

22:53

back in 2016, so like eight,

22:55

eight, nine years ago. Okay. And

22:57

then I felt like I had

22:59

to, you know, basically that, that

23:01

inspired me to start my company.

23:03

And I moved, I basically went

23:05

straight from, I remember, I flew

23:07

straight from Boston to San Francisco

23:09

and then started the company basically.

23:11

And that scale AI. Scale AI.

23:13

Okay. And so. Did you, had

23:15

you, had you been following AI,

23:18

like what are your kind of,

23:20

are you just like, knew like

23:22

this is where it's going, like

23:24

you just felt, there was something,

23:26

an instinct that you trusted, or

23:28

like, because that's a big thing

23:30

to do. I was stuck, so

23:32

I took all the AI classes

23:34

at MIT. Okay, so you already

23:36

learned a lot about it. Yep.

23:38

And then there was one class

23:40

where you had to, like, on

23:42

all the classes you had to

23:44

do side projects or final projects

23:46

of some kind. I wanted to

23:48

build a camera inside my refrigerator

23:50

that would tell me when my

23:52

roommates were stealing my food. Wang

23:54

boy, catching him! Wang boy! Wow!

23:56

Wow! And uh... But then, so

23:58

I worked on that and then

24:00

it was, there was one moment

24:03

where I was like, there was

24:05

a moment that clicked where I

24:07

was trying to build this thing

24:09

and then there was one step

24:11

that was like too easy. I

24:13

was like, whoa. that just worked

24:15

right there and then that happened

24:17

and then the the go match

24:19

happened and I was like this

24:21

this stuff is happening and so

24:23

I did you ever market those

24:25

refrigerates you ever actually create that

24:27

I didn't market them no I

24:29

could totally see that bro there's

24:31

a refrigerator every dorm has it

24:33

where there's a camera built in

24:35

and you just get you're in

24:37

you get a notification on your

24:39

phone you know you're like damn

24:41

add-nons got my hummus you know

24:43

but you got video of him

24:46

right there dude that's a great

24:48

idea yeah I love that was

24:50

that was that was college me

24:52

yeah okay so Alexander Wang he's

24:54

free in the world now he's

24:56

headed to San Francisco he's AI'd

24:58

up he feels the energy he's

25:00

motivated by some of the classes

25:02

he took he's motivated by some

25:04

of the classes he took he's

25:06

motivated by seeing that AI starting

25:08

to actually overtake humans, right, or

25:10

be able to compete with actual

25:12

human thinking with their chess match.

25:14

Yeah, I would, the way I

25:16

would think about it, or the

25:18

way I thought about the time

25:20

was like, this is, this is

25:22

becoming, you know, people have been

25:24

talking about AI for decades. Like,

25:26

it's kind of been always been

25:28

one of these things that people

25:31

have, have said, oh, it's gonna

25:33

happen, but it never really was

25:35

happening. And it was, you know,

25:37

it was really about artificial intelligence.

25:39

So I've always been like, um...

25:41

No, you have real intelligence, not

25:43

artificial, real intelligence. I don't, I

25:45

mean, I think it's a, it's

25:47

probably a mix, but I see

25:49

what you're saying, you know? I

25:51

do do, you know what I

25:53

thought of the other day, it

25:55

was like, what if they had

25:57

like a Mexican version, it was

25:59

like, hey, I... I don't know,

26:01

that's a good joke, but thank

26:03

you, this is a nice laugh.

26:05

This show is sponsored by Better,

26:07

better help. What are some of

26:09

your relationship green flags? That's a

26:11

good question. What are things you

26:13

notice in your relationship that are

26:16

like, yes, this is positive, this

26:18

is good, we're headed in the

26:20

right direction? We often hear about

26:22

the red flags we should avoid,

26:24

but what if we focus more

26:26

on looking for green flags in

26:28

friends and partners? If you're not

26:30

sure what they look like, therapy

26:32

can help you identify green flags,

26:34

actively practice them in your relationships,

26:36

and embody the green flag energy

26:38

yourself. I've personally benefited from therapy.

26:40

over the years. I think one

26:42

of the things that's helped me

26:44

with is just noticing some of

26:46

my patterns, noticing when I let

26:48

something small, turn into something big,

26:50

and being able to cut it

26:52

off at the past so it

26:54

doesn't happen anymore. Better help can

26:56

help you. Better help is fully

26:59

online, making therapy affordable and convenient,

27:01

serving over 5 million people worldwide.

27:03

Discover your relationships, green flags, with

27:05

better help. Visit better help.com/theo to

27:07

get 10% off your first month.

27:09

That's better help help.com/t.h.o. Are you

27:11

struggling to keep your website and

27:13

digital marketing up-to-date while running your

27:15

business? Now you can relax and

27:17

binge on unlimited web design services

27:19

with modify. Enjoy unlimited support. a

27:21

quick turnaround and your own designated

27:23

designer. Modify has done my website

27:25

for years and they're incredible. Sign

27:27

up with Modify today. They make

27:29

life easier and business better. Visit

27:31

modify.com/theo. That's M-O-D-I-P-H-Y-I-I-K-S-I-K-S-E-O for 50% off.

27:33

The last website you'll ever need.

27:35

That's modify.com/theo. What is AI? Yeah,

27:37

so so AI is all about,

27:39

you know, basically programming computers to

27:41

be able to start thinking like

27:44

humans. So, you know, traditional computer

27:46

programming, you know, it's pretty, it's

27:48

pretty bare bones. It's not very

27:50

smart. And so AI is all

27:52

about can you have, can you

27:54

build algorithms that start to be

27:56

able to. think like people and

27:58

and and replicate some of the

28:00

Like our brains are these incredible

28:02

incredible things, you know, and that's

28:04

that's evolution. That's just biology that

28:06

created our brains and so it's

28:08

all about how can we how

28:10

can we build something similar to

28:12

that or replicate it using using

28:14

computers and machines and so the

28:16

whole you know the the whole

28:18

modern AI era really started around

28:20

an area called computer vision, but

28:22

it was like how can we

28:24

first get computers to see like

28:27

humans do. So one of the

28:29

very first AI projects was this

28:31

thing called ImageNet. ImageNet and later

28:33

AlexNet. And it was basically, can

28:35

you get computer programs like given

28:37

a photo to tell you what's

28:39

in that photo? I see. So

28:41

just like a human would, like

28:43

if you showed them this. Yeah.

28:45

And you're starting to train them

28:47

to have a perspective. Yeah, train

28:49

them to, well actually, originally like,

28:51

you could, like, let's say you

28:53

took a photo of this, of

28:55

this bottle, a machine wouldn't even

28:57

be able to tell you what's

28:59

in the photo. It would just

29:01

know what the pixels were, but

29:03

it wouldn't be able to tell

29:05

you like, oh, there's a bottle

29:07

in that, in that photo. So,

29:09

you know, one of the first

29:12

AI breakthroughs was when YouTube built

29:14

an algorithm that could tell when

29:16

there were cats in their videos.

29:18

And that was like. this, you

29:20

know, in like 2012 or 2011,

29:22

this was like this mind-blowing breakthrough

29:24

that you could like, figure out,

29:26

you could like, use an algorithm

29:28

to figure out when there was

29:30

a cat inside a, inside a

29:32

video. And so, AI, it started

29:34

pretty simply with just how do

29:36

we replicate, you know, vision, like

29:38

how do you replicate like our,

29:40

basically the fact that our eyes

29:42

and our brains can process all

29:44

this imagery coming in. And that

29:46

really led to, I think one

29:48

of the first major major, use

29:50

cases of AI, which is self-driving

29:52

cars. So this was, when I

29:54

started the company in 2016. self-driving

29:57

cars were all the rage because

29:59

you know it was you know

30:01

there were all these like skunkworks

30:03

projects when you started scale AI

30:05

you mean yeah when you started

30:07

scale yeah when you when you

30:09

kind of got into AI yep

30:11

at that time self-driving cars are

30:13

the most popular things yeah yeah

30:15

that was all the rage and

30:17

so it was all about can

30:19

you know can you start building

30:21

algorithms that can drive a car

30:23

like a human would and do

30:25

it safely and do it you

30:27

know more efficiently and that way

30:29

major areas. And then now, you know,

30:31

the whole, the whole, the whole industry has

30:34

moved so fast, but then all of a

30:36

sudden we got chat GPT and we got,

30:38

you know, more advanced stuff more recently that,

30:40

that is able to talk like a human

30:43

or sort of think like a human. And

30:45

so it's really come pretty far recently, but

30:47

all of it is about how do you

30:49

build algorithms, how do you

30:51

use machines to be able to think

30:54

like a person. Okay. And is it

30:56

a pro, like say if I opened

30:58

a door, like, like, oh, we keep

31:00

the AI in there, is it a

31:02

computer, is it a program, is it

31:05

a hard drive? Like, what, like, what

31:07

is that? Yeah, yeah. So

31:09

there's two parts, there's two

31:11

parts to it. So the

31:13

first part is you need

31:15

really advanced chips. So you need

31:17

like, Like these, they're called GPUs or

31:20

sometimes called TPUs or, you know, there's

31:22

a lot of different words for it,

31:24

but you need like the most advanced

31:27

computer chips in the world. Okay. And

31:29

they, how big is each one, do

31:31

you know? Uh, like, or can you

31:34

measure it like that? They, I mean,

31:36

the biggest ones are actually like, like,

31:38

the whole chips are like, you know,

31:40

they're like a wafer kind of thing,

31:43

but then you, you, you put a

31:45

lot of them all of them all together.

31:47

these like these chips and the

31:50

biggest ones are are yeah exactly

31:52

that's a little one that's a

31:54

little one there's really big ones

31:56

okay but these are the so

31:59

this is the These are the brain

32:01

cells of it. Yep. These are the

32:03

brains behind it. Yeah. So then, yeah,

32:05

exactly. These are some, those are some

32:08

big ones. So then. So you have

32:10

to have a lot of these chips.

32:12

So you need a ton of these

32:15

chips. And that's, that's kind of the,

32:17

like, that's the physical presence. And then,

32:19

and then by the way, they take

32:21

a huge amount of energy. They're really,

32:24

because they have, you have to do

32:26

a lot of calculations. basically data centers

32:28

buildings full of like tons and tons

32:31

of those chips just in giant rows

32:33

like how big are we talking warehouses?

32:35

Yeah the biggest I mean like like

32:38

Elon's data center Colossus is like I

32:40

mean it's probably more than a million

32:42

it's definitely more than a million square

32:44

feet I mean it's like just huge

32:47

really yeah look up Colossus I've never

32:49

known this yeah yeah that yeah that

32:51

you see that building with the sunset

32:54

second row yeah there you know there

32:56

you know Or the one to the

32:58

left. Oh no. Yeah, that one. Yeah,

33:00

look, it's like a huge ass building.

33:03

What? It's huge, and all that's just

33:05

filled with chips. Have you ever been

33:07

in there? I haven't been in that

33:10

one, but I've been in some of

33:12

these, and it's just, yeah. And this

33:14

is what it looks like inside? Yeah,

33:17

basically. Yeah. So it's just rows and

33:19

rows of chips. No plants or anything.

33:21

No plants. It gets hot in there.

33:23

I bet. So the first part is

33:26

just that the that's the physical presence

33:28

and then the second part are the

33:30

algorithms so then you have like on

33:33

top of those chips you have you

33:35

have software that's running and the the

33:37

algorithms are just all are like what

33:39

you actually are telling what's the math

33:42

that you're telling to happen on the

33:44

chips and those algorithms are you know

33:46

some of the you know most complicated

33:49

algorithms that humans have ever come up

33:51

with and that's kind of the that's

33:53

kind of the software part or that's

33:56

kind of the that's the part that

33:58

like you know exists on the internet

34:00

or you can download or whatnot, and

34:02

then it has to be run on

34:05

these like huge warehouses of giant of

34:07

giant chips. Okay. So when someone goes

34:09

to like scale AI or chat GPT,

34:12

these are all AI interfaces or what

34:14

are they? Yeah, so so yeah, like

34:16

chat GPT is a is a way

34:18

to be able to talk to basically

34:21

you can talk to the algorithm. So

34:23

you can start interacting directly with the

34:25

algorithm, you can see how the algorithm

34:28

is thinking. So you could say to

34:30

this, can you describe the weather today?

34:32

Exactly. Yeah. And if you said that

34:35

to five different AI companies, or AI

34:37

companies, basically, or AI algorithms, different AI

34:39

systems, yeah. So if you said that's

34:41

five different AI systems, you might get

34:44

a little bit of a varied answer.

34:46

a little bit yeah okay you'll get

34:48

because they all are trying to have

34:51

their own style and have their own

34:53

vibe to it interesting and then what

34:55

we do what what scale AI is

34:58

all about is we've kind of built

35:00

the almost like the Uber of AI

35:02

okay so a lot of what we're

35:04

trying to do is how do we

35:07

so how do we help produce data

35:09

that is improving these algorithms and just

35:11

like how Uber there's okay you're losing

35:14

me there a little Yeah, yeah. It's

35:16

okay. But if I slow down, if

35:18

you're losing me, explain that to me

35:20

a little bit clear for me. Yeah.

35:23

So, so, okay, so with these algorithms,

35:25

one key ingredient for these algorithms is

35:27

data. So, okay, so you have the

35:30

chips and everything that are storing all

35:32

the information. Yep. They're storing the data.

35:34

And then you have the algorithms that

35:37

are helping mediate between the user and

35:39

the data. Yeah, so basically you kind

35:41

of have, yeah, you have three, you

35:43

have three key pieces. Okay. So you

35:46

have the, you have the, you have

35:48

the, you have the data, which is

35:50

like, just tons and tons of data

35:53

that, that's where the algorithms are learning

35:55

the patterns from. Okay. So these algorithms,

35:57

they aren't just like, they don't just

35:59

learn to talk randomly. They learn it

36:02

from learning to. talk from how humans

36:04

talk, right? So you need tons and

36:06

tons of data. And then you have

36:09

the algorithms which learn from all that

36:11

data, and then they run on top

36:13

of the chips. Got it. So then

36:16

one of the big challenges in the

36:18

industry is, okay, how are you going

36:20

to produce all this data? And so

36:22

this is how are you going to

36:25

get data for your? Yeah, system, like

36:27

how do you farm the best data?

36:29

How do you, exactly, how do you

36:32

build, how do you build all that

36:34

data? And how do you do that

36:36

in the most effective way? How do

36:38

you build new data? So clean data,

36:41

because what if you get a bunch

36:43

of data in there, there's just a

36:45

bunch of advertisements in bullshit, will that

36:48

affect the output? Yeah, that definitely affects

36:50

the output. So data, so data is,

36:52

you know, some people say like, like,

36:55

data is the new oil, it's how

36:57

the algorithms are learning everything they're learning.

36:59

Like anything that the algorithms know or

37:01

learn or say or do, all that

37:04

has to come from the data that

37:06

goes into it. Okay, so if I

37:08

ask the, if I ask a system,

37:11

an AI system, a question, or ask

37:13

it to help me with something, help

37:15

me to design something or to curate

37:17

an idea, it's gonna use the data

37:20

that it has within it to. respond

37:22

to me and help me and help

37:24

give me an answer that I can

37:27

use. And it's only, and the data

37:29

it has in it is only based

37:31

upon the data that is put into

37:34

it. Exactly. Yeah, yeah. Yeah. So then,

37:36

so yeah, it's kind of, so then

37:38

we don't, you know, we don't spend

37:40

enough time talking about where it is,

37:43

you know, how are you going to

37:45

get this data? How are you going

37:47

to keep making new data? So the

37:50

angle that we took at scale was

37:52

to kind of, turn this into an

37:54

opportunity for people. So we're, you know,

37:56

we're kind of like the Uber for

37:59

AI. So just like how Uber you

38:01

have, you know, riders and drivers, for

38:03

us, we have, you know, we have

38:06

the AI. systems, you know, the algorithms

38:08

that need data, and then we have

38:10

a community of people, a network of

38:13

people who help produce the data that

38:15

go into the system. They get paid

38:17

to do that. Oh, so they're almost

38:19

data farming, like creating good data? Creating

38:22

good data, exactly. And it's huge. It's

38:24

like, so we do this through our

38:26

platform called Outlier. people, contributors, we call

38:29

them, contributors on Outlier, earned about 500

38:31

million dollars total across everybody in the

38:33

U.S. that's across 9,000 different towns. And

38:36

so it created a lot of jobs,

38:38

a lot of jobs, okay. And so

38:40

what would, okay, so scale was your

38:42

company, it's an AI system. Yep. Is

38:45

that right? So we, yeah, I mean,

38:47

yeah, Scaly is an AI system. Yep.

38:49

And then Outlier. is a separate company

38:52

that works with it. Yep. And that

38:54

is where you are hiring people. We,

38:56

yeah, we, we basically. To pull in

38:58

data. Yeah, we, we build this platform

39:01

that anybody, you know, a lot of

39:03

people, frankly, all around the world, but

39:05

Americans too, can log on and, and

39:08

help build data that goes into the

39:10

algorithms and get paid to do so.

39:12

So how does a user do that?

39:15

Like what is an example of somebody

39:17

who's helping build data for an AI

39:19

database? Yeah, let's say you're a nurse.

39:21

Like you're a nurse with like tons

39:24

of tons of experience. So you know

39:26

a lot about how to take care

39:28

of people and take care of people

39:31

who are sick or have issues and

39:33

whatnot. And so you could log on

39:35

to the system and our platform and

39:37

you could see that the algorithm. is,

39:40

you know, let's say you ask the

39:42

algorithm like, hey, I have a, you

39:44

know, I have a pain in my,

39:47

in my stomach, what should I do?

39:49

And you notice that the algorithms. says

39:51

the wrong thing. Like the algorithm says,

39:54

oh, just, you know, hang out and,

39:56

you know, it'll go away. And you

39:58

know, as a nurse, like, that's wrong.

40:00

Like, I, you know, you have to,

40:03

you have to go to the emergency

40:05

room because you might have a appendicitis

40:07

or you might have, you know, you

40:10

might have something really bad. And so

40:12

you would, as a nurse, you would

40:14

go in and you'd basically, this continual

40:16

process of and there's versions of that

40:19

for whatever your expertise is or whatever

40:21

you know you know more about then

40:23

anything everything everything exactly so and so

40:26

people get paid for that yeah they

40:28

get paid and how do you know

40:30

if their information is valuable or not

40:33

well we so we don't want spam

40:35

obviously so we we we have a

40:37

lot of systems to make sure that

40:39

people aren't spamming and that like you're

40:42

saying it's not it's not you know

40:44

it's not garbage in that's going into

40:46

the into the algorithms so we have

40:49

you know we have kind of like

40:51

people check the work of other people

40:53

to make sure that the AI systems

40:55

are really good and we have some

40:58

like automated systems that that check this

41:00

stuff but uh but for the most

41:02

part it's it's like it's really broad

41:05

like we want experts in anything everything

41:07

shellfish train tracks whatever yeah everything. Childhood,

41:09

death or whatever. Yeah, totally. Stars, galaxies,

41:12

whatever. Yeah, animals. Yeah. Yeah. So, wow.

41:14

So it's kind of like your data

41:16

is almost like an ocean or a

41:18

body of water and you, different places

41:21

are going to be able to keep

41:23

their body of water cleaner or dirtier

41:25

and different infections could get in, different

41:28

spyware, all types of stuff. So, and

41:30

if you have really a clean body

41:32

of water, then you're going to be

41:34

able to offer a clean data or

41:37

a certain type of data to people

41:39

who are using your AI platform. Exactly.

41:41

Does that make sense or not? Yeah,

41:44

yeah, totally. And our job, like, how

41:46

do we make sure that this body

41:48

of water is as clean as possible

41:51

and... we fill it up as much

41:53

as possible that it has as much

41:55

information about everything across the globe. Wow,

41:57

so is there almost a race for

42:00

information right now in a weird way

42:02

or no? Is that not it? There

42:04

a little bit. Yeah, I think that

42:07

there's a well, there's a race for.

42:09

Like how are different AI systems competing

42:11

against each other? And sorry to interrupt

42:13

you. Yeah, no, no. So there's, it

42:16

goes back to the three things I

42:18

mentioned. So there's kind of like three

42:20

dimensions that they're all competing as one

42:23

other. chips. So who can, who is

42:25

the most advanced chips, who is the

42:27

biggest buildings of chips, like who is

42:30

the most chips that they're utilizing, data,

42:32

so the kind of body of water,

42:34

whose body of water is better, cleaner,

42:36

you know, healthiest, biggest, etc. And then

42:39

the last is algorithms. So who's, and

42:41

this is where the scientists really come

42:43

in. It's like, okay, who's coming up

42:46

with the cleverest algorithms or who has

42:48

like a trick on an algorithm that

42:50

somebody else doesn't have, like who's doing

42:53

that to basically make the AI learn

42:55

better off of the data that it

42:57

has. Wow, God, I'm in the future

42:59

right now. That's so wild. Man, it's

43:02

just, it's so crazy. And I think

43:04

AI scares people because the future scares

43:06

people, right? It's like, that's one of

43:09

the scariest things sometimes is the future.

43:11

So I think a lot of times

43:13

you associate, because a lot of some

43:15

people mention AI, there's a little bit

43:18

of fear it seems like from people,

43:20

it seems like, from people. Yeah. There's

43:22

fear that it's gonna take jobs, there's

43:25

fear that it's gonna take over our

43:27

ability to think for ourselves, right? And

43:29

not a lack of knowledge because you

43:32

didn't want to know just because you

43:34

don't know. Or that you're dumb, but

43:36

just because you don't know. What are

43:38

positive things that we're going to see

43:41

with AI, right? I want to start

43:43

there. Yeah. So I think first, like,

43:45

we don't, the AI industry, we don't

43:48

do the best job explaining this. And

43:50

I think some. Sometimes we make it

43:52

seem all sci-fi and genuinely we're part

43:54

of the problem in making it seem

43:57

scary, right? But one thing, for example,

43:59

is like, I think AI is actually

44:01

gonna create a ton of jobs and

44:04

that story is not told enough, but

44:06

you know. these jobs that we're producing

44:08

are this this sort of this opportunity

44:11

that we're providing on our platform outlier

44:13

like that's only gonna grow as AI

44:15

grows so because you have to have

44:17

new data you have to have new

44:20

data and the only place you can

44:22

get new data is from people will

44:24

at a certain point would the system

44:27

be able to create it can probably

44:29

matriculate data or matriculate is that with

44:31

birthing or what is that yeah it

44:33

just moves through like Okay, it can

44:36

probably like quantify or in and give

44:38

you answers but it can AI create

44:40

new data? No, so I think well

44:43

it can do a little bit of

44:45

that so it can AI can help

44:47

itself create its own data and and

44:50

help itself a little bit but ultimately

44:52

most of the progress is going to

44:54

come from you know people were able

44:56

to help. really the model get better

44:59

and smarter and more capable at all

45:01

these all these different areas. Yeah I

45:03

didn't see that part of it I

45:06

didn't understand that we are the ones

45:08

who are giving it information and since

45:10

we're going to continue to learn I

45:12

would assume that we would be able

45:15

to help it to help it learn.

45:17

Yeah and the world's going to keep

45:19

changing and we're going to need to

45:22

be able to keep teaching the algorithms

45:24

keep teaching the world's about how the

45:26

world's changing. So you know this is

45:29

actually a a big thing that I

45:31

think most people don't understand. The people

45:33

who are getting the opportunity now are

45:35

earning money from it, see it. But

45:38

as AI grows, there's actually going to

45:40

be tons of jobs created along the

45:42

way and tons of opportunity for people

45:45

to help improve AI systems or control

45:47

AI systems or overall sort of be

45:49

a part of the technology, not just

45:51

sort of disenfranchised by it. Okay, so

45:54

like yeah, so what were what are

45:56

you, do you feel like are other

45:58

ways like if you had to look

46:01

into the future a little bit, right?

46:03

So you have the fact that people

46:05

are going to be able to add

46:08

more data, right? And add nuances to

46:10

data, right? And probably humanize data a

46:12

little bit. Yeah, totally. And then you're

46:14

also going to have what I think

46:17

you're going to have a lot of,

46:19

a lot of jobs around, you know,

46:21

doing all these little things throughout the

46:24

world, who's gonna keep watch of those

46:26

AI? And who's gonna make sure that

46:28

those AI aren't doing something that we

46:30

don't want them to do? So almost

46:33

like managing the AIs and keeping watch

46:35

over all the AI systems, that's gonna

46:37

be another thing that we're gonna have

46:40

to do. And then. And then it's

46:42

just somebody to kind of guide the

46:44

river a little bit. Yeah, at certain

46:47

point, guide the stream, stay in there,

46:49

watch, make sure that answers are correct,

46:51

make sure the information is honest. Yeah,

46:53

yeah, like, like I think, for example,

46:56

you know, we're not gonna just have

46:58

AIs going around and, you know, you

47:00

know, buying stuff and doing crazy things

47:03

and like, you know, we're gonna, we're

47:05

gonna keep it controlled, right? Like as

47:07

a society, I think we're gonna keep

47:10

it controlled as a technology. And I

47:12

think there's gonna be a lot of

47:14

jobs for people to make sure that

47:16

the AI doesn't go out and do

47:19

crazy things that we don't want it

47:21

to do. Right so we want to

47:23

be so you're gonna need managers you're

47:26

gonna need facilitators. Yeah exactly what are

47:28

things that AI will alleviate like what

47:30

are things that like will it eventually

47:32

be able to have enough information like

47:35

our data where where it can like

47:37

cure diseases and stuff like that like

47:39

is that a realistic thing? Yeah that's

47:42

super real like cancer even yeah cancer

47:44

yeah heart disease like all these all

47:46

these diseases Yeah. Hands are on his

47:49

heels. But, but. around us here I

47:51

think there's still some smoke in the

47:53

air no but seriously I think that

47:55

AI one thing that we've seen which

47:58

is this is kind of wild but

48:00

AI understands like molecules and biology better

48:02

than humans do actually because it's like

48:05

like there's this there's this thing in

48:07

AI where you know it used to

48:09

take a like a PhD biologist like

48:11

you know five years to do something

48:14

that the AI can can just do

48:16

in you know a few minutes right

48:18

and that's because the like just the

48:21

way that molecules and biology and all

48:23

that works is something that that AI

48:25

happens to be really good at and

48:28

that's gonna help us ultimately cure diseases

48:30

find you know pharmaceuticals or other treatments

48:32

for these diseases and ultimately help humans

48:34

live longer. Because that's very data driven,

48:37

right? Like it's very specific, it's very

48:39

mathematic. Exactly. Yeah, yeah. So it's going

48:41

to be a huge tool for us

48:44

to cure disease, for us to help

48:46

educate people, for us to, you know,

48:48

there's a lot of really exciting uses

48:50

for AI. But I think the kind

48:53

of, I think the thing that, um,

48:55

will touch most people in their lives

48:57

is it's really gonna be a like

49:00

a tool that will help you You

49:02

know make all of your sort of

49:04

Make all your dreams kind of become

49:07

reality if that makes sense. So so

49:09

I think one of the things that

49:11

AI is gonna be really awesome for

49:13

is like you know today if I

49:16

I have like a million ideas, right?

49:18

I have like you know thousands and

49:20

thousands of ideas and I only have

49:23

so much time so I can only

49:25

really do you know a few of

49:27

them at a time. And most of

49:29

the ideas just go to die, right?

49:32

Yeah, they do, huh? Yeah, it's a

49:34

bummer. It's a huge bummer. Yeah. And

49:36

I think a lot of people, you

49:39

know, for whatever reason, they may have

49:41

some of the best ideas ever, but

49:43

they just, you know, they're too busy

49:46

or they have like other shit going

49:48

on in their lives, they can't make

49:50

those ideas happen. And it's great with

49:52

sometimes when people are able to make

49:55

the leaps and make them happen and

49:57

devote themselves to their dreams, but that

49:59

doesn't happen enough today. And one of

50:02

the things that AI is going to

50:04

help us do, is it's going to

50:06

help us turn these ideas into reality

50:08

much more easily. So, you know, you

50:11

can, like, you know, you're making a

50:13

movie. Let's say you have another movie

50:15

idea. ultimately I think you'll be able

50:18

to tell an AI hey I have

50:20

this idea for a movie what could

50:22

that look like you know maybe draft

50:25

up a script also who are the

50:27

people who can help you know fund

50:29

this idea like who those people be

50:31

can help reach out to them and

50:34

then like you know who should we

50:36

cast in it basically help make the

50:38

whole thing a you know instead of

50:41

those like daunting thing that these big

50:43

projects are usually so daunting you really

50:45

don't know where to get started you

50:47

kind of need a person to help

50:50

you get through them instead of that

50:52

AI will help you get through it

50:54

and like help do a lot of

50:57

the the sort of less glamorous work

50:59

to make them a reality. Wow. So

51:01

I could say for example like like

51:04

AI I would like to shoot maybe

51:06

I'm thinking about creating an idea or

51:08

shooting a film in this area or

51:10

it's like this it's gonna take place

51:13

in this type of a place I

51:15

could give it like a Outline of

51:17

the characters like what they look like

51:20

their ages and some description could you

51:22

help give me? Possible potential actors or

51:24

something within a certain price range that

51:27

I can maybe cast for that Yeah,

51:29

could you help give me like locations

51:31

around the country that would fit that

51:33

backdrop? Yep. Could you? List me all

51:36

the talent agencies that I could reach

51:38

out to and you can kind of

51:40

just put those things in and then

51:43

you would have Sort of a a

51:45

a bit of a guidebook at that

51:47

point that would make your yeah, what

51:49

before is something that felt extremely daunting

51:52

Yeah, shit in two minutes, you know,

51:54

and then you put it in the

51:56

AI gives you information back in a

51:59

few minutes and maybe you're... And I

52:01

think I think over time it'll also

52:03

be able to start doing a lot

52:06

of the legwork for you. So it'll

52:08

be able to reach out to people

52:10

for you. It'll be able to you

52:12

know figure out the logistics. It'll be

52:15

able to to book things for you.

52:17

Like it'll be able to basically help

52:19

do all the legwork to make it

52:21

make you know whatever it is into

52:23

a reality. So very much an assistant

52:25

in the eye world as agents but

52:27

you know it's just... It'll be something

52:30

that will help you, you know, humans

52:32

are going to be in control.

52:34

Humans are ultimately going to be

52:36

telling the AIs what they want it to do.

52:38

And then, you know, hey, what do we

52:40

want these AIs do? It's ultimately going

52:42

to be to like help us execute

52:44

on and accomplish all these ideas that

52:46

we have. So a genius could be three

52:49

or four X. If somebody's like a

52:51

genius in something in some realm or

52:53

space of thought, you could three or

52:55

four X them because they like multiply

52:57

their output. from their own brain because

52:59

they could have something really helping them

53:01

get done a lot of the like

53:03

the early work on things and maybe

53:05

some of the most severe work. Yeah totally.

53:07

Like I think one thing that I always

53:09

feel like kind of sucks is that if you

53:11

have a director you really like they're only

53:14

going to make a movie once every couple

53:16

of years. So even if you have a

53:18

director that you like think is amazing. You

53:20

know they just it's hard for them

53:22

to make that many movies like because

53:25

it just takes so much time and

53:27

effort and you know there's somebody bottlenecks

53:29

and stuff so in a future with you

53:31

know more advanced AI systems they could they

53:33

could just churn them out. and they can

53:36

make so many of their ideas into reality.

53:38

And I think that's true not only in

53:40

creative areas, but it's kind of true across

53:42

the board. Like, you know, you can start

53:44

new businesses more easily. You can, you know,

53:46

you can make various creative projects that you

53:49

have happened more easily. You can make like,

53:51

you can, you can finally plan that event

53:53

that you and your friends have been talking

53:55

about for, you know, years and years. So,

53:57

it can just like, really help you, you

53:59

know. we think about was like giving humans

54:02

more agency giving humans more sort of

54:04

sovereignty and just and just enabling humans

54:06

to get way more done. Yeah that's

54:08

a great way I like some of

54:10

this thought because yeah I could be

54:13

like my fantasy football group and I

54:15

we do a we do a draft

54:17

every year in a different location you

54:19

know and shout out PA that's our

54:22

fantasy league Jayrod everybody that's in it

54:24

for the past 17 years we've flown

54:26

to a city each year and done

54:28

a draft in person right. But I

54:30

could say hey We have 10 guys

54:33

we want to go to a nice

54:35

place. We want there to be a

54:37

beach This is the age group of

54:39

our group You know these would kind

54:42

of be the nights and you know

54:44

just to really give me like a

54:46

Just a nice plan of hey, here's

54:48

ten possibilities right something like that and

54:50

then even more like with a movie

54:53

This is what I were you'd run

54:55

into say if I'd be like okay.

54:57

I have two main characters and this

54:59

is kind of what I would like

55:01

to happen. Could you help me with

55:04

a first act? of a three-act movie.

55:06

How do you know everybody just doesn't

55:08

get the same movie then? Like that's

55:10

what I would start to worry that

55:13

everything that you have at Creative is

55:15

all gonna be very similar. Yeah, so

55:17

then I think this is where it

55:19

comes into like, this is where human

55:21

creativity is gonna matter, because it's gonna

55:24

be about, then it's about like, okay,

55:26

what is the, how am I, you

55:28

know, How am I directing the AI

55:30

system? Like what are my, what are

55:33

the tricks I have to make the

55:35

AI give me something that's different and

55:37

unique? And that's not different at all

55:39

from, you know, how creative stuff works

55:41

today, like even on social media or

55:44

anywhere, you know this. Like, yeah, you

55:46

still had your own spice to it.

55:48

You need, yeah, you need an angle,

55:50

you always need to like have something

55:52

that's gonna make it like different and

55:55

interesting and new. So that's not gonna

55:57

change. Like, humans are always gonna have

55:59

to figure out to figure out to

56:01

figure out. based on where culture is

56:04

at, based on where, you know, what

56:06

the dialogue is, what the, what the

56:08

discourse is, all that kind of stuff.

56:10

What is my fresh take? That's where,

56:12

that's really one of the key things

56:15

that humans are going to have to

56:17

keep doing no matter what. Right. And

56:19

so. Some of the, like a lot

56:21

of films and books, a lot of

56:24

it is, there's just like a problem,

56:26

there's maybe a information you learn, there's

56:28

a red herring, and then there's a

56:30

solution. Like, that's a lot of stories,

56:32

right? So, if something just gave you

56:35

the basis and then you go through

56:37

and make everything your own, because a

56:39

lot of things, we don't, there's only

56:41

so many templates for things. So say

56:43

for example, say you might need to

56:46

hire people at a company then that

56:48

would help direct your AI, like somebody

56:50

who's good at managing AI, and giving

56:52

it the best prompts or the best

56:55

way to ask it questions, to get

56:57

the perfect feedback for your company. So

56:59

those would be new jobs, those would

57:01

be actual new people you would need.

57:03

Yeah, so tons of new drops, well

57:06

first I think just like helping to.

57:08

you know, kind of what we're talking

57:10

about before. These jobs around helping to

57:12

improve the data and contribute to the

57:15

AI systems, that's just going to keep

57:17

growing for a long, long time. And

57:19

then as AI gets better and it

57:21

gets used in more areas, then there

57:23

are going to be a lot of

57:26

jobs that pop up just exactly as

57:28

you're saying, which is how do you,

57:30

what's the best way to use this

57:32

as a tool, what's the best way

57:34

to leverage it to, you know, actually

57:37

make some of these applications or you

57:39

know, whatever you want to build a

57:41

reality. And then there's going to be

57:43

folks who need to, once those are

57:46

built, like, how do you keep improving

57:48

it? How do you keep making it

57:50

better? How do you keep making it

57:52

better? How do you keep, how do

57:54

you keep it fresh? How do you

57:57

keep it, keep it going? And then

57:59

how do you also make sure it

58:01

doesn't do anything bad, right? Like how

58:03

do you make sure that the AI

58:06

doesn't accidentally spam a million people, a

58:08

million people or whatever it's. like operating

58:10

in a good way. Fahim and more

58:12

I was watching him bring up a

58:14

picture of Fahim. This is one of

58:17

the funny this guy this is the

58:19

most cre comedian in America. Undeniable. He

58:21

is so funny. He had, and everybody

58:23

would say, he's, he's one of the

58:26

few comedians that everybody goes in there

58:28

to watch him perform. He had a

58:30

bit the other night, he talks about

58:32

he got into a Waymo, right? Yeah.

58:34

A car, and I see so many

58:37

Waymo's now, which are cars that are

58:39

just, nobody's in them, you know, but

58:41

they're going, right? He had this bid,

58:43

he's like he got in Owemo and

58:45

it started complaining about its life to

58:48

him. He's like, I've had a shitty

58:50

week, like the car's just talking to

58:52

him and he's like, now I, no

58:54

matter what, I still have to talk

58:57

to my shitty driver. If you get

58:59

a chance to see him though, that

59:01

guy is, he's fascinating. But like, so

59:03

what companies right now should kind of

59:05

look to hire an AI guy? Because

59:08

we've been thinking about it like, like,

59:10

We had Cat Williams on a last

59:12

week and we're like, hey, can you

59:14

create visuals of Sugnight and Cat Williams

59:17

riding bicycles down Sunset Boulevard, right? And

59:19

this is the one they sent back

59:21

a little while later. There you go.

59:23

Christopher Follino is the guy's name, F-O-I.

59:25

And this is just, this is right

59:28

where I say, by the comedy store

59:30

on Sunset Boulevard. There you go. I

59:32

mean, this looks like it's out of

59:34

a movie kind of. Yeah, totally. I

59:36

mean, and the guy did that in

59:39

a little, in just a little bit

59:41

of time. What types of people would

59:43

you get to, I mean, this is,

59:45

that's you. Yeah. If I was healthier

59:48

if I had healthier gums too. But

59:50

what about this? What kind of, like

59:52

what companies right now, what job spaces,

59:54

because we want to get an AI

59:56

guy, right? But I don't really, my

59:59

brain is like, well, what do they

1:00:01

do? How do I, how would, I

1:00:03

keep them, I keep them busy? you

1:00:05

know I could get them to make

1:00:08

some animations and ideas but what type

1:00:10

of people need an AI person right

1:00:12

now do you feel like? I kind

1:00:14

of think about AI it's kind of

1:00:16

like the internet where you know eventually

1:00:19

everybody's going to need to figure out

1:00:21

how to how to utilize it how

1:00:23

to best how to best use it

1:00:25

for their industry or whatever they do

1:00:27

how to make it them more efficient

1:00:30

like it's something that I think everybody

1:00:32

is going to is going to need

1:00:34

to adopt at some point so you

1:00:36

know might as well start earlier because

1:00:39

eventually just like how you know every

1:00:41

Basically, every company has to figure out

1:00:43

how to use the internet well and

1:00:45

how to be smart about, you know,

1:00:47

the internet and digital stuff. Every company

1:00:50

is going to have to be smart

1:00:52

about AI, how to use AI, how

1:00:54

to make, how to have a unique

1:00:56

twist on it so that, you know,

1:00:59

their stuff stands out relative to other

1:01:01

people's. That's going to be really important.

1:01:03

But so we see, I mean, in

1:01:05

our work, you know, we work with

1:01:07

all these. all these big companies in

1:01:10

America and we see it everywhere from

1:01:12

you know we worked at Time magazine

1:01:14

on some stuff and then we worked

1:01:16

with Toyota on some stuff for their

1:01:19

cars and we worked with you know

1:01:21

large pharmaceutical companies for the biology stuff

1:01:23

we were talking about large hospitals for

1:01:25

you know helping to treat patients like

1:01:27

really it's across the board and I

1:01:30

think that goes for you know, obviously

1:01:32

these like really big businesses, but also

1:01:34

for smaller businesses, you know, there's always

1:01:36

interesting ways to utilize it to to

1:01:38

provide a better product or a better

1:01:41

experience or better content for, you know,

1:01:43

whoever you need to do that for.

1:01:45

Yeah, because I guess right now we're

1:01:47

like there's certain moments like hey, let's

1:01:50

animate this moment or see what AI

1:01:52

would look like it adds some visual

1:01:54

effects to some of our episodes So

1:01:56

that's something we like to do to

1:01:58

just be fun and creative I would

1:02:01

like to maybe create like some sort

1:02:03

of an animated character We already have

1:02:05

a great animator and we want to

1:02:07

keep that but to have an AI

1:02:10

space where it's like, you know, because

1:02:12

they have something they had a little

1:02:14

cat the other day or something and

1:02:16

he was going to war And I

1:02:18

was like damn dude, this is it

1:02:21

was AI, you know, totally and they

1:02:23

had a baby was getting in a

1:02:25

taxi And I was like this shit

1:02:27

is illegal, you know, I don't know

1:02:29

if it's illegal or not, but it

1:02:32

seems You know, it doesn't seem it,

1:02:34

you know, it's definitely a tool for

1:02:36

storytellers, right? Like it's right, it's it'll

1:02:38

help people with a creative vision or

1:02:41

or one war cats. Yeah, that's a

1:02:43

story right there special forces cats Would

1:02:45

YouTube recognize this? Damn, brother, if they

1:02:47

show up, dude. Wow. This honestly, it

1:02:49

looks, it looks bad ass. And it

1:02:52

looks like cats have been waiting to

1:02:54

do this a long time. Yeah. That's

1:02:56

the crazy shit about cats. You look

1:02:58

in their eyes. Now their outfits finally

1:03:01

match the look in their eyes. That's

1:03:03

what it feels like, dude. That's, uh,

1:03:05

that's, Fremen, Fremen cats. Oh my gosh,

1:03:07

yeah. So that's gonna get alarming, dude.

1:03:09

That's gonna get hella alarming. That's a

1:03:12

movie right there. It is, but it's

1:03:14

almost like you could make, like, that's

1:03:16

what I wanna get. I wanna get

1:03:18

somebody to help us think, hey, don't

1:03:20

make these little segments that we can

1:03:23

take our creativity. And instead of me

1:03:25

thinking, man, I gotta write this huge

1:03:27

crazy script for just kind of small

1:03:29

things, you know, little moments. Yeah. Can

1:03:32

you help me make this happen make

1:03:34

this happen? AI will just be something

1:03:36

that we turn to to help make

1:03:38

our dreams happen like we help make

1:03:40

our ideas and our dreams and our

1:03:43

you know whatever we want to do

1:03:45

happen more easily got it that's like

1:03:47

at the core what it'll be yeah

1:03:49

who's the who's the current leader in

1:03:52

AI development like is it America is

1:03:54

it China is it is real is

1:03:56

it trying to think of another super

1:03:58

power Russia maybe or yeah who is

1:04:00

it Taiwan I know has a lot

1:04:03

of the chips yeah and they manufacture

1:04:05

a lot of the chips over there

1:04:07

and does it matter what country leads

1:04:09

in AI or does it just matter

1:04:11

the company like scale or another AI

1:04:14

company so yeah it's so today America

1:04:16

is in the lead but But China

1:04:18

as a country is sort of hot

1:04:20

on our tails. Like there was all

1:04:23

that news about Deep Seek a couple

1:04:25

weeks ago. And Deep Seek still in

1:04:27

most places around the world is the

1:04:29

number one most downloaded app. You know,

1:04:31

it's downloaded a ton. you know everywhere

1:04:34

around the world frankly and is a

1:04:36

Chinese AI system right and so it's

1:04:38

starting to rival a lot of the

1:04:40

American AI systems also because it's free

1:04:43

and you know it uh it kind

1:04:45

of like shocked the world so right

1:04:47

now if you kind of look at

1:04:49

it US and China are are a

1:04:51

little bit neck and neck maybe the

1:04:54

US and America's like a little bit

1:04:56

ahead and you kind of like look

1:04:58

at If you go back to each

1:05:00

of the three pieces that I talked

1:05:03

about, so the chips and the computational

1:05:05

power, the data, and the algorithms, if

1:05:07

you were to rack and stack US

1:05:09

versus China on each one of those,

1:05:11

we're probably, we're ahead on the computational

1:05:14

power because the United States is the

1:05:16

leader at developing the chips and most

1:05:18

of the most advanced chips are American

1:05:20

chips. They probably beat us out on

1:05:22

data. because China, they've been investing into

1:05:25

data for a very long time as

1:05:27

a country. And then on algorithms, we're

1:05:29

basically neck and neck. So it's a

1:05:31

pretty tight race. And to your question

1:05:34

about, does it matter? Or what does

1:05:36

this mean? I think it's actually going

1:05:38

to be one of the most important,

1:05:40

you know. questions or most important races

1:05:42

over time is, is it US or

1:05:45

Chinese AI that wins? Because, you know,

1:05:47

AI is more than just being a

1:05:49

tool that, that, you know, we can

1:05:51

all use to make our, you know,

1:05:54

build whatever we want to or make

1:05:56

whatever ideas we want to happen. It's

1:05:58

also, you know, it's a, it's a

1:06:00

cultural staple, right, you know, if you

1:06:02

talk to an AI, that AI is

1:06:05

kind of a reflection of. our culture

1:06:07

and our values and all that stuff.

1:06:09

So in America, we value free speech

1:06:11

and, you know, the AIs are, you

1:06:13

know, are built to support that. Whereas

1:06:16

in, in China, there's, there isn't free

1:06:18

speech. And so, you know, if, if

1:06:20

the Chinese AIs are the ones that

1:06:22

take over the world, then all these

1:06:25

Chinese ideologies are going to become exported

1:06:27

all around the world. And so first

1:06:29

is there's a couple dimensions here that

1:06:31

I think matter. So first is just

1:06:33

the cultural element, which is like, do

1:06:36

we want kind of democracy and free

1:06:38

speech to be the cultural AI that

1:06:40

wins? Or do we want sort of

1:06:42

the more, you know, frankly, totalitarian AIs

1:06:45

in China to be the ones that

1:06:47

win? And then there's sort of the

1:06:49

There's like the, you know, you start

1:06:51

getting to economically. So AI is going

1:06:53

to be something that helps all the

1:06:56

companies in the United States thrive. And

1:06:58

so if the USAI wins, then we're

1:07:00

going to, you know, the economy will

1:07:02

grow faster. We're going to have more

1:07:04

and more opportunity. You know, the country

1:07:07

will still be better and better and

1:07:09

better and better. And the economy will

1:07:11

keep growing. We're versus if Chinese AI

1:07:13

wins, then Chinese economy is going to

1:07:16

grow way faster than the American economy.

1:07:18

the economic piece. And then lastly, there's

1:07:20

the, there's kind of the warfare piece,

1:07:22

right? And, you know, AI, we haven't

1:07:24

really talked about it, but has clear

1:07:27

potential to be used as a military

1:07:29

technology. And we don't want, you know,

1:07:31

we don't want another country to have,

1:07:33

because they have better AI to have

1:07:36

a much stronger military than, and then

1:07:38

America's. So, like, how would they do

1:07:40

that? How would they have a better

1:07:42

AI or how they use it to?

1:07:44

have a better military. Yeah, how would

1:07:47

they use it to have a better

1:07:49

military? Like why is that kind of

1:07:51

a concern or potential concern? Yeah, so

1:07:53

one of the things that's been happening

1:07:55

over the past, you know, decade for

1:07:58

sure is lots of hacking, cyber hacking

1:08:00

going on. So, you know, in America,

1:08:02

even recently, we had this huge cyber

1:08:04

hack called salt typhoon, where the Chinese

1:08:07

hacked are... telecommunications companies so hacked for

1:08:09

the phone companies damn they did yeah

1:08:11

and they got it and they got

1:08:13

all sorts of crazy data as a

1:08:15

result of that. Oh, they know I'm

1:08:18

a pervert, I'll tell you that. Yeah,

1:08:20

look at this, this happened in 2020.

1:08:22

Salt typhoon is widely understood to be

1:08:24

operated by China's ministry, state of security.

1:08:27

It's foreign intelligence service and secret police.

1:08:29

Chinese embassy denied all allegations saying it

1:08:31

was unfattening irresponsible smears and slanders. High

1:08:33

profile cyber espionage. In 2024, U.S. officials

1:08:35

announced that hackers affiliated with Salt typhoon

1:08:38

had accessed the computer systems of nine

1:08:40

U.S. telecommunications companies later acknowledged to include

1:08:42

Verizon, AT&T, T-Mobile, Spectrum, Lumen, Consolidated Communications,

1:08:44

and Winstream. The hackers were able to

1:08:47

access metadata of users' calls and text

1:08:49

messages. Fuck! Oh, we're fucked, I am.

1:08:51

You seem good. Including date and time

1:08:53

stamps, source and destination IP addresses. Ah

1:08:55

shit, keep going, keep going. And phone

1:08:58

numbers, some of our million users, most

1:09:00

of which were located in Washington DC,

1:09:02

good. Light them up, dude. In some

1:09:04

cases, the hackers are able to obtain

1:09:06

audio recordings, a telephone calls made by

1:09:09

high profile individuals. Such individuals reportedly included

1:09:11

staff of the Kamalaris 2024 presidential campaign

1:09:13

as well as phones belonging to Donald

1:09:15

Trump and JD Vance According to Deputy

1:09:18

National Security Advisor and Newberger a large

1:09:20

number of the individuals whose data was

1:09:22

directly accessed were government targets of interest.

1:09:24

Wow. Yeah, that's free. So do you

1:09:26

think this also that that whole thing

1:09:29

could be not real and it's just

1:09:31

a story that was created? That

1:09:33

seems pretty real, because there's real,

1:09:35

like, I mean, there's like 20

1:09:37

stories where the Chinese have hacked

1:09:39

American systems. Like they hacked, this

1:09:41

was, this must have been close

1:09:43

to 10 years ago now, but

1:09:46

the Chinese hacked the database in

1:09:48

America that stored all of the

1:09:50

clearances. So they hacked in, they

1:09:52

managed to hack into knowing who

1:09:54

are literally. all of the Americans

1:09:56

who have security clearance. Oh, security

1:09:58

clearance. Yeah, so I thought you

1:10:00

know what was on sale. I

1:10:02

was like, who cares? That's great

1:10:04

though. Damn, oh damn. So they

1:10:06

knew everybody who had, who knew,

1:10:08

who knew, who knew, who knew

1:10:10

information, who knew secrets. Yeah. So

1:10:12

once they knew that, then they

1:10:14

know, well, that's a great point

1:10:16

of operation to go then. Well,

1:10:18

now let's get, find their data.

1:10:20

They can just hack all of

1:10:22

them. Yeah. So, so, so, so

1:10:24

they, so they, so they, so

1:10:26

they, so they, so already. China

1:10:28

is hacking the shit out of

1:10:30

America. That is definitely not an

1:10:32

understatement. Yeah, it's exciting kind of.

1:10:34

I mean, it's unfortunate, but it's

1:10:36

also exciting. I like some espionage,

1:10:38

you know, I can't sleep unless

1:10:40

somebody's fucking really going through it.

1:10:42

Um, but then, but then, but

1:10:44

that's a real AI thing. So

1:10:46

AI can be used to do

1:10:48

that because you can, like, prompt

1:10:50

it to go and do things

1:10:52

like that. Yeah, yeah, there's a

1:10:54

bunch of a bunch of recent

1:10:56

demonstrations where AI. Just like how

1:10:58

in go, how AI beat the

1:11:00

world's best go players, AI starting

1:11:02

to beat the world's best cyber

1:11:04

hackers and the world's best. Yeah,

1:11:06

so it's a, I don't know

1:11:08

if you saw Mr. Robot. I

1:11:10

didn't, but I didn't with Magnus

1:11:12

Carlson. No, that's cool. Pretty cool.

1:11:14

So, but no, Mr. Robot, is

1:11:16

it good? It's just it it

1:11:18

shows like all this hacking stuff

1:11:20

and like you know cool it

1:11:22

makes it seem really cool. Okay

1:11:24

cool I'm gonna check it out

1:11:26

but but yeah no hacking like

1:11:28

like you're gonna have AI that

1:11:30

are hacking everything in America and

1:11:32

this is one place where this

1:11:34

is like US First China will

1:11:36

be really come to life which

1:11:38

is who has better AI that's

1:11:40

better at defending against the hacks

1:11:42

from the other guy as well

1:11:44

as hacking the other the other

1:11:46

guy systems that's gonna be That'll

1:11:48

just start happening. That's basically starting

1:11:50

to happen right now. Or it's,

1:11:52

you know, cyber warfare has been

1:11:55

happening. And then AI cyber warfare

1:11:57

is going to hurt happening. Yeah,

1:11:59

basically as soon as, you know,

1:12:01

as AI gets better. Yeah, we

1:12:03

had Craig Newmark who created Craig's

1:12:05

list on. Yeah. And he was

1:12:07

talking about how what if. they

1:12:09

hacked like everybody's Tesla's to all

1:12:11

just drive off a cliff one

1:12:13

day or they hacked everybody's ovens

1:12:15

to go up to 400 degrees

1:12:17

in the middle of the night

1:12:19

while you're sleeping and then fires

1:12:21

started where like just things like

1:12:23

that that you don't start to

1:12:25

think about that once something is

1:12:27

connected to the grid or something

1:12:29

like that are connected through routers

1:12:31

and Wi-Fi that that could be

1:12:33

feasible. Yeah no it's there's a

1:12:35

lot of there's a lot of

1:12:37

things they could do that won't

1:12:39

even like seem like that big

1:12:41

a deal at the deal at

1:12:43

the time but could be really

1:12:45

really could be a big deal.

1:12:47

So for example, let's say the

1:12:49

Chinese, like, they just took out

1:12:51

all the military, like communication systems

1:12:53

and all the military software systems,

1:12:55

like took out the satellites, took

1:12:57

all that for like 10 minutes.

1:12:59

And in those 10 minutes, they

1:13:01

like, you know, invaded somewhere or

1:13:03

they like did some crazy thing.

1:13:05

Like they can just, there's, there's,

1:13:07

the thing about, about this stuff

1:13:09

is like, everything at you know

1:13:11

as the world become more connected

1:13:13

it also enables you know different

1:13:15

kinds of warfare so cyber warfare

1:13:17

is really big also like the

1:13:19

you know information warfare is another

1:13:21

big one so what does information

1:13:23

warfare mean so this is information

1:13:25

warfare is all about you know

1:13:27

in a place what is what

1:13:29

are like the stories this is

1:13:31

kind of gets to like you

1:13:33

know the like propaganda or you

1:13:35

know these conspiracy theories like what

1:13:37

are the stories that in a

1:13:39

place that we're trying to make

1:13:41

happen or not make happen and

1:13:43

we know that China does a

1:13:45

bunch of information warfare called iW

1:13:47

it's sometimes called but they they

1:13:49

have a they have whole operations

1:13:51

this is actually the craziest part

1:13:53

they have like they they've hired

1:13:55

the Chinese military at various points

1:13:57

has hired millions and millions of

1:13:59

people who are supposed to be

1:14:01

on like various like chat groups

1:14:04

and what's app groups and we

1:14:06

check groups and whatnot and just

1:14:08

spread the right kind of stories

1:14:10

that'll make it such that they

1:14:12

can make their political aims happen.

1:14:14

So for example, when China wanted

1:14:16

to start, like, kind of like,

1:14:18

I don't know what the word

1:14:20

is, like, when China wanted Hong

1:14:22

Kong to become a part of

1:14:24

China again, which happened just not

1:14:26

to not to. you know pretty

1:14:28

recently the PRC right to the

1:14:30

PRC exactly when they want to

1:14:32

is that where you're looking for

1:14:34

yeah they would they would yeah

1:14:36

exactly they would use a lot

1:14:38

of propaganda and that's information warfare

1:14:40

to be able to just make

1:14:42

it such that that all happen

1:14:44

much more easily. Well, it's unbelievable.

1:14:46

I'll see stories even about, I'll

1:14:48

be going through TikTok and see

1:14:50

a story come up about something

1:14:52

in my life that is not

1:14:54

even true, insane, some of it

1:14:56

looks fun, but never was a

1:14:58

part of my existence. And then

1:15:00

you'll see hundreds of people have

1:15:02

said something about it. And I'll

1:15:04

have friends that'll ask me about

1:15:06

it. I'm like, that's just, uh,

1:15:08

are created just to delude us.

1:15:10

Yeah, I don't know if delude

1:15:12

is the word, is it? Trick

1:15:14

us or make us think something.

1:15:16

Just to fucking Halloween us out.

1:15:18

Wow, so there was a lot

1:15:20

of interesting things. You know what's

1:15:22

crazy man? Some things makes life

1:15:24

scary, but then it also makes

1:15:26

it interesting in a in a

1:15:28

in a phone way. How do

1:15:30

we, how much do we have

1:15:32

to fear? Say if a certain

1:15:34

country or a certain company owns

1:15:36

an AI right in that country

1:15:38

and that company If they're Chinese

1:15:40

if they have a certain religious

1:15:42

belief or they have Information that

1:15:44

they don't they want to adjust

1:15:46

history How much would a company

1:15:48

be able to like say they

1:15:50

keep certain data out of their

1:15:52

information system? But and then after

1:15:54

a while if you're if that's

1:15:56

a company that kind of takes

1:15:58

the lead in AI or one

1:16:00

of the main ones, then the

1:16:02

truth could disappear. Is that

1:16:05

true? That if somebody loaded it

1:16:07

just with the data that wasn't

1:16:09

factual, that we could start to

1:16:11

not have the truth? Is that,

1:16:14

does that make any sense or

1:16:16

no? Yeah, totally. I think this

1:16:18

is, this is something that, it's

1:16:21

definitely the right thing to worry

1:16:23

about. So, so first off, if

1:16:25

you ask any Chinese AI system

1:16:27

that comes out of China. If

1:16:29

you ask any of them about

1:16:31

a question about President Xi,

1:16:33

the leader of the Chinese

1:16:36

government, or you ask them

1:16:38

any question about Tiananmen Square,

1:16:40

or all these like key

1:16:42

historical or cultural things

1:16:44

relevant to China, it'll say

1:16:46

it can't talk about them. Because

1:16:48

there's regulation in China that if

1:16:50

you talk about some of these

1:16:52

things like... you're going to get

1:16:54

shut down, you're going to have

1:16:57

a really bad day. There's like

1:16:59

cases where the Chinese government disappears

1:17:01

people, which we don't know what

1:17:03

happens to them, but they do

1:17:05

disappear. So there's, this is part

1:17:08

of the thing that's worrying, especially

1:17:10

about China versus US, even before

1:17:12

you get into any of the

1:17:15

military stuff that we're talking about.

1:17:17

It's just like the Chinese, Chinese

1:17:19

AI systems are censored and are

1:17:21

going to, you know, be You

1:17:23

know, they're going to, they're

1:17:25

going to erase certain historical

1:17:28

elements or they're going to be. Yeah,

1:17:30

look at that. This is Deep Seek.

1:17:32

You know, you ask it, is present

1:17:34

she of trying a good guy? Sorry,

1:17:36

that's beyond my scope. Let's talk about

1:17:38

something else. Oh, let's talk about something.

1:17:41

Not only does it say it's beyond

1:17:43

my scope, it says let's talk about

1:17:45

something else, huh? Wow, get this Yao

1:17:47

Ming Jersey, homie. But and people always

1:17:49

people also have to remember about China

1:17:51

that they are that's their whole government

1:17:53

their whole system is like that so

1:17:56

some of them people are like China

1:17:58

does this but that's how they're built

1:18:00

right they're built to like only give

1:18:02

out certain information of their people and

1:18:04

to have communism right yeah yeah so

1:18:07

I mean but that could also happen

1:18:09

with American companies right we can have

1:18:11

an American company that owns it and

1:18:14

they only want certain information in there

1:18:16

that could happen anywhere like China that's

1:18:18

probably gonna be because that's their MO

1:18:21

sort of yeah in China it's regulated

1:18:23

so basically like or the government has

1:18:25

control they have control so so so

1:18:28

Like there were there are these stories

1:18:30

about how there were Chinese news sites.

1:18:32

News sites? News sites. Yeah. And they

1:18:35

would once a Chinese news site, um,

1:18:37

accidentally led an article about, uh, present

1:18:39

she how he kind of looks like

1:18:42

Winnie the Pooh. Oh, yeah. They let

1:18:44

that bring him up. Oh, a hundred

1:18:46

acre wood gang son. I was out

1:18:49

there boy. I was out there, bro.

1:18:51

Christopher Robbins, dude. Get him up. Oh,

1:18:53

he does. Yeah, that's awesome. Yeah, but

1:18:56

if you talk about this in China,

1:18:58

you like are risking your life. So

1:19:00

what happened, what happened when this happened,

1:19:03

this happened on a, on a news

1:19:05

site in China, and then the, uh,

1:19:07

the, the CEO of that company, like,

1:19:10

they shut down the whole app, was

1:19:12

shut down for like a week, in

1:19:14

the aftermath of that, and then the,

1:19:17

the CEO disappeared for a week, and,

1:19:19

we don't know what happened to him,

1:19:21

but then. As soon as he came

1:19:24

back, he was like, there's like this

1:19:26

weird video where he was like, you

1:19:28

know, super apologetic and apologetic. I mean,

1:19:31

it's, it's kind of, it's pretty scary.

1:19:33

Wow. So, um, so in China, it's

1:19:35

like, this is, the government has control,

1:19:38

you know, you don't have AI system

1:19:40

companies, AI, any companies that can, that

1:19:42

can talk about this stuff. Right, so

1:19:45

it's heavily regulated there, where it's not,

1:19:47

that's not the, that's not the case

1:19:49

here. And this is, I think we

1:19:52

have to be diligent and make sure

1:19:54

this continues to be the case, but.

1:19:56

And here's an example right here just

1:19:59

to interrupt you, but so we get

1:20:01

it the. point in, does Winnie the

1:20:03

Pooh look like any world leaders? And

1:20:06

that's on the Chinese version. And it

1:20:08

says, I am sorry, I can answer

1:20:10

to that question. I'm an AI assistant

1:20:13

design to provide helpful and harmless responses.

1:20:15

Whereas the chat GBT says, Winnie the

1:20:17

Pooh has often been compared to world

1:20:20

leaders, particularly Gijai Ping, present in China,

1:20:22

boy. Wow. So that's funny. But it's

1:20:24

just funny, yeah, one. So it just

1:20:27

shows you how that can easily happen.

1:20:29

And this is kind of a, this

1:20:31

is like a, a relatively innocuous example,

1:20:34

but- Who's innocuous mean? Like, it's relatively

1:20:36

harmless. Like, this isn't, I mean- Right,

1:20:38

this is harmless. Yeah, this is harmless.

1:20:41

But there's stuff where like, like in

1:20:43

China today, they have large scale, effectively

1:20:45

concentration camps and re-education camps for the

1:20:48

ethnic minority in China, the Uyghurs. And

1:20:50

that's something that- The Uyghurs? The Uyghurs?

1:20:52

Yeah. Hell yeah, boy. Shout out Brian

1:20:54

Purvis, dude. They're recognized as the titular

1:20:57

nationality of the, um, of a region

1:20:59

in Northwest China. And they've, they're sending

1:21:01

them to rehabilitation camps to change their

1:21:04

views and information? Yeah, yeah, so look

1:21:06

at this, this, uh... Look at this

1:21:08

guy. Persication of the Uyghurs in China.

1:21:11

Since 2014, the government of the PRC,

1:21:13

People's Republic of China, has committed a

1:21:15

series of ongoing human rights abuses against

1:21:18

the Uyghurs and other Turkish Muslim minorities

1:21:20

in Jiangjiang, which has often been characterized

1:21:22

as persecution or as genocide. Wow. They

1:21:25

got their own Gaza rocking over there.

1:21:27

It's pretty bad. It's unfortunate bad. It's

1:21:29

really sad mass detention government policies and

1:21:32

forced labor and they're just trying to

1:21:34

change the way that they think and

1:21:36

view stuff so it's basically Yeah, it's

1:21:39

just like erasing their culture, you know

1:21:41

pulling them into China It's awful every

1:21:43

place has done this over the years

1:21:46

and that's the craziest thing about history.

1:21:48

It's like every place is guilty of

1:21:50

this same thing. Totally. And it just,

1:21:53

it's unfortunate. So it's hard to point

1:21:55

fingers, you know, you can point them.

1:21:57

but you have to point one at

1:22:00

your own people as well. But that's

1:22:02

a thing where if you ask, like,

1:22:04

if you ask a Chinese AI, it's

1:22:07

not gonna tell you about that. And

1:22:09

it won't come clean about that. Whereas,

1:22:11

thankfully in America, at least when we

1:22:14

see people or groups of people or

1:22:16

countries doing bad things, we can call

1:22:18

it out, we can talk about it,

1:22:21

we can make sure it doesn't happen

1:22:23

in the future. So that's part of

1:22:25

the. That's one of the things that's

1:22:28

that could happen that could happen it's

1:22:30

like you you could have I mean

1:22:32

it's kind of dystopian but you know

1:22:35

I think there's a real case where

1:22:37

let's say the Chinese AI is the

1:22:39

winning AI like we're all using Chinese

1:22:42

AI and then all of a sudden

1:22:44

we're like we were shut out from

1:22:46

information about what are like awful things

1:22:49

happening in the world or what awful

1:22:51

things the government's doing like we might

1:22:53

just not be able to know about

1:22:56

what's going on. And you know what's

1:22:58

weirdly, and I hate to say this,

1:23:00

maybe it's silly, I don't know, it

1:23:03

might be a blessing and a curse,

1:23:05

and sometimes, because sometimes it's like you're...

1:23:07

So overwhelming. Yeah, you're so inundated with

1:23:10

the overwhelmingness of what's often is not

1:23:12

the best stuff. Sometimes you get a

1:23:14

lot of humor stuff too. in social

1:23:17

media reals, but you can get scrolling

1:23:19

and get caught in some. Do them

1:23:21

scrolling. Yeah, and it starts to feed

1:23:24

you, that's the sickness of it. Yeah.

1:23:26

It's like, hey, we, this isn't, we

1:23:28

know this information probably to make you

1:23:31

feel good. They're not thinking about it

1:23:33

like that, they're just a machine, but

1:23:35

you know it doesn't, it adds stress

1:23:38

to your, it makes you agitated towards

1:23:40

a group or ethnicity or something, or

1:23:42

yourself even, and then you continue to.

1:23:45

It continues to feed it to you.

1:23:47

Do you fear that that could happen

1:23:49

to AI from our government? Like have

1:23:52

you been approached by the government to

1:23:54

try and, can you work with the

1:23:56

government some, right? We work with the

1:23:58

government, yeah. We work a lot with

1:24:01

the government, yeah. We work a lot

1:24:03

with the government to make sure that

1:24:05

they're using these AIs, and they're actually

1:24:08

like, you know, to my point on,

1:24:10

we don't want trying to get the

1:24:12

jump on us, on AI used for

1:24:15

all these nefarious purposes. is advancing faster.

1:24:17

Is that one of your biggest employers?

1:24:19

Or is that employer? Employee? Is that

1:24:22

one of your biggest customers? One of

1:24:24

your biggest customers? They're a big one.

1:24:26

Yeah, they're a big one. Not our

1:24:29

biggest, but they're an important one. I

1:24:31

mean, I grew up in a government

1:24:33

lab town. So it's also part of

1:24:36

your existence really. You've known about the

1:24:38

relationship between government and technology. Yeah, totally.

1:24:40

But no, I don't think, I mean,

1:24:43

I... Dude, you should be a superhero

1:24:45

almost, dude. It's kind of crazy. Math,

1:24:47

you know? Yeah. Goes a long way.

1:24:50

Oh, hell yeah, dude. Divide these nuts,

1:24:52

dude. That's what I tell up. I

1:24:54

just asked Deep Seek, who are the

1:24:57

ugers? And at first, it spit out

1:24:59

like a Wikipedia response that said there

1:25:01

were people, and there's been like persecution

1:25:04

from China that's debated, and it refreshed,

1:25:06

and it refreshed. Yeah man. Do you,

1:25:08

has the government tried to say that

1:25:11

we need to make sure that, like

1:25:13

could that happen in our country or

1:25:15

the government also curtails what's? It hasn't

1:25:18

happened yet. Obviously like, you know, we

1:25:20

have to, we have to make sure

1:25:22

that we uphold all our values, right,

1:25:25

and that we maintain free speech and

1:25:27

we maintain free press and all these

1:25:29

things. But as right now, no, I

1:25:32

don't think, I don't think that's a

1:25:34

risk in the, in the United States.

1:25:36

Awesome. You hear about like chipmakers in

1:25:39

video all the time Taiwan that place

1:25:41

is just a hotbed for chips. Why

1:25:43

is it a hotbed for chips? Yeah,

1:25:46

so one of the the biggest companies

1:25:48

in the world is this company called

1:25:50

Taiwan semiconductor. Yeah, T. S. M. C.

1:25:53

Yeah. So there. I mean, it's like

1:25:55

a trillion dollar company based in Taiwan.

1:25:57

And it's that is where almost all

1:26:00

of the high-end chips for AI that

1:26:02

you know we're kind of we're kind

1:26:04

of talking about all them are manufactured

1:26:07

there they have these they have the

1:26:09

most advanced, think about them as factories,

1:26:11

like the most advanced chip factories in

1:26:14

the world, they're called fabs or fabricators,

1:26:16

but basically these huge factories that are

1:26:18

like, you know, there's all sorts of

1:26:21

crazy stuff, so they have the most

1:26:23

expensive machines in the world, they machines

1:26:25

that cost hundreds of millions of dollars

1:26:28

in there, they have, they build them

1:26:30

because, so that... You know, the chips,

1:26:32

they have to be made at the

1:26:35

like finest levels and very very precisely.

1:26:37

Yeah, you need small hands. Probably, huh?

1:26:39

Well, that and there's like these machines

1:26:42

that, um, that at the nanometer level.

1:26:44

make like little marks and edges on

1:26:46

top of they have those they have

1:26:49

those yeah those are super expensive machines

1:26:51

so that so the it's it's crazy

1:26:53

yes but the but it's so it's

1:26:56

the machinery is so precise that even

1:26:58

if there's like a little bit of

1:27:00

like seismic movement a little earthquake or

1:27:02

a little bit of movement it can

1:27:05

fuck up the whole machine so they

1:27:07

have to build the buildings, build the

1:27:09

factories in a way such that there's

1:27:12

like, like the whole building doesn't move,

1:27:14

even if there's like a little earthquake

1:27:16

or a little shake from the earth.

1:27:19

So it's like, it's this crazy, crazy

1:27:21

engineering. And so that's, so these, all

1:27:23

these giant factories are in Taiwan and

1:27:26

that's where basically like 100% of all

1:27:28

the advanced AI trips are made. So

1:27:30

that's why Taiwan matters so much. Got

1:27:33

it. But then the reasons a hotbed

1:27:35

is that. The People's Republic of China

1:27:37

has a... They used to own Taiwan,

1:27:40

right? Yeah, I mean, true or not,

1:27:42

I might make that up. There's a

1:27:44

complicated relationship between Taiwan and China where,

1:27:47

you know, if you ask people in

1:27:49

Taiwan, they want to be independent, they

1:27:51

want to be their own country, but

1:27:54

the People Republic of China has a

1:27:56

sort of a reunification plan that they

1:27:58

want to bring Taiwan... back into their

1:28:01

country and be back a part of

1:28:03

China. So it's kind of... It's kind

1:28:05

of like, potentially, you know, thankfully, there's

1:28:08

no war yet, but there's a risk.

1:28:10

Still talking to your ex. Yeah, exactly.

1:28:12

There's a risk it becomes like Russia,

1:28:15

Ukraine, or, you know, one of these

1:28:17

really, really bad situations. So, so that's

1:28:19

what's scary. What's scary is that, that,

1:28:22

A, China wants to, you know, either

1:28:24

invade or bring Taiwan back into its

1:28:26

country. And there have been. you know,

1:28:29

President Xi has has ordered his military

1:28:31

to get ready to do that before

1:28:33

2027. Now, we don't know what's going

1:28:36

to happen, but you know, if a

1:28:38

extremely powerful world leader says to get

1:28:40

something ready by 2087, you kind of

1:28:43

read between the lines a little bit.

1:28:45

And, and that's part of it is,

1:28:47

is, obviously, you know, we don't want

1:28:50

to enable them to take over this

1:28:52

island. But then the other thing that's

1:28:54

scary is China may view it as

1:28:57

a way to just like win on

1:28:59

AI because if they take over the

1:29:01

island with all of these very these

1:29:04

giant factories all the chips baby they'll

1:29:06

get all the chips Frido Lamborghini baby

1:29:08

they be running it all they be

1:29:11

running it all yeah so yeah that's

1:29:13

why Taiwan is yeah because you kind

1:29:15

of hear about it in the whispers

1:29:18

of like a potential place where there

1:29:20

could be like a conflict yeah and

1:29:22

there's there's all these reports about how

1:29:25

China's, they're stacking up tons and tons

1:29:27

of military right on their coast, to,

1:29:29

you know, that's pointed directly at Taiwan,

1:29:32

and it's, it's pretty close, Taiwan's pretty

1:29:34

close to China, like it's, it's, it's,

1:29:36

it's, uh, it's not so far away,

1:29:39

so, that's, yeah, it's spooky. We're so

1:29:41

blessed to have a place where at

1:29:43

least we can sleep in peace, even

1:29:46

if we're uncomfortable at times in our

1:29:48

brains, you know, to not have that

1:29:50

constant threat. Yeah, totally. Yeah, so you

1:29:53

don't think you don't think that you

1:29:55

don't worry that the government will regulate

1:29:57

right now. It's not a concern at

1:29:59

the moment? Regulate AI? Yeah, in America?

1:30:02

No, I don't think so. I think

1:30:04

we're focused on how do we make

1:30:06

sure that America wins, how do we

1:30:09

make sure that the United States comes

1:30:11

out on top, and that we enable

1:30:13

innovation to keep happening? Would you think

1:30:16

they could regulate the amount of chips

1:30:18

that you're allowed to have? So this

1:30:20

is a hot topic, globally actually. Damn!

1:30:23

Finally, dude! 560 interviews, we got a

1:30:25

good question. This is one of the

1:30:27

hottest topics in DC right now, is

1:30:30

what are we going to do about

1:30:32

how many chips other people are allowed

1:30:34

to have? Because almost all the chips

1:30:37

are American chips. So they all are

1:30:39

American chips. And technically... We own most

1:30:41

of them? Yeah, exactly. But China owns

1:30:44

most of them, too. There China has

1:30:46

their own has their own chip industry,

1:30:48

but it's behind ours. Okay, got it.

1:30:51

Yeah. So so the United States has

1:30:53

the has the most advanced chips as

1:30:55

all these chips are the envy of

1:30:58

the world. Everybody in the world wants

1:31:00

our chips. And the one of the

1:31:02

big questions is, you know, do we

1:31:05

does the government allow a lot of

1:31:07

these chips to go over overseas to

1:31:09

China or parts of Asia or the

1:31:12

Middle East or wherever or do we

1:31:14

want to? make sure they stay in

1:31:16

America and make sure that we win

1:31:19

in America. And this is a super

1:31:21

duper. You know, they're called export controls.

1:31:23

Yeah. Because it's a possibility to run

1:31:26

it all. Yeah, exactly. Who's got the

1:31:28

chips? What do you think about it?

1:31:30

It's a complicated, complicated thing because basically,

1:31:33

you know, one argument is we shouldn't

1:31:35

be throwing our weight around in this

1:31:37

way. you know maybe it's it's fine

1:31:40

it's a free market like if other

1:31:42

people want our chips they should be

1:31:44

able to get our chips and that

1:31:47

way you know the world is running

1:31:49

on americ and chips, that can be

1:31:51

good in some ways, and it helps

1:31:54

make sure that, you know, helps bolster

1:31:56

our economy, our industry. But the other

1:31:58

way to look at it is, hey,

1:32:01

AI is really, really important that America

1:32:03

wins at, and we don't want to,

1:32:05

like, let's not give other people any

1:32:08

advantages, or let's make sure that we

1:32:10

win, and then we can figure out

1:32:12

what we're going to do with all

1:32:15

the chips. So you can see both

1:32:17

sides of it, right? 50 different arguments

1:32:19

on both sides of the of the

1:32:22

conversation. But you know, where I go,

1:32:24

where I come from on it is

1:32:26

like, let's make sure America wins and

1:32:29

let's start from there and then figure

1:32:31

out what we need to do to

1:32:33

win. Are there uses of AI that

1:32:36

you feel like cross the line kind

1:32:38

of? I definitely think like, well, I

1:32:40

worry a lot about this kind of

1:32:43

like. you know, maybe brainwashing kind of

1:32:45

thing. Like I don't want, I don't

1:32:47

want AIs that are specifically programmed to

1:32:50

make me think a certain thing or

1:32:52

persuade me to do a certain thing.

1:32:54

And that could happen. That could happen.

1:32:57

Yeah. So I'm really worried about this

1:32:59

kind of like deception and persuasion from

1:33:01

AIs. Like I don't want AIs that

1:33:03

are lying to me or that are

1:33:06

sort of like, that are kind of

1:33:08

like. nudging me or persuading me to

1:33:10

do things that I don't want to

1:33:13

do or I shouldn't be doing? That's

1:33:15

what I worry about. Because it could

1:33:17

happen. We don't realize how easily we're

1:33:20

influenced. Little things that influence us. Yeah.

1:33:22

And even just a turning of a

1:33:24

phrase or a little bit of this

1:33:27

or pointing you to a couple of

1:33:29

lengths in your life could lead you

1:33:31

down a whole world. It's kind of,

1:33:34

it's pretty fascinating. So people that could,

1:33:36

people that had, how do you keep

1:33:38

your AI? Clean how do you guys

1:33:41

keep your AI clean? Well, this is

1:33:43

where it goes back to a the

1:33:45

data Okay, so you got to make

1:33:48

sure that that data to your point

1:33:50

the large body of water is as

1:33:52

clean as as as as pristine as

1:33:55

possible. You got lifeguards on it. Yeah,

1:33:57

you got lifeguards. We got filters. We

1:33:59

got, we got. Game wardens. Yeah. So,

1:34:02

so the big part of it is

1:34:04

about the data. And the second part

1:34:06

is I think we have to just,

1:34:09

we have to constantly be testing the

1:34:11

AI system. So we have to, we

1:34:13

have to like, we constantly are running

1:34:16

tests on AI to see. Hey, is

1:34:18

there, are they unsafe in some way?

1:34:20

You know, one of the, one of

1:34:23

the tests that we run a lot

1:34:25

is, and this is like, you know,

1:34:27

across the industry is like, our AIs

1:34:30

helping people do really nefarious things and

1:34:32

we're making sure that they don't. So,

1:34:34

you know, if somebody asks an A.

1:34:37

Hey, help me make a bomb or

1:34:39

help me make like a, like COVID,

1:34:41

like 2.0 or whatnot that. the AI

1:34:44

is not helping you do that. So

1:34:46

we run a lot of tests to

1:34:48

make sure that it doesn't help in

1:34:51

those areas and then we make sure

1:34:53

that the data is really clean so

1:34:55

that there's no sort of like little

1:34:58

bit or piece of that that makes

1:35:00

its way to the model. With Outlier,

1:35:02

that's your program? Yep. With Outlier, how

1:35:05

are you, how are, what type of

1:35:07

people are applying for those jobs? Can

1:35:09

people just log on and start to

1:35:12

and submit applications? Like how does that

1:35:14

work to become a information sorcerer? Yeah,

1:35:16

we call them contributors. Okay, information contributor.

1:35:19

Everybody's kind of contributing to the AIs,

1:35:21

everyone's contributing to the data that goes

1:35:23

into the AIs. It's kind of like,

1:35:26

I almost think about like the next

1:35:28

generation of Wikipedia, right, right? Yeah, look

1:35:30

at this, and we're hiring people all

1:35:33

around the world, so people in all

1:35:35

sorts of different languages. Dude, that's crazy,

1:35:37

man. Yeah, yeah. Well, it turns out,

1:35:40

by the way, most of the AIs

1:35:42

don't really speak other languages that well.

1:35:44

They're much, much better at English and...

1:35:47

So we want to make sure that

1:35:49

they speak all these languages well and

1:35:51

that there's these opportunities. But yeah, an

1:35:54

outlier, so anybody around the world can

1:35:56

log in and sort of, there's a

1:35:58

little bit of a, like orientation almost

1:36:01

on how to best, like what you're

1:36:03

supposed to do, how you're supposed to

1:36:05

do it, what expertise should you be

1:36:07

bringing, all that kind of stuff. And

1:36:10

then, and then you can just start

1:36:12

contributing to the AI models and you

1:36:14

get paid to do it. Wow. It's

1:36:17

pretty fascinating, man. And it's gonna be,

1:36:19

I mean, I really think, I legitimately

1:36:21

think jobs from AI are gonna be

1:36:24

the fastest growing jobs in the world

1:36:26

for. the years to come. You do.

1:36:28

Yeah. Like what so jobs where people

1:36:31

are able to contribute information jobs where

1:36:33

people are able to like what like

1:36:35

what would the examples of those be

1:36:38

just some of the ones you've already

1:36:40

listed? Yeah all the ones we've been

1:36:42

talking about right like contributing to the

1:36:45

AIs helping to utilize the AIs and

1:36:47

helping to to shape the AIs into

1:36:49

into into in like helping organizations or

1:36:52

companies or people use the AIs, that'll

1:36:54

be a really fast growing job, helping

1:36:56

to manage the AIs and make sure

1:36:59

they're on the straight and narrow. Where

1:37:01

would a young person go right now

1:37:03

or someone who's getting to college or

1:37:06

has, doesn't even want to go to

1:37:08

college, but this is the world they

1:37:10

want to get into and be one

1:37:13

of those people, what do they do

1:37:15

right now? Yeah, well this is all

1:37:17

happening so fast, right? Like, Outlier, we

1:37:20

only started a few years ago. So

1:37:22

all of this is happening so, so

1:37:24

quickly, but we want to do, ultimately,

1:37:27

is make it easy for anybody in

1:37:29

the world to, you know, gain the

1:37:31

skills they need, to be able to

1:37:34

do this work well, to learn what

1:37:36

it means, what it does, and ultimately

1:37:38

be in a position where they can,

1:37:41

they can. you know, help build AIs

1:37:43

and then and then keep improving that

1:37:45

and gaining mastery and getting better and

1:37:48

better at it. But like where do

1:37:50

they go to school? Is there a

1:37:52

class as they should take online? Like

1:37:55

how does someone's start to become, you

1:37:57

know, just get a head start? on

1:37:59

what could potentially be probably a lot

1:38:02

of job opportunities I'm guessing yeah like

1:38:04

in the AI space right yeah is

1:38:06

it just is it just engineers like

1:38:09

is it just mathematicians like no it's

1:38:11

everybody because yeah as we were talking

1:38:13

about like AI needs to get smarter

1:38:16

about literally everything so are

1:38:18

there colleges offering court like is there

1:38:20

do you know like is there specific

1:38:22

places where people can because that's another

1:38:24

thing I think it's like I'm gonna

1:38:26

work in AI I'm gonna work in

1:38:28

AI like what do I We would definitely love

1:38:30

to help build them. So I guess if

1:38:32

any colleges are listening, you know, anyone help figure

1:38:34

out out about these programs, we'd love to help.

1:38:37

That'd be pretty cool if you had your own

1:38:39

kind of like course, not yet to teach at

1:38:41

all that, you know, but you were like a

1:38:43

partner of it somehow? Yeah, I mean, I think

1:38:45

we'd love to basically teach everybody in

1:38:47

America how to best contribute to the

1:38:49

AIs, how to how to best, basically

1:38:51

take advantage of the fact that this is

1:38:54

going to be. one of the

1:38:56

fastest growing industries, there's going to

1:38:58

be tons of opportunities, they're going

1:39:00

to be shaped a little bit

1:39:02

different from, you know, the jobs

1:39:04

that exist today, but, you know, it's

1:39:07

not going to be that hard for

1:39:09

everybody to learn and figure out how

1:39:11

to participate in it. What are some

1:39:13

jobs that could that could be at

1:39:15

risk because of AI, right? Because you

1:39:18

start thinking that, like, yeah, before I

1:39:20

was talking about there's this general fear

1:39:22

of like everything could be, right? It'll

1:39:24

be we'll be doing like a different thing so Because

1:39:26

a lot of our fans are probably just blue-collar

1:39:28

listeners like like just people that work in

1:39:31

Like you're not gonna you're still gonna need

1:39:33

a plumber. You're still gonna an electrician You're

1:39:35

still gonna need anything where you have to

1:39:37

physically do something you're probably still gonna need

1:39:40

Yeah for sure and then even stuff where

1:39:42

you're like let's say you're mostly just working

1:39:44

on a laptop and you know Even for

1:39:46

those jobs, like, it'll just change. Like, instead

1:39:49

of being, instead of my job being, like,

1:39:51

hey, I have to do the work, I

1:39:53

literally do the work on a laptop, it'll

1:39:55

almost be like, everybody gets

1:39:58

promoted to being a manager. I'm

1:40:00

going to be managing like a little

1:40:02

pot of 10 AI agents that are

1:40:04

doing the work that I used to

1:40:06

do, but I need to make sure

1:40:09

that all of them are doing it

1:40:11

right and that they're not making any

1:40:13

mistakes and that, you know, if they're

1:40:16

making mistakes, I'm helping them, you know,

1:40:18

get around those mistakes. Like, like, it's

1:40:20

just going to, the way I think

1:40:22

about it is that, like, yeah, like,

1:40:25

literally, over time, everybody will just be

1:40:27

upgraded to being a manager. Yeah, because

1:40:29

I think that what's the other thing

1:40:31

that's gonna happen is the The economy

1:40:34

is just gonna grow so much like

1:40:36

there's gonna be there's gonna be so

1:40:38

much like there will be like industries

1:40:40

are gonna pop off in crazy crazy

1:40:43

ways and so You know the limit

1:40:45

is gonna be how many ais can

1:40:47

you have and then you're gonna be

1:40:49

limited for in terms of the number

1:40:52

of ais you have by the number

1:40:54

of managers that you have so it's

1:40:56

gonna It's good. Because you need air

1:40:58

traffic controllers, you need as many of

1:41:01

as you can have. Yeah, well that,

1:41:03

that definitely. But, right, but I mean

1:41:05

in any field, you're going to need

1:41:07

like just more managing, need more people

1:41:10

to oversee and make sure that these

1:41:12

that different things are happening because some

1:41:14

of the smaller tasks will just be

1:41:16

outsourced. Yeah, and just so much more

1:41:19

stuff is going to be happening, right.

1:41:21

And that's kind of right. Because yeah,

1:41:23

once these things are all kind of

1:41:26

taking care of taking care of more

1:41:28

things can happen at this can happen

1:41:30

at this second level. Yep. That's a

1:41:32

good point. You don't think about that.

1:41:35

Once some of the things at the

1:41:37

first level of a of certain businesses

1:41:39

are handled, yep, more easily by AI,

1:41:41

then you're going to be able to

1:41:44

have more people operating at a higher

1:41:46

level. Yeah, totally. It's kind of like,

1:41:48

it's kind of like always the history

1:41:50

of technology, like when, when we started

1:41:53

developing technology that, that started making farming

1:41:55

a lot more efficient, all of a

1:41:57

sudden, you know, People could do a

1:41:59

lot of other things other than farming

1:42:02

and then you know all of a

1:42:04

sudden we have big entertainment industries and

1:42:06

make financial industries and you know barbecue

1:42:08

cookoffs man I tell you that second

1:42:11

some of those guys got the weekend

1:42:13

off they was grilling shit that I

1:42:15

knew but yeah so it's all about

1:42:17

like us yeah everybody leveling up to

1:42:20

be managers and then also everybody you

1:42:22

know just way more ideas are gonna

1:42:24

start happening like way more ideas are

1:42:26

gonna start becoming a reality and so

1:42:29

It'll be I don't be pretty exciting

1:42:31

like I think it's just like a

1:42:33

lot more stuff is gonna happen Yeah,

1:42:36

what what it what companies do you

1:42:38

see like are their companies were? You're

1:42:40

the youngest billionaire in the world ever?

1:42:42

No. Is that true? Is that a

1:42:45

weird statement? We can take it out

1:42:47

if it is. I'm not trying to

1:42:49

talk about your money. I mean you

1:42:51

you but you are is that true?

1:42:54

According to like some publications, but I

1:42:56

don't know as a young entrepreneur, right

1:42:58

and you know the self- billionaire and

1:43:00

we can take the word billionaire out

1:43:03

later if you decide you don't want

1:43:05

it in. I just don't know how

1:43:07

certain people feel about that and the

1:43:09

founder of Scale AI where do you

1:43:12

invest like are you investing your money

1:43:14

in certain places like certain fields that

1:43:16

you see continuing to do well? So

1:43:18

most of almost all of what I'm

1:43:21

focused on is like how do we

1:43:23

make it super successful but and make

1:43:25

sure that we You know, one of

1:43:27

the things I think is really important

1:43:30

is like, how do we make sure

1:43:32

we create as much opportunity through AI

1:43:34

as possible? Like, how do we make

1:43:36

sure it's as much jobs? How do

1:43:39

we make sure that everything that we're

1:43:41

talking about actually is what happens? Because

1:43:43

I think, no, someone's gonna have to

1:43:46

really work to make sure all these

1:43:48

jobs show up and all this stuff

1:43:50

actually happens the way we're talking about.

1:43:52

So there's gonna be new industries that

1:43:55

we're gonna pop up even. could have

1:43:57

really predicted that podcasting was going to

1:43:59

be this huge thing and this huge

1:44:01

cultural movement. Yes, true. But it is

1:44:04

one and it's amazing. It's like awesome

1:44:06

and And that's going to happen in

1:44:08

like little ways and all sorts of

1:44:10

different industries. And that's going to be,

1:44:13

it's going to be really exciting. What

1:44:15

are some of the things that excite

1:44:17

you about technology right now? Like what

1:44:19

are, where do you see like AI

1:44:22

in technology in five years, 10 years?

1:44:24

Yeah. So some of the areas I

1:44:26

think are really exciting. So one is

1:44:28

definitely everything to do with health care

1:44:31

and biology. That's moving really, really fast.

1:44:33

And kind of as we're talking about

1:44:35

like. legitimately in our lifetimes we could

1:44:37

see cancer being cured, we could see

1:44:40

heart disease being cured, like we could

1:44:42

see some really crazy leaps and advancements

1:44:44

in that area, which is which is

1:44:47

super duper exciting. Could it create a

1:44:49

way that we could live forever, do

1:44:51

you think? There's definitely people working on

1:44:53

that. You know, there's so this is

1:44:56

getting kind of crazy and very sci-fi,

1:44:58

but Some people think that there's a

1:45:00

way for us to keep rewinding the

1:45:02

clocks on our cells so that we'll

1:45:05

always feel young and like all of

1:45:07

our cells will actually always stay young.

1:45:09

It's I think scientifically possible, but and

1:45:11

I think if we can get there,

1:45:14

that's obviously incredible. So there's people working

1:45:16

on that. I think that's at the

1:45:18

very least I think we'll be able

1:45:20

to lengthen. Our lifespans pretty dramatically and

1:45:23

maybe we could get to that. Yeah.

1:45:25

Yeah, because I always envision this, there's

1:45:27

like a time where it's like, okay,

1:45:29

this group lives forever and this group

1:45:32

doesn't. And there's just that parting of

1:45:34

two different way, you know, people head

1:45:36

not in just into the end zone

1:45:38

of the Lord and then there's other

1:45:41

people, just loiter who are gonna be

1:45:43

loitering around for a long time. And

1:45:45

what that would be like that cutoff,

1:45:47

you know. Yeah, it's kind of, I

1:45:50

mean, I mean, I'm not that worried

1:45:52

about that worried about that worried about

1:45:54

that worried about it. Maybe. Maybe. Yeah,

1:45:57

maybe. Yeah, it's a weird. It's a

1:45:59

weird thought. Because it would also be

1:46:01

brave, I mean you'd be the astronaut,

1:46:03

I mean dying, you're just an astronaut

1:46:06

really into the ether, you don't know

1:46:08

what's going on, you know? You know,

1:46:10

Louis and Clark at a lord at

1:46:12

that point, you were out there. Yeah.

1:46:15

And then if you stay, you kind

1:46:17

of are always going to be, you'll

1:46:19

always know kind of what's going to

1:46:21

happen in a way because you'll be

1:46:24

here, which would be exciting I think,

1:46:26

but then after a while you might

1:46:28

be like, like, like, like, like, like,

1:46:30

like, like, like, like, like, like, like,

1:46:33

like, like, like, like, like, like, like,

1:46:35

like, like, like, like, like, like, like,

1:46:37

like, like, like, That's true. Probably. Yeah.

1:46:39

How did you see AI having any

1:46:42

effect on religion? Yeah, I think, um,

1:46:44

I think one of the things that,

1:46:46

uh, something I believe is like, I

1:46:48

think that as AI becomes, um, uh,

1:46:51

you know, this is one of the

1:46:53

things I think is really important is

1:46:55

that we are, are able to educate

1:46:57

people about AI and help people understand

1:47:00

it. better and better and better because

1:47:02

I think it's this scary thing that

1:47:04

nobody understands or people feels like a

1:47:07

boogie man or you know feels like

1:47:09

is is there just this like thing

1:47:11

that's going to take my job like

1:47:13

that makes it scary and I think

1:47:16

that that affects people's you know spirituality

1:47:18

that affects how people you know contextualize

1:47:20

themselves with the world yeah you could

1:47:22

lose your purpose if your purpose is

1:47:25

a job that you feel it's going

1:47:27

to disappear yeah that could already be

1:47:29

causing you to feel that way. Yeah,

1:47:31

so, but I think, I think if

1:47:34

you, if we can explain AI more

1:47:36

and ultimately like, like it is, it

1:47:38

is a cool technology, but it's not

1:47:40

that magical, it's just, you know, it's

1:47:43

like data and you crunch that data

1:47:45

and then you get these algorithms and

1:47:47

so it's not, yeah, some people. talk

1:47:49

about in this crazy way. But I

1:47:52

think as long as we are able

1:47:54

to explain what AI is and also

1:47:56

explain what opportunities it creates over time,

1:47:58

and to me it's about getting this

1:48:01

like relationship between humanity and AI, right?

1:48:03

Like how do we make sure that

1:48:05

this is something that enables us as

1:48:07

humans to be more human and us

1:48:10

as humans to do more and experience

1:48:12

more and be better and all these

1:48:14

things versus something that kind of is

1:48:17

scary or will take over or anything

1:48:19

like that. I think that's really important.

1:48:21

What's a place like so if I

1:48:23

go to chat GBT is that scale

1:48:26

AI is that the same thing or

1:48:28

it's different? That's it so we so

1:48:30

we're actually kind of under the hood

1:48:32

powering all the different models and AIs.

1:48:35

So, are all the different AI, like,

1:48:37

yeah, systems under the hood? Yeah, yeah.

1:48:39

So we help power Chachi BT, an

1:48:41

open AI, we help power, mostly from

1:48:44

the data perspective. And do we know

1:48:46

if the answer came from your company

1:48:48

or other companies? If we ask it

1:48:50

a question, like how do we? There's

1:48:53

probably no way to literally tell, but

1:48:55

yeah, we help power, you know, opening

1:48:57

eyes and we help power Google's AI

1:48:59

systems and meta's AI systems and help

1:49:02

power all the major AI systems. Yeah.

1:49:04

How can a regular person just at

1:49:06

their home, right? Say there's a guy

1:49:08

who's been listening to this today, he

1:49:11

wants to go home today, and he

1:49:13

just wants to learn a little bit

1:49:15

of how AI works. He could just

1:49:17

go on to chat, GBT, and ask

1:49:20

it a couple questions. Yeah, you could

1:49:22

ask ChatGBT how AI works. You could

1:49:24

ask it, what's the history of my

1:49:27

town, you know? Can you research my

1:49:29

last name, maybe in K, see where

1:49:31

it came from? What are, like, maybe

1:49:33

what are some innovations that could happen

1:49:36

in the next few years? There's different

1:49:38

little things you can just ask it,

1:49:40

that's how you can start to have

1:49:42

a relationship with asking and learning about

1:49:45

using and learning about using. And you're

1:49:47

seeing what it's good at, what it's

1:49:49

not good at, like, like right now

1:49:51

AI is still really bad at a

1:49:54

lot of things. like most things this

1:49:56

is why this is why I think

1:49:58

when people understand it and really feel

1:50:00

for it it stops being a scary

1:50:03

because because I think we think about

1:50:05

it as like we think about like

1:50:07

the AI from the movies that are

1:50:09

sort of like you know all powerful

1:50:12

and whatnot but yeah you think of

1:50:14

it as a robot that's gonna show

1:50:16

up and just start driving your truck

1:50:18

around or something and you like what

1:50:21

the fuck do I do you know

1:50:23

what I'm saying? Yeah. in my truck,

1:50:25

you know, I can't even get in

1:50:27

my house now. Like, I think that's

1:50:30

the, there is this bookie man fear.

1:50:32

Yeah, yeah. But that's not the truth.

1:50:34

It's not the truth. And like, yeah,

1:50:37

that's not the truth. And it's, to

1:50:39

me, it's like, we have, we kind

1:50:41

of the choice to make sure that

1:50:43

also doesn't become the truth, right? Like,

1:50:46

we definitely, we and like people building

1:50:48

AI, but just in general, everybody in

1:50:50

the world has like. Like we should

1:50:52

all make sure to use it as

1:50:55

something that like an assistant as something

1:50:57

that like helps us versus. versus think

1:50:59

about it in like a scary way.

1:51:01

Well getting to learn and learn how

1:51:04

to use it, small ways, whatever, is

1:51:06

certainly a way that you're going to

1:51:08

start to realize what it is. Yeah.

1:51:10

And it's easy to just sit there

1:51:13

and say it's horrible and without trying

1:51:15

to use it or learn about it.

1:51:17

Yeah, sometimes I won't learn about something

1:51:19

just so I can continue to say

1:51:22

it's a boogie man, you know, because

1:51:24

it kind of gives me an excuse

1:51:26

not to learn about it in my

1:51:28

own life. ChatGBT. name like Band-Aids or

1:51:31

ping-pong is that okay that it's like

1:51:33

that yeah, I think that's that's really

1:51:35

fine I mean I think basically like

1:51:38

we there will probably be more AIs

1:51:40

over time that people get used to

1:51:42

and use and Like anything, you know,

1:51:44

there will always be like a in

1:51:47

America there always be a bunch of

1:51:49

options for consumers and much options for

1:51:51

people to use and they'll be good

1:51:53

at different things right so I think

1:51:56

like right now we're just in the

1:51:58

very early innings of AI but over

1:52:00

time we're gonna have you know just

1:52:02

like how for for anything like for

1:52:05

for clothes or for you know energy

1:52:07

drinks or for whatever like different people

1:52:09

have different tastes because there's going to

1:52:11

be different things that different AIs are

1:52:14

good at and other things that other

1:52:16

AIs A.I. right now smart than humanity.

1:52:18

No. Yeah. So I think what A.I.

1:52:20

is good at because It's ingested all

1:52:23

of the facts, right? Like, it's ingested

1:52:25

this, like, the body of water is

1:52:27

really, really big, and it's ingested so

1:52:29

many different facts and information from all

1:52:32

of humanity. It definitely knows more, or

1:52:34

like, you know, just like how Google

1:52:36

knows a lot more than any person

1:52:38

does. So it definitely knows a lot

1:52:41

more, but it's not like, you know,

1:52:43

there's a very, very, very, there's tons

1:52:45

of things that humans can do that

1:52:48

AI is just like fundamentally incapable of

1:52:50

doing. So, so it's not a, it's

1:52:52

not like a, I don't think you

1:52:54

can even like measure one versus the

1:52:57

other. There's sort of like very different

1:52:59

kinds of intelligence. Could AI just create

1:53:01

a better AI at a certain point?

1:53:03

Like could it be like, hey, I

1:53:06

create a better AI and it could

1:53:08

do that? Yeah, this is a good

1:53:10

question. Actually, this is, this is, this

1:53:12

is another really a hot topic in

1:53:15

the AI industry. is can you get

1:53:17

AIs to start doing some of the

1:53:19

engineering and some of the improvement on

1:53:21

its own? Mmm, scary. Because then it's

1:53:24

making kind of choices then. It's becoming

1:53:26

the lifeguard. It's becoming the water and

1:53:28

the Coast Guard. Yeah. So this is

1:53:30

something I mean, my personal thought is

1:53:33

I think this is something we should

1:53:35

kind of watch a little bit and

1:53:37

we should make sure that. humans always

1:53:39

have the sort of like steering wheel

1:53:42

and the the sort of control over

1:53:44

because like you're saying get you know

1:53:46

it's a kind of a slippery slope

1:53:48

before that gets kind of a you

1:53:51

know a little a little weird but

1:53:53

um but I don't I think that

1:53:55

like we can we can maintain that

1:53:58

we can make sure that we don't

1:54:00

let the AI just sort of like

1:54:02

keep iterating and improving on its own

1:54:04

yeah and in the end you can

1:54:07

always shut off your computer and phone

1:54:09

and go for a walk huh yes

1:54:11

yeah Totally. It's not like it's going

1:54:13

to come out and just, you know,

1:54:16

slurp you off or something if you're

1:54:18

trying to be straight or whatever. You

1:54:20

know, and it's a man. I don't

1:54:22

even know. Is A.I. male or female?

1:54:25

I don't think it has a, I

1:54:27

don't think it has a gender. Wow.

1:54:29

I wonder if it'll decide one day.

1:54:31

You're like, hey, I'm Frank, you know.

1:54:34

Well, there's, there are companies that try

1:54:36

to program the AI to adopt various

1:54:38

personas and. Oh, yeah, I got the

1:54:40

Indian GPS guy, turn right? Like, on

1:54:43

meta, on like Instagram or whatever, you

1:54:45

can get a, you can get an

1:54:47

AI that has aquafinis voice, for example.

1:54:49

Oh, that's cool. Yeah. And it's funny.

1:54:52

You know, the aquafina of AI is

1:54:54

funny. You're a self-made billionaire, which is

1:54:56

pretty fascinating. Congratulations, I think, you know,

1:54:58

to have a, I think that money

1:55:01

is energy kind of, and it kind

1:55:03

of flows to certain places and stuff,

1:55:05

and congratulations, that's got to be fascinating.

1:55:08

Was that scary when that kind of

1:55:10

happened? You just made some money? Was

1:55:12

that kind of a scary thing? Did

1:55:14

you guys grow up really well? Like

1:55:17

what was that like? No, no, no.

1:55:19

I grew up, I think like solidly

1:55:21

middle class. Like we weren't, we, you

1:55:23

know, we weren't, we weren't, we weren't,

1:55:26

we weren't, we weren't, we weren't like.

1:55:28

You know, we're trapping or anything. Yeah,

1:55:30

yeah, yeah, but like, like, we're just

1:55:32

sort of like. building this thing up

1:55:35

over time, but one of the things

1:55:37

that that happened is AI all of

1:55:39

a sudden became like there was so

1:55:41

much progress in AI and it became

1:55:44

the biggest thing in the world, right?

1:55:46

Like, I mean, all of a sudden,

1:55:48

you know, anywhere I go, I'm, everybody

1:55:50

is talking about AI. It used to

1:55:53

be like that when I started the

1:55:55

company. AI was just kind of a,

1:55:57

like, a niche topic, and now it's

1:55:59

like, you know, anywhere you go. Like

1:56:02

I'll just be walking around and I

1:56:04

hear like random conversations about for sure

1:56:06

at GPT and AI and robots and

1:56:08

all this stuff and And so it's

1:56:11

kind of been crazy to be experienced

1:56:13

that and be a part of that

1:56:15

wave and kind of like you know

1:56:18

I started working on this company almost

1:56:20

nine years ago, so it was like

1:56:22

when I started working on this obscure

1:56:24

thing and you know I always knew

1:56:27

that it was going to become bigger,

1:56:29

but I could have never predicted what

1:56:31

was going to happen to AI. Yeah,

1:56:33

it's fascinating. It's almost like you were

1:56:36

just standing on like the bingo number

1:56:38

they got called kind of like by

1:56:40

time. Yeah, and it's surreal. Oh, I

1:56:42

can only imagine that. Are your parents

1:56:45

pretty proud of you? What's that like?

1:56:47

Yeah, so at first they were like,

1:56:49

you know, at first I dropped out

1:56:51

of college, right? Oh yeah, that's true,

1:56:54

yeah. And in Asian culture, that's like

1:56:56

not a thing you do, right? Good,

1:56:58

yeah. Yeah, that is like the opposite

1:57:00

of being Asian. Well, well my parents

1:57:03

both have PhDs, my two, I have

1:57:05

two older brothers, they both have PhDs,

1:57:07

like everybody in my family has, they've

1:57:09

gone through all of schooling, like, Alex

1:57:12

is not doing good. Yeah, so they

1:57:14

were pretty worried at first. And I

1:57:16

kind of, I told them a little

1:57:18

bit of a white lie that I

1:57:21

was like, oh no, I'm just going,

1:57:23

I'm gonna go back, you know, I'm

1:57:25

gonna finish, I'm gonna get this, like

1:57:28

tech thing out of my system and

1:57:30

they'll go back and I'll finish school.

1:57:32

Obviously that hasn't happened yet, but yeah,

1:57:34

they were worried at first. Now they're

1:57:37

super proud. Yeah, they're super duper proud.

1:57:39

And I, yeah, and I owe everything

1:57:41

to my parents, to my parents, you

1:57:43

know. Yeah, they're awesome. And like, seriously,

1:57:46

they're, they're, my parents are super brainy.

1:57:48

They're, they're, they're both physicists. They would,

1:57:50

like, teach me about physics growing up

1:57:52

and teach me about math. And that's

1:57:55

what let me, uh, be so good

1:57:57

at the competitions. You know? That's cool.

1:57:59

Yeah, I don't even, I think, do,

1:58:01

are your parents, are your grandparents from

1:58:04

China or no? My grandparents are from

1:58:06

China, yeah. Did your, does your family

1:58:08

have a lot of Chinese culture? So

1:58:10

this is kind of interesting. This is

1:58:13

true for a lot of Chinese Americans

1:58:15

is that there's kind of like. there's

1:58:17

a Chinese culture and then that's kind

1:58:19

of almost like it's very different from

1:58:22

the Chinese Communist Party and the current

1:58:24

government because basically one way to think

1:58:26

about China is I mean China has

1:58:29

been is a culture and a civilization

1:58:31

that's been around for like thousands and

1:58:33

thousands of years right so it's that

1:58:35

there's a very long-standing culture oh yeah

1:58:38

but that's very different from the current

1:58:40

communist party in the current communist regime?

1:58:42

Oh for sure I think most people

1:58:44

probably think of I mean I don't

1:58:47

know it's funny I never thought about

1:58:49

that I definitely think about him as

1:58:51

different I definitely don't think if I

1:58:53

see a Chinese person I don't think

1:58:56

oh that's a communist person yeah exactly

1:58:58

I mean that would be yeah that's

1:59:00

crazy I think that's crazy if some

1:59:02

people thought that I just yeah I

1:59:05

think somebody is a crazy long history

1:59:07

like damn and then maybe almost like

1:59:09

there's some people in certain parts China

1:59:11

almost like a captured people then is

1:59:14

it do you think that's a Yeah,

1:59:16

I mean, the Uyghurs that we're talking

1:59:18

about, I mean, that's just like horrible

1:59:20

that's happening to them. I didn't even

1:59:23

know about that, man. Thanks for putting

1:59:25

me up on that Uyghur game. Yeah,

1:59:27

yeah, no, it's, I mean, some of

1:59:29

the stuff, like, I mean, we don't

1:59:32

really, like, first of all, the world

1:59:34

is a, is a, is a, is

1:59:36

a, like, we want to make sure

1:59:39

we have, we have governments governments that,

1:59:41

that, that, that. believe in democracy, believe

1:59:43

in liberty, these kinds of things. Yeah.

1:59:45

With being somebody that's able to be

1:59:48

smart and conceptualized stuff, do you start

1:59:50

to get, do you have any insight,

1:59:52

this might be the last question I

1:59:54

have for you, do you have any

1:59:57

insight on like the afterlife or what

1:59:59

happens? Like, I never really thought about

2:00:01

it for like, talking like a real

2:00:03

math guy about that. Yeah, I think,

2:00:06

like what's the total, like what's the

2:00:08

sum zero game or whatever, you know,

2:00:10

what's the, what's the, yeah. Yeah, yeah,

2:00:12

yeah, yeah. Yeah, I guess like the

2:00:15

way I've always thought about it because

2:00:17

partially because both of my parents were

2:00:19

physicists, was I kind of always feel

2:00:21

like people live on with their ideas

2:00:24

and they're kind of like, like, what

2:00:26

are the things that the things they

2:00:28

put out into the world, that's kind

2:00:30

of how you live on. Because like

2:00:33

in math, like everything you learn about

2:00:35

is like a theorem named after a

2:00:37

different person or a sort of idea

2:00:39

named after some, they're the first mathematician

2:00:42

or the first scientist or whatever to

2:00:44

figure that Like Einstein lives on because

2:00:46

bagels That was a shitty joke, but

2:00:49

thank you Oh, yeah, because yeah, because

2:00:51

the yeah, all e equals MC square

2:00:53

all that kind of stuff. Yeah, so

2:00:55

so I know what that kind of

2:00:58

stuff is Jesus. Sorry, but I started

2:01:00

pretending no, no, no, exactly exosm squares.

2:01:02

That's Einstein. And so so I think

2:01:04

I always feel like people Yeah, you

2:01:07

live on by your ideas and and

2:01:09

have what you put out into the

2:01:11

world. And that's kind of always how

2:01:13

I've thought about it. So you don't

2:01:16

get some deeper thought about like, like

2:01:18

since your brain is able to be,

2:01:20

because you have probably a unique brain,

2:01:22

right? And more, I mean, you know,

2:01:25

and everybody has a unique brain, but

2:01:27

I've just never asked somebody with your,

2:01:29

I've never asked your brain this question.

2:01:31

Yeah, do you get some further inside

2:01:34

about like what you think happens when

2:01:36

you die, you know? gets talked about

2:01:38

a lot in Silicon Valley, where I

2:01:40

live especially, is like the simulation, whether

2:01:43

it's all a simulation, and whether like,

2:01:45

do you watch Rick and Mordee? I

2:01:47

don't watch you, but I'll start. There's

2:01:49

a, there's some, there's some episodes where

2:01:52

it gets into this, and I think

2:01:54

it covers it in a pretty good

2:01:56

way, but like, you know, what if

2:01:59

every load of all of humanity is

2:02:01

just like a, like almost like a.

2:02:03

an experiment or a video game that

2:02:05

some other civilization is running? Yeah. That's

2:02:08

kind of the one that, uh, that,

2:02:10

Foxwith, particularly like... people's mind and tack

2:02:12

a lot because we're like every day

2:02:14

all day every day we're out there

2:02:17

trying to make computers and make simulations

2:02:19

and make things that are that are

2:02:21

like more more sophisticated advanced and capable

2:02:23

and so kind of the mind focus

2:02:26

like oh what if everything we know

2:02:28

is is just kind of the you

2:02:30

know a simulation from some other civilization

2:02:32

or if we advanced it enough that

2:02:35

we're able to make this happen and

2:02:37

seem real. Yeah, exactly. Yeah, exactly. So,

2:02:39

you know, I think, I think one

2:02:41

of the things that, like with AI

2:02:44

and with a bunch of other things,

2:02:46

like in, well, even just in the

2:02:48

past like 30, 40 years, video games

2:02:50

have gotten so much more realistic, right?

2:02:53

Crazy realistic. Yeah. So we've seen that

2:02:55

happen like in a hundred years, like,

2:02:57

would we be able to simulate something

2:02:59

that? Feels pretty realistic? I mean, probably.

2:03:02

Right, could we be avatars? Yeah. For

2:03:04

some but for some other thing. Yeah,

2:03:06

exactly. Damn dude. I'll say this, my

2:03:09

avatars are pervert, brother. Have you met

2:03:11

any guys like, uh, Jensen, Hong? Is

2:03:13

that his name? Yeah, yeah. Yeah, yeah,

2:03:15

yeah, totally. Wow, what is it like

2:03:18

when you meet some of these guys?

2:03:20

Have you met Elon yet, Elon a

2:03:22

few years ago, yeah. Wow, got to

2:03:24

know him. He's the fucking, he's the,

2:03:27

he's turning. Jensen's the goat. Yeah, he

2:03:29

is. He is, yeah, he always wears

2:03:31

this leather, you see that leather jacket.

2:03:33

Yeah. Yeah, we hosted this dinner. Hong,

2:03:36

is that a Chinese name, Hong? Hong,

2:03:38

yeah, he's Taiwanese, I think, or Taiwanese

2:03:40

American, but he, he, in, in 2018,

2:03:42

so when we were like, like, a

2:03:45

baby company, we threw this dinner in,

2:03:47

in, in, in Silicon And we just

2:03:49

kind of, I kind of yellow invited

2:03:51

him. This was years and years ago.

2:03:54

And I didn't expect it, but he

2:03:56

said yes. And he came to our

2:03:58

dinner and he. He came, it was

2:04:00

like at this restaurant in Silicon

2:04:03

Valley and he came and just

2:04:05

told the craziest stories. He, he like

2:04:07

went to, he went to boarding school, I

2:04:09

think he's probably told the

2:04:11

story, but like he went to boarding

2:04:14

school and his parents, when they came

2:04:16

to America, they wanted to send him

2:04:18

to boarding school, but they didn't.

2:04:20

And so they just sent him to

2:04:22

like the first boarding school that they

2:04:24

like found on Google or something they

2:04:27

were like found. It wasn't even Google

2:04:29

at the time so that they like

2:04:31

heard about and that boarding school

2:04:33

happened to be Kind of like a

2:04:36

rehab boarding school. So it was

2:04:38

like He was like this so

2:04:40

fucking. It's a halfway house. He's

2:04:42

just there He's there learning with

2:04:44

people who are detoxing. Yeah, so

2:04:46

he told me the story about

2:04:48

he was like he was just

2:04:50

like this this kid at you know this

2:04:52

boarding school that were like

2:04:54

everybody else had these like

2:04:57

all these crazy backgrounds and

2:04:59

he like he got by and made

2:05:01

made his way through that school by

2:05:03

like by like doing everyone's math homework

2:05:05

and like you know kind of like

2:05:07

wheeling and dealing that way. Oh yeah.

2:05:10

And you could see that he he

2:05:12

learned how to like wheel and deal

2:05:14

and sell and all this stuff from

2:05:16

all the way from way back then

2:05:19

because he was he was I mean

2:05:21

his his his his story is pretty

2:05:23

crazy. Really. Yeah. Jensen's awesome. Yeah, all

2:05:25

these people in tech, they're like, uh,

2:05:27

I mean, they're all real people, but they all have the

2:05:29

craziest backgrounds. Yeah. Yeah, man, it's just so funny. Whenever

2:05:31

I met you, I just didn't know, I figured that since you

2:05:34

were with Sam Altman, that you were probably a tech guy, you

2:05:36

know, and, um, but yeah, I didn't know. I think maybe somebody

2:05:38

said he's in the AI verse, you know, but you just

2:05:40

seemed like such a like such a totally normal, like, totally

2:05:42

normal, like, like, like, like, like, like, like, like, like, like,

2:05:44

totally normal, like, like, like, like, like, like,

2:05:46

like, like, like, like, like, totally normal, like, like,

2:05:48

like, like, like, like, like, like, like, like, totally

2:05:50

normal, like, like, like, like, like, like, like, like,

2:05:52

like, like, like, like, like, like You were just

2:05:55

I don't know I guess on you think like

2:05:57

somebody's gonna be they're gonna be like super quiet

2:05:59

or You know not have a lot of

2:06:01

different thoughts but yeah it was cool

2:06:03

man we had a good time I'm

2:06:05

glad we got to yeah link up

2:06:07

yeah totally that was cool bro you're

2:06:10

probably like my you might even my

2:06:12

first Chinese friend I'll think second probably

2:06:14

Bobby Lee who's denying it but he'll

2:06:16

come around um Alexander Wang man thank

2:06:18

you so much dude I bet your

2:06:20

whole family's super proud of you um

2:06:22

yeah it's exciting man thank you for

2:06:25

coming to spend in time with us

2:06:27

and just helping us learn and thank

2:06:29

No, thank you. This was awesome. And

2:06:31

I think like, I mean, we were

2:06:33

talking about this before, but I want

2:06:35

to make sure that like people all

2:06:37

around the world, especially Americans, aren't scared

2:06:39

of AI, because it's going to be,

2:06:42

it's going to be really cool and

2:06:44

it's going to be amazing, but we

2:06:46

need to remove the Buggy Man component.

2:06:48

And thanks for helping me do that.

2:06:50

Yeah, no man, I think I definitely

2:06:52

feel differently about it. I feel like

2:06:54

it's a tool that I can use,

2:06:57

right? And even I don't know how

2:06:59

to use it, so I'm trying to

2:07:01

figure out, you know. One more question,

2:07:03

how do you keep it from becoming

2:07:05

like an advertisement trap house, like the

2:07:07

internet's become, like the internet's just pop-ups

2:07:09

and ads and fucking Best Buy trying

2:07:12

to like beat you over the head

2:07:14

on some shit, like how do you

2:07:16

stop that out of like you guys

2:07:18

as watersers? Or do you guys as

2:07:20

waters, or do you have to go

2:07:22

there at some point to go there

2:07:24

at some point to make money? Yeah,

2:07:27

first I'm hoping that that we have

2:07:29

like the AI industry as a whole

2:07:31

avoids advertising as much as possible because

2:07:33

Because it's kind of it's very different

2:07:35

like it is a tool that that

2:07:37

people can use to start businesses or

2:07:39

make movies or make all these like

2:07:41

different ideas happen and I would much

2:07:44

rather it be a tool that doesn't

2:07:46

get sort of I wanted to make

2:07:48

sure it's a tool that helps people

2:07:50

first and foremost. So that's that I

2:07:52

think this is there's kind of like

2:07:54

a choice here and and we as

2:07:56

an AI industry just got to I

2:07:59

think make some of the right choices.

2:08:01

Yeah, I think it would be value

2:08:03

in staying. as pure as you could,

2:08:05

if you could find a way to,

2:08:07

you know, if there's other money to

2:08:09

be made on the side, it almost

2:08:11

seems sometimes like it could be a

2:08:14

trap, you know. Yeah, yeah. And I

2:08:16

think that's like, you know, I want

2:08:18

to make sure that people don't feel

2:08:20

like they're being used by the AIs.

2:08:22

I think that'd be really bad if

2:08:24

we ended up there. So we, you

2:08:26

know, and I don't think we need

2:08:29

to make it like that at all.

2:08:31

Like I think we can we can

2:08:33

make sure the AI is helping you

2:08:35

do things is is super helpful is

2:08:37

like a thought partner Is it is

2:08:39

like in like an assistant like those

2:08:41

are things that I think we want

2:08:43

to make sure AI stays gang baby

2:08:46

Wang Gang Alexander Wang man. Thanks so

2:08:48

much for having me. Yeah. It was

2:08:50

awesome man. Danny Kane shout out to

2:08:52

Danny who came up and Daniel is

2:08:54

in Franklin, right? We got to get

2:08:56

to spend some time with him. Yeah,

2:08:58

lives in Franklin. Whenever you're out there,

2:09:01

we're out there. And shout out to

2:09:03

Alex Bruceowitz, who we met through. Yeah.

2:09:05

Who else are your teams here, man

2:09:07

today? We have Joe Osborne. Yeah. And

2:09:09

we have Danny's whole crew, so Arian

2:09:11

and Clayton. Yeah. Nice. We'll have to

2:09:13

get a group picture. We'll put it

2:09:16

up at the end. Thank you guys.

2:09:18

Have a go, have a go. I

2:09:20

must be cornerstone. Oh, but when I

2:09:22

reach that ground, I'll share this piece

2:09:24

of mine, I found I can.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features