Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
I've been working with a nurse
0:02
dietitian for the last six months
0:04
and it's been life-changing. I've lost
0:07
weight, killed my relationship with food
0:09
and have way more energy. Working
0:11
with a dietitian online to create
0:14
a personalized nutrition plan was so
0:16
easy thanks to nourish. The best
0:18
part, I pay $0 out of
0:21
pocket because nourish accepts hundreds of
0:23
insurance plans. 94% of patients pay
0:25
$0 out of pocket. Find your
0:28
dietitian at Use Nourish.com. Where'd
0:30
you get those shoes? Easy. They're
0:32
from DSW because DSW has the
0:34
exact right shoes for whatever you're
0:36
into right now. You know, like
0:38
the sneakers that make office hours
0:41
feel like happy hour. The boots,
0:43
that turn grocery aisles into runways,
0:45
and all the styles that show
0:47
off the many sides of you,
0:49
from daydreamer to multitasker and everything
0:51
in between, because you do it
0:53
all, in really great shoes. find
0:55
a shoe for every you at
0:57
your DSW store or DSW.com. Deep
0:59
Seek is distilling off of open
1:02
AI, open AI is rich enough
1:04
to complain to say, oh
1:06
well, you know, that's not
1:09
good even though they've trained
1:11
their models on everything that's
1:13
out there on the internet
1:16
without permission, but it's a
1:18
real moment. It's like a
1:20
shift or it's an indicator
1:23
of something. One source... actually
1:25
told me like a tech
1:27
person was like, look, this
1:30
is exciting in a way
1:32
because it lowers the cost
1:35
of the AI that's
1:37
on tap for like
1:39
founders and people who
1:41
want to develop
1:43
new products and
1:46
new services and
1:48
things like that. And
1:50
so it's an
1:52
opportunity. Senior Business
1:54
and Technology correspondent
1:56
for CBS News. Jolin Kent,
1:58
welcome back, Jolin. Hi Andy,
2:00
how are you? I'm doing great.
2:02
It's true. You did know
2:04
me when I was Andy
2:07
because we've known each other
2:09
for a long time. No,
2:11
no, it's all really good.
2:13
People will appreciate that. So
2:15
I wanted to have you
2:17
on because I've gotten a
2:19
whole slew of questions about
2:21
about Deep Seek, AI, markets,
2:24
tech, big tech, etc.
2:26
etc. And I thought who
2:28
could I bring on that is a
2:30
deep head in this stuff, has spent
2:32
time in Asia for years and years
2:34
and looks at this stuff professionally. And
2:36
I thought of you, so thank you
2:38
for coming on. I'm sure you got
2:40
some of the same stuff. Did anyone
2:42
ping you being like, hey, what the
2:44
heck is going on? Yeah, I mean,
2:46
people are reaching out and I'm hearing
2:48
from sources and stuff like that, but
2:50
I'm also. on parental leave right now.
2:53
I had a baby about three months
2:55
ago, as you know, and it's just,
2:57
it's been crazy. So I've been like
2:59
catching up. And it's interesting when you
3:01
are used to covering the news day
3:03
to day and then you kind of
3:06
consume it week to week. And people
3:08
have like a thousand different opinions in
3:10
the span of like seven days. And
3:12
so it's been fun to like track
3:14
people and be like, oh, you thought
3:16
this like five days ago and how
3:19
you think this? And how you think
3:21
this? kind of surprised. Well it's one
3:23
reason why I wanted to wait a
3:25
beat because there was like a flurry
3:27
of stuff that came out and then
3:29
there were some more stuff and then
3:31
Deep Seek then turned itself off
3:33
to anyone who didn't have a
3:35
Chinese cell phone number and then a
3:38
bunch of countries then said hey no
3:40
to Deep Seek and I think there
3:42
might be a bill in Congress saying no
3:44
to seek for... government workers
3:46
and maybe even more than that. So
3:48
can you help describe, well actually let's
3:50
give some, I'll do, I'll set the
3:52
table for people who have no idea
3:54
what I'm talking about and then you
3:56
don't go to the art. Thank you.
3:58
Oh no no problem. So Deep Seek
4:00
is a Chinese AI product. Well,
4:03
actually, it's a company. And then
4:05
they had they released a model
4:07
that is something of like a
4:09
Chinese version of chat GPT called
4:11
R1. And it works very, very
4:13
well. Some people said it works
4:15
better than open AI's latest models.
4:17
But the wildest revelations were that
4:19
it cost a very small fraction.
4:21
of what Open AI spent in
4:23
China. Allegedly. Allegedly, but then what's
4:25
wild, Jolie, and we can talk
4:27
about this, there aren't people, I
4:29
mean, everyone knows, like, take it
4:32
with a grain of salt, but
4:34
also people aren't disputing that that
4:36
heavily. They're not like, oh, you
4:38
know, this is nonsense or full
4:40
of shit. Like people are kind
4:42
of taking that on its face.
4:44
And then the other thing is
4:46
that it's open source. And so
4:48
those two things, and by the
4:50
way, the lower cost, it means
4:52
less energy, less computing power, presumably
4:54
less cost. So this came out
4:56
and then people started playing with
4:58
it and kicked the tires on
5:00
it. And then it had this
5:03
massive, gosh, maybe I think it
5:05
produced the biggest stock loss and
5:07
value loss for one company in
5:09
history, where Envedia plummeted. No, double
5:11
digit percentage was like 12, 15
5:13
percent, something like that, which was
5:15
very, very significant because going into
5:17
that, it was worth, you know,
5:19
several trillion, 3.6 trillion. It has
5:21
been the darling of the market
5:23
for a long time. Yeah, so,
5:25
so this new new development caused
5:27
everyone to double back and say,
5:29
wait a minute, and Vidia is
5:32
selling these chips for a lot
5:34
of money because they have the
5:36
stranglehold on the... processing chips that
5:38
are high-powered enough to do all
5:40
of the computing so every major
5:42
company that wanted to develop AI
5:44
was going to invidia and paying
5:46
really like, you know, through the
5:48
nose, like whatever invidia wanted, which
5:50
led invidious stock price to go
5:52
up and up and up, their
5:54
chips were selling for tens of
5:56
thousands of dollars, each even, and
5:58
the major tech companies were spending
6:01
a ton, and then Deep Sea
6:03
came out and said, hey, we
6:05
kind of produced something that's just
6:07
as good. for only $6 million
6:09
for a small fraction of the
6:11
price. And then the market had
6:13
this giant like, wait, what the
6:15
heck is happening? And there was
6:17
a bit of a freak out.
6:19
Right. And so basically it like
6:21
caused all of these stocks to
6:23
go down. And the question was
6:25
like, you know, for a lot
6:27
of people, you know, what is
6:29
the relationship with China? And like
6:32
what if you're using it from
6:34
like a personal standpoint, right? Like
6:36
how does it? work and like
6:38
what are you handing over at
6:40
that point you know to the
6:42
Chinese government and it's like there
6:44
was a lot of people out
6:46
there you know waiving this flag
6:48
saying like look this is like
6:50
surveillance and this is bad you're
6:52
basically like there was you know
6:54
discussion of whether or not you
6:56
know the Chinese government is collecting
6:58
data on people and if you
7:01
look at the terms I mean
7:03
it collects a lot of data
7:05
deep seek does collect a lot
7:07
of data like a lot of
7:09
data like a lot of the
7:11
other tech companies here in the
7:13
United States as well right like
7:15
they have access to other apps
7:17
and things like that but it's
7:19
like a the big question from
7:21
the consumer standpoint was like well
7:23
is there actually some fire there
7:25
and so It was interesting to
7:27
see people like see the market
7:30
freak out in this way, but
7:32
you know, one source actually told
7:34
me like a tech person was
7:36
like, look, this is exciting in
7:38
a way because it lowers the
7:40
cost of the AI that's on
7:42
tap for like founders and people
7:44
who want to develop new products
7:46
and new services and things like
7:48
that. And so it's an opportunity.
7:50
And thinking about like that market
7:52
tumble and what happened to the
7:54
stock market, I was. talking, I
7:56
was reading Dan Ives from Wedbush,
7:58
his, you know, analysis of these
8:01
tech companies and he's been on
8:03
all of these companies for so
8:05
long and he loves to use
8:07
those sport metaphors and sports metaphors
8:09
and all that. And he's like,
8:11
he still says like there's still
8:13
one chip company in the game,
8:15
right? Like, it's one company that's
8:17
actually launching things beyond like a
8:19
chat bot and like having one
8:21
LLLM, like large language model, you
8:23
know. you know, chatbot for consumer
8:25
use is like, that's one thing.
8:27
It's very important and very lucrative.
8:30
But it's another thing to think
8:32
about like all the other stuff
8:34
that comes with invidious. So I
8:36
don't know. It's, it's, it's, it's,
8:38
it really was like a wake-up
8:40
call moment, you know, in a
8:42
lot of ways, maybe people think
8:44
like, but it depends on where
8:46
you were coming from on it,
8:48
right, like it, as always. Well,
8:50
certainly if you're the scrappy entrepreneur
8:52
type, you might be like, ooh,
8:54
this is going to be good
8:56
because, you know, the barriers are
8:59
a little bit lower and maybe
9:01
we can use some tools. You
9:03
know, I talked also to some
9:05
investment professionals who were looking at
9:07
this and have covered the tech
9:09
companies for a while. And that
9:11
they think that these tech companies
9:13
have spent billions of dollars on.
9:15
Billions, yeah. Yeah, billions, I mean,
9:17
you know, like tens of billions.
9:19
On chips and processing power and
9:21
infrastructure to build these large language
9:23
models in large part, because they
9:25
wanted it to be this defensible
9:27
moat, and their concern was that
9:30
deep seek showed that the moat
9:32
actually is very, very crossable. Right,
9:34
you can pop right over. Yeah,
9:36
you can stop right over. And
9:38
one person explained it to me
9:40
as that, so what Open AI
9:42
did is Open AI took all
9:44
of these language inputs and then
9:46
did all, did like a, you
9:48
know, like gajilian. connections and calculations
9:50
so that if you ask it
9:52
a question or you know reference
9:54
certain data then like a certain
9:56
output would come out and deep-seek
9:59
essentially short-circuited all of the like
10:01
the guts in the middle and
10:03
just figured out okay what's the
10:05
input what's the output and did
10:07
that and distillation right yeah distillation
10:09
which so the argument that I
10:11
find compelling and I'm very very
10:13
eager to hear what you think
10:15
was okay. So you spend $100
10:17
billion or whatever it is on
10:19
this moat. And then some clever
10:21
Chinese company says, hey, I don't
10:23
know what's happening inside the moat,
10:25
but it turns out maybe it's
10:28
irrelevant because I can figure out
10:30
the inputs and outputs and distill
10:32
what's happening. And there is evidence,
10:34
by the way, that. Deep Seek
10:36
is very much based on chat
10:38
GDPT. There was a joke I
10:40
used to tell where people would
10:42
ask like how far behind is
10:44
China from the US in AI?
10:46
And the joke answer is 12
10:48
hours because they just wake up
10:50
and see what we did the
10:52
previous day. It's interesting because I
10:54
was at this fortune innovation conference
10:56
in Hong Kong last year, about
10:59
a year ago, and Kaifu Li,
11:01
he's the head of Google China,
11:03
and now it's like a big
11:05
AI, investor and leader in China,
11:07
had a really interesting perspective because
11:09
for a long time he was
11:11
like, well, it's probably going to
11:13
be like two coexisting ecosystems like
11:15
parallel to each other, developing through
11:17
different audiences. And now with Deep
11:19
Seek, that's the first thing I
11:21
thought of was like. Well, I
11:23
guess that doesn't hold, because like
11:25
now everything is like, you know,
11:28
if Deep Seek is distilling off
11:30
of Open AI, Open AI is
11:32
rich enough to complain to say,
11:34
oh, well, you know, that's not
11:36
good, even though they've trained their
11:38
models on everything that's out there
11:40
on the internet without permission. But
11:42
it's a real moment. It's like
11:44
a shift. or it's an indicator
11:46
of something. Have you made a
11:48
New Year's resolution that's related to
11:50
eating healthier, being a little bit
11:52
more fit, losing a little bit
11:54
of weight while have I got...
11:57
Great news for you. Factor meals
11:59
can help you actually achieve that
12:01
goal. Imagine fully prepared meals sent
12:03
to you, fresh, I'm good at
12:05
a lot of things, I will
12:07
confess, but I'm really crappy at
12:09
cooking. And Factor makes it so
12:11
that you can have a completely
12:13
delicious, awesome meal in the comfort
12:15
of your own home that's healthy,
12:17
it's good for you. You know
12:19
when you eat something and you
12:21
know it's healthy, that's the experience
12:23
with Factor. When I ordered it.
12:25
The Nona's Bolinese, the mustard chicken,
12:28
they hit the spot, they're quick,
12:30
they're convenient, they get the job
12:32
done, and then some. Eat smart
12:34
with Factor. Get started at factormills.com/Factor
12:36
Podcast and use code Factor Podcast
12:38
to get 50% off your first
12:40
box plus free shipping. That's code
12:42
Factor Podcast at Factor Meals.com/Factor Podcast
12:44
to get 50% off. Plus free
12:46
shipping on your first box. Now
12:48
I want to pivot to something
12:50
very personal and small, but it
12:52
might be important. So you just
12:54
said, hey, look, Open AI trained
12:57
itself on publicly available data, the
12:59
internet. No one really got paid
13:01
for that. You know, the New
13:03
York Times is suing. Like there's
13:05
all this stuff happening. Yeah, yeah,
13:07
people are suing. So I got
13:09
a note the other day from
13:11
my publisher, which was fascinating, which
13:13
said an unnamed AI company wants
13:15
to train on your book. Oh,
13:17
and they are offering us not
13:19
a lot of money. Honestly, it
13:21
was like, you know, a few
13:23
thousand dollars for it to be
13:26
able to train on your book
13:28
and we'll split the money with
13:30
you 50-50. And so I looked
13:32
at this and was like, okay,
13:34
not a lot of money, but
13:36
it's, you know, not zero. And
13:38
on one level, I was kind
13:40
of impressed that the AI company
13:42
was. paying the publisher, you know,
13:44
that's actually a fairly important award.
13:46
I'm just ripping it off. Yeah,
13:48
yeah, because I mean, obviously, my
13:50
book's out there and like, they're
13:52
electronic copies, so you could just
13:54
feed it to the, it's called
13:57
a day. So the fact that
13:59
they actually went through the proper
14:01
channels. And by the way as
14:03
an author when I write a
14:05
book you know there are various
14:07
rights that you say it's like
14:09
hey audio book you know blah
14:11
blah blah blah. But yeah but
14:13
international but there was nowhere in
14:15
there that said hey and if
14:17
we feed it's AI like you
14:19
know maybe you get something. So
14:21
now they're. There. There I mean
14:23
I I said sure you know
14:26
because again on my hand I
14:28
was like kind of impressed that
14:30
they decided to push my publisher
14:32
that actual money. Yeah. I thought
14:34
that, you know, in my case,
14:36
I would take the money and
14:38
donate it to, you know, a
14:40
democracy org. I can, so I
14:42
was like, so I won't feel
14:44
bad like somehow, you know, I
14:46
mean, you know, though, though, you
14:48
know, I'm sure, you know, like
14:50
there are any number of authors
14:52
who are getting similar notes right
14:55
now. So, like, you know, you
14:57
could be a little bit happy
14:59
for those authors getting a little
15:01
bit of extra money that they
15:03
weren't expecting, though I'm sure they
15:05
all also had the sinking feeling
15:07
in their guts the way I
15:09
did, which is like, oh, I
15:11
guess, you know, that's what's happening
15:13
to my work. It's just going
15:15
to become one, you know, scintilla
15:17
of this anti-model. And, you know,
15:19
that also was a reason why
15:21
I was, I was like, sure,
15:23
because I kind of want my
15:26
book to be in my book
15:28
to be in the, My ideas
15:30
than like the model might answer,
15:32
you know, from with my stuff,
15:34
you know. actually makes me happy.
15:36
So you have open AI training
15:38
on all the publicly available information
15:40
and maybe even some copyrighted information
15:42
that some others may or may
15:44
not have access to. And then
15:46
you have deep seek in the
15:48
Chinese who in my view from
15:50
what I can tell from the
15:52
facts. They've built this sort of
15:55
efficiency layer on top of it
15:57
where it's like, okay, I don't
15:59
know what's going on in all
16:01
the guts of it, but I
16:03
can tell if I ask it
16:05
this, it says that, and then
16:07
I can make a version of
16:09
it that is very, very highly
16:11
efficient and usable and cheaper, and
16:13
I can make this available. at
16:15
very low cost, I mean, you
16:17
know, maybe next to nothing. And
16:19
one of the things that's been
16:21
unclear to me this whole time
16:24
has been, let's say that I'm
16:26
open AI and Microsoft and you
16:28
spend, you know, 80, 100 billion
16:30
dollars on large language models or
16:32
other things. Like right now, that's
16:34
a much, much higher sum than.
16:36
any revenue that's being realized from
16:38
from these models at present like
16:40
are there revenues yes some yeah
16:42
some subscription revenues yeah yeah but
16:44
like like how's have the revenues
16:46
caught up to the investment not
16:48
yet great And then the argument
16:50
is like, oh, but it will,
16:52
it will, it will, you know,
16:55
it will get there, like eventually,
16:57
you know, that this is going
16:59
to be a very, very big
17:01
line item, which on some level,
17:03
like on the enterprise side or
17:05
something, yes, yes, on some level,
17:07
I totally agree with. And by
17:09
the way, one of the things
17:11
I, and I'm even like a
17:13
small investor in a company that
17:15
does something that's related to this,
17:17
so when I was running for
17:19
president, I would say all the
17:21
president, I would say all the
17:24
president, I would say all the
17:26
president, I would say all the
17:28
When do you think AI is
17:30
going to do that job? And
17:32
then people would think about it.
17:34
And now if I ask that
17:36
question, people are like, immediately, yesterday,
17:38
I mean, I talked to the
17:40
head. You can't get through to
17:42
a human anymore. You know, or
17:44
eventually in the argument I was
17:46
making before is like you'll prefer
17:48
the AI or you might not
17:50
be able to tell the difference.
17:53
I talked to the founder of
17:55
a major. consumer facing company that
17:57
everyone would have heard of. And
17:59
he said that they used to
18:01
have 2,300 people in customer service,
18:03
and they're trying to get that
18:05
down to 100. That's wild. But
18:07
I think people are feeling that
18:09
nationwide. And I think that if
18:11
you look at some of the
18:13
survey data out there about like
18:15
how people feel about getting help
18:17
on like routine stuff, yes, AI
18:19
can solve a lot of things.
18:21
And it's, you know, nice to
18:24
have a chat bot. I guess
18:26
if your problem is simple. But
18:28
like so many people feel like
18:30
a frustration of like, this is
18:32
terrible. Like I, for example, I
18:34
called a major institution three days
18:36
ago, and it was like I
18:38
was getting the AI, like all
18:40
of the, you know, prompts and
18:42
everything. And it was like, endlessly
18:44
frustrated. Like I turned into like
18:46
an angry, angry monster. I was
18:48
like, I can't get what I
18:50
need because only a human can
18:53
handle it. But I do think
18:55
that's going to change and quickly,
18:57
quickly, quickly, quickly, quickly become. irrelevant
18:59
and like it won't be an
19:01
issue anymore. Yeah so on that
19:03
level there are definitely companies that
19:05
will pay you know real money
19:07
for AI that's going to be
19:09
able to replace 2,000 customer service
19:11
workers but the the revenue hasn't
19:13
caught up to the scale of
19:15
the investment yet. There's a lot
19:17
of hope that it will a
19:19
lot of projection that it will
19:22
and one of the incentives in
19:24
this structure, I think that's that's,
19:26
you know, going somewhat haywire in
19:28
my opinion, honestly, is that if
19:30
Microsoft announces that they invest $80
19:32
billion in open AI, Microsoft stock
19:34
value pops by some larger sum
19:36
than that. If meta says, hey,
19:38
we're going to increase our capital
19:40
expenditures by $15 billion in AI,
19:42
their stock pops by more than
19:44
that. It's like. like the stock
19:46
market will now reward anyone for
19:48
any money they're spending in the
19:50
space if you announce hey I'm
19:53
building it's like oh good you're
19:55
future-proofing yourself you end up I
19:57
mean that's definitely the historical trend
19:59
right like that is exactly what
20:01
has happened but like do you
20:03
think now with Deep Seek and
20:05
the development costs being like allegedly
20:07
like a lot lower obviously we
20:09
know what has happened here with
20:11
open AI but like do you
20:13
think that that like trend continues
20:15
for much longer And that's really
20:17
the thing I'm trying to lay
20:19
out and I want to talk
20:22
to you about is I don't
20:24
think it does continue. Like I
20:26
think that Deep Sea... It doesn't
20:28
make any sense anymore. Yeah, like
20:30
Deep Sea has kind of pulled
20:32
the, you know, the veil away
20:34
or just pointed out, look, like
20:36
this emperor's clothes are not on
20:38
or aren't great or don't, you
20:40
know, don't need to be there
20:42
because... To me, the business case
20:44
gets so much weaker on so
20:46
many levels if you're like, hey,
20:48
look guys, I'm spending $50 billion
20:51
on AI and this Moten, like
20:53
computing infrastructure and giant, you know,
20:55
giant data sets and everyone's like,
20:57
ooh, ooh, yeah, like you're saying
20:59
all the right word. Oh yeah,
21:01
you're going to control the world.
21:03
And look, I'm someone who very
21:05
much believes that, you know, incredibly
21:07
revolutionary tools, data is the new
21:09
oil, like I obviously believe all
21:11
that stuff. But there is like
21:13
a time curve, and then there
21:15
is how much of that money
21:17
you're going to be able to
21:19
charge in revenue, you know, for
21:22
like different actors. And then you
21:24
have, if the Chinese come and
21:26
say, hey guys, look, large language
21:28
model, really whizbang, great, awesome, like
21:30
you can use ours for next
21:32
to nothing. computing costs are lower
21:34
the energy consumption is lower and
21:36
and then one of the things
21:38
I got asked you know on
21:40
CNN when I was talking about
21:42
this is like oh that won't
21:44
there be some censorship if it's
21:46
a Chinese product like if you
21:48
decide to ask the model about
21:51
Tiananmen Square or whatever and I
21:53
and I was like look for
21:55
the average consumer in Africa or
21:57
Latin America or Europe, that's not
21:59
their concern. Like their concern is,
22:01
can this thing do what I
22:03
needed to do at the lowest
22:05
cost possible? And one of the
22:07
things that is happening to, which
22:09
you and I can also talk
22:11
about, is, you know, it's like,
22:13
this was Deep Seek until they,
22:15
you know, took it, they made
22:17
it unavailable. It was the number
22:20
one app in the app store,
22:22
you know, like, Americans were downloading
22:24
it like wild. And the security,
22:26
the data concerns are very much
22:28
secondary or tertiary to most consumers.
22:30
And you can see that with
22:32
Tiktok. I mean, there was like
22:34
this huge rage in Congress saying,
22:36
oh, we got to get rid
22:38
of Tiktok, you know, it's exactly
22:40
what they're doing now with deep
22:42
sea, right? Like, that's, there's that
22:44
LaHood Gotheimer, Gotheimer, right? It's the
22:46
same, it's the same person who
22:48
did the same person who did
22:51
the What I want to understand
22:53
though is when you think about
22:55
all the things you just said
22:57
and you think about like the
22:59
amount of money that has been
23:01
made so far on AI, I
23:03
was just curious like, do you
23:05
believe what like the Dan Ives
23:07
like investment professionals like these analyst
23:09
guys are saying, which is like,
23:11
look, is, you know, we kind
23:13
of touched on that earlier, but
23:15
basically like, bigger AI infrastructure is
23:17
still like American led and like,
23:20
Deep Sea doesn't really like change
23:22
the game in that. Do you
23:24
buy that? Do you disagree with
23:26
it? Like, I'm confused by the
23:28
case that someone like that's making
23:30
because if you have computer infrastructure
23:32
that enables you to do things
23:34
that, you know, that can only
23:36
be done through having, you know,
23:38
gajilian servers and all that, but
23:40
then if someone can. reverse engineer
23:42
the product of your work and
23:44
not have to redo all the
23:46
calculations and figure it out just
23:49
based on inputs and outputs, then
23:51
it doesn't matter. what you spend
23:53
in the middle, if someone can
23:55
just take input output and say
23:57
like, you know, to die, figure
23:59
it out, you know, at least
24:01
what the, because one of the
24:03
elements of this is that if
24:05
you ask even the creator of
24:07
the AI who has all the
24:09
servers, hey, when I asked the
24:11
AI this and it gave me
24:13
this, like, why or how did
24:15
it do that? Yeah. No one
24:17
knows. Like whether it's like, you
24:20
know, even if it was my
24:22
AI, like I can't necessarily train
24:24
it to figure out the problem
24:26
for the for itself. Yeah, yeah,
24:28
I can't necessarily tell you. And
24:30
so if if the Chinese can
24:32
do functionally the same thing because
24:34
of all the work you've done
24:36
and you spent, you know, 10
24:38
or 50 or 100 billion dollars
24:40
on it, then like the bigger
24:42
your spend is the more you're
24:44
benefiting. both you and them, but
24:46
they don't have to spend the
24:49
50 or 100 billion dollars. Like
24:51
the business case gets a lot
24:53
weaker, especially because now you're having
24:55
this, you know, you're already having
24:57
what Kaifu was proposing, which we
24:59
all kind of projected and feared
25:01
and imagined, which is that the
25:03
West is going to particularly the
25:05
US is going to try and
25:07
set up its own AI universe.
25:09
And then there's going to be
25:11
a giant turf war for... other
25:13
countries, so you would figure the,
25:15
you know, EU would be like,
25:18
sure, we'll work with the states
25:20
on this because we don't want
25:22
to work with China on it.
25:24
And then my guess would be
25:26
that Africa is like, screw it,
25:28
we'll use Chinese AI. Yeah, I
25:30
mean, they're taking all that other
25:32
investment anyways, yeah. It
25:43
will be this then. What happens?
25:45
Okay, so we have the Stargate
25:47
factor, right? Like you have President
25:49
Trump very much, you know, trying
25:51
to shore up AI investment and
25:53
you've got what, $500 billion? Sam
25:55
Altman standing there next to the
25:57
president at the White House. Where
25:59
does... that does that help? Do
26:01
you think like does that actually
26:03
like further or is it just
26:05
like digging more into the current
26:07
positioning of the US AI you
26:09
know companies and just you know
26:11
here's my here's my instinct on
26:13
it is that if deep seek
26:16
is real if you take it
26:18
is real and by the way
26:20
there are a lot of you
26:22
know very serious techies who looked
26:24
at it and are not disputing
26:26
the elements that we've laid out
26:28
here. It isn't like, hey, this
26:30
is bullshit. It's like, okay, I'm
26:32
going to take this as if
26:34
it's real and true, and then
26:36
we're going to respond. The problem
26:38
is that everyone's incentives, and using
26:40
Stargate, it's a perfect example. So
26:42
Trump comes up and says, hey,
26:44
we're going to spend $500 billion
26:46
on AI, compute infrastructure, and win
26:48
the AI arms race with China.
26:50
Now. It turns out that of
26:52
that 500 billion, most all of
26:54
it had already been committed by
26:56
various tech companies. So Microsoft was
26:59
80 billion of the 50 billion.
27:01
I think they might have announced
27:03
the 80 billion even before. So
27:05
what they did is they they
27:07
said Microsoft is good for 80
27:09
and Met is good for some
27:11
number. And then they glued all
27:13
the tech companies together and said
27:15
together. It's 500 billion. Yeah, it's
27:17
500 billion. And what's happened is
27:19
the market has rewarded everyone who's
27:21
in that group and, and Vidia
27:23
for sure, and some of the
27:25
companies that produce the server farms
27:27
and everything else. Like, you know,
27:29
I mean, 500 billion is a
27:31
lot of money. I mean, like,
27:33
it's literally changing the landscape. And
27:35
what no one is in position
27:37
to say. But I'm beginning to
27:39
have this sinking feeling It's like
27:42
hey guys, it could be that
27:44
we're not going about this the
27:46
right way that like this money
27:48
might not be well spent That
27:50
we were not sure And but
27:52
the thing is everyone's incentives now
27:54
are around the spend because if
27:56
I the tech company announced that
27:58
I'm spending 20 billion on AI
28:00
my market cap goes up by
28:02
100 billion and then I and
28:04
then I go to. You know
28:06
and bid and give them like
28:08
you know like billions of dollars
28:10
worth of orders like that there's
28:12
like this giant. arrow heading in
28:14
a particular direction and it reminds
28:16
me of some other times in
28:18
my life I'm a little bit
28:20
older than you. Thank you. Does
28:22
this remind you of like what
28:25
was it like the tech bus
28:27
like is it the same sort
28:29
of like money for money's sake
28:31
without evaluating like what why are
28:33
we committing this much money to
28:35
something? This feels like it's being
28:37
driven by money to your point
28:39
like that like this or you
28:41
know like it to me it's
28:43
like funny money yes yes what
28:45
you you kind of disregarded the
28:47
laws of business physics that sometimes
28:49
apply I think Tim O'Reilly call
28:51
this super money where you can't
28:53
compete with super money when it
28:55
starts going it comes out of
28:57
the White House right like when
28:59
it's you know with such in
29:01
addition to like the incentive in
29:03
the markets to spend more to
29:05
make more it's also like You
29:08
know, you have a president who
29:10
has this very unique and very
29:12
powerful hold on these billionaire founders,
29:14
you know, inauguration case in point,
29:16
and I'm sure much more behind
29:18
closed doors. And so it's like
29:20
all of these factors are coming
29:22
in plus like, what is the
29:24
Trump influence like with the US
29:26
trying to relationship on deep seek
29:28
and this kind of stuff? It's
29:30
true. Because the numbers have gotten
29:32
to this scale, like it's definitely
29:34
now starting very directly with government.
29:36
You know, you had a bunch
29:38
of the tech CEOs at Trump's
29:40
inauguration and some people like, oh,
29:42
that's terrible. My opinion. is that
29:44
these tech CEOs are just pragmatic
29:46
and being like, you know, like
29:48
if it had been common, they'd
29:51
be standing there too. Like if
29:53
they were asked to, I mean,
29:55
I don't know if common would
29:57
have asked them to. I think
29:59
Trump asked them to because Trump
30:01
just likes the sense of it's
30:03
like, see, I've got these guys
30:05
and it's like all the money
30:07
in the world or like, you
30:09
know, I mean, and then if
30:11
you're in their situation, you're like,
30:13
all right, all right, I guess
30:15
I'm going to. I mean, some
30:17
of them are definitely more into
30:19
it. Some of them are just
30:21
in my opinion, it's like, well,
30:23
you know, like, whatever. You can
30:25
read those facial expressions in the
30:27
room and then pretend. You could
30:29
see who was there for the,
30:31
I was called to be here
30:34
and stand here versus like, I
30:36
am here to play ball for
30:38
sure. Yeah, yeah, you're right. There's
30:40
a mixed bag. But at this
30:42
point, the scale is so large
30:44
that it's like, look, I mean,
30:46
we're talking about I'm leading a
30:48
trillion dollar company. We're going to,
30:50
you know, invest hundreds of millions
30:52
in this stuff. And then government's
30:54
going to be there to help
30:56
me and, you know, there'll be
30:58
other resources flowing. It's all heading
31:00
in a particular direction. And to
31:02
me, like the reckoning that Deep
31:04
Seek represents is like the fact
31:06
that invidious stock went down and
31:08
then I think is mostly recovered
31:10
since then. Yeah, it's bounced back.
31:12
I'm not sure the exact numbers
31:14
here, but it's largely, I think,
31:17
recovered. And people aren't questioning the
31:19
fundamental story. But so when I
31:21
was referring before it's like, hey,
31:23
I'm a little bit older or
31:25
whatever, I do remember the technical
31:27
bust of the first bubble, you
31:29
know, 2000. I started my first
31:31
company in 2000 and NASDAQ was
31:33
5,000, went down to 2000, but
31:35
there was that period when anything
31:37
with the.com in it and said,
31:39
hey, internet, internet, internet, you know,
31:41
pets.com, e toys, like you name
31:43
it, would just go and then
31:45
they'd get driven to the moon.
31:47
And there were elements of the
31:49
story that were very real and
31:51
true. It's just that the revenues
31:53
didn't match up for a particular
31:55
time. Like, you know, there's like
31:58
a time horizon. this reminds me
32:00
of too, it's like if you
32:02
ask me, hey, is AI real?
32:04
Oh yeah, AI is very real.
32:06
Is it gonna change a lot
32:08
of things? Yeah, like are we
32:10
ahead of it? Yeah, we're ahead
32:12
of it by a lot. Like,
32:14
you know, like do the numbers
32:16
make sense? You know, maybe eventually,
32:18
but like, you know, you'd have
32:20
to project a lot out into
32:22
the future. So I talked to
32:24
a professional investor friend who was
32:26
sizing this stuff up. and video
32:28
yes we are okay now you
32:30
know that's where he went which
32:32
means that's clear which means that
32:34
he thinks it's overvalued he thinks
32:36
other things overvalued too and I
32:38
was like oh because I respect
32:41
this person's opinion I was like
32:43
you know I I have that
32:45
sense that that like the fundamentals
32:47
have kind of become disconnected or
32:49
unrelated to a lot of what's
32:51
happening in this space. But when
32:53
this heads in a particular direction,
32:55
you know, no one wants to
32:57
stand in its way because like
32:59
who the heck, you know, for
33:01
people who weren't around for the
33:03
first.com boom and bust, like the
33:05
echo of it in a different
33:07
sector. was the mortgage busts circuit
33:09
2008 where there were people looking
33:11
up being like hey some of
33:13
this stuff doesn't make much sense
33:15
but it's like what are you
33:17
gonna do? Lose your home no
33:19
big deal I mean it's it's
33:21
yeah it's what's interesting about the
33:24
comparison to you is if you
33:26
look at the companies that did
33:28
survive the tech bust right it's
33:30
such a small percentage and so
33:32
if you do that just rough
33:34
math thing right to how many
33:36
AI companies how much they're making
33:38
you know, the incredible climb they've
33:40
had. There will be the several
33:42
AI companies that come out like
33:44
winners and infrastructure and they'll provide
33:46
that foundation right of like AI
33:48
going forward. But it is like
33:50
the big question is like every
33:52
single company is trying to turn
33:54
itself into an AI company to
33:56
take advantage of this premium and
33:58
this sort of. like, ride to
34:00
the moon. And it's like, it's
34:02
not, the laws of, yeah, financial
34:04
physics don't really work like that.
34:07
Yeah, I mean, I have friends
34:09
who raise money for AI companies
34:11
and raise a lot of it
34:13
at our very high value. Yeah,
34:15
exactly. Especially like a year ago,
34:17
it was like, if you were
34:19
an AI company doing like, AI
34:21
pets, right, you could probably make,
34:23
you know, easily raise around. This
34:32
podcast is sponsored by Helix Sleep.
34:34
I have always been in search
34:37
of the perfect mattress. Sleeping something
34:39
you do eight hours a day,
34:41
on a good day, and it's
34:44
a third of your life. Plus
34:46
it can make you healthier, more
34:49
energetic, more productive. I'm a little
34:51
bit of a productivity fiend, honestly.
34:53
And so I was so excited
34:56
to finally find out about helix
34:58
mattresses, the best mattress in the
35:00
Biz. Don't just take my word
35:03
for it. GQ Wired, my kids.
35:05
I took the sleep quiz. at
35:07
helixsleep.com and got matched with a
35:10
dawn firm mattress that has been
35:12
a game changer for me I
35:14
wake up refreshed ready to take
35:17
on the day I kind of
35:19
need a firm mattress it's not
35:22
Evelyn's dam so much but it
35:24
is mine and I got to
35:26
say it has been worth every
35:29
penny every moment I'm on this
35:31
mattress makes me actually feel like
35:33
I made a good life decision
35:36
Helix sleep can give you that
35:38
feeling every single morning when you
35:40
wake up. Go to helix sleep.com/yang
35:43
for 27% off site-wide plus two
35:45
free dream pillows with mattress purchase
35:48
plus free bedding bundle. Two dream
35:50
pillows sheet set and mattress protector
35:52
with any Lux or elite mattress
35:55
order. That's helix sleep.com/yang for 27%
35:57
off site-wide helix sleep.com. slash Yang.
36:10
So that's like the sense of the
36:12
era that we're in. And the fact
36:14
that Deep Seek now isn't available in
36:16
the US. I feel like it kind
36:19
of, you know, was this comet and
36:21
shows up and is like, wow, like
36:23
we're here, we're free. And then everyone's
36:25
like, and it's gone. But it's not
36:27
gone. I mean, it's being used by
36:30
millions and millions of people in China.
36:32
And even what Kaifu Li had originally
36:34
said then last year, like a year
36:36
ago, like last March, holds that and
36:38
like are we going week or can
36:41
we never go back to sort of
36:43
like two systems in one universe? I
36:45
think that we're going to end up
36:47
in two parallel AI universes. There's going
36:49
to be the China sphere and the
36:52
US sphere. The problem for the US
36:54
companies though is that as far as
36:56
I can tell, China's going to be
36:58
able to under price by a lot.
37:01
You know, because if I've got a
37:03
company that spent $100 billion and I
37:05
need you to pay for it, I'm
37:07
going to price it very differently than
37:09
if I spent, you know, $10 million.
37:12
You mentioned if you add on to
37:14
the fact that like a lot of
37:16
these companies are heavily government subsidized, right?
37:18
Like in the same way that the
37:20
e-commerce companies were, you know. subsidizing supported
37:23
by the government too. But by the
37:25
way, I mean, the Chinese were forced
37:27
into this position because the US tried
37:29
to cut off their supply of high-powered
37:31
chips, though they did get their hands
37:34
on some. But one of the things
37:36
that popped into my mind during this
37:38
process, Jolling, was just the phrase, necessity
37:40
is the mother of invention. That the
37:42
US tried to box out Chinese firms
37:45
from being able to make advance an
37:47
AI being like, ha ha, like, you
37:49
know, we're going to try and cut
37:51
the chip, the, you know, the chip
37:54
access off to you. China does have
37:56
access to its own data. So, you
37:58
know, that like that, you know, I
38:00
mean, they have more of it, you
38:02
know, and also it's centralized. But. But
38:05
they pushed the Chinese into a situation
38:07
where the Chinese had to figure out
38:09
some kind of hack and, you know,
38:11
in reverse engineering. Yeah, build their own
38:13
on top of, you know, GPT's model
38:16
it looks like. And so one of
38:18
the jokes I had, aside from, you
38:20
know, that they're 12 hours behind us
38:22
and AI, was, does it really shock
38:24
anyone that the Chinese figured out how
38:27
to build something cheaper? It's been kind
38:29
of their jam for, you know, like,
38:31
quite some time. And they're also very
38:33
unmindful of... you know intellectual property rights
38:36
and like and that stuff I mean
38:38
they'll just lift it if they can
38:40
and in this case this is the
38:42
ultimate choice of words but but this
38:44
is the ultimate irony is that the
38:47
AI companies also just frankly appropriated most
38:49
of this data you know from us
38:51
you know hundreds of billions of dollars
38:53
worth and then they managed to build
38:55
these like mega enterprise mega moats and
38:58
then the Chinese have essentially done the
39:00
same thing to their handywork and then
39:02
they're in this really weird position where
39:04
it's like oh man well that stinks
39:06
and you're open source oh man like
39:09
I mean because some of the techies
39:11
originally were like oh we should do
39:13
as open source but then you can't
39:15
can't charge billions of dollars if you're
39:18
open sourcing it. And so it's like
39:20
maybe we won't open source it. Maybe
39:22
we're going to use it. This is
39:24
why Elon's so mad at Sam Altman
39:26
is because opening I started out of
39:29
something that was supposed to be kind
39:31
of, you know, for the commons or
39:33
whatever. It was a non-profit. Non-profit. And
39:35
then they converted for profit. But then
39:37
it turns out that like the money
39:40
for money's sake, it's a very American
39:42
story, I feel like. It's just someone
39:44
saw this and was like, oh, like,
39:46
you know, money to the moon. And,
39:48
and, and, but, and then so now
39:51
they're in this weird position where it's
39:53
like, okay, we spent gajillions of dollars.
39:55
You know, you have an open source
39:57
competitor over there. So let's try and
39:59
shut the door, ban it. China and
40:02
was like, okay, like, let's pretend that
40:04
didn't happen, just keep on investing and
40:06
digging and building and spending. And keep
40:08
doing what we were doing and maybe
40:10
not acknowledging the lessons that
40:12
have been learned here. That's my strong
40:15
feeling. That's my take. That's my
40:17
take. And so when you've been
40:19
talking to people and they've run
40:21
the gamut, like what what have
40:23
the conversations been like? Because you
40:26
have contacts here, China like. Yeah. You know,
40:28
I think most of my conversations of late
40:30
have been like US sources people who
40:32
are in tech companies or around tech
40:34
companies and you know investors and stuff
40:36
like that and I think that there
40:38
is like a couple schools of thought.
40:41
I mean, we've talked about some
40:43
of it already, but there's a
40:45
sense of like, okay, like, let's
40:47
make it cheaper. So it makes
40:50
people, it makes it easier for
40:52
people to develop stuff. So there's
40:54
another option. It's good competition. You
40:56
know, all ships rise in the
40:59
end. But there's the surveillance
41:01
aspect of course, like the
41:03
net set folks are like, okay, you
41:05
know, in the same way that people
41:07
who have told me. I would never use a
41:09
little red book on my actual iPhone. I
41:11
have like a burner phone and these are
41:13
tech people, right? So I have burner phones
41:15
like do that because they really do believe
41:17
that there is a national security threat of
41:19
some kind regarding our personal data. So it's,
41:21
you know, when you think about Deep Seek,
41:23
it's like, if you think about it from
41:26
an investor perspective and like your skepticism about
41:28
where the, how much money AI can make, you
41:30
know, that's, that's one big pot of questions. But
41:32
the other one is really like, is really like,
41:34
like, like, like, like, like, like, like, like, like,
41:37
like, like, like, like, like, How much do you
41:39
give up when you're dealing with a
41:41
TikTok or a Little Red Book or
41:43
a Deep Seek? And like, what are
41:46
some of the things that the US
41:48
government ought to be doing in this
41:50
new administration to actually protect people who
41:52
care about those kinds of things? Because
41:55
I think increasingly so many people just
41:57
like, you just download the app, you
41:59
agree to the... terms and like most
42:01
you know, you know, 100% like you hear, like
42:03
it's just like I just want to
42:05
use the thing that I'm addicted to.
42:07
And so there doesn't seem to
42:09
be there's obviously an outcry
42:11
in Washington on this front
42:14
and there's, you know, the
42:16
legislative proposals and things like
42:18
that. But there what I'm waiting to
42:20
happen and I don't know if it
42:22
will in 2025 is will there be like
42:24
a consumer driven shift in like
42:27
how we use things and like especially
42:29
like social media and AI, because once
42:31
you know what is being collected, do
42:33
people care? I would argue most
42:35
people don't care. Yeah, I agree. Because
42:37
they still use it. But if there is an
42:40
increasing national security threat
42:42
in the geopolitical environment,
42:45
you know, hardens and there's more like
42:47
borders and lines drawn, so to
42:49
speak, then do we care more? Do lawmakers
42:51
care more? Does that enforcement
42:53
mechanism then affect how people
42:55
consume stuff on their devices?
42:57
I just wonder if 2025 is going to be the
42:59
year of like a little bit of a tipping point
43:01
on that run. I don't know. Start
43:11
fresh in the new year. As
43:13
you set resolutions for 2024, consider
43:15
how learning a new language can
43:18
enrich your life, whether through travel,
43:20
career advancement, or cultural appreciation. Keeping
43:22
in mind everything you've learned over
43:25
the last year, it's time to
43:27
build on that. And learning a
43:29
new language can help you connect
43:32
with others and explore new cultures.
43:34
With that in mind, there's no
43:36
better tool than Rosetta Stone. Then
43:39
Rosetta Stone. truly learn to think
43:41
speak and understand it naturally. With
43:43
Rosetta Stone's intuitive approach, there are
43:45
no English translations. You're fully immersed.
43:47
And the built-in true accent feature
43:49
acts like a personal accent coach,
43:51
giving you real-time feedback to make
43:53
sure you sound just right. Don't
43:56
put off learning that language. There's
43:58
no better time than writing. Now
44:00
to get started. Start the
44:02
new year off with the
44:04
resolution you can reach. Today,
44:06
listeners can take advantage of
44:08
this Rosetta Stone's lifetime membership
44:10
for 50% off. Visit Rosetta
44:12
Stone.com/RS10. That's 50% off unlimited
44:14
access to 25 language courses
44:16
for the rest of your
44:18
life. Redeem your 50% off
44:20
at Rosetta Stone.com/RS10 today. Alaska
44:22
Airlines and Hawaiian Airlines have come
44:25
together, bringing you more destinations and
44:27
even more rewards. Now your miles
44:29
add up no matter which airline
44:31
you fly. Head to Hawaii with
44:34
Hawaiian or explore Mexico, the Bahamas,
44:36
the East Coast, and beyond with
44:38
Alaska. Your loyalty just got a
44:40
major upgrade. More flights from the
44:42
West Coast. More perks and more
44:45
ways to earn. Book now at
44:47
Alaska Air.com. So
44:49
here's the stuff I'm hearing is that
44:52
there are some consumers that I know
44:54
like regular people Who are now trying
44:56
to turn off Amazon Prime because
44:58
they're pissed at Basos or that
45:00
they're trying to get rid of
45:03
Facebook or Instagram because they're pissed
45:05
off at Zuck like that now.
45:07
There's like a There's like a
45:09
personal, an anthropomorphization of these tech
45:11
companies where now like, you know,
45:13
Zok equals meta and it's like,
45:16
oh, I'm mad at Zokka with
45:18
something. So I don't know. Yeah, they
45:20
saw him in inauguration, like all these
45:22
simple little pieces. Yes, yes. It's
45:24
something as simple as that.
45:26
So like I think that there
45:29
are some people making consumer-based decisions
45:31
about some of those tech companies.
45:33
But the single biggest emblem
45:35
of this has been Tik. And.
45:37
I mean, this was bipartisan and
45:39
a bunch of legislators came together
45:41
and said, hey, this is a problem. It's
45:44
our kids' data, their brains. We need
45:46
to find a U.S. buyer or this thing is going
45:48
to be banned. And then I'll tell story
45:50
too, Jolin, because I'm on TikTok. You
45:52
know, I mean, I have a TikTok
45:54
account with 400, I'm not like, I
45:56
don't consume, I'm happy to report. I'm
45:58
not like sitting there watching. videos all
46:00
day. That's what you say. That's what
46:03
you say. But what happened was I
46:05
have a staffer who runs my account
46:07
and said, hey, do you want to
46:09
put out a farewell to TikTok message
46:11
because a lot of people are putting
46:13
that out. when it's being banned and
46:15
I was like, no, because I do
46:18
not think it's going to be banned.
46:20
And then the person asked why. And
46:22
I said, number one, it's because all
46:24
these people who are on ticked dock
46:26
are super pissed in our political class.
46:28
Really, they have no backbone. It's like
46:31
if people are pissed at them, they're
46:33
like, oh, okay, okay. Like, you know,
46:35
like even if I did vote for
46:37
that legislation, like, let's make you happy
46:39
because I don't like people mad at
46:42
me. Like if he doesn't ban it and
46:44
it's out there and he's like, hey, I'm
46:46
going to find a buyer, it's like
46:48
TikTok, probably called Instagram. Yeah,
46:50
then Trump doesn't get any, you
46:52
know, bargaining power or money from
46:55
that. But if he doesn't ban
46:57
it and it's out there and he's
46:59
like, hey, I'm going to find a
47:01
buyer, maybe like, you know, create
47:03
the sovereign wealth fund, then there's
47:05
money in it for him. or these
47:08
power. And so he'd
47:10
prefer that path because,
47:12
you know, it's just
47:14
purely transactional and like it's
47:16
not about like privacy or
47:18
kids data or, you know,
47:20
their mental health. So I
47:22
was like, they're not banning
47:24
it. Like, you know, and
47:26
it turns out that, you know,
47:28
my, I think I was largely
47:30
correct. President Trump into the
47:33
zeitgeist of a pajillion Tiktak users, right?
47:35
Because that error message that came up
47:37
was like, oh, we are, I screen
47:39
grabbed it. I can't remember exactly what
47:42
it said, but it was like, President
47:44
Trump is like, working on this, right? And
47:46
then like, a few hours later, Tiktak is
47:48
back. So it's like, it all seems like it's just
47:51
like a show, in a sense of like power, or
47:53
of, you know, what's interesting though, jolings,
47:55
is that the legislators, is that
47:57
the legislators were sincere, I think,
47:59
We're going to ban it. And then...
48:02
And remember who started all of this?
48:04
Who started all of this? This was,
48:06
remember the first, no one, the funny
48:08
thing to me, and even as a
48:11
journalist, like our brains are so full
48:13
of information, things are coming so fast
48:15
and furious and it's crazy right
48:17
now. But like, the Trump, the first
48:19
Trump administration, it was
48:21
all about the TikTok, at the, towards
48:23
the end of the first four years.
48:26
And so it is just wild
48:28
to me that you fast forward,
48:30
you've got President Biden before, years
48:32
now, President Trump is back, that
48:35
this is being done again, it
48:37
comes as no surprise, of course,
48:39
but like, it's just fascinating how
48:41
short our memories are when it comes
48:43
to, you know, something that is so
48:45
essential to so many people's lives,
48:47
like social media, right? Like, this
48:49
all began during the first
48:52
Trump administration. And then, I'll
48:54
never forget this. Trump was like,
48:56
yeah, we got to like get rid
48:58
of TikTok. And so I'm reporting on
49:00
all this. And at some point, like,
49:03
some other major world event happens
49:05
and it's like the Trump DOJ
49:07
or was the FTC or something,
49:09
they kind of go quiet on
49:11
me. And I'm like, I was
49:13
talking to sources inside, you know,
49:15
TikTok and bite dance. And I was
49:17
like, forgive me if this is a
49:20
silly question, but did they
49:22
forget that they just. initiated
49:24
this entire situation in all of
49:26
this legal battle, and the other
49:28
side, tick-tock and by dance, they
49:30
were just like, it appeared, we are
49:33
not getting responses from the
49:35
US government anymore. So it's
49:37
like at the convenience of President
49:39
Trump, it appears that this
49:42
entire controversy ebbs and flows. Well
49:44
you were referred to a little red
49:46
book and that there was this moment
49:48
when people thought Tik Tak was going
49:50
away when people were switching the little
49:53
red book even though that's very natively
49:55
Chinese I mean even like the name
49:57
is based on you know Mao's little
49:59
red book. And there were a bunch
50:01
of Americans who were jokingly, like, I
50:03
pledge allegiance to, you know, the little
50:06
red book. It just goes to show
50:08
how little people care about who's getting
50:10
their data, like the national entities. You
50:13
know, it's like, like, as long as
50:15
I get my, my, whether or not
50:17
they can even be understood by millions
50:19
of people who use that current app.
50:22
Like I had so many friends that
50:24
were getting, they were posting, like, comments
50:26
in Mandarin, they'd screenshot them, because I'm
50:28
not. I didn't put a little red
50:31
book on my iPhone. They'd screenshot it,
50:33
text me, and be like, Joe, what's
50:35
to say, translate it for me? And
50:38
I was like, oh, this really is,
50:40
like, people don't care. They just want
50:42
to be seen? Yeah. Is that it?
50:44
Like, yeah, it's just plugging into their
50:47
brainstem where, you know, you get dopamine
50:49
hits based upon responses. I'm really glad
50:51
I was a functional adult for all
50:54
this stuff came along. I'm sure you'll,
50:56
you know, keep your girls away from
50:58
it for a while, anxious generation and
51:00
all that. It's all very, very intimidating
51:03
to me though, because I feel like
51:05
the more you know about it, the
51:07
worse it weighs, like the heavier it
51:09
weighs. But I agree. I think most
51:12
Americans do not fundamentally care who has
51:14
their data or where it goes because
51:16
we have been conditioned as a society
51:19
to just... release it. Like think about
51:21
all of the credit card breaches and
51:23
all of the payment breaches over the
51:25
years. Like, there's some crazy stat out
51:28
there that is like, you know, the
51:30
vast majority of Americans have had their
51:32
private information hacked, right, and stolen in
51:35
some way, not just Equifax or the
51:37
target hack or whatever. It's like, it's
51:39
out there. Yeah, it's a factor in
51:41
life and you hope it doesn't rise
51:44
to the point where you have to
51:46
change your credit card. Or you show
51:48
up to your website and they're like,
51:50
hey, you might want to change your
51:53
password and you're like, oh God, like
51:55
I said something has happened here if
51:57
you're asking. to do this. Thank you
52:00
so much for helping us dig into
52:02
issues that I get asked about all
52:04
the time. And it sounds like you
52:06
do too. I know that you're hanging
52:09
out with a family right now, but
52:11
if someone wants to keep up with
52:13
you and your work, how can that
52:16
yourself? Yes, I will be back on
52:18
the air at CBS News at the
52:20
second half of March and I'm on
52:22
social at Joe Lincoln across all platforms,
52:25
but always good to be with you
52:27
Andrew Yang, Andy Yang. I don't know
52:29
what I should, whatever you want, Joe
52:31
Lang, whatever you want, but congratulations to
52:34
you on your work, on the family,
52:36
on everything. It's kind of awesome to
52:38
see how far you come, how far
52:41
we all come. I mean, wow, things
52:43
have really. We've all been through a
52:45
lot and it's crazy to think that
52:47
I probably first met you in 2013
52:50
or 14. Yeah, is that right? Yeah,
52:52
I think that's right. I was a
52:54
normal human. Happy 10 year friend aversary
52:57
or something. Drinks on me next time
52:59
I see you. Well, thanks for having
53:01
me on. It's great to chat with
53:03
you. I always love hearing your perspective
53:06
and what all your listeners are saying
53:08
about what the heck is going on
53:10
right now. Thank you, Joolin. Come back
53:12
any time. Oh, thank you. I will.
53:15
Thanks.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More