Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
The demand for content is
0:02
skyrocketing and businesses are turning
0:04
to generative AI for a creative
0:06
edge. But with generative AI, it's
0:08
important to ensure your work is
0:10
copyright safe. Meet Adobe Firefly. Adobe's
0:12
family of generative AI models, built
0:14
to be commercially safe and IP-friendly.
0:16
Firefly is built to be commercially
0:19
safe and IP-friendly. Firefly is trained
0:21
only on content Adobe has permission
0:23
to worry about creating content that
0:25
infringes on copyright. Firefly unlocks limitless
0:27
creativity without legal worries. Create confidently
0:29
at adobe.com/Firefly. Okay, see, hello, Kevin.
0:31
How's it going? Well, you know,
0:33
it's going all right. There's, um...
0:35
a lot of sad news in
0:38
the world right now. Of course,
0:40
you know, my thoughts are with
0:42
everyone affected by these LA wildfires.
0:44
And whenever something bad has happened
0:46
this month, I just think, I
0:48
can't believe they're doing this to
0:50
us during dry January. You know
0:52
what I mean? I'm doing dry
0:54
January this year. And, you know,
0:57
how's it going? Well, the unfortunate
0:59
news is that it's going fantastic.
1:01
I had sort of assumed that,
1:03
you know, each day I would
1:05
wake up thinking like, like, crack
1:07
open a beer with my friends
1:09
tonight or something. But instead, I'm
1:11
just like, I feel so incredibly
1:14
well rested. So that has been
1:16
an interesting learning. Do you plan
1:18
to continue it beyond January? Well,
1:20
yeah, I mean, I don't think
1:22
I'm gonna go 100% dry, but
1:24
I have actually been thinking about
1:26
like, what if I did like
1:28
a dry February too? So I
1:30
don't know, I don't know, changes
1:33
could be on the horizon. What
1:35
would happen to your wine tap?
1:41
I'm Kevin Bruce, a tech columnist
1:44
at the New York Times. I'm
1:46
Casey Noon from Platformer. And this
1:48
is hard for. This week, TikTok
1:51
enters its final hours in the
1:53
United States, and Americans are flocking
1:56
to a new Chinese app. Then,
1:58
Hugging Faces, Sasha Lucioni, joins us
2:00
to help us understand the environmental
2:03
impact of AI. And finally, my
2:05
ideas to bring more
2:08
masculine energy
2:10
to meta.
2:13
Time to
2:15
man up, Kevin.
2:18
Let's go, bro. is that we
2:20
might actually get a Tiktak ban.
2:22
It's true. And I feel like
2:25
we've probably started at least seven
2:27
segments of the show that way
2:29
over the years, Heaven. But this
2:31
really is looking like the final,
2:33
final, final version of, yes, Tiktak
2:36
could be banned. Yes, this is,
2:38
this is, Tiktak ban, V12, final,
2:40
use this one, DocX. That's right.
2:42
We have talked so much about
2:44
how we came to this point,
2:47
the law that Congress passed to
2:49
ban Tikktiktik. Pafaka, Pafaka, technically to
2:51
force it to sell. That law
2:54
is supposed to go into effect
2:56
on January 19th, which is this
2:58
Sunday, the day before Donald Trump's
3:00
inauguration. And as of this taping,
3:03
it appears that barring some last
3:05
minute intervention, TikTok's time as one
3:07
of the most popular social media
3:10
apps in the United States may
3:12
be coming to an end. And
3:14
you want to talk about a
3:17
roller coaster? I mean, the number
3:19
of twists and turns this story
3:21
has taken from when Donald Trump
3:23
tried to ban a tick-talk in
3:25
his administration, to Joe Biden putting
3:28
the brakes on that and exploring
3:30
other alternatives, to that process going
3:32
off the rails, and Congress passing
3:34
the first piece of tech-related legislation
3:36
in the past decade to make
3:39
this thing happen, then Donald Trump
3:41
reverses himself, says I'm going to save
3:43
the app, then that doesn't... And finally last
3:45
week it all ends up at the Supreme
3:48
Court. So yes, if you wonder why we
3:50
keep doing this segment over and over again
3:52
It's because very few things have changed as
3:54
much in the past half a decade or
3:56
so as the fate of TikTok. It's truly
3:59
nuts. So when we started recording this segment
4:01
on Wednesday, the Supreme Court had
4:03
not made a decision yet on
4:05
this case, but as of Friday,
4:07
the justices did issue an opinion
4:09
upholding the law and denying Bite
4:11
Dance's challenge. So Casey, we're going
4:13
to talk about all this, but
4:15
let's start by analyzing a little
4:17
bit of what happened at the
4:19
Supreme Court and then talk about
4:21
where we go from here. Okay,
4:24
so last Friday, the Supreme Court
4:26
of the United States heard oral
4:28
arguments in Tiktak v. Merik Garland,
4:30
of course, is the Attorney General.
4:32
And this was a lawsuit brought
4:34
by Bightdance to try to get
4:36
the Supreme Court to step in
4:38
the last minute and overturn this
4:40
law and overturn this law. to
4:42
delay the ban from going into
4:44
effect. And did you actually listen
4:46
to the oral arguments? I know
4:48
you're a noted Supreme Court watcher.
4:50
I did not listen to the
4:52
Supreme Court arguments live, Kevin, but
4:54
I did catch up on them
4:56
later. And I have to say
4:58
I was surprised by the tenor
5:00
of the discussion. Say more. Well.
5:03
I sort of just thought that
5:05
we would see a bit more
5:07
deference to the First Amendment that
5:09
we got. Justices seem to think
5:11
that the speech issues involved in
5:13
the case were not relevant because
5:15
the way that the law is
5:17
written says that as long as
5:19
bite dance divests this app, all
5:21
of the speech on the app
5:23
remains, right? So they sort of
5:25
swept that away. And again, if
5:27
this is somewhat surprising and we're
5:29
talking about, it is only because
5:31
the court did not have to
5:33
hear this case. Right? The last
5:35
time we talked about this potential
5:37
ban, we said, hey, look, the
5:40
court could just defer to the
5:42
lower court, not hear this argument,
5:44
and just let the law go
5:46
into effect. But instead, they did
5:48
take it up, which made some
5:50
people think, aha, maybe they have
5:52
something to say about it. But
5:54
I did sort of predict at
5:56
the time that really just had
5:58
a lot of justices that kind
6:00
of wanted to like give Tik
6:02
Tak the finger, and that does
6:04
seem like what happened last week.
6:06
comply with this law. What happens
6:08
next? So, Bight Dance has said
6:10
that it will block access to
6:12
the app for Americans beginning on
6:14
Sunday. So you will open your
6:16
TikTok app and it will not
6:19
refresh. It will not be populated
6:21
with new content. Now, under the
6:23
law... Bike Dance doesn't actually have
6:25
to do that. The way the
6:27
law is written, it is actually
6:29
intended to force Apple and Google,
6:31
the two big app store providers,
6:33
to remove Tiktok from the app
6:35
store. But Bike Dance got ahead
6:37
of that and said, we're just
6:39
going to shut the thing off,
6:41
which some people have speculated is
6:43
essentially a move to get some
6:45
leverage, because you're going to make
6:47
so many Americans, where you don't.
6:49
open your app and you'd see
6:51
a little pop-up that said like
6:53
we don't have Uber in your
6:55
city if this makes you mad
6:58
like here's some numbers to call
7:00
to contact your local legislators and
7:02
yell at them yeah and you
7:04
know by chance has tried stuff
7:06
like that in the past in
7:08
the United States and it has
7:10
backfired but you know at this
7:12
point what does it have to
7:14
lose right so The other possibility
7:16
for an outcome here, I suppose,
7:18
is that by dance could agree
7:20
to sell TikTok, that it would
7:22
divest as this law was intended
7:24
to force them to do, and
7:26
that that would be how TikTok
7:28
survives. Now, we should say, I
7:30
think time is running out for
7:32
any kind of deal like that,
7:34
but maybe we should run down
7:37
a few of the possible... ways
7:39
that this could end with a
7:41
new owner of TikTok in the
7:43
United States rather than the app
7:45
just going dark. Let's do it.
7:47
So one group that has lined
7:49
up to say that they are
7:51
interested in possibly acquiring TikTok is
7:53
a group of investors led by
7:55
the billionaire Frank McCourt who is
7:57
the former owner of the LA
7:59
Dodgers. He sent Bight Dance a
8:01
letter last week expressing his interest
8:03
in acquiring Tik. He said that
8:05
he would acquire it even without
8:07
the algorithm that determines what people
8:09
see in their for you pages.
8:11
And he said that he would
8:14
cobble the money together for the
8:16
sale. from private equity funds and
8:18
other ultra-wealthy investors, including Kevin O'Leary,
8:20
the investor who goes by Mr.
8:22
Wonderful on Shark Tank. sharks, I'm
8:24
coming before you today because I'd
8:26
like to buy Tik Talk from
8:28
the Chinese Communist Party. That would
8:30
be a great episode. I'd watch
8:32
it. Yeah, it sounds like this
8:34
is a hastily arranged sale meant
8:36
to avert a catastrophic legal outcome
8:38
and for that reason. I'm out.
8:40
Another potential buyer is the YouTuber
8:42
celebrity Mr. Beast who said that
8:44
he had had billionaires contacting him
8:46
about buying TikTok after he posted
8:48
about it on X. I don't
8:50
know how serious this is, but
8:53
I think we should just say
8:55
Mr. would be, I think, a
8:57
pretty good owner of TikTok. I
8:59
mean, I don't know. After I
9:01
heard what happened on his Beast
9:03
Games show on Amazon Prime, where
9:05
many of the contestants were lucky
9:07
to survive, I'm not sure we
9:09
want this being running a large
9:11
tech platform Kevin. Well, I don't
9:13
think we have to think about
9:15
it that hard, because I don't
9:17
think it's going to happen. Me
9:19
either. But the real wild card
9:21
here, the one that I actually
9:23
do take somewhat seriously, is that
9:25
might be Elon Musk. The Wall
9:27
Street Journal shortly afterward put out
9:29
their own reporting with largely the
9:32
same information. According to the Bloomberg
9:34
article, senior Chinese officials had already
9:36
begun to debate contingency plans for
9:38
TikTok as part of an expansive
9:40
discussion on how to work with
9:42
Donald Trump's administration, one of which
9:44
involves Musk. This is according to
9:46
anonymous sources who asked not to
9:48
be identified revealing confidential discussions. Yeah,
9:50
I bet they did. theory that
9:52
Elon Musk could acquire TikTok. You
9:54
know, every once in a while
9:56
something happens in the universe and
9:58
I think, was this done to
10:00
upset me specifically? That's how I
10:02
felt when I read the story
10:04
that said that Elon Musk was
10:06
going to take over a second
10:08
beloved social platform in the United
10:11
States. you know, presumably apply his
10:13
signature brand of content moderation and
10:15
other fun tricks to the app.
10:17
So, you know, I have to
10:19
say, Kevin, of everything that has
10:21
happened in the TikTok story so
10:23
far, this truly might be the
10:25
craziest because of the different players
10:27
here, who we think knows what,
10:29
what has been said about it,
10:31
I can absolutely see a world
10:33
where this is plausible, I can
10:35
just as easily see a world
10:37
where this is a nothing burger,
10:39
and we're just going to have
10:41
to get a little bit more
10:43
information. But what was your reaction?
10:45
So I was somewhat skeptical when
10:48
I first saw this, in part
10:50
because Tik came out right away
10:52
and said that this was pure
10:54
fiction, but it was written in
10:56
such a way that it I
10:58
felt like the message of the
11:00
stories that I read was that
11:02
actually TikTok may not. be involved
11:04
in these discussions, right? This may
11:06
be happening at the level of
11:08
the Chinese government who is sort
11:10
of deliberating about what to do.
11:12
And that's very telling. Obviously, there's
11:14
this theory that I subscribe to
11:16
and that I think a lot
11:18
of people subscribe to that Bight
11:20
Dants is not really in control
11:22
of its own destiny here because
11:24
there's this sort of, you know,
11:27
there's this government control of all
11:29
Chinese social media platforms, but especially
11:31
this one, which seems strategically important.
11:33
permission to divest TikTok. So that's
11:35
a very real thing. Right. I
11:37
can also see this making sense
11:39
for Elon Musk. He's said before
11:41
that he wants X to operate
11:43
more like TikTok. It's obviously a
11:45
social network that's very popular. They've
11:47
obviously... cracked the code on sort
11:49
of algorithmic presentation of content. I
11:51
also think that that he would
11:53
be a more palatable American acquirer
11:55
for the Chinese government than any
11:57
other potential acquirer. So yeah, so
11:59
sketch this case out like what
12:01
would the Chinese government have if
12:03
Elon owned TikTok? So I think
12:06
one thing that we know about
12:08
Elon Musk is that he does.
12:10
a lot of business in of
12:12
leverage on Elon Musk that they
12:14
do not have over say Frank
12:16
McCourt. Right. On one hand Kevin
12:18
it seems a little crazy to
12:20
me that you know the Chinese
12:22
government thinks essentially we will turn
12:24
Elon Musk into like kind of
12:26
a soft Chinese agent to like
12:28
do our bidding in the United
12:30
States like that seems a little
12:32
bit far-fetched. On the other hand
12:34
if you look at Musk's behavior
12:36
over the past few years which
12:38
I think has been really erratic
12:40
in a lot of ways he's
12:42
always extremely careful about what says
12:45
about China. He truly almost never
12:47
says anything remotely critical of that
12:49
government and you know if you're
12:51
the Chinese government maybe you look
12:53
at that and you appreciate it
12:55
and you think yeah sure let
12:57
that guy take it. Totally. So
12:59
those are the sort of acquisition
13:01
scenarios but I think we should
13:03
say like I don't believe any
13:05
of these are likely to happen
13:07
by the time this deadline hits.
13:09
I think that no matter what
13:11
acquireers might be interested in buying
13:13
Tik, a bite dance does not
13:15
seem interested in selling it. be
13:17
the Chinese government does not seem
13:19
interested in letting bite dance salad
13:22
and see I don't think anyone
13:24
could put together a deal quickly
13:26
enough to actually get this done
13:28
by the 19th. Yeah that's right
13:30
also I just want to like
13:32
acknowledge the American centriceness of this
13:34
conversation. Tiktok is available in many
13:36
other markets where it is not
13:38
banned and while the United States
13:40
is a very lucrative market and
13:42
a great place to run an
13:44
e-commerce operation the way that Tik
13:46
is doing it is Operating in
13:48
many other countries in the world,
13:50
I read this week that its
13:52
largest market right now is actually
13:54
in Indonesia. There are more users
13:56
in Indonesia than in the United
13:58
States. and so if your bite
14:01
dance just for if you're a
14:03
pure dollars and cents perspective you
14:05
made us look at this and
14:07
think we can actually make more
14:09
money operating in the countries where
14:11
we already exist and just sort
14:13
of give up on America and
14:15
we'll be fine so that's another
14:17
scenario here yeah so now let's
14:19
talk about the user response because
14:21
this has been truly wild so
14:23
I'm sure I know you are
14:25
not like a tick-talk addict but
14:27
presumably best efforts yes I tried
14:29
to get addicted to it on
14:31
this very show you did yeah
14:33
But the people on TikTok are
14:35
starting to, I would say, panic
14:37
over the impending possible loss of
14:40
their favorite app, their favorite platform.
14:42
And a lot of content creators
14:44
on TikTok are starting to try
14:46
to bring their audiences over to
14:48
other platforms like YouTube or Instagram.
14:50
Some are saying they're going to
14:52
use VPNs to get around this
14:54
ban. And the most fascinating development
14:56
to come out of all this,
14:58
in my opinion, has been that
15:00
there is now this new trend
15:02
of Tiktak refugees downloading a Chinese
15:04
social media app called Jao Hongshu
15:06
or Red Note. Yes. This is
15:08
truly one of the funniest and
15:10
most unexpected stories of the young
15:12
year so far, Kevin. Yes. So
15:14
as of yesterday, at least when
15:16
I checked, the Johong Shue app
15:19
was the number one free app
15:21
on the iOS app store. It
15:23
has gotten a ton of downloads
15:25
from people who are saying basically
15:27
screw the US government, screw this
15:29
TikTok, I'm going to protest this
15:31
by by going to this explicitly
15:33
Chinese app that does not even
15:35
have an English name in the
15:37
app store. That's right. Now, obviously,
15:39
I have installed this app, which
15:41
I'm going to call Red Note,
15:43
even though I believe that that's
15:45
just kind of an American nickname
15:47
for it. Yes, the literal translation
15:49
is Little Red Book, which is
15:51
also not subtle. That is also
15:53
the name given to the book
15:56
of quotations from Chairman Mao that
15:58
was distributed during the cultural revolution.
16:00
essentially a TikTok-like app. It is
16:02
not owned by Bight Dance, but
16:04
if you open it, you see
16:06
a feed that looks very much
16:08
like the four-you feed of trending
16:10
videos. It basically has the same
16:12
platform mechanics as TikTok. And until
16:14
this week, most of the content
16:16
there was people in China. Speaking
16:18
Chinese and talking about China absolutely
16:20
and you know I downloaded it
16:22
I installed it signed up for
16:24
an account and immediately started watching
16:26
a bunch of videos from You
16:28
know refugees as they're calling themselves
16:30
from TikTok over to red note
16:32
and they seem like they're having
16:35
a great time over there But
16:37
you know in addition to this
16:39
migration Kevin what was truly so
16:41
funny was there were so many
16:43
posts on X of people bragging
16:45
about how they were racing to
16:47
share all of their data with
16:49
a new Chinese app They were
16:51
posting screenshots of themselves with that
16:53
Apple app tracking transparency screen. You
16:55
know, Apple sends you this big
16:57
scary warning. Hey, are you sure
16:59
that you want to share your
17:01
data? And people are like, hell,
17:03
I want to share it, brother.
17:05
Like, I'll give you everything. I
17:07
saw TikTok from this one girl
17:09
who was like, I would fly
17:11
to China and hand my social
17:14
security number to Xi Shen Ping
17:16
before I would ever use Instagram
17:18
reals. That's where the user base
17:20
is at, Kevin. Thought I should
17:22
probably download and install this Jahong
17:24
Shoo rednote app just to see
17:26
what all the fuss is about
17:28
and can I tell you the
17:30
first three videos that I saw
17:32
on my feed? Number one was
17:34
a clip from modern family That
17:36
is the kind of like, that
17:38
is the sourdough starter of the
17:40
modern video based social network is
17:42
just clips of modern families. Number
17:44
two was a Chinese language clip
17:46
of a dog at the vet
17:48
having its anal gland expressed. Perfect.
17:50
And express yourself. And number three
17:53
was someone making a latte art
17:55
of Luigi Mangio. So I think
17:57
it just goes to show you
17:59
how quickly you can create an
18:01
American social network. It really, in
18:03
just three videos, you've captured a
18:05
shocking amount of the zeitast, Kevin.
18:07
But I would say, after I
18:09
scrolled more and more, I did
18:11
start to see these so-called Tiktok
18:13
refugee videos, these Americans who are
18:15
coming over to Red Note from
18:17
Tik and basically trying to make
18:19
sense of this new thing and
18:21
sort of participating in almost a
18:23
cultural exchange. So why do we
18:25
play a couple videos that have
18:27
been making the rounds on Red
18:30
Note? Let's do it. Hi guys,
18:32
I got sent here from TikTok
18:34
and I was hoping that you
18:36
guys can welcome me. I really
18:38
like this app and I love
18:40
the makeup. I tried to do
18:42
it today. So thank you. Sal
18:44
Hocho. I'm not gonna lie to y'all
18:46
bro. I can't read shit on this
18:48
app. Say somebody helped me out. And
18:51
I need some followers too. I say
18:53
go ahead and hit that follow-up. I
18:55
need that. Say who else the Tiktak
18:58
refugee y'all let me know in a
19:00
comment to something. Shahosho. So it inserts
19:02
the little Jahong Shoo at the end
19:05
of every video that's like the watermark.
19:07
Now I actually feel like I could
19:09
pronounce it. So Casey what do you
19:11
make of the Jahong Shoo red note
19:14
trend? I mean... I have to say,
19:16
like, this is why I love Americans.
19:18
Like, the absolute irreverence that they are
19:21
bringing to this conversation, I find, like,
19:23
such a refreshing change of pace, you
19:25
know, so much of the discussion, and
19:27
certainly I participate in this, is in
19:30
terms of, like, the rights involved, the
19:32
equities, like, speech versus security, what are
19:34
the national security implications? And Americans are
19:37
truly just rolling their eyes out of
19:39
the back of their heads with this
19:41
discussion, and they're saying, flock to another
19:43
one and I just think it's amazing.
19:46
Totally. I mean, it's also very interesting
19:48
to me that the response from American
19:50
TikTok users is not the response that
19:53
TikTok had hoped. which was that Americans
19:55
get outraged that their favorite social media
19:57
app is disappearing and banned together to
20:00
like storm the streets and try to
20:02
save TikTok and overturn this law. It's
20:04
like no we're just gonna like pack
20:06
up and move to another platform. I
20:09
think it really speaks to the fragility
20:11
of social media right now and the
20:13
fact that like these platforms are seen
20:16
as somewhat interchangeable and commoditized and so
20:18
like if one of them gets banned
20:20
by the government you just pack up
20:22
and like move over to another one.
20:25
Yeah, although of course there are several
20:27
American apps that folks could have chosen
20:29
to move over to and I think
20:32
it is extra funny that instead of
20:34
doing that, Americans were like, no, find
20:36
us something that looks exactly like TikTok,
20:39
but is even more Chinese. Even more
20:41
Chinese, even less, makes even fewer promises
20:43
about data privacy. Yeah, exactly. Now, you
20:45
know, some people might ask, doesn't Pefaca
20:48
ban Red Note as well? And my
20:50
understanding is that no, Pefaca primarily applies
20:52
to bite dance and tick-talk, but when
20:55
it comes to other apps that are
20:57
owned by a company in a country
20:59
that the United States considers a foreign
21:01
adversary, it is up to the president
21:04
to decide that it represents a national
21:06
security threat. And I imagine it's going
21:08
to be sometime before Red Note is
21:11
come to be seen that way. Right,
21:13
so in addition to packing up and
21:15
moving over to Chong Shoo, Other Tiktok
21:17
users are participating in the Chinese spy
21:20
meme? Have you seen this one? I
21:22
love this meme so much. Okay, explain
21:24
what's going on. So people are saying
21:27
goodbye to their Chinese spy. This is
21:29
another sort of irreverent American joke. Basically,
21:31
they're making fun of the fact that
21:34
members of Congress are constantly saying that
21:36
the Chinese government is using Tiktak to
21:38
spy on Americans. And the Americans are
21:40
now just making videos saying, hey, I'm
21:43
so glad we got to spend this
21:45
time together, my Chinese spy. I will
21:47
always. Remember you and then you also
21:50
have Chinese people, you know, and these
21:52
could be, you know, Chinese people from
21:54
all over the world, but they're making
21:56
memes pretending to be the Chinese spy
21:59
saying, hey, I. you know really loved
22:01
spying on you for all these years
22:03
and you know maybe call your mother
22:06
a little bit and you know this
22:08
message goes out to Laura from New
22:10
York and that sort of thing so
22:13
yeah saying goodbye to my Chinese buy
22:15
I do think is one of the
22:17
best tic-toc memes of all time coming
22:19
in hot right at the end it's
22:22
so good yeah so we've talked a
22:24
lot in the past about the free
22:26
speech implications about all this about whether
22:29
this is sort of the first of
22:31
many conflicts between the US and China
22:33
over emerging platforms But Casey, like, where
22:35
are you right now days away from
22:38
the likely end of TikTok as a
22:40
major presence in US social media? What
22:42
are you thinking about? I mean, my
22:45
feeling has been, and I've kind of
22:47
gone back and forth on this, but
22:49
where I netted out was, I do
22:51
think there are good reasons for the
22:54
United States to restrict. foreign ownership of
22:56
these kinds of apps, particularly from its
22:58
adversaries like China. But I really hate
23:01
the way that they went about it,
23:03
and I worry about the implications for
23:05
other speech platforms in the future. People
23:08
are saying, well, this one is really
23:10
easy because of the Chinese control angle.
23:12
But I don't know, you know, if
23:14
this incoming administration decides it doesn't like
23:17
a lot of the content on an
23:19
American. owned up and says, you know,
23:21
what, we're actually just going to make
23:24
you change ownership the same way that
23:26
we did with bite dance and Tiktok.
23:28
And now the Supreme Court has essentially
23:30
rubber stamped that argument and said, yeah,
23:33
there are no speech concerns because as
23:35
long as you sell the app, all
23:37
that speech can remain. You can imagine
23:40
a lot of really dark outcomes for
23:42
that kind of thinking. So, you know,
23:44
I personally as an older American who
23:47
tried and failed to get addicted to
23:49
Tiktok. an engine of culture that we
23:51
are going to miss out on in
23:53
this country. Those folks are going to
23:56
have to find a new home. It
23:58
really sucks for all of those creators.
24:00
And so, yeah, I think there's going
24:03
to be a lot of really sad
24:05
files. from this. What do you think
24:07
happened? So I still think there's a
24:09
chance that Donald Trump decides to intervene
24:12
and try to save TikTok in the
24:14
United States. This week we got some
24:16
news that show to the chief executive
24:19
of TikTok is going to be at
24:21
Trump's inauguration sitting with a bunch of
24:23
other VIPs and some people have interpreted
24:25
that as Trump saying he supports TikTok
24:28
and might try to save it. He
24:30
obviously made promises about saving Tik during
24:32
his campaign. Obviously a lot's changed since
24:35
then. but I do think that he
24:37
understands that a lot of young Americans
24:39
care deeply about the fate of Tiktok
24:42
and then maybe he can build some
24:44
goodwill with those young Americans by stepping
24:46
in at the last minute sort of
24:48
heroically saved Tiktok. Now there are some
24:51
different ways he could do that, he
24:53
could instruct the Justice Department in his
24:55
administration not to enforce the ban on
24:58
Tik. He could also try to arrange
25:00
some kind of deal potentially... selling Tiktok
25:02
to Elon Musk or someone else that
25:04
he trusts and sort of say that
25:07
is enough of a divestment for me
25:09
that satisfies the requirements of Pafaka. And
25:11
he is, after all, Kevin, the author
25:14
of The Art of the Deal. Exactly.
25:16
So I do think there's a chance
25:18
that Donald Trump sort of keeps Tiktok
25:21
around in some form after all. But
25:23
I'm not sure about that. And I
25:25
think it's equally plausible that Tik actually
25:27
does sort of go away and that
25:30
it becomes this... kind of free-for-all in
25:32
the social media world as different companies
25:34
race to hover up the users who
25:37
had previously spent. you know, hours a
25:39
day on TikTok. Yeah, and when TikTok
25:41
was banned in India, we saw what
25:43
happened there, which was that YouTube shorts
25:46
and Instagram reels, which were Google and
25:48
Metas answers to TikTok, exploded in popularity.
25:50
So one way that you should be
25:53
thinking about this is if this goes
25:55
into effect, this is truly one of
25:57
the greatest gifts for Google and meta
25:59
that you can. And that is just
26:02
really interesting given the strong bipartisan feeling
26:04
in Congress that Google and meta specifically
26:06
ought to be rained in and actually
26:09
even broken up. This, I mean, when
26:11
you think about who is on TikTok,
26:13
it is the younger generation of Americans.
26:16
So if meta and Google can now
26:18
go out and further entrench themselves into
26:20
the lives of Generation Z, they're going
26:22
to have. essentially monopolies over those folks,
26:25
at least in terms of short-form video
26:27
consumption, for the foreseeable future. Yeah, I
26:29
think that's totally possible. I think there
26:32
are probably a lot of executives at
26:34
meta who are licking their chops about
26:36
this, who are very excited about the
26:38
potential because, you know, their platforms, Facebook
26:41
and Instagram, is for millennials. And until
26:43
now, Gen Z has been the Tiktok
26:45
generation. And if meta can sort of...
26:48
you know, suck up those users, it
26:50
can sort of extend its dominance for
26:52
another generation. But I think that the
26:55
past week has made me less sure
26:57
that that's going to be the outcome
26:59
here, because what we have seen on
27:01
TikTok as this ban has approached is
27:04
not people saying, oh, everyone move over
27:06
to Instagram reels. It's saying, let's move
27:08
over to this obscure Chinese language app
27:11
that no one's ever heard of. That's
27:13
how badly we don't want to be
27:15
on Instagram. I think part of the
27:17
GenZ. identity is about not just embracing
27:20
TikTok as a platform, but rejecting the
27:22
platforms that people older than you use.
27:24
And so I think it's equally plausible
27:27
that those younger users do not go
27:29
to Instagram Reels or YouTube shorts, that
27:31
they instead go to some new app
27:33
that may have most of the features
27:36
of TikTok, but is different in some
27:38
way. Maybe we're finally going to get
27:40
a new American social media app. You
27:43
know, I would love to believe everything
27:45
that you're saying, and I think that
27:47
it absolutely could come to pass, but
27:50
I also think it's true that most
27:52
members of Gen Z who have Tic
27:54
on their phone probably have Instagram as
27:56
well, and it's just going to be...
27:59
really hard for them to taking a
28:01
look at that as they look elsewhere
28:03
to get their fix. But at the
28:06
same time, we're also seeing sort of
28:08
separately from all of this a boom
28:10
in the Fediverse and people building on
28:12
protocols. And that is rooted in the
28:15
exact same frustration with these apps that
28:17
are controlled by billionaires and giant faceless
28:19
corporations. So I agree with you, there
28:22
is a lot of frustration among all
28:24
sorts of Americans on that point. And
28:26
so who knows, maybe we do get
28:29
an American-owned alternative to Tikata that is
28:31
not YouTube or meta. Firefly
29:19
unlocks limitless creativity without
29:21
legal worries. Create confidently
29:23
at adobe.com/Firefly. This podcast is
29:25
supported by Google Gemini. Imagine
29:28
an AI assistant that doesn't just spit out
29:30
answers, but that you can have a real
29:32
conversation with. Well
29:54
Casey for basically the entire time we
29:56
have been making this podcast we have
29:58
gotten emails from listeners who want us
30:00
to talk about the environmental impact of
30:02
AI. Yeah, this might be the question
30:04
that we have gotten the most that
30:07
we have not yet devoted a segment
30:09
to. Yeah, and I would say my
30:11
own reluctance to talk about this topic
30:13
on the show so far has been
30:15
some insecurity on my part about like
30:17
not being an expert in climate science
30:19
or the relevant information here, but also
30:21
just like it is very hard to
30:23
get good and authoritative data about this
30:26
subject in particular. It is just not
30:28
something that there is a large body
30:30
of reliable literature about. And the companies
30:32
that have the best data by and
30:34
large are not disclosing any of that
30:36
data. And so that. means that a
30:38
lot of what we talk about when
30:41
we talk about the environmental impact of
30:43
AI is based on estimates that may
30:45
or may not be close to the
30:47
mark. Yeah, but I'm sure you have
30:49
observed as I have that the issues
30:51
around the environment and AI have only
30:53
gotten more important to people. This really
30:56
came to a head last week when
30:58
the wildfires started burning in Los Angeles.
31:00
I saw so many people posting
31:02
on social media about what they
31:04
viewed as a link between AI
31:06
use and the wildfires. And I'll
31:08
just read you one meme I
31:10
saw in my feed that was
31:12
liked and shared millions of times.
31:14
This was posted by a guy
31:16
named Matt Bernstein and I'll just
31:19
read it to you. It said,
31:21
one search on chat GPT uses 10
31:23
times the amount of energy as a
31:25
Google search. Training one AI model
31:27
produces the same amount of carbon
31:29
dioxide as 300 roundship flights between New
31:32
York and San Francisco and five times
31:34
the lifetime of emissions of a
31:36
car. We don't need AI art.
31:38
We don't need AI grocery lists. We
31:40
don't need AI self-driving cars. We don't
31:43
need chat cheapity or Gemini or
31:45
Grok or Dolly or whatever revolutionary
31:47
technology already exists inside our own human
31:49
brains. We need the earth. And
31:51
then below this meme was a
31:53
picture. of a blazing fire. So clearly
31:55
this idea has taken root in culture
31:58
that there is some kind of
32:00
link between the disasters that we are
32:02
seeing in places like Los Angeles and
32:04
the use of AI for basic everyday
32:07
tasks. Yeah, and I think today we
32:09
want to see what we can find
32:11
out about how true some of the
32:13
ideas in that post are. Yeah, so
32:16
to shed some light on this very
32:18
hot topic of AI and energy use,
32:20
I realize that I just used light
32:22
and heat, that was not intentional. But
32:25
we're going to hope we shed more
32:27
light than heat in this discussion, Kevin.
32:29
Yes, today we are talking with Dr.
32:31
Sasha Lucioni. She has an AI researcher
32:34
and the climate lead at Hugging Face,
32:36
which is an AI company that offers
32:38
tools to developers for building AI models.
32:40
She has been researching and talking about
32:43
AI's environmental impact for many years. and
32:45
also developing tools to help developers understand
32:47
the impacts of their own systems on
32:50
the environment. Yes, and Kevin, this might
32:52
be a good time to dust off
32:54
my shiny new disclosure because when we
32:56
talk about AI issues, I will sometimes
32:59
remind people that my boyfriend is a
33:01
software engineer at an AI company called
33:03
Entropic, my full ethics disclosure is a
33:05
platform or not news slash ethics. And
33:08
my fast disclosure is that I work
33:10
at the New York Times Company, which
33:12
is doing open AI and Microsoft versions
33:14
of copyright violations. Perfect. All right, let's
33:17
bring in. Sasha Luciini. Sasha Luciini, welcome
33:19
to Hard Fork. Thanks for having me.
33:21
So I'm very excited to have this
33:23
conversation. This is one we've been looking
33:26
forward to for a while and are
33:28
frankly overdue in having. And I want
33:30
to start by reading you an email
33:33
that we recently got from a listener.
33:35
This comes from a listener named T.
33:37
Morris. And it says the following. As
33:39
a tech content marketer, I feel increasingly
33:42
conflicted about using AI. On the one
33:44
hand, it's been an amazing writing partner
33:46
for big tasks like brainstorming and editing
33:48
tech articles and smaller copywriting tasks like
33:51
drafting social media posts. On the other
33:53
hand, I see climate disasters like the
33:55
North Carolina floods and LA fires linked
33:57
with the amount of water and natural
34:00
resources it takes to sustain AI infrastructure
34:02
and feel myself rationing my AI use,
34:04
questioning whether the time saved is worth
34:06
the environmental tradeoffs. How do I navigate
34:09
this new world where AI is everywhere
34:11
while staying true to my environmental values?
34:13
So, Sasha, we'll dive into some of
34:16
the specifics in just a minute, but
34:18
I want to just start with. this
34:20
question from our listener, what advice would
34:22
you give T Morris? I'm generally very
34:25
skeptical of like individual culpability when it
34:27
comes to the common craves. Like yes,
34:29
of course we all contribute, but I
34:31
think that we're all also part of
34:34
systems and we have professions that require
34:36
usage of technologies, you know, some people
34:38
drive for a living and you know,
34:40
we can't spend our time feeling bad.
34:43
I'm much more of a fan of
34:45
a fan of while requiring accountability from
34:47
companies and requiring transparency, because I think
34:49
that especially around climate change, but also
34:52
a lot of aspects of society, we
34:54
just don't have the numbers to make
34:56
informed decisions, and that doesn't mean you
34:59
need to care, but you should have
35:01
the information necessary for caring. So I'm
35:03
more about like, ask for accountability, ask
35:05
for transparency when using these technologies instead
35:08
of like, kind of siking yourself out
35:10
about them. Got it. So I thought
35:12
a one way to sort of frame
35:14
this discussion would be to... split it
35:17
into essentially two parts the the micro
35:19
and the macro micro being this question
35:21
of like what do we know about
35:23
the environmental impact of AI at the
35:26
level of the individual user the individual
35:28
question that you might ask to chat
35:30
GPT or Gemini or Claude and getting
35:32
a response to that. And then macro
35:35
being this larger question of like what
35:37
is the AI sector's energy footprint more
35:39
broadly? What do we know about where
35:42
all the energy is coming from to
35:44
run these very powerful models? And what
35:46
can we do as sort of a
35:48
society and as big corporations to position
35:51
ourselves better for the future? And with
35:53
your permission, Sasha, I wanted to start
35:55
with the micro. So one of the
35:57
statistics that people will often throw out
36:00
when talking about the energy demands of
36:02
AI is this figure that a chat
36:04
TPT query or something like it costs
36:06
somewhere in the neighborhood of 10 times
36:09
more energy than a traditional web query
36:11
on something like Google. Now, I asked
36:13
Google about this figure and they wouldn't
36:15
say exactly how much energy it takes
36:18
to query Gemini versus... to run a
36:20
traditional web search, but they did say
36:22
that those numbers are much larger than
36:25
what they've seen internally. But, Sasha, where
36:27
did that figure come from? And what
36:29
do we know about how accurate it
36:31
is? I think the initial Google search
36:34
query is actually pretty old and it
36:36
was part of a study to like
36:38
greening the web kind of type situation
36:40
and they made an estimate. And once
36:43
again, they didn't really have the numbers
36:45
but they tried to extrapolate and then
36:47
for CHIPT it was a similar kind
36:49
of assuming that somebody is querying a
36:52
model that is running on this type
36:54
of hardware and assuming that the latency
36:56
is X and blah blah and they
36:58
kind of extrapolated that. There are other
37:01
models that do similar things so maybe
37:03
even if you don't know exactly CHIGPT.
37:05
you know inherently you have other models
37:08
that will do similar tasks and so
37:10
you can get a range and I
37:12
think that that range is more interesting
37:14
than trying to like to to chase
37:17
down the exact number and compare the
37:19
two and also it's probably not a
37:21
single number anyway and so that's why
37:23
it's so hard to like pin down
37:26
this number and that's why it's going
37:28
to be always possible for them to
37:30
say oh no that's not the number
37:32
that's not the exact number Right, and
37:35
I think that your point earlier that
37:37
one of the things that we need
37:39
on this subject is just a lot
37:41
more transparency is really well taken. I
37:44
know that Google has folks who work
37:46
on climate issues, but I'm curious as
37:48
like you look across the industry, maybe
37:51
it's some of the newer, smaller AI
37:53
labs or just, you know, I don't
37:55
know, companies other than Google. Do you
37:57
get the sense that people are paying
38:00
attention to this, that they are taking
38:02
these sort of measurements, that they even
38:04
have a sense of like the query
38:06
energy usage of one of their products.
38:09
Definitely. because unlike Google or Microsoft or
38:11
any of the big tech companies, usually
38:13
smaller companies are a lot more compute-restrained.
38:15
So they're doing more with less because
38:18
they have to. They don't always come
38:20
at it from a sustainability perspective. They're
38:22
not like, oh yeah, we want to
38:24
protect the planet, but there is a
38:27
part of that. It's like frugality. It's
38:29
like we want to be more efficient
38:31
because we only have 100 GPUs to
38:34
work with. Right. for all of these
38:36
companies are to get the amount of
38:38
compute and energy that they are using
38:40
over time down as quickly as they
38:43
can. Another claim that you often hear
38:45
from people who are worried about the
38:47
environmental impacts of using AI on a
38:49
micro or personal level is about water
38:52
use. There's this statistic I'm sure you've
38:54
seen it around that using an LML
38:56
is... like pouring out a bottle of
38:58
water or half a liter of water
39:01
I've seen going around. Where did that
39:03
figure come from? And why do these
39:05
AI models need water? And is that
39:07
statistic true? So that paper is kind
39:10
of, once again, an extrapolation. It takes
39:12
some of the work that I did
39:14
about an open source model where we
39:17
measured how many kilowatt hours of energy
39:19
were being used to query it. And
39:21
essentially what happens in data centers is
39:23
that they have an amount, like a
39:26
liter of water per kilowatt hour of
39:28
energy. I mean, it's like a water
39:30
efficiency, they call a water use efficiency.
39:32
And essentially, depending on where you're doing.
39:35
This hardware heats up, I don't know
39:37
if you've ever visited a data center,
39:39
if you can, I highly recommend it.
39:41
It is like an overwhelming experience, the
39:44
noise, the heat, and just like the
39:46
general like buzz of electricity is pretty
39:48
overwhelming. Anyway, so you need a lot
39:50
of cooling and essentially how that's usually
39:53
done is with water cooling, like you
39:55
pump in cool water and there's a
39:57
bunch of pipes and it goes through
40:00
all of the hardware and then it
40:02
either a part of it evaporates completely.
40:04
back into nature or whatnot. And so
40:06
that whole process is hugely water consumptive.
40:09
And of course, it's not like, not
40:11
all the water evaporates, but a fair
40:13
amount of it does just because the
40:15
hardware heats up so much. But once
40:18
again, it's, so I go back and
40:20
forth on this a lot, like whether
40:22
putting out statistics like this on, that
40:24
are based on estimates or not, is,
40:27
I guess, useful for the conversation, because
40:29
on one hand, it's really easy for
40:31
the company. It's really easy for the
40:33
company. the true number, which then it
40:36
kind of cuts the conversation. And on
40:38
the other hand, they do become like
40:40
urban legend. And so now I hear
40:43
this 500 milliliter per conversation number a
40:45
lot. And it's like, well, actually, it'll
40:47
depend on so many different things. So
40:49
it's definitely not systematically 500 millilators, but
40:52
it is a non-negligible amount of water.
40:54
And depending on where the data center
40:56
is located. that can become an issue.
40:58
So we've seen places where the data
41:01
centers have put a strain on like
41:03
the towns around them that have water
41:05
shortages because the water is being pumped
41:07
into a new data center that has
41:10
been, you know, powered up. So when
41:12
you, and you know, you said earlier,
41:14
understandably, that you're not a huge fan
41:16
of thinking about these issues at the
41:19
individual level, I'm still curious when you
41:21
are considering your own personal use of
41:23
AI where water usage fits into things.
41:26
Like, is that for you a reason
41:28
to send a fewer queries to CHATGPT
41:30
or an equivalent? I'm in general such
41:32
a, like, I don't use AI that...
41:35
I mean, generative AI that often. The
41:37
one use case that I found was
41:39
really kind of something that actually is
41:41
useful in my life is when I
41:44
read an article or a research paper,
41:46
like putting in the abstract and getting
41:48
a fun title. Like I'm so bad
41:50
at generating fun titles, but ChatTPT is
41:53
really good and you know, it can
41:55
come up with like puns and stuff
41:57
like that. But what really kills me
41:59
is like people who switched to generative
42:02
AI for things that don't really need
42:04
it, like like like my... pet peeve
42:06
example is calculators. People use chatGBT as
42:09
a calculator now, and that's really like,
42:11
it's really terrible. Like you really don't
42:13
need it, not only is it bad.
42:15
at arithmetic, like it's literally not made
42:18
to do math, but it's also like.
42:20
order some magnitude more energy and a
42:22
crazy amount of water for something that
42:24
you know doesn't need water. Well I
42:27
have to say I'm gonna you know
42:29
I'm going to admit something which is
42:31
I've talked before on the show about
42:33
how I use this app called Raycast
42:36
which is plugged into to open Raycast
42:38
which is plugged into to Open AI's
42:40
model and I can just summon it
42:42
on my keyboard with command space and
42:45
I do probably ask it four or
42:47
five questions a day and I am
42:49
definitely using how old as bill. Crystal
42:52
or whatever. And does it, do you
42:54
check the answers? And they're all accurate?
42:56
It's, it's not that I check the
42:58
answers, it's that I don't really care
43:01
that much. So when, when the LLM
43:03
says, you know, that Billy Crystal is,
43:05
you know, I don't know, 70 or
43:07
whatever he is, I'm like, yeah, that's,
43:10
that's, that's 70 or whatever he is,
43:12
I'm like, yeah, that's, that's, that's, that's
43:14
the right ballpark ballpark. you know, embody
43:16
the other side of this because what
43:19
I'm hearing from you is like, generative
43:21
AI is not useful enough in many
43:23
cases to justify the energy costs of
43:25
engaging with an LLM. And I'm a
43:28
person who uses AI every day. I
43:30
generally find it quite useful in my
43:32
life. I use it to accomplish a
43:35
lot of tasks that I could not
43:37
use equivalent tools for. I don't just
43:39
run like how old is Billy Crystal
43:41
searches over and over again. To be
43:44
clear, I only read it once. And
43:46
I would say that my own usage
43:48
of this is to do new things
43:50
that I couldn't do before, mostly. And
43:53
I think if people don't find generative
43:55
AI useful, they shouldn't use it. But
43:57
if people do find it useful but
43:59
are worried about the environmental costs, I'm
44:02
just not entirely convinced that we're thinking
44:04
about the costs of AI in the
44:06
sense of energy at the right scale.
44:08
a sub stack post by a guy
44:11
named Andy Masley where he basically broke
44:13
down the best data and estimates we
44:15
have about the environmental cost of using
44:18
AI and he compared it to some
44:20
other activities like sending emails or streaming
44:22
a video on Netflix or driving a
44:24
car a very short distance and basically
44:27
what he found is that compared to
44:29
all these other activities the energy required
44:31
to to generate a an answer on
44:33
chat GPT or a similar system is
44:36
just infinitesimally small that if we are
44:38
worried about our own personal environmental footprint
44:40
we could do much more to help
44:42
the environment by cutting out meat from
44:45
our diets or by taking fewer trips
44:47
in cars or on airplanes and basically
44:49
the argument that he made that I
44:51
am tempted by is that all of
44:54
this sort of talk about personal responsibility
44:56
is just neglecting to look at AI
44:58
use in the context of all the
45:01
other things that we do in our
45:03
lives that require energy. And I'm wondering,
45:05
Sasha, what you make of that argument.
45:07
I mean, it kind of builds upon
45:10
what I said at the beginning, but
45:12
in general, when you talk to people
45:14
around the issue of climate change and
45:16
mitigation, it's like we're bound by the
45:19
structures in which we... operate and live
45:21
and you know the constraints that we
45:23
have so of course I'm not going
45:25
to be like oh yeah don't take
45:28
that plane to take a well-deserved vacation
45:30
and you know spend your time worrying
45:32
about climate change because that's not a
45:34
productive you know state of mind but
45:37
on the other hand we can make
45:39
decisions with the environment. in our minds.
45:41
So for example, like nowadays, a lot
45:44
of people have Chad GPT open as
45:46
the de facto source of information on
45:48
the internet. And I do think that,
45:50
yes, of course, little by little that
45:53
individual like energy consumption of each query
45:55
is not that much, but if we
45:57
start using it as literally like a
45:59
rubber doc and our bouncing board and
46:02
our companion, and then people also will
46:04
use Chad GPT to build tools, right?
46:06
Nowadays, people are building life nervous therapists
46:08
and whatnot, companions using and then like
46:11
that incrementally incrementally becomes a deal. Personally,
46:13
I try to on like a specific
46:15
task you want to do, for example,
46:17
searching the internet or answering a question,
46:20
and then comparing like what you would
46:22
use like option A and option B,
46:24
and then what's the difference, and then
46:27
it's up to you to decide whether
46:29
that difference is worth it based on
46:31
the advantage that the technology gives you.
46:33
But I don't think it makes sense
46:36
to compare like meat and. email or
46:38
Netflix and taking your car because I
46:40
feel like they're like incomparable actions. Right,
46:42
like people aren't choosing between like, well,
46:45
should I drive to work today or
46:47
should I ask chatty people? Yeah, exactly.
46:49
So I feel that like I understand
46:51
where he's coming from in his argumentative,
46:54
but I don't feel that that helps
46:56
us make choices any better. It kind
46:58
of makes us feel bad all around.
47:00
So, Sasha, can I try to sort
47:03
of summarize what I'm hearing from you
47:05
on the point of? individual use the
47:07
sort of micro question about the environmental
47:10
impact of AI. What I'm hearing you
47:12
say, I believe, is that the individual
47:14
costs of using LMs may not move
47:16
the needle on climate one way or
47:19
the other, but that people should be
47:21
conscious of what they are using AI
47:23
for, and maybe use the smallest model
47:25
that will allow them to get the
47:28
task done that they are looking to
47:30
do, and that maybe we shouldn't be
47:32
tearing our hair out over people using
47:34
ChatGBT, if they're using it to do
47:37
stuff that is genuinely useful to them.
47:39
Is that an accurate reflection of your
47:41
sentiment? It's a great reflection, and I
47:43
think that often we forget our our
47:46
power as consumers and users of technology.
47:48
And I think that putting pressure on
47:50
companies and being like, hey, we care,
47:53
we want this number, stop like bullshitting
47:55
us like you have the number somewhere
47:57
of the average energy, you know, even
47:59
if it's not a single number, if
48:02
it's a range, give us the range,
48:04
and then we'll make our informed decisions
48:06
because like people are more and more
48:08
aware of like relative comparisons, like, you
48:11
know, a mile driven in a car
48:13
or like a steak or like. make
48:15
informed decisions. We should stop just like
48:17
feeding them shit and keeping them in
48:20
the dark. Right. Okay. So that is
48:22
the sort of micro picture of the
48:24
AI energy story. Now let's talk about
48:26
the macro. There was recently a report
48:29
just last month from the Lawrence Berkeley
48:31
National Laboratory about the power that is
48:33
currently needed to run the data centers
48:36
in this country and the power that
48:38
will soon be needed as the... AI
48:40
boom sparks demand for more and more
48:42
of these data centers. This report said
48:45
that between 2018 and 2023, the power
48:47
to run data centers around the US
48:49
went from 1.9% of total annual electricity
48:51
consumption to 4.4% more than double. And
48:54
this report estimated that the energy demands
48:56
of data centers of which AI is
48:58
a major part will continue to increase
49:00
over the next few years and could
49:03
by 2028 makeup. between 6.7% and 12%
49:05
of total US electricity consumption. So, Sasha,
49:07
just, let's zoom out a little bit
49:09
and talk about the energy needs of
49:12
the AI industry as a whole. Where
49:14
are we? Do these companies know where
49:16
they are going to get all this
49:19
energy to build these incredibly powerful AI
49:21
models? Yes and no. I mean, we
49:23
currently have a certain infrastructure, but the
49:25
problem is the growth of the interest
49:28
structure is kicking into high gear. And
49:30
so what's interesting is that the big
49:32
tech companies are the largest purchasers of
49:34
renewable energy credits, which are kind of
49:37
like offsets for energy, and also they
49:39
make a lot of power purchase agreements,
49:41
which are essentially ways of kind of
49:43
promising to buy energy, especially renewable energy
49:46
into the future. So they've been kind
49:48
of... I'll give them that, that they've
49:50
been actually on top of things. But
49:52
then this year, I mean, this past
49:55
year, both Google and Microsoft actually put
49:57
out reports saying that they're not meeting
49:59
their own sustainability targets, like they drop
50:02
the ball on their own like energy
50:04
and carbon goals because of AI, because
50:06
they were not ready. themselves for the
50:08
amount of energy that they would need
50:11
and where that energy is coming from
50:13
like the renewable energy offsetting things weren't
50:15
recovering. And so I think that the
50:17
latest and greatest in the trends in
50:20
terms of energy generation has been nuclear.
50:22
All the big tech companies have signed
50:24
nuclear agreements like power purchase agreements in
50:26
the last couple of months and the
50:29
general messaging is that that's going to
50:31
solve the issue in terms of energy
50:33
like demand growth. And when Microsoft and
50:35
Google said, like, hey, you know, we're
50:38
not going to make our targets, was
50:40
it nuclear that they were pointing to?
50:42
Like, did they say, like, don't worry,
50:45
we're going to fix this, like, we
50:47
have a new strategy? Or did they
50:49
say, like, we have a new strategy?
50:51
Or did they say, like, we might
50:54
just never hit these targets because our
50:56
values have changed. But no, nuclear actually
50:58
entered the chat. relatively recently. I think
51:00
that the reports came out in around
51:03
May of last year of 2024 and
51:05
then like a couple of months later
51:07
it was Microsoft announced that they are
51:09
recommissioning Three Mile Island, Google signed a
51:12
partnership with I don't remember what nuclear
51:14
like generate and they're also... Chiros, yeah,
51:16
exactly. And so they're saying that well
51:18
this is the new direction we're going
51:21
because the thing is I mean sadly
51:23
like building out renewable energy infrastructure does
51:25
take time. And also the problem with
51:28
data centers in renewable energy is that
51:30
data centers need energy 24-7 and the
51:32
cycles aren't necessarily like... predictable as like
51:34
heating and cooling, for example. You know,
51:37
when the, you know, when the temperature
51:39
drops, people will turn on their, their
51:41
heating systems, like you have these models
51:43
that have worked pretty well historically, but
51:46
with data centers, they don't work, and
51:48
renewable energy tends to vary, you know,
51:50
if there's wind, if there's sun, and
51:52
so there are a lot of challenges,
51:55
you can't just let me eat a
51:57
bunch of solar panels and expect them
51:59
to respond to the demand of the
52:01
demand of your data centers. solution. So
52:04
my understanding Sasha is that a lot
52:06
of the big AI companies are now
52:08
just sort of racing to get as
52:11
much energy capacity as they can and
52:13
that one of the worries is that
52:15
they are sort of tapping out the
52:17
infrastructure for clean or renewable energy and
52:20
so they are starting to go into
52:22
these dirtier forms of energy that we
52:24
know have these harmful environmental costs because
52:26
there just isn't enough renewable energy and
52:29
adding more takes time as you said.
52:31
Yeah and also the thing is with
52:33
data centers is that like they're a
52:35
very concentrated, very intense energy sink. So
52:38
making that connection, like I was talking
52:40
to some energy grid operators in Paris,
52:42
and they're saying, like, even if we
52:44
did have the capacity, like the actual
52:47
megawatt hours, distributing it in a way
52:49
that all of that extra capacity goes
52:51
towards the data center in whatever, like,
52:54
rural area they build it in, is
52:56
a challenge in itself. argument. I was
52:58
talking with someone the other day who
53:00
works at an AI company and one
53:03
of the arguments that they made for
53:05
why we shouldn't worry so much about
53:07
the energy costs associated with AI is
53:09
that basically our electrical grid in America
53:12
has been in desperate need of modernization,
53:14
that we have this sort of creaky
53:16
old electrical grid that we are, that
53:18
has not been growing nearly as quickly
53:21
as it needs to, and that basically
53:23
because AI now exists and demands all
53:25
of this energy, we are starting to
53:28
do things that we probably should have
53:30
done a long time ago as far
53:32
as investing in new sources of energy,
53:34
in these mini-nuclear reactors, in trying to
53:37
scale up things like solar and wind
53:39
power. And so, yes, these models are
53:41
demanding a lot of energy, but they
53:43
are sort of forcing us to modernize
53:46
our infrastructure and our energy grid in
53:48
ways that will benefit us as a
53:50
country down the line. What do you
53:52
make of that argument? Well, so what's
53:55
interesting about the United States particularly is
53:57
that it's not a single energy grid.
53:59
There's a lot. of energy providers in
54:01
the states. There's a really nifty website
54:04
called Electricity Map, and they map out
54:06
electricity. And what's interesting, when you zoom
54:08
in on the US, it's like a
54:11
patchwork. There's some states that have like...
54:13
12 different grids and then there's some
54:15
actually like multiple states have a single
54:17
grid. What's interesting is that I mean
54:20
for example Canada is one per province
54:22
in Europe it might be one per
54:24
country like France has a single one
54:26
and so it's yet they're probably right
54:29
to an extent but modernizing the US
54:31
energy like system network of grids is
54:33
actually really difficult because it's so heterogeneous
54:35
and because you know even if you
54:38
update one part of the grid that
54:40
doesn't mean like smaller energy grids don't
54:42
have that much capacity. and the bigger
54:44
ones will take time to update. So
54:47
I think it's like, yes, in theory,
54:49
it would be good to overhaul the
54:51
U.S. energy grid, but in practice, it's
54:54
a lot of small problems that are
54:56
harder to solve. One other thing I've
54:58
heard from people who work in the
55:00
AI industry or are not as worried
55:03
about the environmental impact of AI is
55:05
that, yes, this stuff costs energy, yes,
55:07
we need to find new sources of
55:09
energy, but ultimately. AI is going to
55:12
be more of a help in addressing
55:14
the climate crisis than it will hurt.
55:16
What do you make of that argument?
55:18
Is that just self-serving? I don't think
55:21
it's self-serving, but I think it's kind
55:23
of a false dichotomy because the AI
55:25
systems that are the most energy intensive,
55:27
like large language models, are the ones
55:30
that have yet to prove their utility
55:32
in fighting climate change. Like, I think
55:34
that the issue here is that we're
55:37
using these big models for tasks that
55:39
are not helping the fight against climate
55:41
change, and compared to that, the models
55:43
that are helping climate change aren't the
55:46
ones that are the issue. And so
55:48
it's like the problem with AI being
55:50
an umbrella, an umbrella term kind of
55:52
makes it. very, very hard to have
55:55
this discussion, but it's like essentially large
55:57
language models are not solving the climate
55:59
crisis anytime soon, and the models that
56:01
are helping are not the ones that
56:04
are contributing like most of the energy
56:06
and carbon issues. that we're seeing. One
56:08
more argument that I want to have
56:10
you address, which is about the efficiency
56:13
of AI over time. We've heard from
56:15
companies that they are making their models
56:17
much more efficient because they're creating these
56:20
algorithmic breakthroughs, doing things like model distillation,
56:22
the chips themselves are also becoming much
56:24
more energy efficient. And so there's this
56:26
argument that you'll hear from folks in
56:29
the industry that actually we're running out
56:31
on outdated information when we say that
56:33
AI is a risk to the climate
56:35
because the energy needs are scaling down
56:38
over time. per use and that actually
56:40
we're just worried because our information isn't
56:42
up to date. So do you think
56:44
about efficiency in those terms or how
56:47
should we think about that? So efficiency
56:49
is interesting because I think that like
56:51
a lot of what people. talk about
56:53
when they talk about like technological progress
56:56
is some form of efficiency. It's like,
56:58
oh, we're using less time, we're using
57:00
less, I don't know, fuel, we're using
57:03
less energy, for example. And I think
57:05
in the AI, we are seeing this,
57:07
but what's interesting, I've been, I've been,
57:09
I've been really going down the rabbit
57:12
hole in terms of like macroeconomic literature
57:14
on this, like, there's this really interesting
57:16
paradox, it's called Jevin's paradox, what Jevans
57:18
observed in like this. kind of phenomena
57:21
has been observed a lot with different
57:23
kinds of efficiency gains, whether it be
57:25
time, whether it be, you know, for
57:27
example, cars, like now that we can
57:30
drive farther on the same amount of
57:32
fuel, we'll actually go to more places.
57:34
And so I think what we're seeing
57:36
a lot in AI is this kind
57:39
of rebound effect that, yeah, we can
57:41
do more AI for the same, you
57:43
know, amount of computer money, but that
57:46
means we're gonna do. Even more we're
57:48
gonna put AI into even more things
57:50
and so those efficiency gains are kind
57:52
of lost because now we're using LLLM's
57:55
for things that we didn't use LLLM's
57:57
for before Casey do you want to
57:59
try repeating back what we've heard about
58:01
the macro? picture when it comes to
58:04
AI and energy? Well, the macro picture
58:06
of AI and energy is that the
58:08
construction of data centers does actually put
58:10
a strain on the grid. We're seeing
58:13
many more of them and that even
58:15
as individual usage of AI gets more
58:17
efficient, it seems likely that we'll just
58:19
use a lot more of it. And
58:22
so this is one that it seems
58:24
like we do have to watch and
58:26
take the environmental claim seriously. That's what
58:29
I feel like I heard. Does that
58:31
sound right? Yes, it does. I think
58:33
you summed it up really well. Got
58:35
it. And I think what we can
58:38
agree on whether or not we think
58:40
that the individual or the macro use
58:42
of AI across the economy is dangerous
58:44
for the environment is that... I think
58:47
AI companies should be required to disclose
58:49
a lot more data about the energy
58:51
use of their models. It just seems
58:53
like the data we have a lot
58:56
of it is based on estimates from
58:58
the outside, a lot of it is
59:00
outdated, a lot of it has sort
59:02
of gone through this game of telephone,
59:05
where all of a sudden, every time
59:07
people use chat GPT, they think they're
59:09
like burning down a forest. And it
59:12
seems like this could all be solved
59:14
by just having much better and more
59:16
transparent data from the AI companies themselves
59:18
about how much energy they're using. Agreed
59:21
and giving users more agency. when it
59:23
comes to generative AI, and even having
59:25
a toggle when it comes to whatever
59:27
AI generated summaries in Google, just like
59:30
giving people a little bit more control
59:32
over how they use. Like we don't
59:34
want to stop using Google, or most
59:36
people don't, so like let us use
59:39
Google in a way that is coherent
59:41
with our values or the things that
59:43
we want to optimize for. Well, Sasha,
59:45
thank you so much for enlightening us
59:48
on this subject is one I imagine
59:50
we will return to because I don't
59:52
think this debate is going away any
59:55
time soon, but I really appreciate your
59:57
expertise and your time. Thank you for
59:59
the great questions. When we come back,
1:00:01
put on your... gold chains, insert your
1:00:04
zins, and let's do some Jiu-Jitsu. We're
1:00:06
talking about masculinity in the tech industry.
1:00:29
This podcast is supported by Google Gemini.
1:00:31
For anyone new to Gemini, it's an
1:00:34
AI assistant you can have real conversations
1:00:36
with. Whether you want to brainstorm something,
1:00:38
prep for a presentation or interview, or
1:00:41
just learn something new, Gemini can help
1:00:43
you do it smarter and faster. And
1:00:45
by the way, this script was actually
1:00:47
read by Gemini. You can download the
1:00:50
Gemini app for free on iOS or
1:00:52
Android. Must be 18 plus to use
1:00:54
Gemini Live. Hey finance folks! You're under
1:00:57
a lot of pressure to save money.
1:00:59
Brex knows you want to drive growth,
1:01:01
change the game, and win. So that's
1:01:03
exactly what Brex will help you do.
1:01:06
Brex offers the world's smartest corporate card,
1:01:08
banking, expense management, and travel, all on
1:01:10
one AI-powered platform. See why 30,000 companies
1:01:13
across every stage of growth use Brex
1:01:15
at brex.com/Grow. I did, and I assume
1:01:17
that that's why we're sitting here in
1:01:19
our oversized baggy t-shirts and our gold
1:01:22
chains. That's right. Yes, thank you for
1:01:24
agreeing to this costume change. Listeners should
1:01:26
know that we are wearing a very
1:01:28
boxy black t-shirts right now and gold
1:01:31
chains to try to get us into
1:01:33
the mindset of what I'm hoping we
1:01:35
can do today. Yeah, should we pop
1:01:38
us into? If you have a lot
1:01:40
waiting on you, go for it. I've
1:01:42
got four, I've got upper deckies here.
1:01:44
Let's not pretend. Let me teach you
1:01:47
something about straight culture. Please do. You're
1:01:49
always enlightening me about gay culture. Upper
1:01:51
deckies are when you put a zinn
1:01:54
nicotine pouch in your upper lip. That
1:01:56
is perfect, Kevin. That is exactly the
1:01:58
right spirit that I want to take
1:02:00
into this segment. So you watch the
1:02:03
Mark Zuckerberg interview on Joe Rogan. I
1:02:05
did. And what did you think? I
1:02:07
thought it was very long. That was
1:02:09
my main thing, was this meeting could
1:02:12
have been an email. Well, I think
1:02:14
that's a fair point, Kevin, but to
1:02:16
me, I was so pleased to hear
1:02:19
it because finally someone in Silicon Valley
1:02:21
was willing to say what we've all
1:02:23
been thinking for years now, which is
1:02:25
that this town does not have enough
1:02:28
masculine energy. You know what I mean?
1:02:30
Kevin, sometimes I will visit a company
1:02:32
in Silicon Valley and see as many
1:02:35
as one female executive. And finally, people
1:02:37
like Mark Zuckerberg are starting to ask,
1:02:39
when did things get this out of
1:02:41
control? And I know you thought the
1:02:44
same thing. You've said that to me
1:02:46
off-money. I don't think I have, but
1:02:48
go on. Now, some people get confused
1:02:51
because the most recent time that meta
1:02:53
shared numbers, it had about two men
1:02:55
at the company for every one woman.
1:02:57
But this just highlights how powerful feminine
1:03:00
energy is Kevin. What Zuckerberg is saying
1:03:02
is that to counteract the presence of
1:03:04
even one woman at meta, at least
1:03:06
three men are needed to restore balance.
1:03:09
Now, just to give listeners a bit
1:03:11
more of a sense of what we're
1:03:13
talking about, I think we should play
1:03:16
Mark Zuckerberg talking about masculine energy on
1:03:18
the Joe Rogan experience. Let's do it.
1:03:20
I just think we kind of swung
1:03:22
culturally to that part of the kind
1:03:25
of... the spectrum where it's all like,
1:03:27
okay, masculinity is toxic. We have to
1:03:29
get rid of it completely. It's like,
1:03:32
no, like, it's, both of these things
1:03:34
are good, right? It's like, you want
1:03:36
feminine energy, you want masculine energy. Like,
1:03:38
I think that that's, like, you're gonna
1:03:41
have parts of society that have more
1:03:43
of one or the other. I think
1:03:45
that that's all good, but I do
1:03:48
think the corporate culture sort of head
1:03:50
swung towards being this. somewhat more neutered
1:03:52
thing. And I didn't really feel that
1:03:54
until I got involved in martial arts,
1:03:57
which I think is still a much
1:03:59
more masculine culture. There is something about
1:04:01
being punched in the face that makes
1:04:03
you think my culture has been neutered.
1:04:06
You know what I mean? So, yes,
1:04:08
I did hear this part of the
1:04:10
interview. This went. a viral everyone on
1:04:13
my feeds has been talking about this
1:04:15
this these comments that Mark Zuckerberg made
1:04:17
about masculine energy being missing from many
1:04:19
of our greatest corporations and this is
1:04:22
sort of in the context of all
1:04:24
the moves that he's been making to
1:04:26
try to make meta more palatable to
1:04:29
people on the right including the incoming
1:04:31
Trump administration and this was sort of
1:04:33
him saying to Joe Rogan in a
1:04:35
way that people mercilessly mocked the real
1:04:38
problem in corporate America is that we've
1:04:40
been letting this feminine energy take over
1:04:42
and we need to kind of assert
1:04:45
masculine energy and that's our path back
1:04:47
to greatness. Exactly Kevin and so as
1:04:49
we so often try to do on
1:04:51
this show I've spent all week thinking
1:04:54
how can we be part of the
1:04:56
solution here? And so I have come
1:04:58
up with a list of ideas that
1:05:00
we can bring to the meta corporation
1:05:03
to help them restore masculine energy to
1:05:05
meta. Oh boy. We're going to give
1:05:07
meta a masculine makeover and I would
1:05:10
love to share some of the ideas
1:05:12
that I have with you right now.
1:05:14
Number one, modify the Facebook like button
1:05:16
to display a bulging vein reflecting long
1:05:19
hour spent in the gym. What do
1:05:21
you think? I like it. Is something
1:05:23
else? Whenever you tap at your phone,
1:05:26
grunts. Number two, let's just say the
1:05:28
poke is going to work a little
1:05:30
differently now, but I can't say how
1:05:32
on this podcast. Number three, transform every
1:05:35
conference room at meta into an octagon.
1:05:37
Kevin, remind workers at every meeting that
1:05:39
work as a combat zone and Mark
1:05:42
Zuckerberg can strike at any time. I
1:05:44
like this. We're also changing the name
1:05:46
of the finance. department to MMA, mixed
1:05:48
martial accounting. Number four, meta acquires fortune.
1:05:51
It's the largest repository of disturbed 17-year-olds
1:05:53
in the world, Kevin, and they could
1:05:55
be part of the solution, too. Now,
1:05:57
the obvious thing to do would be
1:06:00
to let them run the human resources
1:06:02
department. But I'm proposing that meta goes
1:06:04
further and puts them in charge of
1:06:07
content moderation. That'd be somastulent energy. I
1:06:09
sure would. Number five. No more of
1:06:11
these beta team building activities like making
1:06:13
pottery and volunteering, Kevin. Instead, we're going
1:06:16
on a wild boar hunt. Yes. As
1:06:18
Mark shared on the Joe Rogan experience,
1:06:20
one of the greatest challenges in his
1:06:23
life is that his ranch in Kauai
1:06:25
is absolutely beset by an invasive species
1:06:27
of wild boars. And for years now,
1:06:29
Zuckerberg has been spending his downtime hunting
1:06:32
them with bow and arrows. In fact,
1:06:34
do we have a clip of that?
1:06:36
Well, though my favorite is bow, bow
1:06:39
and arrow. I think like the most.
1:06:41
That feels like the most kind of
1:06:43
sporting version of it. Yeah, if you
1:06:45
want to put it that way. Yeah,
1:06:48
I mean, you're just trying to get
1:06:50
meat. It's not the most effective. The
1:06:52
most effective is certainly a rifle. If
1:06:54
you work at meta, I think this
1:06:57
should be your problem too. Whether you
1:06:59
want to use a bow and arrow
1:07:01
or a rifle, report to the Zuckerberg
1:07:04
branch for further instructions. Now do we
1:07:06
know what happens if you are a
1:07:08
met employee and you actually bring a
1:07:10
boat, a hunting boat into the office?
1:07:13
Number six, replace the water and meta's
1:07:15
data centers with Mountain Dew Code Red.
1:07:17
Oh, I like this one. Me too.
1:07:20
Number seven, in the 2019 film Joker,
1:07:22
Kevin, Joaquin Phoenix's character does a famous
1:07:24
dance down a set of stairs to
1:07:26
signify that he is fully transformed into
1:07:29
the Joker. My proposal, we bring those
1:07:31
steps to the meta campus in Menlo
1:07:33
Park. You have a meeting with Mark
1:07:35
Zuckerberg? Guess what, Kevin, you have to
1:07:38
walk up the Joker steps. I like
1:07:40
that Mark is the Joker now. Number
1:07:42
eight. And what many people perceived as
1:07:45
a cruel and pointless attack on trans
1:07:47
people met instructed managers to remove tampons
1:07:49
from the male restrooms at their campuses.
1:07:51
But this is a half-measure, Kevin, because
1:07:54
let's face it, real men don't use
1:07:56
toilet paper. True. Get rid of it!
1:07:58
Yeah. Are we doing bidets or are
1:08:01
we just going raw dog? Bidets, are
1:08:03
you kidding me? There will not be
1:08:05
one French thing in those records. As
1:08:07
long as I'm suggesting ideas. Okay. Number
1:08:10
nine, employees will now get one extra
1:08:12
day off a year to do one
1:08:14
of the following three activities. Modalon, watch
1:08:17
the game or hang with the boys.
1:08:19
Which one of those would you pick,
1:08:21
Kevin? Hang with the boys, for sure.
1:08:23
Do you even have any boys for
1:08:26
sure? Now, I have one last suggestion
1:08:28
to bring up the masculine energy at
1:08:30
Metacaven, and it goes like this. We're
1:08:32
going to have a hackathon for women.
1:08:35
Doesn't that sound nice? Yeah. Yeah. And
1:08:37
at the end, we're going to take
1:08:39
all the best ideas from their hackathon
1:08:42
and give them to Metas' male executives,
1:08:44
because what kind of energy is more
1:08:46
masculine than taking credit for a woman's
1:08:48
idea? Anyway, just my thoughts, Kevin. Do
1:08:51
you have any ideas as well? No,
1:08:53
I think that basically covers it. I
1:08:55
think with these changes the meta corporation
1:08:58
will be fully What's the opposite of
1:09:00
a masculated it will be and manulated
1:09:02
and manulated and and we will have
1:09:04
a glorious future Run by men, you
1:09:07
know, there used to be a time
1:09:09
in meta when people like Cheryl Sandberg
1:09:11
had a seat at the table and
1:09:14
and famously told women there to lean
1:09:16
in yeah, what's happening with that now?
1:09:18
I'm being given word that that they're
1:09:20
being asked to lean out actually Mark
1:09:23
Zuckerberg announced this week that he was
1:09:25
gonna cut 5% of what he called
1:09:27
the low performers at the company. And
1:09:29
that is sort of the ultimate leanout
1:09:32
is a layoff. Yeah. I did see
1:09:34
some meta employees posting that they were
1:09:36
the way they were going to avoid
1:09:39
getting laid off is by getting extremely
1:09:41
jacked. So that's an idea there. I
1:09:43
mean that is now something that we
1:09:45
can respect in culture as we can
1:09:48
say. If you have visible muscles, maybe
1:09:50
you belong around here. Casey, how do
1:09:52
you, I have to ask, since we
1:09:55
are in the Zuckerberg uniform now, minus
1:09:57
the $900,000 watch, this is just my
1:09:59
Apple watch. How do you feel? Do
1:10:01
you feel more masculine sitting in the
1:10:04
studio today? I am having an almost
1:10:06
uncontrollable desire to just wrestle you to
1:10:08
the ground and force you to submit.
1:10:11
How are you feeling? I'm feeling like
1:10:13
I'm feeling like I'm a little insecure.
1:10:15
Honestly, yeah, why? Because I don't think
1:10:17
I can pull this off. You can
1:10:20
absolutely pull it off. Everyone looks good
1:10:22
in a black t-shirt and a gold
1:10:24
chain. Yeah, including me. I'm not a
1:10:26
big man jewelry guy. You know what?
1:10:29
I haven't been either, but then for
1:10:31
anniversary, my boyfriend and I got little
1:10:33
chains. Is that so cute? That is
1:10:36
cute. Yeah. And manly in kind of
1:10:38
a different way. This is what I
1:10:40
love. You start up, you're talking about
1:10:42
something supermanly, but then you get into
1:10:45
it in any degree of detail, and
1:10:47
you realize, no, it's masculine and feminine
1:10:49
energy together in the same place. Isn't
1:10:52
that beautiful? Now, Casey, the one serious
1:10:54
thing that I do want to say
1:10:56
about this is that I... It clicked
1:10:58
for me when I heard Mark Zuckerberg
1:11:01
on Joe Rogan talking about masculinity and
1:11:03
masculine energy that this is what founder
1:11:05
mode was. Yes. You can look back
1:11:08
at our shows that we did about
1:11:10
founder mode last year and to my
1:11:12
recollection not one of the people in
1:11:14
Silicon Valley calling for the return of
1:11:17
founder mode was a woman. And I
1:11:19
believe that that is because founder mode
1:11:21
was an elaborate way of saying, we're
1:11:23
big boys and we would like to
1:11:26
run our companies like big boys. Yeah,
1:11:28
and I mean, look, I don't want
1:11:30
to completely dismiss the idea that people
1:11:33
should get in touch with masculine energy.
1:11:35
That is a fine thing to do,
1:11:37
I think, no matter who you are.
1:11:39
I get really concerned when somebody who
1:11:42
employs tens of thousands of people starts
1:11:44
talking about this in the context of...
1:11:46
corporate culture and amid a series of
1:11:49
initiatives that includes killing off the DEI
1:11:51
program and firing your quote-unquote low performers
1:11:53
like a clear message is being sent
1:11:55
and the message is not women are
1:11:58
welcome at meta. One thing that also
1:12:00
struck me as I was listening to
1:12:02
Mark Zuckerberg is that it also reminded
1:12:05
me of a conversation that Jeff Basos
1:12:07
had at the deal book conference just
1:12:09
a few weeks ago that I heard
1:12:11
where I was actually surprised you know
1:12:14
Jeff Basos was sort of the original
1:12:16
sort of tech founder who kind of
1:12:18
got super masculine right he turned from
1:12:20
this like scrawny nerd into this like
1:12:23
jacked dude who lifts weights and has
1:12:25
these sort of bulging muscles and you
1:12:27
know just sort of embraced a masculine
1:12:30
aesthetic I think earlier than a lot
1:12:32
of other tech executives. But I was
1:12:34
also struck by his comments at Dealbook
1:12:36
where he basically talked about his feelings
1:12:39
a lot and how he had started
1:12:41
becoming more emotionally open at work about
1:12:43
feeling scared or feeling vulnerable. And it
1:12:45
just really struck me that like that
1:12:47
is a person who is actually comfortable
1:12:50
with masculinity when you can talk about
1:12:52
emotions in the context of a business
1:12:54
meeting and you could talk about them
1:12:56
on stage at a business conference. This
1:12:59
sort of like larping that Mark Zuckerberg
1:13:01
is doing where he is pretending to
1:13:03
be super masculine all of a sudden
1:13:05
and like enjoy bow hunting and hanging
1:13:07
out with the bros like it just
1:13:09
feels very insecure to me and very
1:13:11
like very much like this is a
1:13:13
person who has not yet actually become
1:13:15
at peace with his own self. Yeah
1:13:18
I think that there is something to
1:13:20
that I can't even make a joke
1:13:22
about that because it's actually kind of
1:13:24
terrifying. To be 40 and sort of
1:13:26
still be trying to work out. Hmm,
1:13:28
what are my values? And could I
1:13:30
just replace them wholesale almost
1:13:33
overnight with a different set?
1:13:35
That's some kind of a
1:13:37
scary proposition for somebody who
1:13:39
runs a set of platforms
1:13:41
used by billions of people.
1:13:43
Yes, and I hope that
1:13:45
whatever Mark Zuckerberg is looking
1:13:48
for, he finds it, and I hope
1:13:50
that it does not come at the
1:13:52
expense of a lot of boars who
1:13:54
might needlessly die. This
1:14:00
podcast is supported by
1:14:02
Google Gemini. For anyone
1:14:04
new to Gemini, it's an
1:14:07
AI assistant you can
1:14:09
have real conversations with.
1:14:11
Whether you want to brainstorm
1:14:13
something, prep for a presentation
1:14:15
or a presentation or interview,
1:14:17
or just learn something new.
1:14:19
Gemini can help you do
1:14:21
it smarter and faster. This
1:14:23
script was actually read by
1:14:25
Gemini. You can download the
1:14:27
Gemini app for free on
1:14:29
iOS or Android. Must be
1:14:31
18 plus to use Gemini
1:14:34
Live. Hey finance folks, you're
1:14:36
under a lot of pressure
1:14:38
to save money, but the
1:14:40
best finance leaders focus on
1:14:42
more than that. Brex knows
1:14:44
you want to drive growth,
1:14:46
change the game, and win.
1:14:48
So that's exactly what Brex
1:14:50
will help you do. Brex
1:14:52
offers the world's smartest corporate
1:14:54
card, banking, expense management, and
1:14:57
travel, all on one AI-powered
1:14:59
platform. See why 30,000 companies
1:15:01
across every stage of growth
1:15:03
use Brex at brex.com/Grow. rex.com/Grow.
1:15:05
One last thing before we go, our
1:15:08
colleagues over at the Matter of Opinion
1:15:10
podcast just published an extensive interview with
1:15:12
the tech investor Mark Andreessen about his
1:15:14
support for Donald Trump and what he
1:15:16
sees as the emergence of a new
1:15:18
conservative tech write. If you're interested in
1:15:20
checking out that show, you can search
1:15:22
for the Matter of Opinion podcast or
1:15:24
click the link in our show notes.
1:15:27
Hard Fork is produced by Whitney Jones
1:15:29
and Rachel Cohn. We're edited this week
1:15:31
by Rachel Dry. We're fact-checked by Katon
1:15:33
Love. Today's show was engineered by Brad
1:15:35
Fisher. Original music by Rowan Nemesto and
1:15:37
Dan Powell. Our executive producer is Jen
1:15:39
Poiant. Our audience editor is Nell Gologli.
1:15:41
Video production by Ryan Manning and Chris
1:15:43
Shot. You can watch this full episode
1:15:46
on YouTube at youtube.com/Hard Fork. Special thanks
1:15:48
to Paula Shuman, Kewing Tam, Dahlia Hadad,
1:15:50
and Jeffrey Miranda. As always. you
1:15:52
can email us
1:15:54
at at hard fork at.com. dot
1:15:56
Send us your ideas
1:15:58
for how to
1:16:00
make Hardfork's masculine energy
1:16:03
more palpable. What
1:16:05
if we had a
1:16:07
third male co -host?
1:16:09
more no. What if
1:16:11
we had a third male co-host? Oh
1:16:13
no.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More