Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Hey upstream listeners, today we're releasing
0:02
a conversation I had on turpentines
0:04
show The Riff, with my friend Berne Hobart.
0:06
We discussed the benefits Elon Musk
0:08
gains from having so many companies,
0:10
the chaotic nature of Doj's work, Chinese
0:13
AI development updates, and more. Please
0:15
enjoy the conversation. Hey,
0:31
how are you? Hey, Bern, I'm doing great.
0:33
How about you? I am good. Shall we
0:35
get into it? Yeah, let's do it. Awesome.
0:38
Bern, there's a lot I want to
0:40
cover this week. One is, you had a
0:42
piece on Don Junior and how
0:44
he's leveraging the Trump brand. I
0:46
sort of want to have you unpack
0:48
that a little bit, but also just
0:50
ask the question if Don Junior
0:53
was focused on making as much
0:55
money as possible by leveraging the
0:57
Trump brand. He was at saying,
0:59
hey, Bern, what advice would you
1:01
have for me? What might you say?
1:03
Yeah, I'm not sure I would tell
1:05
him to do that much differently at
1:07
this point. I guess, you know, there
1:10
are ways to get a little bit
1:12
more of the upside, and certainly there
1:14
is stuff that could be done, but
1:16
may already be being done in terms
1:18
of trading crypto in particular, ahead
1:20
of different announcements. But, you know,
1:22
he's, there is this long and
1:24
kind of sorted political
1:27
history of... major politicians, less
1:29
known relatives, getting into various
1:31
business dealings. It's like, it
1:33
is one of the sort
1:35
of kind of comic relief
1:37
side stories in any presidency
1:40
that there is some nephew,
1:42
cousin, half brother, whatever, who
1:44
is constantly saying that they're
1:46
very close to the president
1:48
and that of course, you
1:51
know, of course he can't
1:53
ever be directly involved in
1:55
this, but you know he's
1:57
he's really on our side and
1:59
that I I'd heard this theory about Hunter
2:01
Biden too, which I thought actually had
2:04
some explanatory power, that yeah, all the
2:06
emails are real and Hunter really did
2:08
tell people it's 10% for the big
2:10
guy and so on, but he just
2:12
made sure to not actually tell the
2:15
actual president any of the things he
2:17
was doing trading on the name. And
2:19
meanwhile, you can't really collect on that,
2:21
you know, if you are corrupt oligarch
2:24
and you will hire someone for this
2:26
completely fake very well-paid job and he
2:28
says yeah you know it's the org
2:30
chart is there's me and then under
2:33
me is the secretary of state or
2:35
whatever it's not like you can sue
2:37
him for misrepresenting himself if that turns
2:39
out not to be true so it
2:42
is kind of a good a good
2:44
grift to run and I think in
2:46
the Don Junior case he he is
2:48
at least among Republicans a celebrity in
2:50
his own right and does operate in
2:53
that capacity like he gives speeches and
2:55
things he has his own media presence
2:57
so he is also part of that
2:59
orbit but yeah he seems to he
3:02
seems to be doing a lot of
3:04
the a lot of the kind of
3:06
smaller scale business dealings but it's also
3:08
it is a feature of the market
3:11
right now that retail investors are pretty
3:13
big they tend to get pretty excited
3:15
about things like that about you know
3:17
brand name board members and especially for
3:19
companies where either A, you can't really
3:22
evaluate the business at all, it's too
3:24
early for that, or B, you can
3:26
add, you can compare to other companies
3:28
and say, this is just not as
3:31
good a business as the other ones,
3:33
but it is my kind of not
3:35
as good a business. So what we've
3:37
sort of done is democratize the idea
3:40
of subsidizing a nepotistically subsidizing a family
3:42
member who really wants to run a
3:44
coffee shop or use bookstore or something
3:46
high status but not very lucrative. It
3:49
used to be that you had to
3:51
be wealthy enough to actually do that,
3:53
to actually fully subsidize that business, but
3:55
now you can actually do the nepotistic
3:57
business subsidy thing through your Robinhood account.
4:00
It's a smaller amount, but you still
4:02
get the same fuzzy feeling. So yeah,
4:04
maybe this is this is all about
4:06
bringing taking parts of capitalism that used
4:09
to be available only to the 1%
4:11
and making them something anybody could do.
4:13
Yeah, good explanation on the Don Trump
4:15
phenomenon. I want to also segue to
4:18
another version of kind of like, what
4:20
would you do if you were this
4:22
person? You also mentioned, wrote about how
4:24
Dustin Moscovitz left Asana. and had an
4:27
interesting take as it relates to sort
4:29
of how he's thinking about what to
4:31
do with his time, there is this
4:33
sort of question of, you know, if
4:35
you have that level of resources, what's
4:38
the highest leverage thing that you could
4:40
be doing, especially if you're worried about
4:42
what's the highest leverage thing that you
4:44
could be doing, especially if you're worried
4:47
about sort of runaway AI. Yeah, and
4:49
I think that is basically the story.
4:51
So it's part of what I was
4:53
writing if that founder steps down. And,
4:56
you know, it often for good reasons
4:58
drives the stock down a little bit,
5:00
you know, you wonder if all the
5:02
stock is going to be liquidated, but
5:04
you also in many cases just want
5:07
this person who is, who understands the
5:09
business very deeply and who's respected by
5:11
the employees and so on. It's better
5:13
for them to be in charge than
5:16
someone else, even if that other person
5:18
is also pretty good. But I do
5:20
think in this case, if there is
5:22
a strong explanation for why that person
5:25
will want to leave, it's a weaker
5:27
signal because sometimes people do just leave
5:29
because they can see the businesses actually
5:31
not going to go all that well
5:34
in the future. There are other higher
5:36
value uses of time, and usually those
5:38
uses of time are just either giving
5:40
away lots of money generally or just
5:42
spending more time with your possessions. And
5:45
you have to be pretty motivated to
5:47
run a business if you could also
5:49
just enjoy being a billionaire indefinitely and
5:51
never think about money again, except thinking
5:54
about how nice it is to have
5:56
so much. But in his case, he's
5:58
been alarmed about AI for a long
6:00
time. and has been donating to a
6:03
lot of E.A. effective ultra-risk causes, including
6:05
causes that are about AI at existential
6:07
risk and things like that. So I
6:09
think it's actually, it's pretty aligned in
6:12
that case to say that as AI
6:14
accelerates and as it gets better at
6:16
doing human level performance, that if you're
6:18
worried about it, achieving superhuman performance in
6:20
a very negative way for society, that
6:23
that should probably be your full-time thing.
6:25
So yeah, so I guess it is
6:27
kind of analogous to if someone steps
6:29
down because they really want to provide
6:32
full-time medical care to a relative with
6:34
a terminal illness. It's just in his
6:36
case, the person with a terminal illness
6:38
is all of us, and the illness
6:41
is that there's this AI thing that's
6:43
ramping really fast and could kill us
6:45
all. Yeah, it is. And so what
6:47
do you expect him to expect him
6:49
to do you expect him to do?
6:52
I mean, I don't know, it's always
6:54
hard to come up with expectations for
6:56
someone who is clearly smart and effective
6:58
and has a lot of financial resources
7:01
and has been paying attention to this
7:03
rapidly changing field for long enough time
7:05
to have some sense of what's coming
7:07
next. So I think it will be
7:10
very high signal, whatever he does next,
7:12
I think that the, you know, even
7:14
the fairly straightforward things like, okay, we
7:16
will, someone will start. another lab that
7:19
is even more safety focused than the
7:21
other labs, I think that that probably
7:23
just doesn't work because you probably if
7:25
you view safety as actually complementary to
7:27
building a more powerful AI system, then
7:30
maybe you'd say you don't need as
7:32
much money as open AI or as
7:34
Google because you can actually get more
7:36
out of the dollars you spend. But
7:39
I think that's that's not necessarily true.
7:41
It is probably true. in some sense
7:43
over long periods, that if you have
7:45
this smarter reasoning model, it may reason
7:48
its way into fairly pro-social behavior, but
7:50
it's hard to say for sure. Anyway,
7:52
so I really don't know, but I
7:54
think it would be very interesting to
7:57
see, you know, if someone... does have
7:59
this fortune and they now have time
8:01
to devote to these kinds of causes,
8:03
what is it that they see as
8:05
the highest leverage thing to do? But
8:08
yeah, no idea what that would be.
8:10
And you know, I could be Robert,
8:12
like the fact that that is a
8:14
good explanation does not mean it's the
8:17
only explanation. And so it is possible
8:19
that either A, it's like quitting to
8:21
do non-profit stuff, but it's a totally
8:23
different non-profit thing. It's shrimp welfare or
8:26
something. Or B, that it is just
8:28
quitting because it is really relaxing to
8:30
not be running a large company. the
8:32
money the money will still be there
8:34
yeah but what is your sense on
8:37
sort of you know Dustin's broader project
8:39
or his sort of contributions to sort
8:41
of you know what he's doing with
8:43
his his work and his foundations and
8:46
just is the broader sort of field
8:48
of existential risk do you do you
8:50
think they're doing a pretty good job
8:52
do you think it's sort of like
8:55
impossible effort on its face or what
8:57
sort of your analysis of our sort
8:59
of existential risk you know apparatus I
9:01
think it's good to think about existential
9:04
risks, but I think that you really
9:06
want to avoid prematurely defining risks of
9:08
a particular nature and saying this is
9:10
what counts as an existential risk, because
9:12
there is this just ongoing existential risk
9:15
of we end up reaching, we end
9:17
up creating a really complicated society, it
9:19
has high fixed costs, not in terms
9:21
of it takes a lot of dollars
9:24
to keep things running, but in terms
9:26
of it takes some very functional institutions
9:28
and the ability to put competent people
9:30
in charge of solving the right problems.
9:33
if you lose that it's very hard
9:35
to replace it. And so you can
9:37
have these radical decreases in the complexity
9:39
of your society. And in the case
9:42
of modern human civilization, that just has
9:44
to mean not just a large decrease
9:46
in standard of living, but actual starvation.
9:48
So that's the kind of thing that
9:50
I worry about more is sort of
9:53
cultural decay and risk aversion and just
9:55
people. not having kids and thus not
9:57
thinking about the distant future. I think
9:59
that is over. long periods and existential
10:02
risk that will always be with us
10:04
and is worth addressing. And then there
10:06
are other more near-term existential risks, whether
10:08
it is the Genghis Khan or nuclear
10:11
war or synthetic bio weapons or rogue
10:13
super intelligence. Like these are all things
10:15
that could actually wipe out humanity or
10:17
functionally wipe us out, but they're always
10:19
in that mix and it does feel
10:22
like the the risk of, you could
10:24
sort of think of like a hot
10:26
apocalypse and cold apocalypse where hot apocalypse
10:28
is nukes start flying and by the
10:31
time they stop, civilization is over, and
10:33
then cold apocalypse is things are just
10:35
a little bit less functional over time
10:37
and people are a little bit less
10:40
willing, a little bit lower trust and
10:42
a little bit less willing to invest
10:44
in long-term projects and take big risks
10:46
and smart people do things. They opt
10:49
out of making challenging high stakes decisions
10:51
decisions in favor of... other stuff that
10:53
they can use their intelligence to do.
10:55
And the whole thing just gradually falls
10:57
apart. But I suspect it would fall
11:00
apart in a pretty invisible way. Like
11:02
if you read about people who are
11:04
in late stage empires, they don't usually
11:06
feel like they're at the end of
11:09
the empire. In fact, it seems more
11:11
common for people to have this really
11:13
paranoid feeling that this could collapse at
11:15
any moment for them to feel like
11:18
that on the way up. This is
11:20
one like Bill Gates apparently in the
11:22
like throughout the 80s where it was
11:24
you know throughout the 70s where it
11:27
definitely made sense and then continuing to
11:29
the 80s but then into the 90s
11:31
where it was really hard to justify
11:33
Gates would always talk about how Microsoft
11:35
is just a couple bad releases from
11:38
being a completely irrelevant company that no
11:40
longer exists and I'm sure a lot
11:42
of people who had those conversations with
11:44
him in the 90s were thinking to
11:47
themselves. Bill, no one has ever made
11:49
as much money as you have at
11:51
your age and you're a monopoly in
11:53
this incredibly valuable product category and everyone,
11:56
all of the smart people, you are
11:58
very close to the top of the
12:00
list of every smart person's ideal place
12:02
to work. Like, how could you possibly
12:04
think that there's some risk of all
12:07
this going wrong? And I think his
12:09
answer would be, yeah, you could have
12:11
said that about a lot of other
12:13
people who's, who if I name them,
12:16
you won't actually recognize the names because
12:18
it is true that a couple mistakes
12:20
are enough to kill a company. And
12:22
then it doesn't feel like, it feels
12:25
like in the, in the mid-2000s when
12:27
Microsoft is kind of in this, something
12:29
Paul Graham wrote about was that he
12:31
no longer really had to warn startups
12:34
to think about what Microsoft could do
12:36
to kill them. They had to warn
12:38
them about Google, but Microsoft was just
12:40
not a threat to startups. Like in
12:42
that period, it does feel like people
12:45
weren't especially worried that Microsoft would die.
12:47
They sort of felt like it was
12:49
impregnable, but was just not really going
12:51
anywhere. And that is, again, a more
12:54
meaningful risk. part of what got Microsoft
12:56
out of that was a series of
12:58
pretty bold bets, some of which were
13:00
done under the Balmer era, and then
13:03
Satya was able to spearhead one of
13:05
those which did well enough that he's
13:07
now in charge of the company, and
13:09
he does seem to have the right
13:12
kind of paranoia. And maybe it is
13:14
helpful that it's interesting, actually, that Microsoft
13:16
has a lot of categories where they
13:18
are just number one, they're synonymous with
13:20
the category. they're run by someone who
13:23
previously worked in their cloud business, which
13:25
was not number one. It was the
13:27
company that was trying to catch up
13:29
to AWS. It actually did a really
13:32
impressive job. But that is, you know,
13:34
at a company where they are number
13:36
one at a bunch of different fields,
13:38
you actually do have to kind of
13:41
go out of your way to pick
13:43
someone who is not from a business
13:45
that was the best in its category,
13:47
but maybe that's what you actually need
13:49
is for them to always think of
13:52
themselves as underdogs who are not necessarily
13:54
everyone's first choice and who need to
13:56
be pretty scrappages to survive. Yeah, that's
13:58
what we should clear it up. We'll
14:01
continue our... view in a moment after
14:03
a word from our sponsors. Even if
14:05
you think it's a bit overhyped, AI
14:07
is suddenly everywhere. From self-driving cars to
14:10
molecular medicine to business efficiency. If it's
14:12
not in your industry yet, it's coming
14:14
fast. But AI needs a lot of
14:16
speed and computing power. So how do
14:19
you compete without costs spiraling out of
14:21
control? Time to upgrade to the next
14:23
generation of the cloud. Oracle cloud infrastructure
14:25
or OCI. OSI is a blazing fast
14:27
and secure platform for your infrastructure, database,
14:30
application development, plus all your AI and
14:32
machine learning workloads. OSI cost 50% less
14:34
for compute and 80% less for networking.
14:36
So you're saving a pile of money.
14:39
Thousands of businesses have already upgraded to
14:41
OSI, including Botafone, Thompson Reuters, and Suno
14:43
AI. Right now, Oracle is offering to
14:45
cut your current cloud bill in half
14:48
if you move to OSI for new
14:50
US customers with minimum financial commitment. Offer
14:52
ends March 31. March 31. See if
14:54
your company qualifies for this special offer
14:57
at oracle.com/turpentine. That's oracle.com/turpentine. Every time I
14:59
hop on the 101 freeway in SF,
15:01
I look up at the various tech
15:03
billboards and think, damn, turpentine should definitely
15:05
be up there. And I bet I'm
15:08
not alone. In a world of endless
15:10
scrolling and ad blockers, sometimes the most
15:12
powerful way to reach people is in
15:14
the real world. That's where ad quick
15:17
comes in. The easiest way to book
15:19
out-of-home ads, like billboards, vehicle wraps, and
15:21
airport displays, the same way you would
15:23
order an Uber. Started by Instacard Alums,
15:26
Adquick was born from their own personal
15:28
struggles to book and measure ad success.
15:30
Adquick's audience targeting, campaign management, and analytics,
15:32
bring the precision and efficiency of digital
15:34
to the real world. Ready to get
15:37
your brand the attention it deserves. Visit
15:39
Adquick.com today to start reaching your customers
15:41
in the real world. QUICK.com people who
15:43
are thinking about existential risk, I want
15:46
to segue to Elon because you both
15:48
covered that you sort of an update
15:50
with Twitter and how we're impressed with
15:52
how Twitter has has operated or sort
15:55
of the ability of Elon's financial engineering
15:57
prowess, but then also on the other
15:59
side, Doge has had a lot, you
16:01
know, been less in favor of the
16:04
past few weeks and I'm curious for
16:06
your your assessment of what's going on
16:08
there. So the Twitter thing is partly
16:10
financial engineering that Twitter has this relationship
16:12
with X-A-I and it is just one
16:15
of the perks of being Elon or
16:17
just being someone who is in control
16:19
of a bunch of different entities in
16:21
which his ownership stake, his economic ownership
16:24
varies, is that he does often have
16:26
the option to move cash flow, talent,
16:28
infrastructure, etc. to whichever company gets the
16:30
highest return from. That kind of works
16:33
really really well unless you just run
16:35
out of cash flow to reallocate or
16:37
all of your Tesla engineers are so
16:39
busy working on something for SpaceX that
16:42
they don't also have time to fix
16:44
some problem at Twitter or whatever. But
16:46
he was able to technically get Twitter's
16:48
equity value or Twitter's enterprise value back
16:50
up to where it was when he
16:53
bought it. But it's just, it's a
16:55
very different entity right now. And it
16:57
does feel like some of the capitalized
16:59
value of Twitter, Circa 2022, when he
17:02
did raise external capital, including on the
17:04
equity side. Some of that was people
17:06
who wanted to do him a favor,
17:08
who figured it was much better to.
17:11
buy something that Elon is buying that
17:13
makes him feel really good about himself
17:15
at $150 on the dollar than to
17:17
keep that money and not invest it
17:19
somewhere else. So basically a fee to
17:22
get on Elon's good side. And then
17:24
now it feels like there's a kind
17:26
of different math behind being on his
17:28
good side, which is that Twitter remains
17:31
a really important place for ideas to
17:33
rapidly disseminate. It is where the discussion
17:35
happens when there's a breaking news story
17:37
that doesn't seem likely to change. even
17:40
though they have had some attrition and
17:42
lost some of their contributors and they
17:44
have the same kind of boiling off
17:46
effects where if everyone, if people who
17:49
leave are more likely to be progressive,
17:51
then the site gets more right wing
17:53
as they leave and so it ends
17:55
up being too right wing for the
17:57
next set of more progressive people. But
18:00
then some of the people who stick
18:02
around either just really are open to
18:04
a broad spectrum of reviews and they
18:06
know that they have plenty of progressive
18:09
friends and can get the progressive side
18:11
of something or can just generate it
18:13
on the fly. And so they want
18:15
to hear what the right wingers are
18:18
saying. And then the other piece is
18:20
just people who like scrapping and who
18:22
absolutely want to go to a site
18:24
that is now more conservative because they
18:27
get to have arguments with people and
18:29
dunk on them all day. That's quite
18:31
fun some of the time. So. So
18:33
I think Twitter it is it is
18:35
not quite the town square but you
18:38
also can't really run the town square
18:40
if your town spans multiple countries with
18:42
different legal jurisdictions and Your owner the
18:44
owner of this town square has business
18:47
interests including interests in different countries that
18:49
have very different views from the US.
18:51
So yeah, you just can't have the
18:53
town square if the town square is
18:56
owned by someone who does a lot
18:58
of business in China. There are just
19:00
going to be limits to the things
19:02
that he's able to say. And then
19:04
if he's also part of the Trump
19:07
administration, there are further limits to what
19:09
kinds of discourse he wants to happen.
19:11
But yeah, I think so I think
19:13
it is partly just a story about
19:16
how if you If you have a
19:18
bunch of different entities and you just
19:20
have a bunch of different tradeoffs you
19:22
can make, you have a lot of
19:25
opportunities other people don't have. And that
19:27
is a lot of what negotiation is
19:29
in general is figuring out, okay, we're
19:31
both negotiating on this one dimension and
19:34
it's clear that we both care about
19:36
the outcome and we can't quite get
19:38
to the middle. Is there some other
19:40
trade-off that? I care about a lot
19:42
and my counterparty doesn't care about that
19:45
much or vice versa where we can
19:47
actually get closer to an agreement. And
19:49
so Ellen just has more degrees of
19:51
freedom than just about anyone and then
19:54
more hard constraints on how much you
19:56
can use at each of those things
19:58
than just about anyone. Are you, what
20:00
are your sort of, what is
20:03
your outlook on Doge going
20:05
forward? And if it's perceived
20:07
a bit, you know, its ability.
20:09
Yeah, so I think if, if
20:11
I were part of it, what I
20:13
would hopefully be thinking is the shock
20:16
and awe phase is over. we have
20:18
convinced everyone who gets a government contract
20:20
that these contracts will be scrutinized. In
20:22
fact, they'll be scrutinized in a pretty
20:25
hostile way, and sometimes you'll be doing
20:27
something good that sounds not so good,
20:29
and so you still lose your funding.
20:32
And so what that forces is some
20:34
level of transparency and
20:36
really a focus on KPIs. So
20:38
if you have some grant that
20:40
is actually serving U.S. interests and
20:42
it's doing so in a kind
20:44
of opaque way, just making that
20:46
more legible to someone who's new
20:48
to the system is probably a
20:50
good thing. And I think you
20:52
can look at a lot of
20:54
the mistakes Doha's made and say,
20:56
this is why we need people
20:58
who have much more expertise in
21:00
how U.S. foreign aid works and
21:02
how the U.S. government allocates it.
21:04
operating expense budget and things like
21:06
that. But I think the other side
21:09
of that is if you're deferring to
21:11
people who do have that expertise, you're
21:13
often deferring to people who also have
21:15
a vested interest in how things currently
21:17
work, and you don't really know what
21:20
the balance of that is unless you
21:22
actually start cutting and see what goes
21:24
wrong and what doesn't. But once that
21:26
shock and off phase is over, I
21:29
would love to see them just go after
21:31
there are areas. like Medicare where
21:33
there's just a there there are a
21:35
lot of opportunities for people in general
21:37
service providers to take advantage of
21:39
the system in a way that
21:42
was not intended that probably shows
21:44
up pretty straightforwardly if
21:46
you are doing statistical analysis on things
21:48
like I don't know like you could
21:50
look at some category of surgery and
21:52
look at what demographic factors predict
21:54
you know things like age predict
21:56
how common that surgery is and
21:58
then just look Okay, per capita,
22:00
is this happening a whole lot in
22:03
very specific places? Maybe there is someone
22:05
who is really, really interested in getting
22:07
subsidized to do this surgery and is
22:10
willing to do it for people who
22:12
could probably do without it. I think
22:14
something like that, it's maybe less exciting
22:17
and certainly less politically polarizing, but is
22:19
actually a useful thing to do. The
22:21
problem with that though is if they
22:24
had tried to do that early on
22:26
there would just be a long period
22:28
where Doe's is announced and then nothing
22:31
happens and then when they do have
22:33
an impact with these tiny incremental changes
22:35
that no one really cares about and
22:38
so Doe's just becomes this minor side
22:40
show and maybe in the next state
22:42
of the union Trump gets to say
22:44
here's somebody billions of dollars this organization
22:47
is saved and everyone will be asking
22:49
what's Doge? If in the next state
22:51
of the union, he quotes a number,
22:54
it's an accurate number, it's a few
22:56
billion less than if Doge had been
22:58
really rigorous from the very beginning, at
23:01
least in this case, everyone knows what
23:03
Doge is, everyone involved with Doge gets
23:05
to be proud, like people know that
23:08
they have actually achieved something. And given
23:10
the size of the team and given
23:12
the scope of US spending, it's probably
23:15
a pretty high ROI decision, even if
23:17
it's not quite the magnitude that Musk
23:19
was hoping for. That makes sense. The,
23:22
yeah, it is interesting. What is your
23:24
sort of assessment of sort of the
23:26
first, you know, a couple months of
23:28
the, of the Trump White House more
23:31
broadly? Do you think this is what
23:33
we expected, you know, sort of the,
23:35
like, the chaos, sort of the ups
23:38
and downs? Do we think things are
23:40
going to normalize? Is it just this,
23:42
this status quo going forward going forward?
23:45
How should we think about it? It
23:47
is very nice if you to ask
23:49
if it's as much chaos as expected
23:52
because yes, yes It is it is
23:54
as chaotic as I expected because I
23:56
remember the Trump one days very clearly
23:59
There was a lot of chaos, but
24:01
it was chaos that didn't actually lead
24:03
to that many outcomes, or there was
24:06
a very high ratio, chaos, discourse, news
24:08
flow, et cetera, to actual changes. And
24:10
it feels like they've had that same
24:12
level of discourse and uncertainty and things,
24:15
but have actually implemented a bunch of
24:17
changes. And I think some of those,
24:19
like some of the cost cuts that
24:22
Doge did, were just bad, or at
24:24
least were very ill-considered, if only from
24:26
a political standpoint. others seem actually pretty
24:29
reasonable. And it also, like the US
24:31
government, from what I understand, in many
24:33
cases they're pretty generous work from home
24:36
policies, but their real estate footprint has
24:38
not dropped that much. So that is
24:40
the kind of real alignment where I
24:43
think a lot of a lot of
24:45
companies in the private sector would have
24:47
very quickly adjusted their their spending to
24:50
make sure they're spending for offices that
24:52
actually have people in them. And there's
24:54
less of an incentive to do that.
24:56
within the federal government, but it's probably
24:59
helpful to do that kind of downsizing.
25:01
On the tariff stuff, it is, you
25:03
know, it's almost, you almost don't want
25:06
to say this is a really effective
25:08
bluffing strategy. In fact, it should, if
25:10
you don't like tariffs, you should make
25:13
it somewhat taboo to say this is
25:15
an effective bluffing strategy. What you should
25:17
do instead is pretend that you take
25:20
these completely seriously. This is a complete
25:22
reshaping of the economic order and no
25:24
country will be able to sell anything
25:27
to America, Americans, without paying a toll
25:29
to the US government. Like if you
25:31
say that, then tariffs are actually a
25:34
more credible threat. If everyone understands that
25:36
a 25% tariff is the start of
25:38
a negotiation, the actual tariff will be
25:40
much lower in the end, then the
25:43
ratio of starting bluff to final outcome
25:45
gets a lot more extreme in favor
25:47
of less important final outcomes. One thing
25:50
that has been interesting is to see
25:52
just how other countries are reacting in
25:54
terms of doing their own defense spending
25:57
and then there's the story that came
25:59
out I think over the weekend about
26:01
how the UK has a digital service
26:04
tax. they are considering preemptively removing that
26:06
tax because the tax is basically a
26:08
tariff. It is not on not on
26:11
goods but definitely on services and the
26:13
way it's structured is that it's a
26:15
tax on social networks and search and
26:18
a couple services like that but these
26:20
are all categories that are just overwhelmingly
26:22
dominated by American companies so it does
26:24
in effect mean that there was a
26:27
tariff on the US and now it's
26:29
going away and As I've pointed out
26:31
a couple times, the tech companies are
26:34
in a really interesting part of the
26:36
crossfire, which I think they probably did
26:38
not anticipate. I suspect that some people
26:41
would have been a little bit less
26:43
Trumpy had they thought that the tariff
26:45
talk was serious than it actually is.
26:48
Sorry, my doctor's. jumped off the bed.
26:50
But these companies, they're they're kind of
26:52
swing voters or swing swing influences where
26:55
they used to be very pro-democratic and
26:57
I think if you ask them, they
26:59
just wouldn't have been able to credibly
27:02
say that they were centrist. They would
27:04
have said that they, yeah, that many
27:06
of their employees are, you know, they
27:08
want to give each side of fair
27:11
hearing, but yeah, it's much, much more
27:13
likely that a Democratic candidate than a
27:15
Republican candidate would be able to do
27:18
a fundraiser and get a lot of
27:20
these people to. to attend and donate.
27:22
And now Big Tech is still probably
27:25
tilts left, but there are plenty of
27:27
people who are vocally right. And that
27:29
means that if you are running trade
27:32
policy in some country and the US
27:34
is threatening that country with tariffs, one
27:36
of the things you think about is
27:39
who are some swing voters who we
27:41
can really punish this politician through. And
27:43
if everyone in tech who supported Trump
27:46
associates Trump with lots of bad things
27:48
that happen to American tech companies, then
27:50
it's probably much. much harder for Vance
27:52
to raise money from these people in
27:55
four years. So yeah, it's it is
27:57
a form of trade policy that is
27:59
very heavily informed by US domestic political
28:02
considerations, but that's just how the world
28:04
works. But like it is, I think
28:06
from from from a big tech perspective
28:09
if the UK decides to preemptively be
28:11
nicer to big American tech companies then
28:13
that's actually enough of a tariff victory
28:16
that they feel good about it and
28:18
I presume Trump feels really good about
28:20
the idea that he didn't even have
28:23
to directly threaten a tariff to get
28:25
this so so maybe that it does
28:27
end up leading to an outcome where
28:30
there are probably going to be slightly
28:32
more barriers for trade but It's still
28:34
a recognizable economic order. And then another
28:36
thing is that if the US is
28:39
protecting global seaborne trade routes by shooting
28:41
rockets at people who try to attack
28:43
ships, that is economically equivalent to a
28:46
lower tariff. It is, it's eliminating deadweight
28:48
loss instead of eliminating attacks, but it
28:50
still does reduce the cost of trade.
28:53
And part of the story of globalization,
28:55
part of it was changing. legal and
28:57
economic infrastructure such that trade was more
29:00
viable, but a lot of it was
29:02
just that container ships are a really,
29:04
really cheap way to move things arbitrarily
29:07
long distances and they tend to create
29:09
much more complicated supply chains that are
29:11
much more sprawling and very hard to
29:14
unwind. So it is doing things like
29:16
that is in some economic sense equivalent
29:18
to making slightly better container ships or
29:20
just slightly reducing tariffs. We'll continue our
29:23
interview in a moment after a word
29:25
from our sponsors. In an age of
29:27
AI and big data, your personal information
29:30
has become one of the most valuable
29:32
commodities. Data brokers are the silent players
29:34
in this new economy, building detailed profiles
29:37
of your digital life and selling them
29:39
to the highest bidder. But data breaches
29:41
happening frequently, these profiles aren't just fueling
29:44
spam. They're enabling identity theft and putting
29:46
your financial security at risk. But what
29:48
if there's a way you could opt
29:51
out of that system entirely? Incogna is
29:53
a service that ensures your personal data
29:55
remains private and secure. They handle everything
29:58
from sending removal requests to managing any
30:00
pushback from data brokers. press most about
30:02
Encogmy is how it provides ongoing protection.
30:04
Once you're set up, it continuously monitors
30:07
and ensures your information stays off the
30:09
market. Want to extend that peace of
30:11
mind to your loved ones? Encogmy's family
30:14
and friends plan lets you protect up
30:16
to four additional people under one subscription.
30:18
Take control of your digital privacy today.
30:21
Use code upstream at the link below
30:23
and get 60% of an annual plan.
30:25
Again, that's Encogmy.com/Upstreet. Hey
30:30
everyone, Eric here. In this environment, founders
30:32
need to become profitable faster and do
30:34
more with smaller teams, especially when it
30:37
comes to engineering. That's why Sean Lanahan
30:39
started Squad, a specialized global talent firm
30:41
for top engineers that will seamlessly integrate
30:44
with your org. Squad offers rigorously vetted
30:46
top 1% talent that will actually work
30:48
hard for you every day. Their engineers
30:51
work in your time zone, follow your
30:53
processes, and use your tools. Squad has
30:55
front-end engineers excelling in typescript and react,
30:57
and next JS, ready to onboard to
31:00
your team today. For back-end, Squad engineers
31:02
are experts at NojS, Python, Java, and
31:04
a range of other languages and frameworks.
31:07
While it may cost more than the
31:09
freelancer on Upwork billing you for 40
31:11
hours, but working only two, Squad offers
31:14
premium quality at a fraction of the
31:16
typical cost, without the headache of assessing
31:18
for skills and culture fit. Increase your
31:20
velocity without amping up burn. Visit choose
31:23
squad.com and mention turpentine to skip the
31:25
wait list. One of the things that
31:27
people have been thinking about, of course,
31:30
is what's the future for the Democratic
31:32
Party and Ezra Klein and Derek Thompson
31:34
just came out with a new book
31:37
called Abundance and you gave it a
31:39
brief verb in your long reads. You
31:41
know, Antonio Garcia Martinez joked on Twitter
31:43
or on X that is just helping.
31:46
Democrats appreciate supply and demand a little
31:48
bit. Is this just a basic, hey,
31:50
markets aren't that bad? I just tried
31:53
to phrase it. in a non-right-wing way?
31:55
What's your assessment of the, of the,
31:57
of the, of the, of the, of
32:00
the, of the, of the, of the,
32:02
of the, of the ideas there in
32:04
how it relates to Democrat Party? No,
32:06
I like it a lot. I think
32:09
that abundance, it maybe has ended up,
32:11
you could definitely tell a, tell a
32:13
conservative story about abundance, but it's already
32:16
kind of the conservative bed, which if
32:18
we aim for more economic efficiency in
32:20
the econ sense that will all be
32:23
much better off and that that actually
32:25
solves a lot of the problems that
32:27
redistribution is supposed to solve that basically
32:30
if you have a sufficiently rich society
32:32
it is very cheap to keep people
32:34
out of abject poverty and there are
32:36
just a lot of job opportunities for
32:39
people so you don't have to have
32:41
as many transfer payments to deal with
32:43
that. That is always one of those
32:46
situational things and there can definitely be
32:48
times where for a particular voter is
32:50
just way better for them to have
32:53
more transfer payments coming in rather than
32:55
for GDP growth to be half a
32:57
percent higher next year. But over really
32:59
long periods, the GDP growth thing is
33:02
really all that matters and you'd much
33:04
rather live in a country like the
33:06
US at pretty much any part of
33:09
the economic stratum than a country that
33:11
has perfect redistributive strategies has had them
33:13
for so long that GDP growth has
33:16
undershot the world average for multiple generations.
33:18
Like that country you will you'll be
33:20
a lot closer to your neighbors in
33:22
terms of standard of living but none
33:25
of you will have a great standard
33:27
of living. And if you live in
33:29
a more global globalized world where people
33:32
can stream videos of shows that are
33:34
set in the United States and they
33:36
can talk to Americans about what their
33:39
day-to-day life is like and they can
33:41
buy a plane ticket and they can
33:43
potentially get a visa of some kind.
33:45
And then you'll in in those more
33:48
egalitarian countries they will tend to bleed
33:50
off talent and that talent will go
33:52
to higher variance places. So yeah it's
33:55
in the long run you do need
33:57
some level of and you need some
33:59
growth-focused policies just for your country to
34:02
stay competitive. And then there's another debate
34:04
in that over how you handle people
34:06
who are mostly experiencing the downsides of
34:08
that kind of growth. But there's also
34:11
an interesting kind of political entrepreneurship here
34:13
where the Republican Party is just a
34:15
lot less growth-focused than they were a
34:18
decade ago. And they are a lot
34:20
more interested in other issues. Some of
34:22
it is that immigration is more salient,
34:25
but... Some of it is just Republicans
34:27
are a lot less jazzed to talk
34:29
about things like how tax cuts will
34:31
affect incentives and cause people to start
34:34
more companies and things and that is
34:36
true. Like it is true that on
34:38
the margin the tax rate you pay
34:41
does affect the decisions that you make
34:43
and in particular if you have a
34:45
very high marginal tax rate and someone
34:48
if someone is considering a very high
34:50
variance opportunity the effective marginal tax rate
34:52
is actually even higher because in many
34:54
outcomes they fail and then in the
34:57
outcome where they succeed they pay very
34:59
very steep taxes on on that success
35:01
so it does truncate the the tail
35:04
of the bell curve where you actually
35:06
or the tail of the the distribution
35:08
where a lot of the a lot
35:11
of the companies and other institutions that
35:13
actually create long-term growth that's that's where
35:15
they are is in exactly that part
35:18
of the distribution so it is good
35:20
to be very careful about that kind
35:22
of thing and there's a lot of
35:24
just non-partisan low-hanging fruit it feels like
35:27
if If there were, if politics, if
35:29
foreign policy got more boring, if the
35:31
social issues were less lively, and if
35:34
there were actually just space on front
35:36
pages to talk about things like permitting
35:38
reform, that we'd probably be in a
35:41
much better spot. And also, just as
35:43
the parties reshuffle themselves, there is the
35:45
question of where some people go. So
35:47
I'm sure there are plenty of people
35:50
who were libertarians and held their nose
35:52
and vote Republican for a while and
35:54
have switched to Democrat, and then vice
35:57
versa, there are plenty of libertarians who
35:59
made the opposite switch. more vocally. But
36:01
things haven't really settled and which party
36:04
gets, which constituency is always an open
36:06
question. One of the other pieces I
36:08
linked in the same newsletter I was
36:10
talking about abundance was this breakdown of
36:13
all the detail, the detailed demographics we
36:15
now have on which voters were swinging
36:17
in which direction in the 2024 election.
36:20
And one of the things that struck
36:22
me was that when I first heard
36:24
about just how different voting demographics work.
36:27
There were a lot of cases where
36:29
if you're in the middle of some
36:31
distribution, especially something like income or education,
36:33
you're Republican, and then if you're at
36:36
either tail, either very poor or very
36:38
rich, you're more likely to be a
36:40
Democrat. And then for education, it was
36:43
if you are a college graduate, you're
36:45
very Republican, if you have a GED,
36:47
you probably vote Democrat. And now those
36:50
are neither of those are really true.
36:52
That the Republican party does skew you
36:54
more towards lower educated voters and the
36:56
Democrats. and they also skewed more towards
36:59
lower income voters. So it is just
37:01
a different coalition than it was when
37:03
Mitt Romney was running. And so there
37:06
are a lot of people who just,
37:08
they registered when there was one set
37:10
of assumptions about not just what are
37:13
Republicans believe, but who is a typical
37:15
Republican, and now they feel a little
37:17
bit more isolated. But when you think
37:19
about something like that, it is also
37:22
important to realize that we're not talking
37:24
about something where... 80% of the people
37:26
who earn more than 200k a year
37:29
vote Democrat and you know 80% of
37:31
the people earning less than 30k a
37:33
year vote Republican or whatever whatever the
37:36
ranges would be like there are still
37:38
there's still a lot of political diversity
37:40
within those cohorts and so plenty of
37:42
people they may not they have high-income
37:45
people or well-educated people they may not
37:47
know it but they do have Republican
37:49
friends so it's it's not as if
37:52
there's this totally totally income and class-based
37:54
distinction but It is different from what
37:56
it used to be. Yeah, it's really
37:59
interesting. It's, it's. seems like whatever's going
38:01
to be popular is just going to
38:03
be whatever Donald Trump hates or doesn't
38:05
want because it's sort of the negative
38:07
polarization and so the more that he
38:09
is sort of anti-free market or anti-trade
38:11
or anti-immigration the more that Democrats you
38:14
know will be will be in favor
38:16
of that. Well I think for you
38:18
know in particular for abundance I think
38:20
there's there's some level of opportunism in
38:22
that now is a really really good
38:24
time to tell Democrats that free markets
38:26
are a good idea because you can
38:28
actually run as the free market person
38:30
and America, you know, free, the term
38:33
free market probably pulls reasonably well
38:35
among Americans. I'm not 100% sure
38:37
on that, but I'm sure there
38:39
are plenty of terms that indicate
38:41
that and that are things that
38:43
a politician wants to say a
38:45
whole lot. So I don't I don't
38:47
think there's that much I don't think
38:49
much of this is driven by people
38:51
just saying I would be inverse Trump
38:53
in every respect but I think definitely
38:55
you want to highlight that message at
38:57
a time when it's also very politically
38:59
viable. That makes sense speaking of
39:01
sort of Republicans in the economy you
39:04
know there's been concern of last few
39:06
weeks about you know are we going to enter
39:08
a recession? You wrote a great capital gains
39:10
piece about sort of the different ways that
39:13
that that economy could fall into residence.
39:15
Why do you sort of unpack that
39:17
a bit? Yeah, so a lot of
39:19
what I was thinking about in that
39:22
piece was that you, when you look
39:24
at older recessions, there, well, the further
39:26
back in history you go, just
39:28
the less comparable things are. And
39:31
financial crises can happen in
39:33
many different environments, but they
39:35
are just a much worse
39:37
thing to happen in a
39:39
more financialized economy. So they do become
39:41
in many ways a bigger factor when
39:44
just there are a lot more a
39:46
lot more borrowers a lot more lenders
39:48
and a lot more interconnections within the
39:50
system. But also yeah recessions sometimes they
39:53
are just kind of a more random
39:55
outcome the economies don't grow forever and
39:57
there are feedback loops where if there
39:59
is a growth shortfall in some sector
40:01
for whatever reason that can ripple out
40:03
into into other ones and in some
40:05
ways I think you can worry a
40:08
little bit less about recessions than you
40:10
used to have to worry about them
40:12
because the rates are not zero and
40:14
while inflation is not running you know
40:16
it hasn't been completely conquered it's not
40:18
nearly as salient as it was a
40:20
while ago. And I think one of
40:23
the indicators of that is actually that
40:25
people are specifically talking about the price
40:27
of eggs because eggs have been idiosyncratically
40:29
very expensive recently and If inflation were
40:31
a broader problem, there would just be
40:33
more product categories that we would talk
40:36
about than just X. So in some
40:38
ways, there is more room for the
40:40
government to react, but you can have
40:42
a recession that stems in part from
40:44
businesses just being really uncertain about whether
40:46
or not the capital investments they make
40:48
right now are worth it. And I
40:51
think that that's one of the negative
40:53
outcomes from tariffs is that Over a
40:55
long period, there would be adjustments. There
40:57
would be more factories in the US
40:59
building goods that were previously built outside
41:01
of the US and shipped here. And
41:04
so labor costs would be probably mostly
41:06
higher in the US, but your transportation
41:08
costs are lower. And depending on the
41:10
overall mix of what moves here, what
41:12
doesn't, you maybe end up with a
41:14
similar or only slightly higher cost structure.
41:16
And you're also stimulating demand if you
41:19
have. more of those goods built by
41:21
American factory workers who are going to
41:23
spend most of their income. So you
41:25
can have that kind of adjustment. Things
41:27
can end up being kind of close
41:29
to normal and yes we pay higher
41:32
prices but we've secured important parts of
41:34
our supply chain and that's just a
41:36
worthy price to pay. You can also
41:38
have a case where The cost of
41:40
importing goods is higher because of tariffs.
41:42
Companies are considering relocating, but they also
41:44
ask themselves, well, if these tariffs cause
41:47
a recession and the recession means that
41:49
Republicans are unelectable in 2028, and the
41:51
first thing a Democrat does when they're
41:53
in office is eliminate all of these
41:55
tariffs, then I just have this big
41:57
factory in a place with more. labor,
42:00
and I will really wish that I'd
42:02
kept it in Malaysia or Vietnam or
42:04
Mexico or wherever. So that kind of
42:06
uncertainty can mean that the inflationary impact
42:08
of tariffs isn't very self-correcting. And it's
42:10
hard to say exactly what you do
42:12
about that other than you can make
42:15
your tariffs more expensive and more inflationary
42:17
if you also couple them with subsidies,
42:19
credit or otherwise for people who build
42:21
in the U.S. If it is just
42:23
much harder for environmental review to indefinitely
42:25
delay building a factory, then more of
42:28
those factories get built. Because it is
42:30
probably the case that if you decide
42:32
today that you're going to build the
42:34
next Toyota factory in the United States
42:36
rather than in some of their country,
42:38
it's not necessarily the case that you
42:40
actually get that thing done and up
42:43
and running by the time Trump is
42:45
out of office. So there are a
42:47
lot of incentives to wait things out
42:49
and streamlining the process by which projects.
42:51
Good approval might actually be a way
42:53
to make the economy a little more
42:56
adaptable to that. Yeah, interesting. You know,
42:58
there was this sort of line of
43:00
thinking in the past few weeks around
43:02
like some people were trying to spin
43:04
this into a good thing or this
43:06
meaning the sort of stock market going
43:08
down. I think they're trying to say
43:11
things like, hey, this will help us
43:13
with our debt somehow or this will
43:15
sort of like clean the sewage out
43:17
the system. I heard people say things
43:19
like or like. You know, there's a
43:21
difference between Wall Street and Main Street,
43:24
and this is just affecting Wall Street,
43:26
but sort of Main Street's not affected.
43:28
Or, you know, the stock market's not
43:30
the economy. So, like, what is, what
43:32
is the right way of thinking about
43:34
it? Is there a way, I assume
43:36
that that's all cope, basically, and that
43:39
everything that affects sort of Wall Street
43:41
also affects Main Street, but let's disentangle
43:43
this. Yeah, yeah, like if you if
43:45
you get fired from your job and
43:47
your first reaction as well, the office
43:49
chairs were really uncomfortable. So this is
43:51
actually a good thing. Like it's obviously
43:54
cope. It is. Like all else being
43:56
equal it is probably better for the
43:58
market to be up than down. It
44:00
probably means that the US is producing
44:02
more things and that these are things
44:04
that people want such that they are
44:07
willing to pay a markup for them
44:09
so these companies are profitable and people
44:11
have confidence that the growth will continue
44:13
so it's not just that earnings are
44:15
high but multiples are rising etc etc
44:17
etc etc but it's also the case
44:19
that yeah sometimes there there are political
44:22
decisions you can make that do redistribute
44:24
some upside from the from consumption to
44:26
investment or from labor to asset holders
44:28
So yeah, a rising stock market is
44:30
not always a good sign. And it's
44:32
always going to be out of sync
44:35
with the rest of the economy too.
44:37
So if you look at a chart
44:39
of the S&P 500 versus unemployment rates,
44:41
you'll definitely find a lot of cases
44:43
where they're moving in opposite directions, but
44:45
it's because S&P is going down in
44:47
anticipation of recession or recession still low
44:50
because the recession hasn't hit yet. And
44:52
then. Unemployment starts going up, market's still
44:54
going down, and then at some point,
44:56
market starts rallying, and unemployment is still
44:58
rising. And so you look at this,
45:00
and you're like, what are people doing?
45:03
But what they were doing in March
45:05
of 2009 was saying, unemployment used to
45:07
go up by 0.4 to 0.5 percentage
45:09
points per month and now it's only
45:11
0.3 to 0.4 so we know that
45:13
things are actually getting worse more slowly
45:15
than before and you know housing prices
45:18
they're dropping more slowly than they were
45:20
a couple months ago and so we
45:22
can kind of see that at some
45:24
point the the second derivative means that
45:26
there will be growth again and You
45:28
just you don't want to wait until
45:31
unemployment is back below 5% and say
45:33
okay now is the time to get
45:35
it to equity So you miss a
45:37
lot of the upside there if you
45:39
see that there's a path towards that
45:41
then that's when you buy So yeah,
45:43
there's there is a lot of cope
45:46
here because it's it's not as if
45:48
what's going on is that we have
45:50
these policies that are just redistributing a
45:52
lot of what would have been money
45:54
going towards buybacks into consumers pockets That
45:56
is just not the case what we
45:59
see instead is that there's a set
46:01
of policies that do cause a lot
46:03
of uncertainty, they cause some deadweight loss,
46:05
they do have some upsides over time,
46:07
especially if implemented well, and it's very
46:09
unclear what those policies will ultimately be,
46:11
but in the meantime, I think pretty
46:14
much anyone who is analyzing stocks in
46:16
the S&P 500 prefers the policy mix
46:18
of a couple months ago to the
46:20
policy mix of today, at least in
46:22
terms of this year's expected earnings per
46:24
share. Yeah, so there's a lot of
46:27
cope there. Yeah, you really, you really,
46:29
really don't want to hear people in
46:31
a policy-making capacity working for the U.S.
46:33
government saying, well, the upside is we'll
46:35
be able to refinance our debt really
46:37
cheaply because the lower, you know, rates
46:39
go to zero when there's a depression.
46:42
So, yes, it's true that you know,
46:44
you're going to be the main thing
46:46
people are talking about, like the headlines
46:48
in... in September of 2008 were not
46:50
treasury interest as a percentage of total
46:52
federal spending likely to drop over the
46:55
next several years. That's not what people
46:57
will really fixated on. In fact, when
46:59
that happens, what the government does, which
47:01
is a reasonable thing to do in
47:03
a variety, but not every economic bottle,
47:05
and it's the one that I relatively
47:07
subscribe to, the reasonable thing to do
47:10
is, okay, money is free, we can
47:12
borrow a lot of it, and let's
47:14
spend it very quickly. We're basically taking,
47:16
if you think of the marginal treasury
47:18
bond buyer as someone who is saving
47:20
rather than spending, you want to take
47:23
the money that they put into treasuries
47:25
and turn that into spending rather than
47:27
saving. So you're recycling that money towards
47:29
people who are lower income, have a
47:31
higher marginal propensity to spend, and all
47:33
of this stuff, it does just, the
47:35
degree distribution just happens as a matter
47:38
of policy, and it is kind of
47:40
a social norm that we don't want
47:42
a, we don't want a strictly fair
47:44
set of outcomes for every situation. But
47:46
it is also true that if we're
47:48
thinking about that. those issues of redistribution
47:50
versus growth and the tradeoff between the
47:53
two, it's just a whole lot better
47:55
to be thinking about those at a
47:57
time when the economy is going reasonably
47:59
well and it could be going a
48:01
little bit better, versus at a time
48:03
where you just don't know which critical
48:06
institution will fail next and you don't
48:08
know which major industry in the US
48:10
is going to be going through the
48:12
bankruptcy courts. That's just a time when
48:14
you have a lot of really narrow
48:16
questions around this many tens of thousands
48:18
of people are not getting the next
48:21
paycheck that they thought they were getting
48:23
and what do we do about that
48:25
and you know those those questions have
48:27
answers that are sometimes sometimes involve lots
48:29
of financial policy changes and sometimes it's
48:31
like let's make sure that we know
48:34
how many police officers are available to
48:36
to do some overtime if things get
48:38
really bad so yeah you just you
48:40
don't want to be making economic policy
48:42
when that's one of the considerations is
48:44
like how much teargust we actually have
48:46
right now. And I don't know, we're
48:49
not anywhere near there, and there are
48:51
a lot of economic missteps that can
48:53
be made that switch growth from high
48:55
growth to less high growth. And even
48:57
if there is a recession, the recession
48:59
is partly the economy calling Trump's bluff
49:02
on how high a tariff should be.
49:04
And I don't know exactly what Trump
49:06
would do in a situation like that,
49:08
but it's entirely possible that he would.
49:10
find some way to gracefully save face.
49:12
And this is actually a case where
49:14
the rest of the world would really,
49:17
really, really want to tell Trump that
49:19
he won and that he got everything
49:21
he wanted and, you know, that they
49:23
surrender, etc. because they really, the rest
49:25
of the world needs U.S. demand during
49:27
a global recession. The U.S. is a
49:30
large enough share of global GDP that
49:32
a recession in the U.S. is probably
49:34
a global recession unless China deliberately intervenes
49:36
in order to cause growth, which has
49:38
its own set of problems. So
49:41
what you'd probably see in that
49:43
case is that Trump would be getting
49:45
a lot of credit from world leaders,
49:47
which very few really informed observers would
49:50
think was all that credible, but in
49:52
the end, it would be, you know,
49:54
they would be telling Trump, you got
49:57
it right, we were exploiting you in
49:59
terms of global trade now, please. lower
50:01
these tariffs somewhat and start spending more
50:04
money so that the world economy can
50:06
grow again. Yeah. Is it your view
50:08
in a world where we spend
50:10
less on Ukraine and Europe spends
50:12
more and in general kind of like
50:15
re-industrializes a little bit, is that
50:17
not in the US foreign policy
50:19
interest to have Europe sort of reinvesting
50:21
in their own? sort of capabilities
50:23
there. What are your thoughts? An
50:25
empire is an asset. It's also a
50:27
liability. And sometimes the balance between those
50:30
two can slightly shift, such that
50:32
you suddenly want to be way
50:34
less of an empire. And I think
50:36
the US view for a while
50:38
was there is no other country
50:40
that can credibly defend the free world.
50:42
And so the US has to do
50:45
it. And there are advantages to
50:47
ultimately calling the shots on all
50:49
matters military in the US fear of
50:51
influence. and then at other times
50:53
yeah that's just there's there's less
50:55
upside to that and the countries in
50:57
Europe are able to take care of
51:00
themselves some of them do actually
51:02
spend more heavily on the military
51:04
but most of them don't and it
51:06
kind of shows and so I
51:08
think it is it is probably
51:10
it does weaken the US in the
51:12
sense that it is the US
51:14
can can really throw its way
51:16
around more if it is the only
51:19
way that military force can be deployed
51:21
in a meaningful scale outside of
51:23
just defending those countries and even
51:25
then it's kind of iffy, but maybe
51:28
that's just not really worth it.
51:30
And I also think it's just
51:32
kind of weird for there to be
51:34
a handful of real countries and then
51:36
a lot of countries that are
51:38
on this weird continuum where it
51:40
doesn't, I'm just not entirely clear if
51:43
a hundred years from now when
51:45
people, when historians are drawing world
51:47
maps of what the world looks like
51:49
and what the borders were in say
51:51
the 1990s, do they actually, I
51:53
would assume that the right thing
51:55
to do is you color the US
51:58
one color and then you use
52:00
a slightly different shade of that
52:02
color to color in most of Western
52:04
Europe and Japan and Australia and
52:06
a few other places because they
52:08
don't really have the scale to make
52:10
a lot of independent foreign policy decisions
52:13
like a lot of their foreign
52:15
policies implicitly through the US and
52:17
within the bounds of the US sets.
52:19
I'm not sure that America actually
52:21
is all that good at that.
52:23
And it's certainly not something like, I
52:25
don't think America thinks of itself as
52:28
being this imperial core with a
52:30
bunch of vassal states of varying
52:32
degrees of independence under it. So I
52:34
think if you have an empire
52:36
that's run by a country that
52:38
doesn't think of itself as running an
52:40
empire, then that's probably going to lead
52:43
to some dumb mistakes. So if
52:45
nothing else, it just aligns. perception
52:47
with reality by shifting reality more towards
52:49
yes Germany is a real country
52:51
it has a military you know
52:53
not just a flag but a military
52:55
too and in the event that Germany's
52:58
interests are best served through the
53:00
use of force their first decision
53:02
you know they will call the US
53:04
president to tell him what they're
53:06
going to do and not call
53:08
the US president and ask him what
53:10
he's going to do yeah and
53:12
okay so that's Europe and by
53:14
the way You probably saw the JD
53:17
Vance, the group chats that just leaked.
53:19
It was very funny where they
53:21
basically let the editor of the
53:23
Atlantic in by him. It was pretty
53:25
wild. Yeah, and he's saying how
53:27
he hates. Yeah, I was thinking
53:30
of just dropping random messages into someone
53:32
by group chats saying things like, yeah,
53:34
we'll probably be launching the missiles
53:36
in exactly two hours. Oops, wrong
53:38
chat, sorry. That's so funny. But signal
53:41
does make it slightly easy to
53:43
send a message to the wrong
53:45
person. So I do try to start
53:47
one-on-one signal chats with just idle chit-chat
53:49
that makes me comfortable with the
53:51
fact that I'm actually talking to
53:53
the person. I think I am because...
53:56
There are just a lot of
53:58
people with the same first and
54:00
last initial and you don't, especially if
54:02
you have like an urgent thing
54:04
that you need to tell this
54:06
person because they specifically would find it
54:08
extremely funny. That is often the case
54:11
where you really want to make
54:13
sure you have the right person
54:15
first. Yeah, it is. That's like every
54:17
journalist dream. You just get accidentally
54:19
added to the most powerful group
54:21
around the world. It's hilarious. We just
54:23
spoke about Europe. How about China and
54:26
particularly as it relates to AI
54:28
policy, you know, the Sam Altman
54:30
interview with Ben Thompson, Stratekri, he kind
54:32
of hinted that open AI might
54:34
be doing a bit more open
54:36
sourcing. We're going back to its roots
54:38
in some ways. And so maybe, you
54:41
know, seeing that, because China was
54:43
able to sort of, you know,
54:45
create deep seek, it seems like there's
54:47
less of a desire to not
54:49
be open source at the moment.
54:51
I'm curious. What you think about that
54:53
debate? What you think about that
54:55
debate? sort of the merits and
54:57
tradeoffs of, you know, China AI policy
55:00
options. Yeah, so I think if you
55:02
hold the models constant, the questions
55:04
are, the advantages you could have
55:06
are one distribution, two, you have access
55:08
to the best hardware, and three,
55:10
you're applying these models to data
55:12
that you can actually use to make
55:15
decisions you really care about. And China
55:17
is obviously pretty far behind on
55:19
the hardware question. They do seem
55:21
to be not necessarily catching up, but
55:23
getting closer, and I think that
55:25
is an important thing to watch.
55:27
It's been historically quite a struggle for
55:30
China to turn dollars of semiconductor industry
55:32
subsidies into working chips that are
55:34
close to what TSMC can do,
55:36
but they've been getting closer on that
55:39
front, and then there are still
55:41
apparently many ways for them to
55:43
get access to some Western hardware. So
55:45
they're still able to do some of
55:47
that, but then on the distribution
55:49
side... There was a story I
55:51
think in the rest of world recently
55:54
about how many different Chinese companies
55:56
and government entities are trying to
55:58
use deep seeks models everywhere they can.
56:00
And I found that just because
56:02
it is one of the one
56:04
of the obstacles to AI having a
56:06
big economic impact is that people need
56:09
to actually decide to use it
56:11
and they need to rethink a
56:13
lot of different workflows based on the
56:15
fact that they have these tools.
56:17
Like there are just a lot
56:19
of things where if you you thought
56:21
about doing something and you thought to
56:24
yourself, well, this would be worth
56:26
this might be worth doing, I
56:28
would have to read. comprehensively analyze 10,000
56:30
pages of PDFs, in order to
56:32
decide if it was actually worth
56:34
taking the next step, and that upfront
56:36
cost is so high that I'm just
56:39
not going to bother, well, analyzing
56:41
large amounts of unstructured text and
56:43
pulling out specific insights is something elements
56:45
are quite good at. So it
56:47
does expand the scope of things
56:49
you're quite good at. So it does
56:51
expand the scope of things you
56:53
can do, but if you're not
56:55
thinking about it from that you now
56:58
have available. So the distribution side is
57:00
interesting and then the data side,
57:02
China just the Chinese government collects
57:04
a lot of data and they have
57:06
a lot of questions that other
57:08
governments don't have about what people
57:10
are saying and who they're associating with
57:13
and some of the things that they
57:15
might do, etc. So there is
57:17
a lot of room for China
57:19
to be just a more effective authoritarian
57:21
country with access to even models
57:23
at the current status. I think
57:25
one possibility is just they get the
57:28
models good enough that dissent is not
57:30
really an option. CCP is pretty
57:32
sure that it will be able
57:34
to celebrate its 100 year, the 100
57:36
year anniversary of winning the Chinese
57:38
Civil War in style and at
57:41
that point it becomes less urgent for
57:43
them to compete. So that is
57:45
one possibility and beyond that. It
57:47
does seem like there's this brief window
57:49
where the last generation or two of
57:52
hardware is still good enough and
57:54
if you are constrained on that
57:56
and you also have just a really
57:58
cracked team. of people who love
58:00
doing low-level optimization, that you can
58:02
make up for some of that lack
58:04
of hardware access. But then the problem
58:07
that you run into after that
58:09
is, okay, you've identified this set of
58:11
low-level optimization, you can get X percent
58:13
more out of a given chip, and
58:15
you've probably given a lot of your
58:17
American peers some ideas for what to
58:19
do with their much larger supply of
58:22
now much faster chips. So I don't. I
58:24
could see, like in general, if
58:26
you open source models, hardware
58:28
does become a more, hardware
58:30
access becomes a more important
58:32
competitive factor because that's the difference
58:35
if everyone's running the same model.
58:37
It's just who's running it faster
58:39
and who needs fewer watts. So
58:41
at that point, it actually becomes
58:43
in the US interest to have
58:45
really good open source models. And
58:47
if you do build models that
58:50
are open source, it's open weight.
58:52
Open weight models that are trained on
58:54
a corpus that does tend to serve
58:56
U.S. interests and that will censor the
58:58
things that you want censored basically and
59:00
not censor the things that you'd actually
59:02
rather the average Chinese net doesn't have
59:04
a little bit more familiarity with. then
59:06
maybe that actual open source arc is
59:08
a good thing. And it's certainly something
59:10
where it gives Sam Altman some political
59:12
cover, if he can say that there's
59:14
a model that is better than anything
59:16
Deep Seek has, and yes, it knows
59:18
who Tank Guy is, and yes, it
59:20
knows which beloved children's character Shishim Ping
59:22
has an uncanny resemblance too. So this
59:25
model will be running on. every smartphone.
59:27
Smartphones tend to move across borders pretty
59:29
frequently, and therefore this is actually U.S.
59:32
soft power being projected into China. And
59:34
then that thesis actually runs into the
59:36
problem of, okay, if China just wants
59:39
to get models, widely deployed models that
59:41
are good enough for keeping an authoritarian
59:43
state in power, well, if there are
59:46
better models out there, does that mean
59:48
that the state is actually getting weaker
59:50
unless it continues to try to advance
59:53
the state of the art? That again,
59:55
very interesting open question and it will
59:57
be fascinating to see what would believe.
1:00:00
are implied by things China does in
1:00:02
the AI policy realm in the
1:00:04
coming years. realm like a
1:00:06
great place years. always,
1:00:08
this has been a wide -ranging
1:00:10
conversation, and until next time. Yes,
1:00:12
indeed. Great heading. Thank
1:00:14
you. has been a wide-ranging conversation, and
1:00:16
Tornberg is a show from
1:00:18
time. the podcast network behind
1:00:21
Moment of Zen and
1:00:23
Cognitive Revolution. you. If you
1:00:25
like the episode, please leave
1:00:27
a review in the in
1:00:29
the Apple store.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More