Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Amika Insurance has partnered with Courageous
0:02
Studios, the branded content studio of Warner
0:04
Brothers Discovery, to create a short film series
0:06
called What You Leave Behind, real stories about
0:08
people who've lost loved ones and how a meaningful
0:11
object left behind can continue to represent
0:13
their legacy. Visit Amikawhatyouleavebehind.com
0:16
to watch the videos and submit your own story.
0:24
And now from the Institute of Politics at
0:26
the University of Chicago and CNN
0:29
Audio, the Axe Files with
0:31
your host, David Axelrod.
0:33
I had a public
0:35
conversation last week with the eminent journalist
0:37
and biographer, Walter Isaacson.
0:40
It was at the Chicago Humanities Festival
0:42
and the focus was his depthful new
0:45
book, Elon Musk, about the
0:47
brilliant, controversial and indisputably
0:50
impactful inventor and entrepreneur.
0:53
But true to the spirit of the Axe Files, we also
0:55
spoke about Isaacson himself, about
0:57
his life and his gifts and
1:00
his fascination with the nature of genius.
1:03
Here's that conversation.
1:08
Walter, there are a thousand things that I could ask
1:10
you about Elon Musk. I want to start
1:13
off by asking you something about
1:15
you. You
1:18
are a genius at writing about geniuses. One
1:21
of the tricks is if you write about geniuses,
1:23
people think you're a genius and you can fake them
1:26
out. That's very good. I
1:28
want to know a little about you. I know your dad was an engineer.
1:31
I think that
1:33
he had something to do with the
1:35
Superdome construction
1:38
and a lot of other major projects.
1:40
Did he inculcate you? Because you read this
1:42
book. It's 605 pages
1:45
or something. I say it's
1:47
hard to lift and hard to put down
1:50
if you start reading it. You
1:54
kind of
1:55
geek out in a wonderful way about
1:57
the technology and the science. Well,
2:00
I love technology and science, and
2:02
Steve Jobs at one point said to me, those
2:04
who stand at the intersection of the sciences
2:07
and humanities are those
2:09
where creativity
2:11
happens. And that's why he
2:14
suggested I did Leonardo da
2:16
Vinci, because his Vitruvian man is
2:18
the symbol of the intersection of the humanities
2:21
and the sciences. And
2:23
it comes from my father and my
2:25
uncle and my brother and my grandfather,
2:27
who are all engineers. So
2:30
what happened to you? I know. Well,
2:33
I'll take that question seriously, but it
2:35
does happen that
2:38
somehow or another, if you're in engineering,
2:41
you think, okay, maybe
2:45
the next generation will be in the humanities
2:47
that will earn that right. And this
2:49
is a humanities festival. And
2:52
we think of the
2:54
importance of the humanities, which
2:57
my father truly believed
2:59
in. He was a very geeky
3:01
engineer and scientist, but he subscribed
3:04
to Book of the Month Club and Saturday
3:07
Review. You're old enough to remember what that
3:09
was. Sadly, yes. Yes, and
3:12
the point was he wanted us, his
3:15
kids and grandkids to be humanists. I
3:18
followed that path, because I think
3:20
the humanities are important, but
3:23
then I began to rankle when people
3:25
in the humanities would give lectures
3:27
about how you had to know the difference, but
3:29
they'd be appalled if you don't know
3:31
the difference between King Lear and Macbeth,
3:34
but then they would happily admit
3:37
not to know science or math.
3:40
They wouldn't know the difference between an integral and
3:42
a differential equation or a capacitor
3:44
and a transistor. And I thought- There'd
3:47
be a quiz at the end. And I
3:49
thought, because I grew up in the
3:51
home in New Orleans, well, we made
3:54
radios. We made actually fixed
3:56
television sets, used vacuum
3:59
tubes and then- switch transistors
4:01
in for them. I had a feel
4:03
for circuits. Ezra Webber,
4:05
one of my students, is sitting over there. I teach
4:07
a course in the digital revolution in
4:10
which I try to get
4:13
students to understand what a circuit
4:15
is, what on-off switches are, why
4:18
you can do logic with yes, no, on-off
4:20
switches in a circuit. These are things
4:23
we've forgotten in this
4:25
day and age, and that's why
4:27
I like writing about people,
4:30
Steve Jobs, Jennifer Doudna,
4:33
and then Elon Musk, who
4:36
do have a feel for
4:38
the inner workings of technology. And
4:41
to get to Musk in particular, when
4:43
I started this book,
4:47
thanks to Antonio Gracias, who's a great
4:49
Chicago person, he put us together,
4:53
I thought, all right, here's a guy doing
4:55
sustainable energy, solar roofs,
4:57
battery packs, electric cars, rocket
5:00
ships, and satellites. How cool
5:02
is that? And then, of course, he buys
5:05
Twitter, which kind of messes up the narrative.
5:07
Yeah, yeah. Let
5:09
me ask you, we'll
5:11
get to that. Let me
5:13
ask you one question more about your biography
5:16
that relates to Musk. You
5:19
grew up in the South at a very
5:22
significant time. You grew up in an era
5:24
of segregation. I know you've
5:26
written and spoken about it, you, and
5:30
the height of the Civil Rights Movement.
5:33
He grew up in an apartheid South Africa.
5:36
I don't think there's anything in the book about that.
5:38
And I was wondering whether you, having had the
5:40
experience you had, had any kind of conversation
5:43
with him about that? Well, there are some things
5:45
in the book. He does go to the anti-apartheid
5:48
concerts. At
5:50
one point, Train Door opens, he's
5:52
with his brother, and there's a guy with a knife
5:55
sticking out of his head. Yes, yes, there
5:57
was. And the blood on the shoes. His father.
6:00
who's a not in my mind
6:02
an admirable character, but did
6:04
run as an anti-apartheid
6:07
city council election and
6:09
won in Pretoria. But did you talk to him about the
6:11
experience of living under,
6:14
did it impact on him? The
6:17
violence impacted and the danger
6:19
impacted, and of course he leaves
6:22
South Africa in order not to have to join the
6:24
army and also to get to Canada
6:26
as it turned out. How about the injustice of
6:29
it? No, I think
6:32
he, I mean this is a problem that you'll
6:34
see throughout the book that's reflected to today,
6:37
is that he has these epic hero
6:40
visions of himself that come from
6:42
being a lonely, socially
6:45
awkward. Yes. He retreated
6:47
into this world of science fiction.
6:50
Exactly. And his father was
6:52
brutally, psychologically brutal to
6:55
him. He was beaten up as a kid.
6:57
You're running through my notes here, man. Okay. Let
6:59
me say something. Okay, go ahead. No, no, no. But
7:02
anyway, that makes him retreat
7:06
into this almost as if he's making
7:08
himself a character in a
7:10
video game in which he gets to play
7:13
himself. Yes. And so
7:15
his ideas of truth and justice
7:19
are these almost Captain Underpants,
7:23
epic X-Men comic
7:25
book things, which to
7:28
some extent are admirable, which are
7:31
three big epic quests. One is
7:33
sustainable energy on the planet. Number
7:35
two is making us multi-planetary, back
7:37
to space again. And third is protecting
7:40
us against artificial intelligence and
7:42
robots gone rogue. But
7:45
those were his
7:47
epic quest. He
7:50
has a great feel
7:52
for engineering. He has a fingertip feel
7:55
that would have exceeded my father by two
7:57
orders of magnitude at looking
7:59
at the... material property. He talked
8:02
about having Asperger's. It's an interesting
8:04
thing how many people relate
8:06
to this stuff, but he does not have
8:10
the, which means, and there are many,
8:12
many forms of autism spectrum
8:14
disorder, which is the real name for it, but
8:18
in his case it means he doesn't have
8:20
deep emotional receptors.
8:23
He doesn't look people in the eye. He doesn't
8:26
have a feel for the,
8:29
or empathy for incoming
8:31
or outgoing emotion, which is why
8:34
he shouldn't have bought Twitter. I mean, you know, you
8:36
should have stuck to batteries and rocket chips.
8:39
Okay. Now that you've run through my entire list
8:42
of thank you very much for coming.
8:45
Well, I'll just start asking
8:47
Max questions. Let me, let me,
8:52
let's start from his dad
8:55
because if there is a character
8:57
in the book who looms almost
8:59
as large as Musk himself, it's his father.
9:02
He's a, he's a specter that hangs over
9:05
every, almost every page of this.
9:07
Talk about him and
9:09
the influence that
9:12
he had on Musk's
9:14
life. At the very beginning
9:16
of this process, may
9:19
Musk, the mother who had obviously
9:21
divorced Harold many,
9:24
many years ago said, here's
9:26
the story. The danger for
9:28
Elon is that he becomes his
9:30
father. Now this is a pretty old
9:33
trope in mythology. Yeah. It's
9:35
Luke Skywulf and in life. It's,
9:38
and except for for me, which is my,
9:41
my aspiration would be to be
9:43
my father. But I'm lucky.
9:46
I mean, I just had a kindly father. Do you
9:48
think you did? No, he
9:50
would rather me be an engineer. I'm sure.
9:55
But that notion
9:58
of fighting the dark side of the fall. of
10:01
imagining yourself as Luke Skywalker
10:03
and discovering Darth Vader as your father.
10:07
That is an old mythological
10:10
theme. And what
10:13
happens to Musk is when he's
10:15
beaten up at the playground, one
10:17
point has to go to the hospital for almost
10:19
a week. He was bullied as a kid. Bullied but
10:21
beaten up because he was so socially awkward
10:24
and that's a euphemism for just,
10:26
you know, not being able to deal with people.
10:28
Sadly, a common story as well.
10:31
Yeah, we could get there. I'm
10:33
gonna go quick detour here, which
10:35
is I have
10:36
been stunned and I can
10:38
use a couple of names like Andrew Yang who
10:40
interviewed me. They all say, I have a kid. Yes.
10:43
Like this. I'm talking to Andrew about it. So do
10:46
I. You know, in my family. I think almost anybody
10:48
in this room has somebody in the family. And
10:50
this notion of
10:53
being somewhere on the autism spectrum, you
10:55
can watch how different people
10:58
channel it. But
11:01
I won't use a name, but cable
11:03
TV
11:05
host, you know, well says,
11:07
my kid is that way. And I always put my
11:10
arm around him at the baseball game. And I'm when they give
11:12
the popcorn, I say, look the guy in the eye and say,
11:14
thank you. When somebody, you know, in coaches, his
11:18
father was the opposite. Elon gets
11:20
beaten up like this all the
11:22
time. And at one point after he comes back from
11:24
the hospital, his father makes him stand
11:26
in front of him erect for
11:29
like two hours almost. While
11:32
he tells Elon that he's stupid, he's worthless. It
11:35
was his fault and takes the side of the people who beat
11:37
him up. So this makes him withdraw,
11:39
obviously, and
11:42
have these demons in his head. Now,
11:45
all of us have some demons that
11:47
they don't know.
11:50
All of us have some demons
11:52
that probably come from childhood. That's the
11:54
oldest theme in biography is,
11:56
you know, Einstein growing up
11:59
Jewish in Germany. Ben Franklin running away
12:01
from Philadelphia, even Jennifer
12:03
Doudna. The question
12:05
is how do you harness those demons and to what
12:08
extent do those demons harness
12:10
you? And that's the theme of this
12:12
book because it works both ways in Matt's
12:15
case. Yeah, you
12:18
literally did sort of, you weren't
12:20
peeking at my notes in the green room, were you? No,
12:23
no, I'm sorry there. But the
12:25
things that really came through were
12:28
what you said. There's almost a messianic
12:31
quality to him if
12:33
SpaceX fails, we'll never be the
12:36
multi-planetary. I know.
12:38
It's odd. If Tesla
12:41
fails, the planet is doomed. If
12:44
I don't take over this AI issue, we're
12:46
gonna be overwhelmed by the robots
12:49
and computers. I mean, he
12:51
sees himself as the thin blue line between
12:53
humanity and disaster.
12:56
He has this epic sense
12:58
of his missions. And at first
13:01
I thought it was just the pontificating
13:04
you would do on a podcast or at a pep
13:06
talk for your team. But then
13:08
over and over again, I'd see him lapse
13:11
into where he'd get really angry in
13:13
South Texas at the launch site for
13:15
Starship. If you have the book, you
13:17
can show the back cover. Largest
13:20
movable object ever made and he's trying
13:22
to get it in his face. Here, just pass it around. Yes. And
13:26
he would just start murmuring to himself,
13:29
if I don't force this, we will never get
13:32
to Mars. We'll never be multi-planetary.
13:34
And likewise in 2008, when both SpaceX has
13:39
destroyed three rockets. Both of them almost
13:41
went down Tesla and SpaceX. December 2008,
13:44
both run out of money. He runs out
13:46
of all of his money. He runs out of all of his brother's
13:48
money. His wife to little O'Reilly's
13:51
parents are saying we'll sell our house. He's
13:54
writing personal checks to keep Tesla
13:56
and SpaceX alive. One
13:59
of the people running SpaceX says, hey,
14:01
give up on one or the other. And he says,
14:03
what you quoted, which is, if
14:06
Tesla doesn't work, the era
14:08
of getting into electric vehicles is going to
14:10
be set back. Because GM
14:12
and Ford had just gotten out of the business. And
14:15
he says, and if SpaceX fails, we'll
14:17
never go back into space
14:19
again. We've given up on the shuttle. We've
14:22
given up on going to the moon. So
14:24
he has this epic sense.
14:27
And I know it sounds odd,
14:30
and if you read the book, you can disagree with
14:32
me. But after a while,
14:35
I believed he believed it. This
14:38
isn't just, well, whatever.
14:41
It was he has
14:43
this mantra in his head that
14:46
if we don't start exploring
14:49
other planets, if we remain confined
14:51
to this Earth, and we allow this Earth to be destroyed
14:54
for many reasons, including not having sustainable
14:57
energy, this is
14:59
the epic quest he has to be on.
15:02
Yeah. So he did become
15:04
his father in one way, which
15:06
is, he does have this dark
15:09
side that is legendary. Obviously,
15:12
you saw it. You wrote about you talked to
15:15
people. But he can
15:17
turn his employees and even
15:20
his friends and relatives into
15:22
that little boy standing in front of his father.
15:25
Absolutely. And his father
15:27
was Dr. Jekyll and Mr. Hyde. He'd switched
15:29
from being a charming engineer to
15:32
being cold and
15:34
psychologically brutal, neither violent
15:38
or even raise their voice. It's
15:40
that cold monotone brutality.
15:43
And Musk has
15:45
multiple personalities. Elon does,
15:48
which is he can be charming. He can be inspirational.
15:51
He can be funny. And then he goes into
15:53
what Grimes, one of his girlfriends, calls
15:56
demon mode. And you watch
15:58
it happen. And he goes, rah! really
16:00
dark and he'll just be coldly
16:03
brutal to the people in front
16:05
of him. And, uh, it,
16:10
it happens mostly on
16:12
engineering issues, but now, unfortunately
16:15
on politics. Did he
16:17
ever go dark on you? No,
16:19
I kept, people kept saying, man, it's going to happen. It's going
16:21
to happen. I kept a
16:24
good line between me and him. I wouldn't
16:26
as pal. We didn't go out, you know,
16:28
late at night. I didn't try. I
16:30
sat in the corner for two
16:32
years. Every meeting he, uh,
16:35
with a few, only, uh,
16:37
two classified national security meetings,
16:40
did he ask me to leave the room? Let's
16:42
just interrupt you for saying, what does it say about him?
16:45
Like when I was in politics and
16:47
I had candidates or presidents
16:49
or whatever, and people said, I just
16:51
want to sit in the corner in the room and watch. I
16:54
like the hell you will. Uh,
16:58
and, uh, and so what does it
17:00
say about him? I mean, there is
17:02
a radical transparency to
17:05
him and also this epic ego
17:07
and, um, superhero
17:10
comic quality that he
17:12
just wanted. Nothing
17:15
to be off limits. His text
17:17
messages, his emails, every
17:20
meeting, dinners at night.
17:23
Uh, and I was stunned
17:26
at two things, the openness
17:28
and transparency. And you've read the book
17:30
and you just go, whoa, why did he allow
17:32
Walter there? Yes. And, uh,
17:35
that open transparency and the fact that
17:37
he never turned the flamethrower
17:40
on me. And, uh, have you heard from him
17:42
since the book came out? Yeah, I
17:44
did, uh, Lex Fredman's podcast
17:47
down in Austin. I know, you know of
17:49
him. And that was about four
17:51
weeks ago. One of the, there were two rules
17:54
when I did this book and, uh, we
17:56
had a couple of hour discussion. I said,
17:58
I don't want to do. it based on five or
18:01
ten interviews or 15. I want to
18:03
be by your side at all times." And
18:05
he said, okay. Just in the monitoring.
18:07
I go, wow. And then I said, you
18:10
have no control over this book. You're not
18:12
even going to be able to read it in advance.
18:15
I'm not going to send you a copy. And he
18:17
said, okay. So about
18:20
two or three weeks ago, I guess maybe four, the
18:22
book came out three or four weeks ago, the
18:24
week the book was about to come out, I'm down
18:27
in Austin doing Lex Friedman's podcast.
18:30
And I've not sent Elon the book,
18:32
but other people have it, meaning the reviewers
18:35
have it and stuff. And so I thought he may have it. And
18:38
I went out to dinner with Friedman and
18:40
Musk pops up. And so
18:42
we all have dinner. On the way
18:45
to the parking lot, he says, I haven't
18:47
read the book, should I? And I said, no.
18:50
And he laughs and says, okay. And
18:53
then about two weeks ago, I happened
18:56
to cross, well, not, I crossed paths with
18:58
him again at a conference in Aspen
19:00
where I used to work. And somebody,
19:03
Gail King said, have you read the book?
19:05
And he said, no, I was in a parking lot with
19:08
Walter. He told me not to read it. So
19:12
you did have influence. Yeah, right.
19:14
I wish I could tell him not to do some other things. What
19:18
the, some of the reviews
19:21
were hard on you, accusing
19:24
you of sort of fanboying him
19:27
and not being hard enough on
19:29
him. You read the book,
19:31
I think it's not fanboying.
19:34
Well, every question I've asked has come from
19:36
what I read in your book. Yeah. So
19:39
no, it is a valid, I didn't
19:41
mean to push back too much. I certainly
19:43
don't think I fanboy because if you 40% of
19:46
the people in this country,
19:49
in this room probably hate him and 40%
19:51
think he's a
19:53
super genius, he's getting us to different
19:55
planets. And I had to say,
19:57
wait a minute, you can hold both things and you
19:59
have the same. time. He's actually able,
20:02
where NASA isn't and Boeing isn't, to
20:04
get astronauts to the space station. And
20:06
just this week, the Falcon
20:09
Super Heavy did something NASA couldn't do, which
20:11
is launch a mission to that asteroid.
20:13
I mean it's just a science thing. You got
20:15
to hold that in your head and hold in the
20:17
head the qualities that
20:20
begin with a word, begins with
20:22
A. Yeah. But
20:25
in the book, I mean, but there's a valid criticism.
20:29
That's me and Michael Lewis, who I grew
20:31
up with in New Orleans and did Sam Bankman free,
20:34
which is you're arriving right next to somebody
20:36
and you keep explaining and you're understanding
20:38
how he is. Does that mean you're excusing
20:41
or you're sugarcoating? I tried
20:43
hard not to sugarcoat and I
20:45
know Kara Swisher, my friend and others,
20:48
we've had this conversation.
20:50
Yeah. Oh, but we've had this conversation.
20:53
Oh, you didn't come down hard
20:55
enough on him. And I
20:58
said, well, I tell the story.
21:00
So if you're one of those people who doesn't like
21:02
certain traits of his, you're going to have about 10
21:05
times more ammunition after you read this book
21:07
because I don't sugarcoat those stories. And
21:10
likewise on the engineering stories, I don't.
21:13
We're going to take a short break and we'll be right
21:15
back with more of the X-Files. For
21:22
two decades, FBI agent Robert Hanson
21:24
sold secrets to the Kremlin. He violated
21:27
everything that my FBI stood for.
21:29
People died because of him. Hanson was the most
21:32
damaging spy in FBI history and his
21:34
betrayals didn't end there. Do I hate
21:36
him?
21:36
No, I don't hate anyone, but his
21:39
motive. I would love to know what his
21:41
two motives so I can get that out
21:43
of me.
21:43
How did he do it? Why?
21:46
Listen to Agent of Betrayal, the double life
21:48
of Robert Hanson, wherever you get your podcasts.
21:52
Amika Insurance
21:53
believes life insurance is about more than just
21:56
financial protection. It's a promise that
21:58
our loved ones will be supported even... when we're
22:00
no longer there to provide for them. It's
22:02
more than a policy, it's your legacy.
22:04
That's why Amika Insurance partnered
22:07
with Courageous Studios, the branded content
22:09
studio of Warner Brothers Discovery, to
22:11
create a short film series called What You Leave
22:13
Behind. Real stories about people who've
22:15
lost loved ones and how a meaningful object
22:18
left behind can continue to represent
22:20
their legacy. In one story, you'll
22:22
hear how a drum set reconnected a man with
22:24
his late father through the power of music. In
22:27
another, you'll learn about how coming into
22:29
possession of his late father's camera changed
22:31
the trajectory of one man's life. Each
22:33
story investigates a special item and
22:36
the legacy it represents. Visit
22:38
amikawhatyouleavebehind.com
22:40
to watch the films and submit your own story
22:42
to honor a loved one and what they've left behind
22:45
for you. That's amikawhatyouleavebehind.com.
22:48
As Amika says, empathy is
22:51
our best policy. And now, back to the show. It
22:57
is an extraordinary story. His
23:00
prodigious talents
23:00
are, and what
23:04
is, the thing that was striking was,
23:06
I think Reed Hoffman is quoted in there
23:08
as saying, you know, he finally
23:11
figured out that Elon starts with a mission
23:13
and then sort of backfills. Backfills.
23:17
Backfills. And he's a great guy. He's a great
23:19
guy. And he finally figured out that Elon starts with a mission
23:21
and then sort of backfills. Backfills. And
23:23
tries to find a business model for
23:26
his mission. He's
23:28
the wealthiest guy on the planet. But
23:32
it's unlike some others. I
23:35
mean, he's not taking
23:37
long cruises in the Mediterranean. And
23:39
he makes fun of Bezos for that. Yeah.
23:42
I tried not to mention anyone. I didn't want to draw other
23:44
billionaires into the conversation. But
23:49
what comes across in your account
23:52
of these engineering exploits
23:55
is how
23:57
deeply he's involved in the process of
24:00
in the process as,
24:02
and he's kind of like a MacGyver, oh,
24:04
the rocket has a leak, go out and find some bubblegum.
24:08
You know, that kind of thing. That wasn't really a
24:10
story, but it was, it's almost
24:12
that. Let's take us some epoxy
24:14
and fix the valve on this Raptor engine.
24:17
Yeah. Which Boeing wouldn't do and
24:19
NASA wouldn't do and you have to figure
24:21
out how much risk do you wanna take?
24:24
So describe that quality of
24:26
his and how striking
24:29
that was. That was very striking to me.
24:31
And as I say, you'll see the other qualities
24:34
as well. But
24:36
he
24:37
has a maniacal, mono
24:40
focus ability. And this is
24:42
part of the psychology of how he's hardwired,
24:45
which is take, for example, the night
24:48
that the Twitter board decided
24:51
to accept his hostile offer to
24:54
buy Twitter. And Musk
24:58
goes that night, as he's hearing
25:00
it, to Boca Chica, the tiny
25:02
town in the spit south of Texas
25:05
where he's trying to launch Starship. And
25:07
he goes into a meeting
25:10
that night and the whole world
25:12
is a buzz that he's just gotten
25:14
Twitter. In the room, all the engineers,
25:17
I know they're in their bus. He doesn't
25:19
mention it and they don't mention it. And
25:21
they spend two hours looking
25:24
at a methane leak in
25:26
one of the engines in the booster of Starship.
25:29
And what could be done to do a work
25:31
around or how they could read it? And
25:34
he's the one who figures out both the material
25:36
qualities of Inconel, which is one
25:39
of the materials
25:42
you can use in a rocket, exactly
25:44
how to do it, and then never
25:48
mentions Twitter. So he will
25:50
focus serially, meaning
25:52
step by step, for a couple of
25:55
hours on how
25:57
do you make a left turn with full self-driving
25:59
when there's a rocket? bike lane, then
26:01
how are we going to fix this valve? Then
26:04
how, and he will
26:06
look at these little things and then leave 99% of what's
26:08
happening at SpaceX
26:10
or Twitter or Tesla
26:14
to the other managers. But he
26:16
says it's like Napoleon, you
26:18
got to be on the battlefield riding the horse
26:21
with the sword in the details. And
26:23
his management style is to
26:25
focus with an urgent intensity
26:28
on the details. And he says that
26:30
will ripple through the
26:32
enterprise. And then while others, when
26:35
others make decisions, then he tells
26:37
them how stupid the decisions are. If
26:40
he doesn't like the decisions. We
26:44
were having this discussion backstage.
26:47
I'm so interested in the psychology
26:50
of sort of genius
26:53
at that level because
26:57
it's disruptive genius, disruptive
26:59
genius. It requires
27:02
you to say, I don't care what the
27:04
rules are. I don't care what the norms
27:06
are. I don't care how we
27:08
did this for a thousand years.
27:11
I'm going to do it a different way. And
27:14
if that means breaking a rule or a regulation
27:17
or so on, so be it. I'm just focused on
27:20
that goal. Is that a common trait
27:24
of the people who you have studied?
27:26
Yeah, that's an interesting question. Wired
27:28
Magazine after I did a book on Steve
27:31
Jobs did a cover called Do
27:33
You Have to Be an A-Hole
27:36
to Be an Innovator?
27:40
And it is true. Out of
27:42
the closed caption people deal with
27:44
this. Actually,
27:46
right. Out of the closed caption. It
27:49
is true in order to be a disruptor,
27:52
you have to be disruptive. When
27:54
I did Jobs, early
27:57
on, Wozniak, his co-founder,
27:59
says the
28:00
the question you have to answer is did he have
28:02
to be such an a-hole? Did he have to
28:05
be asshole, I think is the word. Yes, you're right. Did
28:07
he have to be so mean? And
28:10
a few years later, we're at the launch
28:14
of the second iPod and
28:17
Steve Jobs is dying. He has two
28:19
turtlenecks on because he says skinny and
28:21
cold, but unstaked. And I see Woz
28:23
and I say, okay, what's the answer to that question? Woz
28:27
says, if I had run Apple,
28:29
I would have been nicer. I would have made
28:31
everybody get stock options. It would have run
28:33
it like a family. And
28:36
then Woz, he's a teddy bear of a guy,
28:38
said, but if I had run Apple, we probably would
28:41
never have done the Macintosh. We wouldn't have done
28:43
the iPhone. So do you have to
28:45
be disruptive to do it? In some
28:47
cases, I write about people who aren't
28:50
disruptive. Jen Franklin is the guy
28:52
who takes a lot of disruptive people from
28:54
Hamilton to Sam
28:57
Adams and brings them together. Jennifer
28:59
Dowd… Did you, are you sitting in the corner of the room then? No.
29:02
Well, I was sitting in the corner of the room with Jennifer
29:05
Dowdner, who is the most collegial. She's
29:07
the one who co-invents CRISPR
29:09
in my book, The Code Breaker, which edits
29:11
human genes. And when
29:13
they're bringing anybody into the lab
29:15
to be hired, even as a graduate student,
29:18
she makes them meet everybody in the lab.
29:21
And then they sit around later and talk, would this
29:23
person fit in? Do we like this
29:25
person? Musk is the opposite.
29:28
He says a few things. One,
29:30
he has an algorithm, which is step by step.
29:33
Question every rule, question every regulation.
29:36
Somebody says, the reason we have to
29:38
put this piece of felt
29:40
in is because the regulators
29:43
say so. He says, show me why.
29:45
And that's risk taking. And
29:47
he says, yeah, but everybody who came, we
29:50
used to be a nation of risk takers. Whether
29:53
you came on the Mayflower, across the Rio
29:55
Grande, or from Eastern Europe
29:57
in the 20s and 30s. You
30:00
took some risks and we've lost that ability
30:02
to be risk takers. And
30:05
then he says collegiality
30:09
is not your friend. Empathy
30:11
is not your friend. In other words, if you're
30:13
trying to please all people
30:16
around you, you'll lose sight
30:18
of the enterprise and the mission. And
30:21
he says... That serves his personality
30:23
as well. Absolutely. And it's why some
30:25
people... And Steve Jobs for that man. And
30:28
Bill Gates and Jeff Bezos. So...
30:31
I sense a trend here. Yeah. Well,
30:33
that's an answer to your question in a way.
30:36
And to some extent, these
30:38
people don't have the empathy receptors
30:42
or transmitters, the input-output
30:45
empathy that you and I would have.
30:50
And Jobs told me once,
30:54
you have the luxury of being
30:56
empathetic. And you think that you're
30:59
kind, but you're actually being selfish because
31:01
you want people to like you too much. And
31:03
I did realize when I ran time, it was
31:06
very collegial family. We were doing
31:08
well. When I went to CNN, I
31:11
did not do well running CNN. It
31:13
was partly because I cared too much
31:16
about every person from Lou Dobbs,
31:18
the great... Liking me when
31:20
I needed to be a disruptor and
31:22
I wasn't a disruptor. So
31:25
this is a question in the
31:27
book, which is... He
31:30
said he learned it from playing polytopia,
31:32
the game. Yes. Don't
31:35
be collegial. Yeah. Obsessively.
31:38
Even the night he bought Twitter, he was
31:40
Elden Ring. He got to the final level.
31:43
And he said, empathy is your
31:45
enemy if you're trying to get to the next
31:47
level. So just as a parenthetically,
31:50
you mentioned CNN, you ran CNN. Do
31:53
you think they just hired a new
31:55
CEO, Mark Thompson, who had great success?
32:00
the New York Times and the BBC, do
32:03
you think that there is a
32:05
way to reinvent the model?
32:09
Is there positive disruption that can...
32:12
I'm asking for a friend. Mark?
32:15
For you. Well,
32:17
let me caveat this by saying I'm
32:20
one of the five or six people who
32:22
has proven on the national stage that
32:24
I don't know how to run CNN. And
32:27
so... But you do know this world.
32:30
I do. And I think in
32:32
an era in
32:35
which we have artificial
32:37
intelligence, scraping information,
32:40
misinformation, we're
32:42
going to have to place a higher value
32:45
on reliable, good
32:48
information that will be used to
32:50
train everything from our AI systems
32:54
to inform us in the middle of what's
32:56
happening right now, this horrible
32:58
terrorist attack and its aftermath.
33:02
And it means that people
33:05
will pay a premium at
33:07
some point for information
33:10
that's reliable. We went
33:12
astray when the
33:14
business model of journalism depended
33:17
mainly on advertisers, which meant
33:19
aggregating eyeballs and
33:22
click date. And now
33:24
we're getting into an era in which I
33:26
think if you're the one who has truthful
33:29
and reliable information, people
33:31
will pay for it. Henry Luce
33:34
who invented Time Magazine said, if
33:36
you're dependent totally on advertisers,
33:38
it's not only morally abhorrent, it's
33:41
economically self-defeating. You
33:43
have to be dependent on revenue from
33:46
users. And one of the things
33:49
Musk will do, which is on one
33:51
of the few upsides of Twitter, is
33:53
he's going to make small payment systems.
33:56
So if I hit the Chicago Tribune,
33:58
and I'm trying to read... a wonderful world
34:01
review of the Louis Armstrong play opening
34:03
here tonight. And it says, you have
34:05
to subscribe for a year to the Tribune. For
34:08
the Tribune, I'll do that. But with
34:10
the Minneapolis Star, the San Jose,
34:12
I want to be able to pay a dollar for the article.
34:15
I don't want to have to have a... And those are the type
34:17
of things, I'm sorry, there's a long answer, that
34:20
are in sent higher value
34:22
information that people could be willing
34:24
to pay for. So let's turn to... By
34:27
the way, just in this... The reason
34:29
I ask about the disruption is that
34:32
disruption can be... It can yield
34:35
extraordinary discoveries
34:38
and advances as
34:40
the rockets, the cars, and so on.
34:43
But we also see disruption in every element
34:46
of our lives now. Social
34:48
media has a lot to do with this. Politics
34:50
is now mirroring social media and
34:53
so on. It's concerning. It's
34:55
concerning because democracies
34:57
rely on laws and rules
35:00
and norms and institutions. And
35:02
institutions that aren't disrupted.
35:05
So I am not innately
35:07
a disruptor. I write about them. But
35:11
I do believe that
35:15
the wanton disruption of institutions,
35:17
whether it be general
35:20
interest news magazines where I come from,
35:23
or local papers where I come from, or churches, or civic
35:27
organizations... You could run the whole gamut of institutions.
35:31
Yeah. Democratic and Republican party structures.
35:35
I don't think disruption in and of itself
35:37
is of value. I like technological
35:40
disruption, meaning I
35:42
do think the major auto companies, when
35:45
they decided not to go into electric
35:47
vehicles and GM started crushing
35:49
the Chevy Bolt. It
35:52
was good. Somebody disrupted it. And
35:54
when NASA decided we're going
35:56
to ground the space shuttle and not
35:58
try to get astronauts in the oil. anymore. I
36:01
like that disruption. The disruption
36:04
of other institutions has been enormously
36:06
problematic. So let's, this
36:09
is a natural transition into this
36:11
phase of Musk. You
36:14
were with him during an interesting period
36:16
in his development and evolution because
36:19
he was not a particularly
36:22
political person as you noted in
36:24
the book earlier in his life. He worked with you to write Money for Obama.
36:27
Yes, which
36:29
we appreciated. But
36:35
he has become much more
36:38
outspoken, much more involved.
36:40
And let me just add an anecdote
36:43
in the middle of this because one
36:45
of his technological innovations is
36:47
Starlink, the internet
36:50
provider with low aperture
36:56
satellites that he's launched, many, many
36:58
of them. And
37:01
he has provided the internet service
37:04
that has kept Ukraine connected. There
37:07
was an episode in your book that you
37:09
wrote about it, created some controversy.
37:12
I think the controversy was overblown,
37:15
but the point was important. The
37:17
essence was this guy has a
37:19
power on that night to decide whether
37:22
Ukraine was going
37:24
to launch an attack on Russian
37:26
assets around
37:28
Crimea. And they
37:31
needed Starlink and
37:33
they did not know it had been disabled.
37:36
And that night, he gets the text
37:39
messages all in the book. They're
37:42
saying, you got to turn it on so we can do this sneak
37:44
attack on this fleet. And
37:47
even he feels, he says to me, how
37:50
did I get into this war? I created Starlink
37:53
so people could watch Netflix and chill. He called you by the way
37:55
in the midst of this. So you
37:57
weren't just the guy in the corner, you were also the guy in this.
38:00
It's weird if I want me to tell
38:02
a little story behind it or something. Well,
38:04
let's see. Okay, well, no. It depends how long
38:07
the story is. No, go ahead. Tell the story.
38:09
60 seconds, which is, as you
38:11
know, when the Russians have laid
38:13
Ukraine, you have to figure out how they get so much power.
38:16
One reason is, via SAT, the satellites
38:18
that they're using, totally conked
38:21
out when the Russians... The US military,
38:23
their own military, the only satellite
38:25
that can withstand Russian attacks is
38:28
Starlink. So you've got to say, how
38:30
come he could and NASA and
38:32
DIA couldn't? And
38:34
then he plays Captain Underpants superhero
38:37
because they start taxing him, saying, we've
38:39
got to defeat the Russians. And
38:42
he sends that night, a hundred,
38:44
and then next day, a thousand Starlink
38:46
services there. And
38:48
had he not, Ukraine would have
38:50
been overrun by Russia because the troops wouldn't
38:52
be able to... I'm sorry, this is taking more
38:54
than a minute. Well, flash forward real quick
38:57
to September. And
39:00
after spending a week with him and doing it, I'm
39:02
back home in New Orleans. And here was the backstory
39:04
I was going to do. I'm at my old high school,
39:07
watching a high school Friday night football
39:09
game because Arch Manning, the senior
39:11
at my high school, is this prep star.
39:14
And I want to see him. And
39:16
it's Musk. Musk. He says, okay, they're
39:18
asking me to enable Starlink
39:21
so they can do this Pearl Harbor deck. He
39:23
plays apocalyptic, which he is.
39:26
He says Russian doctrine means they could
39:28
go tactical nuke on us if
39:30
that happens. He talked to some Russian ambassador or something.
39:32
He talked to the Russian ambassador who said, here's
39:35
our doctrine. We consider
39:37
Crimea the homeland. So anyway,
39:40
and so I'm not enabling it. And
39:45
I'm very Socratic. I don't give him advice
39:47
even though I was like, why am I... And
39:49
I said, have you talked to General Mark Milley,
39:51
which is my Socratic way of saying, Yeah,
39:55
I'll move it up your pay grade. And he does
39:57
talk to Milley. And there's a story to be written
39:59
there. That's interesting, but
40:02
we don't have it in the 60 seconds here. And
40:04
eventually he gives up control
40:08
over a significant number of Starlink
40:11
services to the US military
40:13
and the CIA and that
40:15
they can make the decision, not he. I read
40:18
the account of this because
40:20
it became a thing when your book came out
40:23
and I, because I
40:26
want to help
40:28
Musk's business, I tweeted and
40:32
I said, do we really want Elon
40:34
Musk making national security decisions?
40:38
And I thought it was a, I don't think it was like
40:40
a hugely path-breaking
40:43
observation. He tweeted back.
40:46
What do you say? I think he said, since when have
40:48
you become a warmonger? So
40:51
I guess that his point was had he
40:53
done that, then we would have
40:56
had the nuclear attack and so
40:58
on. But first of all, what's he doing
41:00
responding to me? I don't know. I mean,
41:02
this is, yeah. And I've mentioned
41:04
and some of you may know Antonio, but I
41:07
mentioned before they were traveling together
41:09
and Musk keeps like late at night
41:11
responding to people like Axe on Twitter
41:13
saying things. And Antonio
41:16
said, let me take your phone and puts it in the
41:18
safe in the hotel room, punches in the cup. So
41:20
Musk can't tweet that night because he's
41:22
gone on these bad tweets. At 3
41:25
a.m. Musk called hotel security
41:27
and made them open the safe. This
41:30
is an addiction. He has an addiction to video
41:32
games and to tweeting
41:34
that the latter is sometimes
41:37
not a pretty sight. We're
41:40
going to take a short break and we'll be right
41:42
back with more of the Axe files.
41:52
And now back to the show.
41:59
about that yeah I decided not to prolong
42:02
the discussion I just mostly
42:05
because my wife my
42:07
wife threw my phone in the safe as
42:10
well so
42:12
let's talk about though his
42:16
politics as you describe
42:18
in the book you know he was for
42:20
the longest time kind of a liberal on social
42:22
issues libertarian on economic issues
42:25
obviously he was in business doesn't like regulation
42:27
doesn't like rules but
42:30
now it seems like it's morphed into something
42:33
else what what has happened part
42:35
of the book which I won't recap but
42:37
it's the past three years the evolution
42:40
you were there for most of it yeah I mean I'm like and
42:43
first of all let's say there's not one musk
42:45
I mean in the middle of the day when he's in a cheery
42:47
mood says we need more moderate
42:49
I'm gonna start a pack for you know
42:52
centrist and then he'll get into
42:54
a dark mood his demon
42:56
mode and he will the darker
42:59
side of his politics will come out as
43:01
his mother said he could become his father his father
43:03
is a conspiracist who thinks that everybody
43:06
from Fauci to stolen elections
43:09
and even though must does not speak to his
43:11
father and it doesn't as blocked
43:14
his father's emails somehow
43:16
may is right that channeling happens
43:19
but over the past three years he
43:21
shifted from being what I would call an
43:23
Obama Democrat to
43:25
I won't say conservative but
43:28
to this populist right
43:30
that includes even a Bobby Kennedy
43:33
you know so go yes well I mean there are there
43:35
are rumors I think I know
43:38
I trafficked in some of it that he
43:40
is that he's thinking about support
43:42
and they now say that he's thinking
43:44
about supporting a super PAC to
43:47
help Bobby Ken I'm sure it's
43:49
a variable mercurial feeling he
43:51
has and we'll see where it turns out they have they
43:53
have a kind of kinship I mean he's Bob
43:55
Kennedy or a toy is shares his view
43:57
on climate and shares his They're
44:00
both very much in favor
44:03
of fixing the climate, I mean, saving
44:05
us from climate disaster, but also
44:08
conspiratorial. Now,
44:11
I mean, the problems on conspiratorial,
44:13
as Musk would argue, I wouldn't, because
44:15
I'm the least conspiratorial person really,
44:18
is that some things that were called
44:21
conspiracy theories like the Barrington
44:23
Doctrine about lockdowns will
44:25
cause more harm than good, get
44:27
censored on Twitter and they turn out to
44:29
at least be debatably
44:32
correct, if not totally correct. But
44:36
to get to your question, he does evolve
44:39
on the past three years. There are multiple
44:41
reasons, and I'll try to tick him off quickly.
44:45
One is he really hates rules
44:47
and regulations and the COVID lockdowns,
44:50
it just bristle, made them bristle. He
44:52
took on the state of California And
44:55
then the assembly women of, there's about 10 others
44:57
who just say, get out of our state, you're
45:00
horrible. He pays that
45:02
year more tax than any
45:05
person has ever paid to any entity
45:07
in the history. And Elizabeth Warren
45:09
keeps attacking him for not paying taxes.
45:12
So he's, we, Biden decides
45:14
to have an EV Summit of people taking his electric
45:16
vehicles, says Barry Barra of GM,
45:19
you're leading the way. She had made 26 electric
45:22
vehicles that year, Musk had made 1 million,
45:25
and Musk was not invited to it. So
45:27
he's- Because his plants are not unionized.
45:30
Correct. And so he's reacting
45:32
to that. There's also, and
45:34
I'm gonna try to be careful here. There
45:37
was a personal thing, which
45:40
is his oldest
45:43
surviving child, the other child who
45:45
died in infancy, was named
45:47
after his favorite character in the X-Man
45:49
comics. And that child,
45:53
while I'm covering this, sends
45:55
a message saying, I'm transitioning,
45:58
and my name is now Jenna. but
46:01
don't tell my dad this is to her aunt
46:03
and then Elon gets his head around the
46:06
fact that his oldest child is transitioned but
46:09
Jenna goes to court and
46:12
she's so anti-capitalist,
46:15
you know progressive
46:18
that she hates him for having so much money
46:20
and says so and changes
46:22
her last name as well and he
46:25
said this has hurt me any more than anything
46:28
since the death of his first child
46:30
Nevada and he
46:32
blamed it and everybody's
46:35
gonna go because nobody does what
46:37
this phrase means but he blamed
46:39
it on what he called the woke mind
46:42
virus that she picked up in her very
46:44
progressive school so he sells all
46:46
five of his houses as you say he
46:48
doesn't have yachts or anything like that decides
46:51
to live just in a two-bedroom house
46:54
is pained by this thing is
46:56
not but he decides that and
46:58
this is one of 20 things that goes into
47:00
his houses have to do with the Senate because
47:02
she was criticizing him for being a rich
47:05
person who has you
47:07
know and he said I'm gonna sell all my
47:09
houses and all my money is gonna be reinvested
47:11
back in Tesla and SpaceX for the mission
47:14
I'm gonna live in this two-bedroom house and
47:18
he's just rankling at the fact that
47:21
she's become so he
47:23
calls it you know Marxist
47:26
but whatever word you want to use
47:28
this must be one of the reasons
47:31
why he hosted it turned out
47:33
to be a disaster but Ron DeSantis
47:36
on a Twitter live feed
47:38
at the beginning of that's how DeSantis tried to
47:40
announce his campaign it
47:44
was a technological actual technological
47:46
disaster on Christmas Eve
47:49
had decided that he had been told by
47:51
the Twitter engineers you can't get rid
47:53
of these servers in Sacramento and he
47:55
figured out you could and
47:58
he decides to go with his nephews
48:00
and used pliers from Home Depot
48:02
and pry up the floors and cut the cables.
48:05
It's a beautiful scene in the book. Sounds like the
48:07
Santa's campaign. Yeah. And pulled
48:09
out those servers and he turns
48:12
out like everything with bus.
48:15
It turns out to be correct sort of, but then
48:17
there's debris in the wake and a month later
48:19
there's not the backup system
48:21
when he does. So the book has
48:23
a lot of rockets that get into orbit
48:26
and some that leave debris in the way. Well, it does
48:28
raise this question and you and I were talking about this
48:30
before we came out here. Can
48:33
you run, we're going to get back to the
48:35
politics in a second, but can you run
48:37
Twitter the way
48:40
you run these? It's
48:42
a much different enterprise. Absolutely
48:44
not. And when he was buying Twitter
48:47
and we were at the gigafactory in Austin,
48:50
which he was just opening up. And
48:52
no, it's the biggest car factory in, you know, I've
48:54
ever made bringing manufacturing back
48:57
to America and he had a feel for each station
48:59
on the assembly line. And I said, but what
49:01
about Twitter? He said, well, it's basically just an
49:03
engineering issue. They haven't made the product.
49:05
They haven't added video. They have. And
49:08
I'm thinking it's not an engineering question.
49:11
Twitter's not a technology company. It's
49:13
an advertising medium that tries
49:16
to gather eyeballs in a friendly
49:18
suite's place so Pepsi Cola
49:21
and others can do that. And he
49:23
does not have the emotional feel,
49:25
as we discussed earlier, to
49:27
apply to Twitter the type of feel
49:29
he has for an engineering issue. Well,
49:33
I mean, central to our dilemma
49:36
as a country, I believe this
49:38
is my opinion, is that we
49:40
have social media where the
49:42
business model is to keep
49:44
people online. And the great inspiration
49:46
of these algorithms is that outrage,
49:49
alienation, anger keeps
49:52
people online. So
49:54
the very premise of the
49:57
business flies
50:00
in the face of any kind of... Yeah, we
50:02
used to think that social
50:05
media was gonna connect us. That
50:08
was the dream of Facebook and other things.
50:11
And instead, as you say, the algorithm
50:14
has figured out that the more you
50:16
play on people's resentment and the
50:18
more you enrage them, the more they'll
50:20
retweet and the more engaged they'll
50:23
be online. And that's true on
50:25
Twitter and Facebook, but also on Talk
50:27
Radio now and on cable. And
50:29
at the core of that is the aggregation of
50:32
data. These
50:34
algorithms have access to more information
50:37
about us than we have about ourselves.
50:39
Which is why I like the notion that you should
50:41
have to pay for some content, because
50:44
otherwise you're paying with your data and
50:46
your enragement. Yeah, data
50:48
was one of the things that he was chasing according
50:51
to your book. He's big, and I know we're running out of time,
50:53
but his big next thing is
50:55
to go for artificial intelligence
50:58
because he's worried, as I said,
51:00
about it. But he believes
51:03
it should be artificial general intelligence.
51:05
So at the end of the book, way
51:07
after Twitter, way after even Starship
51:10
tries its first launch, he asked me to come
51:12
back to Austin, and we
51:14
sit in the back of his Siobhan
51:17
Zillis who runs Duralink by the pool. And
51:20
he says, I have to start an AI company that's
51:22
gonna do real world AI. It's
51:24
not only gonna do chatbot-like
51:27
stuff where you read documents
51:29
on the internet, and then you can ask the chatbot
51:32
who are the five best boats or something, and
51:34
it'll chat away with you. But it's gotta
51:37
do- If it will say, I'm not allowed to. Yeah,
51:39
right. It will do visual data. Make subjective
51:41
judgment. It will take the feed of
51:44
visual data from Optimus Robot,
51:46
Tesla, the cars, as well as the
51:48
Twitter feed, the language data,
51:50
and he's got some of the best data feeds.
51:53
He has eight billion frames a day
51:55
from Tesla cameras. So he's
51:57
now trying to do self-driving, not-
52:00
based on rules and algorithms, but
52:02
on how humans navigate. And
52:05
I hope, among other things, that's
52:08
his new fixation, which is actually going
52:10
to be a good one, as opposed
52:12
to worrying about Twitter, which he should leave
52:14
to Linda Gackarino, who knows what she's doing.
52:17
The other thing who he brought into to run Twitter.
52:22
The other thing is he said,
52:26
I don't believe in any, I forget
52:28
the exact quote, but I don't believe in laws.
52:31
I believe in the only laws. I believe
52:33
in the laws of physics. Question every
52:35
rule. Question every regulation. The
52:38
only rules that are unbreakable are the
52:40
laws of physics. So, I mean, he's
52:42
committed to the idea that there
52:45
are actual immutable rules
52:48
of science and laws of science.
52:50
One of the things about these conspiracy theories
52:53
is they fundamentally assault the
52:55
idea that very notion.
52:59
And I'm wondering if that paradox
53:02
hasn't occurred to him. He
53:04
believes that the
53:07
other extreme is also anti-science
53:10
and anti-discourse. I
53:13
think free speech is a great thing,
53:15
but I also think it's a complex thing. And
53:18
I don't think he understands the complexities.
53:22
Yeah, and you're not here
53:24
to explain or interpret or
53:29
certainly to defend every practice
53:31
of Twitter, but it is a fact
53:34
that there has been, that
53:37
he's lowered barriers. He fired
53:39
a whole bunch of people, which is a practice.
53:42
They're now fortunately hiring like
53:44
crazy and setting up because
53:46
of the events, the horrors
53:49
of the past weekend too. We've
53:51
seen sort of maligned state actors
53:55
who now are back in on Twitter.
53:59
whatever it is that they think is in
54:01
their interest to propagate. We've seen an
54:03
increase in hate
54:05
speech that's been pretty significant.
54:09
There are consequences. This is not...that's
54:13
another place in which
54:16
this is not like the
54:18
other things that he's done. I agree. Well
54:21
that's no fun. Let
54:24
me ask you about... I don't want to end
54:26
on that note. No, we're not gonna end on that. No, we're not gonna
54:29
end until someone makes us. So...
54:32
We do have to be out of the web. You've got a plane and
54:34
you probably have stuff to do. They've got other
54:37
speakers. I've got all day. But I want
54:39
to ask you about the process
54:45
as a writer. One very sort of mundane
54:48
question, but as a fellow writer I'm interested
54:50
in it. As I said, this is a lengthy
54:52
book, but it's a page-turner. And
54:55
one of the things that's noteworthy about it that
54:58
I actually appreciate as a reader is the paragraphs...
55:01
I mean the chapters are very short. Is
55:03
that by intention? It's intention
55:06
because I was trying
55:09
to mimic... not
55:11
mimic, but capture
55:14
must own day
55:16
and world, which is incredibly
55:19
fast-paced but leaping
55:22
from
55:23
intense focus on this to intense focus
55:25
on that. And I figured that the way
55:28
to make the book feel
55:30
like you were alongside Musk
55:32
was to make it storytelling, as
55:35
opposed to me bloviating and preaching,
55:37
and to make it fast-paced
55:40
narrative stories that
55:43
you almost feel you're rushing
55:46
along in a day with Musk. Interesting.
55:48
Yeah, well it was conscious. It
55:51
works. And do
55:53
you... are there other things that
55:56
you are focused on now that
56:00
other projects that you have any other geniuses
56:02
in your sites? Yeah
56:05
how on the record are we? We live-streamed?
56:08
We're not live-streaming it's just you and me and
56:10
my podcast audience. Two things I'm thinking
56:12
about. First of all, whenever
56:16
I do somebody who's a bit
56:18
rough
56:19
I say okay I got to go to the way
56:21
back machine and go back in here so like after
56:23
I did Kissinger's I'm gonna do somebody's been
56:26
dead 200 years. Well explain that.
56:28
Explain that. Well Kissinger was a tough ride
56:30
too for me. He was unhappy with your book. Somebody
56:34
said did you like the book? He said well I like
56:36
the title. But
56:41
after dealing with it was like all right I'm gonna go
56:43
back and do Ben Franklin. You know he's been dead 200
56:46
years. After doing Steve Jobs
56:48
it was like all right I'm gonna go back 500 years.
56:51
I did Leonardo.
56:55
Part of me wants to do an Alan touring
56:58
to the present AI book
57:00
but I think it's too early. So the
57:03
book in Alan went this that I'm next
57:06
thinking of doing and I haven't really
57:08
told my editors so this
57:10
may change. There
57:13
were two of them. One is
57:15
Louis Armstrong because I grew up in that neighborhood
57:17
and that's why I was because
57:20
I saw a wonderful world down in New
57:22
Orleans last week. It's opening last night.
57:27
But another I'm thinking of doing because I still
57:29
love science is other
57:32
than Einstein the person who makes one of
57:34
the most fundamental discoveries at the beginning
57:36
of the last century is the discovery
57:38
that chemistry is basically
57:41
physics. It's just a question of the
57:43
electrons moving around the nucleus
57:45
and that they can radiate and you get
57:48
new things when radiation
57:50
occurs and the discoverer
57:52
of radiation of course wins the Nobel
57:55
Prize in physics Marie Curie
57:58
and then she wins the Nobel Prize in in chemistry,
58:00
the only person to have won, the
58:03
first woman to have won, but the only person to have
58:05
won two science Nobel's. And
58:08
she has an amazing struggle
58:11
of a life, including
58:14
after her husband Pierre dies, she's
58:16
having an affair with the married student
58:18
of Pierre, and it becomes a scandal
58:21
in Paris just as she's winning
58:24
her second Nobel. So Arrhenius,
58:26
ahead of the Swedish Academy, says, you
58:28
shouldn't come accept it. It's too much of a controversy.
58:31
And she writes back basically saying,
58:34
if I were a man, you wouldn't have said that.
58:36
I'm coming. And she does, and
58:38
she gives this beautiful speech about
58:41
how her science is on a different
58:43
level than worrying about her personal life.
58:45
So I think I might do her next. Madam
58:48
Curie fans here. We had more
58:50
Louis Armstrong fans. Maybe I'll go back
58:52
to Louis. Yeah, yeah. You just
58:54
sort of, you've come full circle. Which
58:58
is, we started with your
59:00
lifelong passion
59:02
for science. And
59:05
it's a gift to make science
59:07
and technology colloquial
59:11
enough for people to absorb
59:13
it. And it's also a joy.
59:16
You know, when they ask Ada Lovelace, who
59:18
was in one of my books, Lorde Bynum's daughter, isn't
59:21
science and math hard. She cites
59:24
a line of poetry from her father. She
59:26
walks in beauty like the night. She
59:29
says, that's hard. But
59:31
a mathematical equation or science
59:34
is also hard, but just
59:37
as beautiful. And so it's always a joy
59:39
to look at the beauty of the connection of the
59:41
humanities and scientists. I meant to
59:43
ask you this earlier. We will end on this. I
59:45
know we have to. People
59:49
holding up, well, so it's a flashing wrap.
59:52
Yeah, so five minutes. Thank you. A
59:56
few weeks ago, I did a podcast
59:58
with Larry Wright. from the
1:00:00
New Yorker. And it
1:00:03
turns out his mentor was Walker
1:00:05
Percy. Dr. Percy, Uncle
1:00:08
Walker, a novelist
1:00:10
from Louisiana. Yes,
1:00:13
who was a very gifted novelist
1:00:16
in the 60s. And was your? I
1:00:20
have a story about him, which
1:00:24
is when I was very young, we
1:00:28
used to go water skiing
1:00:31
and hunting for turtles and fishing on the
1:00:33
Bogaflaya. And there was a girl,
1:00:36
Ann, who was a friend of ours. And he
1:00:38
was called Uncle Walker. And
1:00:42
my father said, Ann, what does your dad do? He's always
1:00:44
sitting on the dock drinking bourbon and eating
1:00:46
hog said cheese. Said, well, he's a
1:00:49
writer. And I knew you could be an engineer
1:00:51
like my dad. Or you could be a fisherman. I didn't know you
1:00:53
could be a writer. When I was 12, his
1:00:56
movie go, the movie go came out. Yeah, the
1:00:59
movie go was his first book. Huge. And so I read
1:01:01
it. And I said, wow, I
1:01:04
get it. You can be a writer. So this affected
1:01:06
me. And there were all sorts
1:01:08
of messages in there. And
1:01:11
so I sat with him once. I said, Uncle
1:01:13
Walker, what are you trying to
1:01:16
teach? What's your message in
1:01:18
this book? And he said, there
1:01:21
are two types of people who come out of Louisiana.
1:01:24
Preachers
1:01:25
and storytellers.
1:01:27
He said, for heaven's sake, be a storyteller.
1:01:30
This world has too many preachers. And
1:01:32
that's what I try to do.
1:01:41
Thank you, Walter. Thank you. Thank
1:01:44
you for becoming a storyteller. And
1:01:46
thank Uncle Walker
1:01:48
for encouraging you. Thank you. Thank
1:01:54
you for listening to the Ax Files, brought
1:01:56
to you by the Institute
1:01:57
of Politics at the University of
1:01:59
Chicago.
1:01:59
and CNN Audio. The
1:02:02
executive producer
1:02:03
of the show is Miriam Fender
1:02:05
Annenberg. The show is also produced
1:02:07
by Sarah Lena Berry, Jeff Fox,
1:02:10
and Hannah Grace McDonald. And special
1:02:12
thanks to
1:02:12
our partners at CNN, including
1:02:14
Steve Licktai and Haley Thomas. For
1:02:17
more programming from the ILP, visit
1:02:19
politics.uchicago.edu.
1:02:40
Amica Insurance believes life insurance
1:02:42
is about more than just financial protection. It's
1:02:44
a promise that our loved ones will be supported
1:02:47
even when we're no longer there to provide for them.
1:02:49
It's more than a policy. It's your legacy.
1:02:51
That's why Amica Insurance partnered with Courageous
1:02:54
Studios, the branded content studio of
1:02:56
Warner Bros. Discovery, to create a short
1:02:58
film series called What You Leave Behind. Real
1:03:00
stories about people who've lost loved ones and
1:03:03
how a meaningful object left behind can
1:03:05
continue to represent their legacy. Visit
1:03:07
amicawhatyouleavebehind.com to
1:03:09
watch the films and submit your own story to
1:03:11
honor a loved one and what they've left behind
1:03:14
for you. As Amica says, empathy
1:03:16
is our
1:03:16
best policy.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More