Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
Hey everyone, it's Tristan and
0:03
welcome to your undivided attention. The
0:06
great late media theorist
0:08
Neil Postman liked to quote
0:10
Aldis Huxley, who once said that
0:12
people will come to adore the
0:14
technologies that undo their capacity to
0:17
think. He was mostly talking
0:19
about television. This was before
0:21
the internet or personal computers ended
0:24
up in our homes or rewider
0:26
to societies. But Postman could have
0:28
just as easily been talking about
0:31
smartphones, social media, and AI. And
0:33
for all the ways television has
0:35
transformed us in our politics, our
0:37
eating habits, our critical thinking skills,
0:40
it's nothing compared to the way
0:42
that today's technologies are restructuring what
0:44
human relationships are, what communication is,
0:47
or how people know what they know.
0:49
As Postman pointed out many times, it's
0:51
hard to understand how the technology
0:53
and media we use is changing
0:55
us when we're in the thick
0:57
of it. And so now as
0:59
the coming wave of AI is
1:01
about to flood us with new
1:03
technologies and new media forms, it's
1:05
never been more important to have
1:07
critical tools to ask of technology's
1:09
influence on our society. And Postman
1:11
had seven core questions that we
1:13
can and should ask of any
1:15
new technology. And I'll let him
1:17
tell you in his own words.
1:19
What is the problem to
1:22
which a technology claims to
1:24
be the solution? Whose problem
1:26
is it? What new problems
1:29
will be created because of
1:31
solving an old one? Which
1:34
people and institutions will be
1:36
most harmed? What changes in
1:38
language are being promoted? What
1:41
shifts in economic and political
1:43
power are likely to result?
1:45
And finally, what alternative media
1:48
might be made from a technology?
1:50
Now, I think about these questions often, and
1:52
it may not surprise you to hear that
1:54
today's episode is one I've been wanting to
1:57
do for quite a long time, since Neil
1:59
Postman has... far been one of the
2:01
most influential thinkers for my own views
2:03
about technology. His ideas have been so
2:05
clear-eyed, prescient, starting in the 1980s, about
2:07
the role of technology in shaping society
2:09
that I wanted to dedicate a full
2:11
hour to exploring them. So today we
2:13
invited two guests who thought deeply about
2:16
Neil's work. Sean Illing is a former
2:18
professor who now hosts the Gray Area
2:20
podcast, Ed Box, and has often written
2:22
and discussed Postman's relevance to our current
2:24
cultural crisis. We also have Lance Strait,
2:26
a professor of communication at Fordham University.
2:28
He was actually a student of Postmans
2:30
at NYU and spent his career developing
2:32
the field of media ecology that Postman
2:34
helped create. Sean Lance, thanks for coming
2:37
on your invited attention. Glad to be
2:39
here. Thank you. So I'm just curious,
2:41
you know, for me, Neil Postman has
2:43
been such a profound influence on our
2:45
work. So in 2013, when I was
2:47
kind of having my own awakening at
2:49
Google, that there was just something wrong
2:51
in the tech industry, there was something
2:53
wrong about the way we were going
2:55
to rewire the global flows of attention
2:57
and something wrong with the the scrolling,
3:00
doom scrolling culture that I saw in
3:02
the Google bus. And I, you know,
3:04
used to be someone who really deeply
3:06
believed just in this kind of, Tech
3:08
is only good. We can only do
3:10
good with it. It's the most powerful
3:12
way to make positive change in the
3:14
world. And it was this friend of
3:16
mine, Jonathan Harris, who is an artist
3:18
in Brooklyn, who first introduced me to
3:21
Neil Postman's work and his books technopoly
3:23
and amusing ourselves to death. And I
3:25
just could not believe just how prescient
3:27
and just precise he was in his
3:29
analysis. And I have been wanting to
3:31
bring Neil Postman's, you know, just really
3:33
critical insights to our audience who include
3:35
a lot of technologists for such a
3:37
long time. So I'm just very grateful
3:39
to have both of you on and
3:41
hope we can have like a really
3:44
rich conversation. So just to sort of
3:46
open with that. That's great. I think
3:48
I got Postman pill back in. 2016
3:50
or 2017 and it's it I mean
3:52
I came up as a political scientist
3:54
political theorist that was medication and we
3:56
didn't really encounter any of this stuff
3:58
right But once I
4:00
sort of internalized
4:02
the media ecological way of seeing
4:04
things, it really kind of
4:06
changed how I understood all the
4:09
politics. It's pretty profound. What
4:11
was your entree into Postman's work and
4:13
what you see as kind of his
4:15
critical insights? In 2016,
4:17
I was invited by a former
4:19
classmate of mine to give a
4:21
talk at Idaho State. This
4:24
is sort of right in the beginning of
4:26
the Trump era and all the chaos involved with
4:28
that. And I gave my little talk and
4:30
then I went for a hike with my buddy
4:32
who's a media theorist, and we got to
4:34
talking. And at
4:36
the end of that, he sort of
4:38
introduced me to Postman and media ecology.
4:40
And that was sort of the germ
4:43
of the book that we ended up
4:45
writing together, The Paradox of Democracy, which
4:47
came out in 2022. But
4:49
before that, I'd never
4:52
really encountered media ecology at
4:54
Neil Postman. And for me, the
4:58
value of these great media
5:00
ecologists is that they really
5:02
force us to stop looking at
5:04
media as just a tool
5:06
of human culture. And instead,
5:09
to see it as much more
5:11
as a driver of human culture. And
5:13
this changed the way I looked at
5:15
the political world. I mean, what
5:17
you discover when you look at the
5:19
history of democracy and media is
5:21
that all of these revolutions in media
5:23
technology, the printing press, the telegraph,
5:25
radio, film, TV, the internet, it's not
5:27
so much that these technologies are
5:30
bad. It's that they unleash
5:32
new rhetorical forms and new
5:34
habits and new ways of thinking
5:36
and relating to the world
5:38
and each other. And that's very
5:40
disruptive to society and the
5:42
established order. And we're sort of living
5:44
through that. I could go on, but I'll pause
5:47
and let Lance speak. Lance,
5:49
how about you? How did you first
5:51
get into this work and starting
5:54
with your being a student at Postman's?
5:57
Well, I mean, I could go
5:59
back to the... 70s as an undergraduate
6:01
in a class on educational
6:03
psychology. Postman's first big book,
6:05
Teaching as a Subversive Activity,
6:07
was on the reading list.
6:09
And that was when he
6:11
was still following McLuhan with
6:13
the argument that we need
6:16
to adjust ourselves to the
6:18
new media environment. Just a
6:20
note for the audience, Marshall
6:22
McLuhan is another very influential
6:24
media ecology thinker from Canada.
6:26
who famously coined the idea
6:28
that the medium is the
6:30
message. And you'll hear his
6:32
name throughout this conversation. But
6:35
I first read him. in
6:37
I guess in 79 with
6:39
teaching as a conserving activity
6:41
which was also when I
6:44
first met him and That's
6:46
where he did his about
6:48
face Although maintaining the media
6:50
ecology outlook, but arguing that
6:52
we needed to counter the
6:55
biases of television because we're
6:57
inundated with it and When
6:59
Postman introduced the idea of
7:01
media ecology, he gave it a
7:04
very simple definition that it's the
7:06
study of media as environments. And
7:08
once we understand that, then it's
7:11
no longer just a tool that
7:13
we choose to use or not
7:16
use and we have complete control
7:18
over, but rather it's like the
7:20
environment that surrounds us and influences
7:23
us and changes us. And when
7:25
we look at democratic society and
7:28
democratic politics, That was shaped, modern
7:30
democracy was shaped by
7:32
a typographic media environment
7:34
and that television is
7:36
reversing so many of
7:39
those characteristics and is
7:41
really a question about
7:43
what will survive that
7:45
of the various institutions
7:47
that grew up within
7:49
the media environment formed
7:52
by print culture. So
7:54
there's just already so much to dig into
7:56
here. So let's set the table a little
7:58
bit for listeners. Let's start by. by talking
8:00
about Neil Postman's book, Amusing Ourselves
8:03
to Death, which is really a
8:05
critique of television and how the
8:07
medium of television and taking over
8:09
society and transitioning us from Lance,
8:12
what you're just talking about of
8:14
a typographic culture to a television
8:16
culture, would completely shift and transform
8:18
public discourse, you know, participation, democracy,
8:21
education. Does one of you want
8:23
to take a stab at kind
8:25
of the Cliff Notes version of
8:27
Postman's argument before we dive into
8:30
specifics? Well, I mean, it really
8:32
is the shift from typographic era
8:34
to the television era and that
8:36
that has undone a lot of
8:39
key elements of American culture. You
8:41
know, as you may know, I
8:43
did a book that followed up
8:46
on Amusing Ourselves to Death, and
8:48
I don't think Postman quite... made
8:50
it overt in amusing ourselves to
8:52
death, but he has four case
8:55
studies, you know, and they're the
8:57
news politics religion and education and
8:59
how each one has been transformed
9:01
in a negative way from by
9:04
television and what I tried to
9:06
explain is that what postman hit
9:08
upon there are the four legs
9:10
that the table of American culture
9:13
stands on politics democratic elections obviously
9:15
journalism as the First Amendment and
9:17
the way that makes possible democratic
9:19
participation, absolutely, often overlooked, but religion
9:22
forms the kind of moral and
9:24
ethical basis that our republic was
9:26
founded upon, and then education as
9:29
the promise that people will be
9:31
literate, which like the bottom line
9:33
of education is reading, writing, and
9:35
arithmetic, that people will be literate
9:38
enough. be able to govern themselves
9:40
to get access to information and
9:42
and think rationally and make good
9:44
decisions. So do you want to
9:47
add to that? there's so much
9:49
here. I mean, when people talk
9:51
about, you know, typographic culture versus
9:53
televised culture, let's just zoom into
9:56
what do we really mean? Because
9:58
so much of Postman and Marshall
10:00
McLuhan is essentially a kind of
10:03
holding up a magnifying glass to
10:05
the invisible. When we say it
10:07
structures, you know, the way that
10:09
we think, like what do we
10:12
actually mean by that? What's the
10:14
phenomenology of reading text on a
10:16
page that's so different from watching
10:18
this podcast in a video right
10:21
now? Well for me, I mean,
10:23
the point in all of this
10:25
is to get us to really
10:27
see how every medium of communication
10:30
is acting on us by imposing
10:32
its own biases and logics, and
10:34
they are different. You know, Postman
10:36
talks about, you know, you have
10:39
the printed word, what is it
10:41
to read a book? What is
10:43
the exercise of reading? It's deliberative,
10:45
it's linear, it's rational, it's demanding.
10:48
What is TV? It's visual,
10:50
it's... It's entertaining. It's all
10:53
about imagery and action and
10:55
movement. What is social media?
10:57
It's reactionary. It's immediate.
10:59
It's algorithmic. It kind
11:02
of supercharges the
11:04
discontinuity of TV and
11:07
monetizes attention and new and
11:09
powerful ways, right? Once you have this
11:11
media ecology framework, you look at the
11:13
eras of politics that coincide with these
11:15
communication technologies. You can see it in
11:18
the language of politics. You can see
11:20
it in the kinds of people that
11:22
win elections, how they win those elections,
11:24
how they appeal to people. You can
11:26
see it in the movements and the
11:28
dominant forces at the time. Like I was
11:31
saying, it still blows my mind that
11:33
I made it through a graduate education
11:35
and political theory, and we never managed...
11:37
to read any media ecology because it really
11:39
is, especially in a free and open society
11:41
where people can speak and think and persuade
11:43
one another. It's a kind of master variable
11:45
that's not often seen as that, but it
11:48
should be. So let's dive into that just for
11:50
a second. So you know, people think, okay,
11:52
we live in a democracy, you have
11:54
candidates, those candidates debate ideas, they have
11:56
their platforms, they talk about themselves, and
11:58
then voters are sitting there. and they
12:00
kind of taken all those arguments and
12:02
they make a rational choice. And that's
12:04
just democracy. And democracy is democracy. It
12:07
doesn't change over the last 200 years.
12:09
So let's just explain, Sean, what you
12:11
were just saying, maybe Lance, you want
12:13
to do this. In what way does
12:15
media define the winners and losers of
12:18
our political world? I mean, Postman gives
12:20
so many examples, but. Well, I think
12:22
we have to start with the fact that,
12:24
you know, democracy was founded on the
12:26
idea that. people have enough access
12:28
to information to make decisions,
12:31
you know, but it also
12:33
presupposes that people will talk
12:35
to one another and be
12:37
able to speak in a
12:39
rational way. I mean, Postman's
12:41
kind of... wonderful illustration is
12:43
of how people went to
12:45
listen to the Lincoln Douglas
12:47
debates for hours upon end.
12:49
And it can imagine there
12:52
was a carnival-like atmosphere, but
12:54
still that people were willing
12:56
to sit and listen for
12:58
whatever, six hours of debating
13:00
going on, whereas today everything
13:02
is reduced to these sound
13:05
bites, you know, these 10-second
13:07
sound bites. And, you know,
13:09
posting points to two key
13:11
technologies. in the 19th century
13:14
that start the ball rolling
13:16
away from typography and ultimately.
13:18
come together with television. One is
13:20
the telegraph, because just by speeding
13:23
things up, we have no time
13:25
to think and reflect, and that
13:27
really is harmful. So just the
13:29
speed at which we're moving, right,
13:31
which we see today, where we've,
13:33
you know, in this moment, we
13:36
feel overwhelmed, and there's like a
13:38
new story every few hours, some
13:40
new thing happening, and we don't
13:42
know what to do. And the
13:44
other thing is the image, the
13:46
photography, the photography. of the 19th
13:49
century becomes the dominant
13:51
mode of communication. So
13:53
between the two, it's
13:55
all about appearance and
13:58
personality that's communicated. over
14:00
the televised image and this
14:02
rapid turnover that favors celebrity
14:05
and fame over substance. Yeah,
14:07
can I just say something real
14:09
quick? The telegraph is such
14:11
a good example of a
14:13
practical example of McLuhan's, you
14:16
know, the medium is the
14:18
message, you know, that how
14:20
the medium itself, the technology
14:23
itself, doesn't just influence content.
14:25
It really dictates what it actually
14:27
means. And I was going back
14:29
and I was reading Thoreau actually
14:31
when I was researching my book.
14:33
And you know Thoreau was talking
14:35
about the telegraph as a kind of
14:37
proto social media that it was
14:40
that it was actually he's arguing
14:42
that it's actually changing what constituted
14:44
information that with a telegraph it
14:47
became a commodity to be sold
14:49
and bought right we get the the
14:51
birth of the penny presses and tabloid
14:53
journalism in. For him, that was sort
14:55
of the end of the idea
14:58
that information was something that was
15:00
definitionally important or actionable. It just
15:02
became another source of entertainment.
15:04
It became a consumer product. So
15:06
much of our work in this podcast
15:09
and at CHT, obviously it's like this
15:11
question of why does any of this
15:13
matter? Like why are we here talking
15:16
about this? And it's because technology and
15:18
media are having a bigger and bigger
15:20
influence on constituting our culture. People always
15:23
say, you know, If culture is upstream
15:25
from politics, then now technology is constituting
15:27
the culture that is upstream from politics.
15:29
I was just at Davos in Switzerland,
15:32
and I would say the most popular
15:34
question being asked, and it was like
15:36
right on inauguration day January 20th, and
15:39
basically the dinners I was at, people
15:41
said, what do you think will matter
15:43
more in the next few years? The
15:45
choices of political leaders or the choices
15:47
of technology? leaders and companies, and especially
15:49
when you tune into AI. And so
15:52
I just want to ground this for
15:54
listeners of like, why are we even
15:56
talking about this? It's because technology is
15:58
going to structure what human relationship is
16:00
what communication is how people know what
16:03
they know the habits of mind so
16:05
I just want to just make sure
16:07
we're returning to kind of set the
16:10
stakes of why this is so important
16:12
because so often I think the thing
16:14
that's it's problematic for me about postman
16:17
is it just feels so abstract McLuhan
16:19
the medium is the message it doesn't
16:21
hit you about how significant that idea
16:24
is so I just want to return
16:26
lands to the thing you were saying
16:28
about the Douglas Lincoln debates in the
16:31
1800 people don't know we kind of
16:33
res passed it They debated for three
16:35
hours each, I believe it was one
16:38
guy took three hours, then the next
16:40
guy took three hours, then there was
16:42
like an hour rebuttal. Can you imagine
16:44
seven hours of political debates that are
16:47
long-formed speeches in front of live audiences?
16:49
And just what a different notion of
16:51
the word democratic debate. So here we
16:54
are, we're using this phrase democratic debate,
16:56
but the meaning of it has completely
16:58
shifted. of what constitutes those two words
17:01
in the year 2025 than the year
17:03
1862. And so let's just dive into
17:05
I think another aspect of why this
17:08
matters, which is the power that media
17:10
confers in the way it sets up
17:12
what kinds of people win or Liz
17:15
Sean, you look like you're trying to
17:17
jump in. What's interesting is that for
17:19
postmen, the TV air was all about
17:22
entertainment, right? Like everything that unfolded on
17:24
or through that medium. had to be
17:26
entertaining because that's what the laws of
17:28
TV demand. But this era, where TV
17:31
is still around, it still matters, but
17:33
not nearly as much, there's much more
17:35
of a convergence with other mediums, like
17:38
the internet and social media, which are
17:40
now more dominant, really, culturally, and politically.
17:42
And on these mediums, it's not about
17:45
entertainment so much as attention. The attention
17:47
economy is master now, right? In the
17:49
TV era, politicians really had to be
17:52
attractive and likable. They had to play
17:54
well on TV. Now they just have
17:56
to know how to capture and hold
17:59
attention, which means leaning into spectacle and
18:01
problem. and performative outrage or virtue as
18:03
the case may be, they dictate a
18:06
different kind of political skill set
18:08
to win. One of the reasons
18:10
why both postman and McLuhan are
18:12
so prescient, at least that people
18:14
think of them that way, you
18:17
know, that they're, what they were
18:19
talking about largely television and yet
18:21
it seems to apply so well
18:24
to today. And for some, for
18:26
many people, it seems to better
18:28
fit today, is that their analysis
18:31
was based not on, not just
18:33
on the specific medium of television,
18:35
but on the idea of electronic
18:38
media generally. But I think
18:40
entertainment was Postman's way of
18:42
getting at the larger point,
18:44
which is that it's trivial,
18:47
it's not serious, and what
18:49
catches our attention, it's a
18:51
larger set of dynamics, and
18:54
entertainment was just kind of
18:56
way of pinpointing, but it
18:58
really is that non-serious trivialization.
19:01
It's a different kind of
19:03
entertainment. So I just want to name a
19:05
pushback that I got when I remember
19:07
speaking to these arguments in the tech
19:09
industry when I was at Google in
19:12
2013 Which is people some people might
19:14
say well, why is that a problem
19:16
if people like to amuse themselves people
19:18
like amusement? Don't we all need some
19:20
amusement in the world? What would you
19:22
say to that or what would postman's
19:24
argument be against that? Well postman wasn't
19:27
against amusement. You know, he said
19:29
television is great. The best thing
19:31
about TV is junk. He loved
19:34
TV, especially sports. You know, we
19:36
actually bonded together as Mets fans,
19:39
although his real love was the
19:41
Brooklyn Dodgers, but you know, in
19:43
their absence, it was the Mets.
19:46
He also, you know, loved basketball.
19:48
and all of that. I mean
19:50
sports is one of the great
19:53
things that television can provide. It's
19:55
awful for politics. It's awful for
19:57
religion. I mean it really... has
20:00
degraded religious participation and presentation by
20:02
putting it on television and also
20:04
through social media and all of
20:07
the other advances that we've seen.
20:09
And it's bad for education. So
20:11
I would go back to what
20:14
you were saying earlier about distraction,
20:16
which is a really important word.
20:18
I think that's more. closely pegged
20:20
to the role of technology here,
20:23
fragmenting our attention, pulling us around
20:25
like greyhounds chasing around a slab
20:27
of meat. I mean, I was
20:30
talking to Chris Hayes the other
20:32
day, who was on my show
20:34
and he has a new book
20:36
out about attention and the fragmentation
20:39
of attention and really sort of
20:41
the death of mass culture in
20:43
any meaningful sense, right? And I
20:46
was asking him, well, I mean,
20:48
isn't democracy on some level... a
20:50
kind of mass culture and if
20:52
we can't pay attention together, if
20:55
we can't focus on anything together,
20:57
then what the hell does that
20:59
make of our democratic politics, right?
21:02
I mean, that's what concerns me,
21:04
right? I mean, I remember, you
21:06
know, reading McLuhan, who, you know,
21:09
would talk about... media and time
21:11
and he was so obsessed with
21:13
electric media because it flattened time
21:15
and it made everything instantaneous and
21:18
And he would argue that this
21:20
sort of scrambled society's relationship to
21:22
time and you know like radio
21:25
and TV and now the internet
21:27
create this landscape where everything unfolds
21:29
in real time, but you know
21:31
in a print dominated culture where
21:34
you're consuming weekly or monthly magazines
21:36
or quarterly journals or books, that
21:38
facilitates a kind of deliberation and
21:41
reflection that you don't get when
21:43
everything is so immediate and frenzied
21:45
and in a democracy where the
21:47
horizon of time is always the
21:50
next, the hell of the next
21:52
election, it's the next new cycle.
21:54
That kind of discourse makes it
21:57
very hard to step back and
21:59
think. beyond the moment. It makes
22:01
it very difficult to solve collective
22:03
action problems. And all the
22:06
most important problems are collective
22:08
action problems. Totally. Yeah, I
22:10
think to sort of my interpretation of
22:12
what you're both saying is that
22:15
there isn't a problem with people
22:17
having amusement in their lives or
22:19
having entertainment. It's about whether the
22:21
media systemically structures the form of
22:23
all information in terms of its
22:26
amusing capability or its entertainment capability.
22:28
and that that systemic effect makes
22:30
us confused about whether we're actually
22:32
consuming information or getting educated versus
22:35
we're really just being entertained. And
22:37
he says, you know, the basic
22:39
quote, the television is transforming our
22:41
culture into one vast arena for
22:44
show business. And that was for
22:46
the television era. When I think
22:48
about social media era and I
22:50
think about Twitter or X, I
22:52
think, you know, social media is
22:54
transforming our culture into one vast.
22:56
Vadiator Stadium arena for basically drama
22:58
and throwing insults and, you know,
23:00
salacious tweets back and forth. Another
23:03
sort of key concept that
23:05
Postman is critical of is the
23:07
information action ratio. And I remember
23:09
this actually in the tech industry
23:11
that so many people, and I
23:13
used to really believe, How many
23:15
problems really had to do with
23:18
people just not having access to
23:20
the appropriate information, which is all
23:22
about information access? I mean, I
23:24
had a tiny startup called Apshire
23:26
that was a talent acquired by Google that
23:28
was all about giving people contextual access to
23:30
more information. I remember it. Do you remember
23:33
that? Okay. Yeah, yeah, it was good. Yeah,
23:35
well, thank you. I mean, it was motivated
23:37
by, I think the good faith version of
23:39
this, which is that if people don't. have
23:42
imagined you know right when you're encountering something
23:44
that you have no in basic you have
23:46
no reason to be interested in the perfect
23:48
most engaging professor guide lecturer you know
23:51
museum curator showed up and held your
23:53
hand and suddenly just told you why
23:55
this thing that you're looking at is
23:57
the most fascinating thing in the world
23:59
and that's what this little appure thing
24:01
was. It was basically providing instant
24:03
contextual, rich information that was supposed
24:05
to entrance you and deepen your
24:07
curiosity and understanding about everything. And
24:09
it was driven by my belief,
24:11
which is very common in the
24:13
tech industry, that it's all about
24:15
driving so much more information. And
24:17
if we only just gave people
24:19
more information, then that would suddenly
24:21
make us respond to climate change
24:23
or respond to poverty or do
24:25
something. And so I'd love for
24:27
you to articulate what was Postman's
24:29
kind of critique. of information glut
24:31
and the information action ratio he
24:33
speaks of. Well, you know, I
24:35
mean, what he would say is
24:37
that in the 19th century not
24:40
having enough information was a problem,
24:42
but we solved it. We solved
24:44
it long ago, and that's the,
24:46
and that. creates new problems because
24:48
we just keep going and going
24:50
and going and going. I mean,
24:52
I would say, you know, think
24:54
about how most of human history
24:56
not having enough food was a
24:58
problem, and today we are wrestling
25:00
with issues of obesity because we
25:02
solved that problem a long time
25:04
ago. We've got plenty of food,
25:06
but we just keep going and
25:08
going and going and going. So
25:10
I mean, this was actually one
25:12
of McLoon's points is that you
25:14
pushed things far enough and you
25:16
get the reverse. You get it
25:18
flipping into it into it. opposite.
25:20
So information scarcity by solving it
25:22
we create a new problem of
25:24
information glut and that leads us
25:26
you know as you said since
25:28
most of that we're powerless to
25:30
do anything about it leaves us
25:32
with irrelevant information leaving us feeling
25:34
impotent powerless which again I think
25:36
a lot of people are feeling
25:39
particularly right now. Yeah I always
25:41
found with those types. There's a
25:43
tendency to conflate information in truth
25:45
as though they're the same and
25:47
they are not the same I
25:49
don't know how anybody can look
25:51
at the world right now and
25:53
say that this Superabundance of information
25:55
has been a boon for truth
25:57
and to the point that Lance
25:59
is just making It's this combination
26:01
of being constantly bombarded
26:03
with information. Most of
26:05
it true, a lot of it bullshit, a
26:07
lot of it terrible, being bombarded
26:10
with that and also the
26:12
simultaneous experience of complete impotence
26:14
in the face of that.
26:16
We've also engineered an environment
26:19
that elevates the lies,
26:21
it elevates the falsehoods, it
26:23
elevates the distractions, it elevates
26:25
the things that stimulate our
26:28
more base primal impulses.
26:30
And that in the
26:32
contest between diversions, amusements,
26:34
provocations and dispassionate truth, I
26:37
think we all know who's going
26:39
to win that fight 99 times
26:41
out of 100. And I would
26:43
really think it's really important
26:45
to distinguish between information and
26:48
knowledge and knowledge is something
26:50
that we largely got from
26:52
books. And information is something
26:55
that we are inundated through
26:57
the electronic media. And it
26:59
doesn't really have to be
27:02
true or false. And that's
27:04
why in a way the
27:06
distinction, well, valuable in some
27:09
context, but the distinction between
27:11
misinformation, disinformation, and just information,
27:13
is not that important. because it's,
27:15
you know, when we have information
27:17
glut, anything goes. You can't tell
27:19
what's what because it's not relating
27:21
to anything out there. Things are
27:23
critical point that you're making because
27:26
even, let's say, we solved the
27:28
misinformation, disinformation problem, boom, it's gone,
27:30
it's all gone from all the
27:32
airways. You're still just bombarded by
27:34
information glut and information that doesn't
27:36
give you agency over the world
27:38
that... that you're seeing, the company's profit from
27:40
mistaking and reorienting or restructuring what agency means
27:42
in terms of posting more content on social
27:45
media. So I see the social cause that's
27:47
driving me to emotion and then I hit
27:49
reshare and think that I've like done my
27:51
social action for the day. I think Malcolm
27:54
Gladwell wrote about this like 10 years ago,
27:56
so that the kind of failures of text
27:58
solutionism I'm going to reshale. share this
28:00
content, what I'm really doing is
28:02
actually driving up more things for
28:04
people to look at and keep
28:07
getting addicted on social media. So
28:09
I'm perpetuating the money printing machine
28:11
that is the social media company.
28:13
I want to actually get us
28:15
to AI because so much of
28:17
this conversation was really motivated for
28:19
me about how do we become
28:22
a more technology critical? culture, which
28:24
I think is what Postman was
28:26
all about. It's like, what does
28:28
it look like to have a
28:30
culture that can adopt technology in
28:32
conscious ways aware of the ways
28:34
it might restructure? community, habits of
28:37
mind, habits of thought, education, childhood
28:39
development, and then consciously choose and
28:41
steer or reshape that technology impact
28:43
dynamically such that you get the
28:45
results you would want by adopting
28:47
that technology. And in doing that,
28:49
I think I want to turn
28:52
at this point in the conversation
28:54
to his other book, Technopoly, which
28:56
he wrote several years later, which
28:58
the subtitle is The Surrender of
29:00
Culture to Technology. And I think
29:02
this is actually the heart of
29:04
what I'm... I mean, I think
29:06
that amusing ourselves to death is
29:09
a very accessible thing for most
29:11
people in the race to the
29:13
bottom of the brainstem and social
29:15
media as an extension of TV.
29:17
I think technopoly really gets to
29:19
the heart of what does it
29:21
mean to have a society consciously
29:24
adopt technology in ways that it
29:26
leads to the results that it
29:28
wants? And what does that relationship
29:30
look like? So how would we
29:32
set the table of the argument
29:34
that Postman is making in technopoly,
29:36
either of you? His idea of
29:39
technology is really like a more
29:41
accessible expression of Heidegger's critique of
29:43
technology. Technologies are things we use
29:45
in the world to get things
29:47
done or improve our experience in
29:49
the world. And then gradually as
29:51
we move into the modern world,
29:54
technology becomes almost a way of
29:56
being. As Postman says, we became
29:58
compelled by the impulse to invent.
30:00
It's innovation for the sake of
30:02
innovation. It is a blind mania
30:04
for progress. disconnected from any fixed
30:06
purpose or goal and that's sort
30:09
of what Postman is calling technology
30:11
where our whole relationship to the
30:13
world is defined by and through
30:15
technology. Technology is this autonomous self-determinative
30:17
force that's both undirected and independent
30:19
of human action and we're almost
30:21
a tool of it rather than
30:24
the other way around. Here's Postman
30:26
in his own words. Well in the
30:28
culture we live in... Technological
30:30
innovation does not need to be
30:33
justified, does not need to be
30:35
explained. It is an end in itself
30:37
because most of us believe
30:39
that technological innovation and human
30:42
progress are exactly the same
30:44
thing, which of course is
30:46
not so. Postman was talking
30:48
about the personal computer as
30:51
a quintessential technology of technology.
30:53
I mean, my God, what
30:55
would he make of... AI, which by
30:57
any measure, is and will be
31:00
far more immersive and totalizing than
31:02
personal computers. I just want
31:04
to briefly add the quote that
31:06
Postman cites from Thoreau, since we
31:08
mentioned it multiple times. that our
31:10
inventions are but an improved means
31:12
to an unimproved end. I think
31:15
this really speaks to what you're
31:17
speaking about, Sean, which is Postman's
31:19
critique that we deify technology. We
31:21
say that efficiency and productivity and
31:23
all the new capabilities, whatever they
31:25
are, the technology brings, are the
31:27
same thing as progress, that technology
31:29
progress is human progress. And it's
31:31
never been more important to interrogate
31:33
the degree to which that's true
31:36
and not true. And this is not
31:38
an anti-technology conversation, but it's about... How
31:40
do we get critical about it? Lance,
31:43
you were going to jump into that?
31:45
Well, first I'd say that Postman
31:47
would say that Heidegger was a
31:49
Nazi and should not be mentioned
31:51
anymore, but that the big influences
31:53
on Technopoly were Lewis Mumford, who
31:55
was one of the great intellectuals
31:57
of the 20th century and a
31:59
key. mediocre scholar and then Jacques
32:01
Alouel. And it definitely is this
32:03
argument that particularly in America, it's
32:05
not about the stuff, it's not
32:07
about the gadgets, it's about a
32:10
whole way of looking at the
32:12
world and that efficiency becomes the
32:14
only value that we make any
32:16
decisions on. You know, which means
32:18
that it's almost impossible to say
32:20
no when somebody goes, here's a
32:22
more efficient way to do this.
32:24
You can do it faster, do
32:26
more with it, and we almost
32:28
never say no. And you must
32:30
have seen this new thing about
32:32
mirror genes or whatever, the, you
32:34
know, mirror bacteria. Yeah, whether they
32:36
can create organisms with mirror image,
32:38
DNA, which are... bodies would have
32:41
our immune systems would have absolutely
32:43
no defense over. And so we
32:45
shouldn't do it. Well, somebody's going
32:47
to do it. I mean, you
32:49
know that somebody is going to
32:51
do it because once we have
32:53
that capability, nobody puts a stop
32:55
to it. You know, Postman did
32:57
know about AI because that's been
32:59
around, you know, for much longer
33:01
than people, you know, than this
33:03
sudden emphasis on it. And Joseph
33:05
Weisenbaum, who was somebody that Postman
33:07
knew, I was one of these
33:10
sort of pioneers in artificial intelligence.
33:12
He did the Eliza program and
33:14
and in his book, Computer Power
33:16
and Human Reason, you know, he
33:18
introduces the word ought that we've
33:20
forgotten to use, OU, G,T, you
33:22
know, ought we do this? Not
33:24
can we do this, but ought
33:26
we do it, and that that
33:28
is just vanished from our vocabulary.
33:30
And, you know, he argues that
33:32
we need to reintroduce it. You
33:34
know, I always think of that
33:36
hilarious John Stewart joke, you know,
33:39
that the... the last words a
33:41
human being will ever utter will
33:43
be you know some dude in
33:45
a lab coat who says it
33:47
I would ask you a question.
33:49
I would ask you a question.
33:51
I mean, you were part of
33:53
this world in a way. I
33:55
am not. You talk to these
33:57
people. The people who are building
33:59
AI, who want to build AGI
34:01
and whatever else, I mean, they
34:03
are acutely aware of how potentially
34:05
destabilizing it can be. Why did
34:07
they persist in that? Is it
34:10
just a simple, well, if we
34:12
don't do it, China is going
34:14
to do it? It's actually related
34:16
to what Lance is speaking about,
34:18
that if we don't have a
34:20
collective ability to choose which technology
34:22
roads we want to go down
34:24
and which ones we don't, and
34:26
if we just say it's inevitable,
34:28
someone's going to do it and
34:30
better we, the good guys, who
34:32
we think we have better values
34:34
than the other guys, better off
34:36
that we do it first, we
34:39
actually even know what the dangers
34:41
are and can try to defend
34:43
against the bad guys. And I
34:45
think that the thing that, you
34:47
know, you were just speaking about
34:49
with the mirror bacteria, is a
34:51
perfect example because the reason that
34:53
Postman's questions here about how do
34:55
we consciously make decisions about what
34:57
technologies we should do and not
34:59
want to do rather than just
35:01
because we can we do it
35:03
is because AI is about to
35:05
exponentially the introduction of new capabilities
35:08
into society. So it's just it's
35:10
going to be a Cambrian explosion
35:12
of brand new text and media
35:14
and generative everything that you can
35:16
make. You can make law, you
35:18
can make new religions, you can
35:20
make. as we say language is
35:22
the operating system of humanity, from
35:24
code to law to language to
35:26
democracy to conversation, and now a
35:28
generative AI can synthesize and decode
35:30
and hack the language either of
35:32
conversation in the form of misinformation,
35:34
hack code in the form of
35:37
hacking cyber infrastructure, hack law in
35:39
the fact of overwhelming our legal
35:41
systems or finding loopholes in law.
35:43
And so as we're unleashing all
35:45
these new capabilities, it is more
35:47
important than ever that we get
35:49
an ability to consciously choose, do
35:51
we want to do mirror bacteria?
35:53
But then the challenge is, as
35:55
technology democratizes the ability for more
35:57
people to do more things everywhere
35:59
beyond. on global boundaries. Our problems
36:01
are international, but our governance is not
36:04
international. We have national governance responding to
36:06
global interconnected issues. And then we can
36:08
see the political henwoods are not really
36:10
trending in the direction of global governance,
36:12
which is looked upon as a kind
36:14
of a conspiracy of people who are
36:16
out of touch of the national interests
36:18
of the people, which is a very
36:20
valid critique. So yes, Sean, I'm sort
36:23
of wanting to play with you here
36:25
on what's your relationship to this question
36:27
that you're laying out? I don't know. I
36:29
mean, I'm just constantly thinking
36:31
of what are the tradeoffs going to
36:34
be. I mean, you just think about
36:36
the explosion of the internet and
36:38
the tradeoffs involved there. One
36:41
consequence of that, there are
36:43
a lot of incredible benefits. I
36:45
love the interwebs. I use them
36:47
every day. But one of the
36:49
consequences of that is the complete
36:52
destruction of gatekeepers, of any
36:54
kind of boundaries at all
36:56
on the information environment. We lost
36:58
the capacity society lost the
37:01
capacity to dictate the stories
37:03
society was telling about itself
37:05
and You know digital just
37:07
exploded all that you know
37:09
the internet is like this
37:11
choose your own adventure playground
37:13
and it Unsettles and undermines
37:15
trust and a lot of people
37:18
might say well good these institutions
37:20
the elites were corrupt and untrustworthy
37:22
to begin with okay fine But
37:24
we tend to under-appreciate how much what
37:26
we take to be true is really
37:29
just a function of authority. Most of
37:31
us haven't observed an electron or a
37:33
melting glacier. We take it to be
37:35
true because we believe in the experts
37:38
who tell us these things are real.
37:40
And we believe the video clips on
37:42
the evening news of glaciers melting. But
37:44
if that trust is gone and the
37:46
info space is this hopelessly
37:49
fragmented thing riddled with
37:51
deep fakes and misinformation
37:53
reality. isn't possible anymore,
37:55
then where does that leave us? I
37:57
will say I think there's actually a
37:59
way. to get to a good
38:01
world is just we have to
38:03
distinguish between the internet being a
38:05
problem versus the engagement-based business models
38:07
that profited from drama derivatives, you
38:09
know, the amusement culture, the tweetification
38:11
culture, and personalized information bubbles which
38:13
are incentivized. So it's important to
38:15
recognize the reason we have personalized,
38:17
it's not just that you can
38:19
choose your adventure, it's also true,
38:21
but the mass like reinforcement of
38:23
personal information bubbles is actually incentivized
38:26
by the business models because... it's
38:28
better to keep you coming back
38:30
if I give you more of
38:32
the thing that got you interested
38:34
last time. And so we can
38:36
we can split apart the the
38:38
toxic thing of the engagement based
38:40
business models from the internet. And
38:42
then I think you could say
38:44
is there a different design of
38:46
internet protocols and design of these.
38:48
Metcalfe monopolies, meaning network effect-based social
38:50
media places where there's only a
38:52
handful of them, could they be
38:54
designed in a different way that
38:56
actually do reward the kinds of
38:58
mediums that actually enrich and bring
39:00
out the better angels of human
39:02
nature? And that's still the optimist
39:04
in me that believes that it's
39:06
possible to do that. Lance, I
39:08
see you sort of nodding and
39:11
also maybe skeptically nodding your head
39:13
here, so feel free to jump
39:15
in. Well, I mean, I think
39:17
Postman would question. whether more technologies,
39:19
the answer, and every new innovation
39:21
solves some problems, but creates many
39:23
more, which we then solve by
39:25
more technologies, and it just keeps
39:27
expanding and expanding and expanding that
39:29
way. You know, when I teach
39:31
my students media ecology, I try
39:33
to emphasize, let's think about what
39:35
are the appropriate uses for this
39:37
particular medium, and then what's inappropriate?
39:39
And, you know, if we can
39:41
start with that, the internet... or
39:43
various aspects of it were great
39:45
for certain things. I never empowered
39:47
people who were, you know, kind
39:49
of in minorities and brought together
39:51
people who were having difficulties in
39:53
a lot of ways. I can
39:55
speak just in terms of my
39:58
own family with having raised an
40:00
autistic. child that, you know, parents
40:02
of autistic children will largely unable
40:04
to like go to a self-help
40:06
group in person because your hands
40:08
are full and being able to
40:10
communicate over a discussion list or
40:12
group online was, you know, very
40:14
valuable. So, you know, this is
40:16
where we face this problem of
40:18
trying to evaluate the costs and
40:20
benefits. There is a vision of
40:22
a world that would work. And
40:24
I agree with you, Lance, that
40:26
it actually, it takes asking what
40:28
are the appropriate uses of a
40:30
technology and the actively inappropriate uses
40:32
and then consciously designing our social
40:34
structures, our social norms, our culture,
40:36
like not designing, but like, you
40:38
know. practicing cultural values that allow
40:40
us to say what how do
40:42
we reward those appropriate uses and
40:45
anti-reward the inappropriate uses. Now I
40:47
want to just move a little
40:49
bit from admiring the problem because
40:51
there's a tendency to kind of
40:53
re-hash all these things and I
40:55
think Postman is unique in offering
40:57
I don't know if I'd call
40:59
it solutions, but a form of
41:01
taking an active and agentic stand
41:03
on technology. And he has this
41:05
famous lecture series where he outlined
41:07
seven questions that we can ask
41:09
of any new technology. And he
41:11
said that these questions are a
41:13
kind of permanent armament with which
41:15
citizens can protect themselves from being
41:17
overwhelmed by technology. You know, the
41:19
first is what is the problem
41:21
to which this technology is the
41:23
solution? What is the actual human
41:25
or social problem? for which that
41:27
technology is the solution. It's a
41:30
very basic question, but it's a
41:32
very powerful one. So anyway, we
41:34
can go into some of the
41:36
others, but I'm just curious if
41:38
either of you have a reaction
41:40
to this or as we move
41:42
into a more of a solutions
41:44
oriented posture. You know, Sean, what's
41:46
your sense of this? I think
41:48
it's a great question. I just
41:50
go back to what we were
41:52
saying a minute ago. How do
41:54
we answer it? What is a
41:56
mechanism for having that conversation for
41:58
having that conversation? of giving us
42:00
more of what we want. It
42:02
cannot tell us what's worth wanting
42:04
in the first place. And the
42:06
problem is I don't know how as
42:08
a society we have that conversation
42:11
together about what's worth wanting and
42:13
then have a conversation about how
42:15
to go about getting it. I
42:17
just don't know. And the problem with
42:19
some of these new technologies like AI
42:22
is it's not even clear what they're
42:24
going to do. So it's very hard
42:26
to talk about. the trade-offs that might
42:28
be involved. But I don't know, it's
42:31
not a very good answer because I
42:33
don't have one, I guess. Well, and it's
42:35
interesting because I think that, so
42:37
one of the things that actually
42:39
excites me about AI is the
42:41
ability to use it to more
42:44
quickly augment. society's ability to see
42:46
the downsides and externalities and play
42:48
out simulations of various new technologies.
42:50
Because one of the things that
42:52
we have to get incredibly good
42:54
at is actually foreseeing the negative
42:56
unintended consequences before they happen. So,
42:58
you know, imagine inventing plastics, but
43:00
actually knowing about forever chemicals and then taking
43:03
a left turns, we don't go down the
43:05
road of creating more, you know, pollution than
43:07
we have the capacity to clean up. And
43:09
the same thing with social media, and
43:11
that's one of Postman's other questions, is
43:13
whose problem is it? So if it's
43:16
the problem of not being able to
43:18
generate content at scale, whose problem
43:20
was that? This is the basic second
43:22
question. The third question is what new
43:25
problems will be created by solving this
43:27
problem with this technology? So in the
43:29
case of generative media, we will create
43:32
a new problem of people who have
43:34
no idea what's true because now anybody
43:36
can create anything and flood the... information
43:38
airwaves and then he asks which people
43:41
and institutions will be most harmed
43:43
by the adoption of this technology.
43:45
So for example gatekeepers or the
43:47
idea of trustworthy or having you know
43:49
any kind of authority or expertise is
43:51
suddenly going to be eliminated by the
43:53
fact that there's a flood of information
43:55
kind of a denial of service attack
43:57
on democracy through all this stuff that's
43:59
And then he has this really
44:02
important subtle question that he asks,
44:04
what changes in language are being
44:06
promoted by this technology? And I'm
44:08
curious, Lance, if you have some
44:10
examples that Neil has given on
44:13
that one, because I think it's
44:15
such a crucial one that's very
44:17
subtle. Well, sure. And I think
44:19
it's actually a very important one.
44:22
And you're right that it does
44:24
sort of take a left turn
44:26
from the other questions. But what's
44:28
often missed. when folks just look
44:30
at like amusing ourselves to death
44:33
and and technopoly is that postman's
44:35
grounding was in the study of
44:37
language and he was he started
44:39
out in English education and and
44:41
he was also very much associated
44:44
with general semantics which in in
44:46
a large part is about our
44:48
use of language and trying to
44:50
understand our misuse and how that
44:53
changes our thinking. I mean I
44:55
think for me a great example
44:57
is community. And when you think
44:59
about the use of the word
45:01
community in a real community. people
45:04
are together and they don't all
45:06
share the same interests and viewpoints,
45:08
which is what we mean when
45:10
we talk about online community, virtual
45:12
community, and that's where you get
45:15
that siloing effect. You know, in
45:17
a real community, people have to
45:19
negotiate with people who are very
45:21
different from themselves and find a
45:23
way to live together. And you
45:26
can't just like pick up and
45:28
leave, you know, where you live,
45:30
whereas on the internet, you can
45:32
just, you know, click a button
45:35
and you've left that community. and
45:37
you find one that's more to
45:39
your liking. So that meaning of
45:41
the word community has changed drastically
45:43
by that usage. And that is
45:46
also, you know, you could also
45:48
connect that back to a kind
45:50
of Orwellian quality because that was,
45:52
you know, the idea in 1984,
45:54
and it's expressed in the index
45:57
that we can change the meaning
45:59
of words and change the way
46:01
people think. That may not be
46:03
happening all that intentionally as it
46:05
was under a totalitarian system. And
46:08
it actually did happen under Nazi
46:10
Germany and in Soviet Union. But
46:12
it's still happening and it's still
46:14
changing the way we think. I
46:17
think it's an excellent point. And
46:19
then it feeds back into real
46:21
community. So when people are in
46:23
real community, their expectations have been
46:25
formed by these online experiences and
46:28
these new definitions for words. Sean?
46:30
I guess I've done a lot of
46:32
technology bashing here and I just want
46:34
to say it's not all of
46:36
our problems cannot be laid at the
46:39
feet of technology. I mean, it is
46:41
also true that over the last
46:43
three, four decades, we have stopped
46:45
as a society investing
46:48
in social infrastructure, community
46:50
centers, libraries, third spaces,
46:52
where people can actually... Get
46:55
together and talk and be with one
46:57
another and engage their community and not
46:59
just be Home alone ordering pizzas with
47:01
the app so that they don't have
47:03
to engage with another human being in
47:06
the entire process right so My worry
47:08
is that these technologies have pushed society
47:10
in a more solipsistic direction. It's pulling
47:12
us more inward Allah the movie her
47:14
I feel like that's where we're going
47:17
where people are just they're going to
47:19
be in relationship with chat bots. They're
47:21
going to be you know at home
47:23
using VR technology or whatever and they're
47:25
going to stop going outside and doing
47:27
things with other people and so we
47:30
have failed on both fronts and there
47:32
is a there are policy solutions that
47:34
could counterbalance some of this if we
47:36
invested in those things and we we
47:38
haven't or we stopped and we should
47:40
again. I agree. I just wanted to
47:42
name one other example of language
47:44
that change that is happening without
47:47
really reckoning with it is Elon's
47:49
redefinition of saving free speech when
47:51
he takes over Twitter to protect
47:53
people's ability to reach millions of
47:55
people anonymously inside of a news
47:58
feed that rewards the most salacious
48:00
inflammation of cultural fault lines in the
48:02
cultural war and like a system that
48:04
just like rewards the toxicity of inflammation
48:07
on cultural fault lines everywhere and then
48:09
saying that that's about free speech it's
48:11
like it's a it's a kind of
48:14
a new speaking and kind of turn
48:16
on what what was freedom of speech
48:18
really meant to protect in its original
48:20
essence as defined by the founding fathers
48:23
and it had nothing to do with
48:25
or it certainly did not foresee a
48:27
world where a single person could reach
48:30
200 million people every day with their
48:32
thumb as many times as they wanted
48:34
to, and that's a different thing than
48:36
the deeper ideas. And so I just
48:39
think the best question of language, we
48:41
just imagine a society that is actually
48:43
asking that question. So imagine a society
48:46
that is actually asking that question. So
48:48
imagine that sort of a postman-informed society.
48:50
And every time there's a new technology
48:52
rolling out, their immediate first thoughts are
48:55
instead of being entranced by it and
48:57
deifying the technology and deifying the solution.
48:59
Whose problem is that? What are the
49:02
new problems that are going to be
49:04
created by this technology? What are the
49:06
changes in language that it's actually hiding
49:09
from us about the way it's reconstituting
49:11
things? So I just, I feel like
49:13
that's a vision of society that I'm
49:15
reminded of the, I think it's the
49:18
opening chapter of Technopoly where he talks
49:20
about the story of, was it's really
49:22
about. What is a conscious adoption strategy
49:25
of technology where in that story they're
49:27
actually talking about should we adopt the
49:29
written word and they're sort of talking
49:31
about that as a choice and noticing
49:34
all the things that that's going to
49:36
give and also it's what it's going
49:38
to do and also which things it's
49:41
going to undo in the society? And
49:43
I just feel like that's. That's so
49:45
within reach is to have cultures that
49:47
actually are critical of technology in which,
49:50
you know, Postman is part of the
49:52
curriculum of, you know, political science courses
49:54
at every university and part of undergraduate
49:57
education. And it's all the more important
49:59
because technology is so central in the
50:01
fundamental shaping forces of the entire world.
50:03
So maybe I'm just a dreamer but
50:06
this is the. Can I ask you
50:08
a question? Do you think it's the
50:10
responsibility of the people building these technologies
50:13
to ask themselves these questions or do
50:15
you think it's the responsibility of the
50:17
public to ask and answer these questions
50:19
and then impose their solutions. It's all
50:22
the more important that the people building
50:24
it have a critical understanding of what
50:26
it will do because they're they're being
50:29
at the driver's seat. and the control
50:31
panels about how it's going to roll
50:33
out means that it's even more important
50:35
that they're contending or tending with these
50:38
questions than it is with the regular
50:40
public and I think the regular public
50:42
needs to contend with it as maximally
50:45
as possible. Lance? Well, I mean, the
50:47
history of invention shows that inventors... pretty
50:49
much are wrong about what their technology
50:52
is going to do. And so they're
50:54
the last people, I think Arthur Kessler
50:56
called them sleepwalkers, you know, that, I
50:58
mean, television's a great example because when
51:01
television's introduced or, you know, especially in
51:03
the post-war period. all of the write-up
51:05
of it is it's going to bring
51:08
culture into everyone's home. They'll have opera
51:10
and ballet and classical music and it's
51:12
going to be wonderful for democratic politics
51:14
because we'll be able to televise political
51:17
conventions and people will see, you know,
51:19
debate and discussion on political issues. You
51:21
know, and they couldn't be more wrong
51:24
and I, you know. I think there's
51:26
a great spirit of play that comes
51:28
with invention. It's just, you know, to
51:30
see, you know, what can be done,
51:33
what we can do. But I don't
51:35
even know if an AI program, I
51:37
mean, you mentioned this before, Tristan, but
51:40
I don't know if that can, they...
51:42
I don't know, it can adequately foresee
51:44
all of the consequences because you introduce
51:46
a change into a highly complex interdependent
51:49
system. It's going to change something, it's
51:51
going to change other things, they're going
51:53
to... interact with one another? It's a
51:56
complex system for sure. Yeah. And to
51:58
be clear, I want to say a
52:00
couple of things. I agree that we
52:02
don't, and others don't have a good
52:05
track record of foreseeing the consequences of
52:07
their invention. I do think that there
52:09
are tools one can use to much
52:12
better foresee what those consequences will be.
52:14
In 2013, how could I foresee that
52:16
the attention economy would lead to a
52:18
more addicted, distracted, sexualized society? It's because
52:21
the incentives at play help you predict
52:23
the outcome. And I think an incentive
52:25
literate culture that follows the Charlie Munger
52:28
quote, you know, if you show me
52:30
the incentive, I'll show you the outcome.
52:32
If we can understand what the incentives
52:34
are, you can get a very good
52:36
sneak preview of the future. I don't
52:38
think it's an easy thing to reach
52:40
for, but I think it's something that
52:42
we need more of if we're going
52:44
to be a technology enhanced society
52:47
and actually make it through, because
52:49
we're quite in danger now. Sean? Yeah, look,
52:51
even if the answer to these questions is,
52:53
you know, in the words of Nate Barkazzi,
52:55
nobody knows, we should still be
52:58
asking them. That would at least
53:00
be a start. That's just not
53:02
something that we've done or are
53:05
doing. I think one of the
53:07
real needs is to
53:09
really reinforce literacy and that
53:11
this is ultimately what's
53:14
being threatened because that is
53:16
the foundation of democracy and
53:18
it's the foundation of the
53:21
enlightenment in postman's last. book
53:23
was building a bridge to
53:26
the 18th century, which wasn't
53:28
saying that we should go
53:30
back to the 1700s, but
53:33
that we should retrieve from
53:35
that era that literacy, typography,
53:38
the enlightenment, and the respect
53:40
for science and democracy that
53:42
existed back then, that we
53:45
need to reinforce those elements of
53:47
the media environment that the electronic
53:49
media are really doing away with.
53:51
And when you say, what is
53:54
the problem that AI is going
53:56
to solve? And I actually mentioned
53:58
it before. I mean, information glut
54:01
is one of the problems that
54:03
it's there to solve. But I
54:05
think one of the problems is
54:07
that reading and writing are hard.
54:09
They're hard to do. Anyone who
54:11
has written a book, you know,
54:13
will tell you that, what could
54:15
be more unnatural than sending a...
54:17
five-year-old to sit still for hours
54:19
on end, you know, but that's
54:21
what you need to learn how
54:23
to read and write and So
54:25
what are we doing? I mean
54:27
we and we've been doing this
54:29
for a long time now We're
54:31
developing technology to read for us
54:33
and to write for us I
54:35
mean, that's what AI voice synthesis
54:37
and voice recognition That's what it's
54:39
all doing. So we don't have
54:41
to do it ourselves So the
54:43
way to at least try to
54:45
mitigate this is by reinforcing those
54:48
aspects of the media environment that
54:50
we still have that are under
54:52
assault today. Yeah, I would just
54:54
say that in a lot of
54:56
ways the problem of our time
54:58
is this misalignment between our interests
55:00
and our incentives and the tragedy
55:02
really is that we have built
55:04
tools that have undermined our capacity
55:06
to alter our incentive structures. in
55:08
healthy ways. Exactly. That is it.
55:10
If our whole damn problem could
55:12
be distilled, that's it. I don't
55:14
know what to do about that,
55:16
but that's the challenge ahead of
55:18
us, and we've got to figure
55:20
it out. Completely, completely agree. If
55:22
incentives can control the outcome, then...
55:24
And governance is normally the ability
55:26
to change what those incentives are.
55:28
You pass a law or a
55:30
policy and you build social norms
55:32
and consensus in order to get
55:34
that law or policy passed to
55:36
change and say, hey, you're not
55:38
allowed or you can't profit from
55:40
this thing that would be highly
55:43
profitable, like whether it's underage, drugs,
55:45
sex trafficking, whatever the thing is.
55:47
So I completely completely agree. I
55:49
know we're basically here out of
55:51
time and just want to close
55:53
with this. this quote, that no
55:55
medium is excessively dangerous if its
55:57
users understand what its dangers are.
55:59
It's not important that those who
56:01
ask the questions arrive at any,
56:03
at my answers, or Marshall McLuhan's.
56:05
This is an instance in which
56:07
asking the questions is sufficient, to
56:09
ask is to break the spell.
56:11
And that just feels like what
56:13
we're arming here is let's arm
56:15
ourselves with the questions to protect
56:17
ourselves from getting further overwhelmed. And
56:19
also let's be honest about the
56:21
nature of what's coming. So questions
56:23
are our most important medium. That's
56:25
from language, and that's the way
56:27
that we start to think about
56:29
things critically and deeply. Well, no
56:31
one's going to listen to a
56:33
three-hour Lincoln-style speech to save us,
56:35
so we just need a kick-ass
56:38
meme that's going to bring us
56:40
all together. It's your job to
56:42
find a tweet for this one
56:44
and create some memes that are
56:46
going to go viral. We'll tweet
56:48
our way through it. No worries.
56:50
Sean and Lance just wanted to
56:52
thank you for coming on your
56:54
undivided attention. That's great. Thank you.
56:56
Thank you. So
56:58
a thought I'd like to leave you
57:00
with. There's a quote from the introduction
57:02
of amusing ourselves to death that has
57:04
always stuck with me, where Postman compares
57:06
two dystopian visions for the future. The
57:08
first presented by George Orwell in 1984,
57:11
of surveillance at Big Brother, and the
57:13
other presented by Aldis Huxley in Brave
57:15
New World. Postman wrote, what Orwell feared
57:17
were those who would ban books, while
57:19
Huxley feared was that there would be
57:21
no reason to ban a book, for
57:23
there would be no one who wanted
57:25
to read one who wanted to read
57:27
one. Orwell feared those who would deprive
57:29
us of information. Huxley feared those who
57:31
would give us so much that would
57:33
be reduced to passivity and egoism. Or
57:35
well feared that the truth would be
57:37
concealed from us, while Huxley feared the
57:39
truth would be drowned in a sea
57:41
of irrelevance. Or well feared that we
57:43
would become a captive culture, while Huxley
57:45
feared we would become a trivial culture.
57:48
As Huxley remarked, the civil libertarians and
57:50
rationalists who are ever on the alert
57:52
to opposed tyranny to opposed tyranny. failed
57:54
to take into account man almost
57:56
infinite appetite for
57:58
distractions. And it was
58:00
Postman's fear that
58:02
it would be Huxley,
58:04
not Orwell, whose
58:06
prediction would come true.
58:13
Your undivided attention is produced by the
58:15
Center for Humane Technology, a non -profit working
58:17
to catalyze a humane future. Our senior
58:19
producer is Julia Scott, Josh Lash is
58:21
our researcher and producer, and our executive
58:23
producer is Sasha Fegan. Mixing on this
58:25
episode by Jeff Sudeiken, original music by
58:28
Ryan and Hayes Holiday. And a special
58:30
thanks to the whole Center for Humane
58:32
Technology team for making this podcast possible.
58:34
You can find show notes, transcripts, and
58:36
much more at humanetech.com. And if you
58:38
like the podcast, we'd be grateful if
58:40
you could rate it on Apple Podcast
58:42
because it helps other people find the
58:45
show. And if you made it all
58:47
the way here, let me give one
58:49
more thank you to you for giving
58:51
us your undivided attention.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More