Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:02
When
0:02
Vivian Schiller signed on as a senior
0:04
executive at Twitter in twenty thirteen,
0:07
she was excited to be joining a company
0:09
that seem poised to remake the world.
0:11
It was a heavy time for the social media
0:13
startup. Just a few years earlier, it had been
0:15
messages on Twitter that connected democracy
0:18
activists throughout the Middle East leading to
0:20
a revolutionary moment known as the Arab
0:22
spring. But Shell soon became
0:25
disillusioned and has long since left the
0:27
company. In the years since Twitter
0:29
was increasingly hijacked by purveyors
0:31
of hate and disinformation following
0:34
democracy instead of spreading it.
0:36
Now billionaire Elon Musk has taken
0:38
over Twitter, fired half its workforce,
0:41
and signal plans to revise if
0:43
not roll back the content moderation
0:45
policies that led the company to kick Donald
0:47
Trump off the platform for spreading
0:50
election lives. We'll talk to
0:52
Shiller about what we should make of the must takeover
0:54
and what it portends for the future of Twitter,
0:56
social media, and American democracy
0:59
on this episode of skullduggery.
1:03
I do solemnly swear
1:06
that I will faithfully execute the office
1:08
of president of the United States and
1:10
welcome to the best of my building. Reserve,
1:13
protect, and defend the
1:15
constitution of the United States. So
1:17
help me god. So help me god.
1:19
So help me god. So help me god. Don't
1:21
help me die. Don't help me die.
1:24
I'm
1:24
Michael Isakoff, chief investigative correspondent for
1:26
Yahoo News. I'm Dan Clydeman, editor
1:28
in chief of Yahoo News.
1:30
And I'm Victoria Bessetti, a senior counselor
1:32
at States United.
1:33
So the other day, right
1:35
after Musk had taken over
1:37
Twitter. I'm walking the
1:40
dog and have my head buried
1:42
in my iPhone reading all about
1:44
what Musk is doing
1:47
to the company and the threats
1:49
to its content moderation policies.
1:53
And some elderly gentleman comes
1:55
up to me and says, get
1:57
your head out of your phone, take
1:59
in the environment, look at the trees,
2:01
and look he was
2:03
right. You know, we all spend too
2:05
much time scrolling
2:08
through Twitter, and this
2:10
is we shouldn't be getting
2:12
our heads off our phone. That said,
2:14
Twitter clearly plays an
2:16
outsized role in our
2:18
political dialogue. So the
2:20
idea that this eccentric billionaire,
2:23
Elon Musk, could run the company
2:26
as he wishes and influence what
2:28
Americans see and read and
2:31
how they get their news is I think
2:33
troubling on its face and,
2:35
you know, put on top of that.
2:37
Musk's start in which
2:39
he, a, retweets some
2:42
crazy conspiracy theory about the
2:44
Paul Pelosi attack. signals
2:47
he plans to have Donald Trump back
2:49
on the platform, although it's, you
2:51
know, he's saying we're
2:53
gonna establish our new
2:55
content
2:55
moderation guidelines first.
2:58
All of that has huge implications
3:01
for all of us. Well
3:03
well, now we have a title for the podcast.
3:06
What's that? Smell the roses
3:08
is a cough. Yeah. But
3:11
the trees smell the roses. Yeah.
3:13
Yeah. So, you know, we're we're, like,
3:15
four days away from, you know, these very
3:18
consequential mid term elections and
3:20
everyone's obsessed with politics
3:22
right now. The one story that's,
3:24
you know, really breaking through is
3:26
Elon Musk buying Twitter. That's partly
3:29
because people are fascinated by
3:31
Elon Musk and his Mercurial
3:33
personality. It's partly because people are assessed
3:35
with Twitter and many people, including
3:38
many of our colleagues, and probably some
3:40
of our listeners are addicted to Twitter, But
3:43
it's also relevant to
3:45
the election because of the outsized
3:47
influence that Twitter has,
3:49
because of the dangers of disinformation
3:52
and voter intimidation. And because
3:55
of the fundamental questions that
3:57
Twitter and Twitter's problems and social
3:59
media's problems more generally posed
4:02
to our democracy. So
4:05
it deserves to break through and I'm
4:08
very glad that we're gonna be having
4:10
this conversation today with Vivian Schiller
4:13
who was a a high ranking executive
4:15
there and and is very thoughtful about
4:17
these issues and how important
4:19
they are. Howard Bauchner: I
4:20
think one of to me at least one of the really
4:22
interesting things about this is
4:24
that In the past, Elon Musk
4:26
has operated into
4:28
a certain degree a much
4:30
more constrained space. He's been
4:32
running a space
4:34
exploration company and an
4:36
automobile company. He has
4:38
stepped into an arena that I think he
4:40
doesn't fully understand and
4:42
the and he is
4:45
beginning to get buffeted by forces
4:47
that I think he never fully
4:49
expected and possibly isn't prepared to
4:51
deal with. Twitter is
4:53
fundamentally, and at the end of the day,
4:56
an organization that is ruled
4:58
by law, and that is ruled
5:00
by perception, and that it is ruled
5:02
by its millions of users. and
5:04
Musk to the extent that he thinks he can
5:06
actually control or dictate the
5:08
fate of that product or the fate
5:10
of its economics. is
5:12
probably gonna find out he's sadly
5:14
mistaken. He's already
5:16
the subject of a major class action lawsuit
5:18
because of the way he fired all of these people.
5:21
he's potentially the subject of a federal
5:23
investigation regarding who his second
5:25
largest investor is in
5:27
Saudi Arabia. billionaire just point
5:30
that out. Yes. He's got advertiser revolts.
5:32
He's
5:32
got advertiser revolts, and
5:35
he's got a user based revolt. And
5:37
the thing he needs remember is that
5:39
his users are his actual product.
5:41
In stark contrast to
5:44
rockets and cars, where
5:47
he can control the product
5:49
and make the product. He can't
5:51
do that at all with Twitter. And I
5:53
think he's in for a sharp awakening
5:55
and maybe unpleasant look at his
5:57
bank account in the not too distant
5:59
future.
5:59
Well, that would be interesting. And who's
6:02
gonna do that? take a look at
6:04
his bank account. Are we talking through
6:06
litigation, lawsuits, or, you
6:08
know, some of the old governmental I
6:10
mean, intervention investigation.
6:12
Some of it is just out there. I mean, he's
6:14
he's he's taken on thirteen billion
6:16
dollars in debt in this
6:18
acquisition and he he
6:20
has to to service that
6:22
debt, you know, he's gonna be paying a billion
6:24
dollars a year, which is
6:27
more than Twitter earns. So,
6:29
you know, just to Victoria's point,
6:31
I mean, just just financially, just
6:33
on that basis alone, he's
6:36
got he he may have bitten off more
6:38
that he can chew. And and the point is, yes,
6:40
he is a very successful businessman. He's the wealthiest
6:42
man in the world. but just to emphasize
6:45
Victoria's point, media
6:47
is a very very different business.
6:49
And
6:50
to answer your point, Mike,
6:53
government may be, but this guy's in
6:55
hock up to his eyeballs, it to
6:57
banks. And those banks those
6:59
banks have covenants and
7:01
they very are very, very
7:03
carefully looking at their investment
7:06
and at their at what
7:08
Musk is doing with it. So he's
7:10
gotta he's gotta hit his covenants.
7:11
And when he does It'll be
7:13
fascinating to see. I I'm just,
7:15
you know, from larger perspective,
7:19
historians turn in
7:21
the century would talk about the outsized
7:23
influence of the
7:25
press barons The yellow press
7:27
all first and Joseph Pulitzer.
7:30
Look at today to the extent to
7:32
which we have not evolved
7:34
at all. Rupert Murdoch, Jeff
7:37
Bezos, Mark Zuckerberg, Elon
7:40
Musk, billionaires, all
7:42
of them, having a
7:45
really outsized significant role
7:47
in determining how we get our
7:49
news and what news we read.
7:51
And I think that on its face
7:53
ought to be a trouble issue
7:56
for all of us. But anyway, we've got
7:58
a great guest to
7:59
talk about all this. Vivian
8:02
Schiller, a former Twitter executive herself,
8:05
So let's get to it.
8:13
Alright. We now have with us Vivian
8:16
Schiller. She was once the chief
8:18
of global news at Twitter.
8:20
She's also been the
8:22
president and CEO of NPR, General
8:25
Manager of The New York Times, a
8:27
Chief Digital Officer of NBC News.
8:30
So fair to say she is
8:32
a steep in the news business,
8:34
Vivian. Welcome to Skaldegree.
8:36
Thank you.
8:37
Thanks so much. Glad to be back.
8:38
So you are at
8:41
Twitter in its relatively
8:43
early stages two thousand
8:46
thirteen, fourteen, if I remember
8:48
correctly. Yeah. You've long since left,
8:50
but you've like all of us,
8:52
you've been watching Elon
8:54
Musk's takeover and the
8:56
turmoil within Twitter now.
8:58
How concerned are you
9:01
with what Musk seems to
9:03
be doing at Twitter.
9:05
Very
9:05
concerned for a
9:07
whole bunch of reasons. starting with
9:09
the fact that there doesn't seem to
9:11
be a plan, a strategy
9:14
that I have heard. It seems to
9:16
keep shifting hour to hour
9:18
so I don't know what's going to happen to Twitter as a
9:20
business. I'm concerned because
9:22
they have fired not all,
9:24
but a huge number of people that
9:26
looked out for the integrity
9:28
of the content on Twitter.
9:30
And I'm concerned because even though
9:32
I left Twitter long ago,
9:35
I'm still a heavy Twitter user, and it's incredibly
9:37
valuable to me. And I feel like it
9:39
has the potential to either
9:42
turn into a trap, sheep, or go
9:44
away entirely. So,
9:46
yeah, lots of
9:47
problems here. Let
9:48
me ask you a follow-up on just on
9:50
the firings, Vivian. because
9:53
at least as of the time we are
9:55
recording this podcast on Saturday, that's the
9:57
latest big news. And, you
9:59
know, basically, in a
10:01
number of hours, Musk fired
10:03
half the workforce. It was a
10:05
appears to have been a totally chaotic
10:07
process with people getting fired in the
10:09
middle of the night, people getting
10:11
locked out of their email, but
10:13
not their Which is how they found out. Yeah. Which is
10:15
how they found out. There was a crazy
10:17
story about, you know, there was some product
10:19
meeting where people were it was a video
10:21
conference where people were calling in. one
10:23
person on the call just disappeared because
10:26
at that moment, they've been locked out of
10:28
the Twitter system. So I'm
10:30
just curious, what does the
10:32
way this hack and
10:34
tell you at all about
10:36
Elon Musk and how
10:38
he might run this company. The there's
10:40
a professor of management at Harvard
10:42
who said that she called it a master
10:44
class in not how to do this.
10:47
Some might argue that, you know, you you gotta pull
10:49
the band aid off if you had to do
10:51
this. to get Twitter on
10:53
a viable, you know,
10:55
financial path that you might as well do it
10:57
quickly, but I'm curious what your thoughts are
10:59
about how this went down. Well,
11:00
there's and then there's just complete and utter
11:03
badness, which is,
11:03
you know, what appears to have happened.
11:05
Look, he owns the joint now. He
11:08
can do whatever he wants. And
11:11
so, I mean, he had sort of telegraphed that he
11:13
was gonna fire a bunch of people. So
11:15
that part is not a surprise,
11:17
but And I suppose, you know, it's
11:19
not a surprise that it was handled so chaoticly.
11:21
I'm just concerned that
11:23
based on the kinds of categories of people,
11:25
well, just I've got to share numbers. I mean,
11:28
inevitably he's going to lose a
11:30
lot of people who make sure that the
11:32
site can basically stay line
11:34
is protected from
11:35
outside cyberattacks
11:37
is
11:37
protected from content
11:40
influence campaigns. where
11:42
is protected from, you know,
11:44
the kind of garbage that sadly
11:46
is part of human nature and, you
11:48
know, people would like to, you know,
11:50
inflect on on Twitter
11:52
users. So I don't know. Look, the guy,
11:54
he's the you know, he's the richest man
11:56
in the world. Right? And he's built a couple of
11:58
successful businesses, but
11:59
sure doesn't seem to be much of a
12:02
strategy
12:02
here that I can discern. And just one really quick
12:04
follow-up. It's just the what he's been
12:06
saying is he had to do this Twitter's losing
12:08
four million dollars a day, and it's
12:10
because of these activists
12:12
-- Oh, come there. You
12:14
know, who who are scaring off the
12:16
advertisers. I mean, that's bullshit.
12:18
Right? He's scaring off the
12:20
advertisers. The activists were trying to
12:22
tell him what he needs to do to protect the
12:24
site. then he threatened to
12:26
name and shame the advertisers
12:29
who are no longer advertising, boy, that's
12:31
a real way to win hearts and minds of the
12:33
people that, you know, keep your lights
12:35
on. Let me step back for a moment and
12:37
contextualize Twitter's importance.
12:40
Only and only is actually probably
12:42
not the right word, but only about twenty
12:44
five percent of Americans used Twitter,
12:47
and only a sub portion of
12:49
them used Twitter for purposes of
12:51
the news. So all of us on this
12:53
podcast are probably slightly Twitter
12:55
addicted and use it for a lot
12:57
of our jobs. But why
12:59
should the average American,
13:01
most of whom are not on Twitter
13:03
care about what happens to this
13:05
platform? Look, the
13:06
people that are listening to this
13:07
podcast right now in which we're
13:09
talking about Twitter is exactly
13:11
why Twitter punches above its
13:13
weight in terms of the reach and
13:15
influence that it has. Journalists
13:17
are all on Twitter. So
13:19
journalists then use Twitter
13:21
whether they
13:21
should or shouldn't. That's a whole other topic
13:24
for
13:24
another day in their reporting.
13:26
And so it is and, you
13:28
know, heads of state use Twitter,
13:31
Titans of corporate industry use Twitter,
13:33
whether they'll continue to as a whole other
13:35
story. So it has tremendous
13:38
influence beyond just the people who happen
13:40
to be
13:40
on Twitter all day like all of us.
13:42
And there's
13:43
nothing like Twitter. There is
13:45
no other platform that
13:48
is public and real
13:50
time communication platform.
13:52
Howard Bauchner:
13:52
Now, in the midst of all of the turmoil
13:55
that's been going on on Twitter, there have been
13:57
a lot people who've been leaving or saying that
13:59
they're leaving, and there
14:01
are a fair number of people who are
14:03
kind of pumping up alternative platforms
14:05
to Twitter. you hear about, I think,
14:07
Mastodon is one of the I I hear that
14:09
Jack Dorsey, who's one of the former CEOs
14:11
of Twitter, is starting up his own kind
14:13
of new new Twitter.
14:15
Darcy is the one who pushed pushed
14:18
Musk to buy Twitter. Yep.
14:20
So so there is no
14:21
alternative. Is that your that
14:23
There is no alternative today. There
14:25
is no alternative today.
14:27
So I should have in the
14:29
introduction mentioned your current
14:31
position. Thank you. I I would never get Yeah.
14:34
You are chief of Aspen
14:37
Digital and executive director of
14:39
Aspen Digital. And
14:41
I gather one of the
14:43
issues you deal with, which you're part of the
14:45
Aspen Institute, is
14:47
the whole question of
14:49
the role of social media companies and
14:53
how they have both
14:55
contributed to
14:57
Democratic dialogue at least
14:59
that was the original conception
15:01
of Twitter and others and,
15:03
you know, how they have been degraded
15:06
by hate speech and and
15:08
conspiracy theories. And and, you know,
15:10
look, Musk started off
15:12
on a bad foot on
15:14
the content moderation side
15:16
of the equation. First, he retweets
15:19
some wild conspiracy theory
15:22
about the attack Paul Pelosi.
15:24
And then, you know, in a
15:26
twelve hour period right
15:28
after his takeover it
15:30
was pointed out that there was like a
15:32
five hundred percent increase in
15:34
the use of the n word,
15:37
something that should disturb everybody.
15:39
On the other hand, there is a
15:42
question, you know, the outstanding
15:44
about content moderation and
15:46
whether there can be any real
15:48
rules that could be
15:50
guide posts for social media
15:52
companies like Twitter and whether there's
15:54
any role for the government at all
15:56
in dictating what information, not
15:58
dictating or influencing what information
16:00
we see.
16:02
How do you navigate that
16:04
larger issue of
16:06
content moderation in an age
16:08
of conspiracy theories in hate
16:10
speech. And preserve the
16:12
first amendment. Yeah. carefully
16:14
and imperfectly. You know,
16:17
look, I can't think of any platform
16:18
that has gotten content
16:21
moderation entirely right. I don't even know
16:23
that it's possible. But it
16:25
requires diligence
16:26
and vigilance to
16:28
what
16:28
you're seeing happening on your platform line. I'm
16:31
saying if you work in content moderation,
16:33
it
16:33
requires judgment and
16:36
flexibility. It cannot be automated,
16:38
and you will
16:40
never get
16:41
it perfectly right, but you can just keep
16:43
trying to move as close as you
16:45
can to good enough. So,
16:47
you know, the thing is on
16:49
on Twitter,
16:50
pre Musk, there were plenty of issues. I mean,
16:53
Musk was right. There's a bot problem.
16:55
There's hate speech on Twitter.
16:57
There were attempts, most of which have been
16:59
thorted at at sort of coordinated, you
17:01
know, to thwart coordinated,
17:03
you know, misinteraction campaigns.
17:06
Twitter, I think, was
17:08
one of the best and the most
17:10
thoughtful on
17:11
these issues, even though there was a
17:13
lot of
17:14
problems. And I will say that the guy who is
17:16
now running trust in safety at Twitter
17:18
because the his bosses were
17:21
all fired, name is
17:22
UL Roth, is
17:23
still there and at least
17:26
as of this recording at
17:28
this moment on Saturday morning
17:30
at eleven sixteen AM eastern,
17:32
Musk seems to like them and is basically
17:34
on Twitter telling everybody to listen to
17:36
this guy. And I
17:36
know you well, and he
17:39
is a person of utmost integrity. So
17:41
that is sort of the small comfort
17:43
that I'm hanging on to right now.
17:45
But
17:45
content moderation is is
17:48
really, really, really difficult. The
17:50
role
17:50
of the government? Oh, man.
17:53
No.
17:53
Yeah. Just a
17:55
quick follow-up just on that, which is that, I mean, if you look
17:57
at kind of the legal landscape, you've got
18:00
challenges to Section three
18:02
twenty Is it section two thirty? Section
18:04
five o? Is it you got set
18:06
challenges to section two thirty, which immunizes
18:08
the platform. more basic class. Yeah.
18:10
Yeah. Yeah. Which it immunizes the that
18:13
form from liability for, you know,
18:15
content that appears on their sites. Right? And then
18:17
you've got you've got challenges to the
18:19
moderation of content in
18:21
states like Texas and
18:23
Florida. So Mike was talking about,
18:25
you know, how you, you know,
18:27
navigate all that. How do you situate Musk on
18:29
the continuum between those
18:31
two things? I don't
18:32
know. I mean, I don't know because he hasn't
18:34
told us. Yeah. I mean, he says that he's
18:36
for free speech, which has
18:39
turned into a completely empty
18:42
term that appears to mean
18:44
whatever this the person
18:46
uttering those words wants it
18:48
to mean. So we really
18:50
don't know. So, you
18:52
know, and it's gonna be I
18:54
mean, if he doesn't know already, he's gonna quickly
18:56
learn, there is no such thing as as a
18:58
site with no content moderation. It
19:00
certainly won't last long because
19:02
that's where you're gonna get his, you know,
19:04
his hated spots. You're gonna
19:06
get hate speech. You're
19:06
gonna get spam. You're gonna
19:09
get solicitations for
19:11
all kinds of garbage. You're gonna
19:13
get porn. It's just going to be
19:15
You can't run a business that
19:17
way. So so far, Musk
19:21
has managed to turn
19:23
off most of the advertisers
19:26
on Twitter and severely
19:28
restrict or severely kind
19:30
of You've been alienated, alienated. Okay.
19:32
I just want to be clear about it.
19:34
And so severely
19:36
impacted the incoming revenue of
19:38
Twitter over the course of at least the
19:40
next year, probably. He has fired so
19:43
many people that he's imperiled the
19:45
product quality. Is
19:48
Twitter on life support right now? How
19:50
long does it actually have? You know, I think
19:52
we'll know a lot next week with
19:54
the midterms. I don't know. I mean, I don't I
19:56
don't wanna just you know, I try to make up
19:58
an answer to that. I really
19:59
don't know. I mean, it hasn't don't
20:02
forget, most people only got fired
20:04
yesterday. So I've seen those same reports
20:06
about increases in the use of the n
20:08
word, etcetera. But I haven't
20:10
seen you know, we haven't really seen
20:12
any other must impact on
20:14
Twitter yet. I will say that I feel
20:16
like the energy on Twitter has gone
20:18
this is just my experience. This
20:19
is not, you know,
20:21
a quantitative not
20:24
based on quantitative data, but I feel like
20:26
people are a little quieter on Twitter right
20:27
now. So you do actually believe that he can
20:30
turn it around. No. I have
20:31
no idea. I mean, he has not
20:33
given us any
20:34
any plan. He says
20:35
he wants to try to
20:38
reduce box actually, that would be a great thing. Although some bots are
20:40
fine, you know, just starting with
20:42
labeling bots would be a great thing. He says
20:44
he wants to
20:46
Find
20:46
ways for Twitter to make more money, which
20:49
all the previous CEOs have
20:51
basically
20:51
failed to do
20:53
and it's possible there is not a way for that
20:55
Twitter just, you know, that there's not
20:57
revenue streams that nobody's thought of. I
20:59
must think he's gonna do it by charging
21:02
people eight dollars a month to be
21:04
verified, which don't forget
21:06
verification is intended
21:08
to signal that the person tweeting is
21:10
who they say they are.
21:13
It is some people perceive it as status symbol,
21:15
that's not its intention. So by
21:17
selling verification as if
21:19
it's a status symbol, the
21:21
people like us, I'm not
21:23
gonna I'm verified because I'm a journalist
21:26
and I'm not gonna pay eight dollars
21:28
a month, but yet people
21:30
who are
21:30
frauds can pay eight dollars a month,
21:33
so it's gonna become meaningless. It's
21:35
like the starbelly
21:36
snitches. Yeah. Yeah. It
21:37
seems to me it totally defeats
21:40
the purpose of the blue check
21:42
mark. He doesn't buy it. I
21:44
mean, it's like the housekeeping
21:46
seal of approval. You'd run an
21:48
ad in the magazine. You get one or something.
21:50
I am fascinated by
21:53
the evolution of
21:55
Twitter over the years. You
21:57
were there in two thousand thirteen,
22:00
sort of the aftermath of the Arab
22:02
spring, and there was, you know,
22:04
a a great deal of excitement about
22:07
Twitter that you
22:09
know, spread democracy
22:11
throughout the world. And
22:13
then, you know, over time, things
22:16
change quite dramatic. actually. Give me a
22:18
sense of the arc of
22:20
this, and I assume when you were
22:22
there, you shared, you know,
22:24
much of the enthusiasm about
22:26
I believe a positive role Twitter
22:28
could play. Pro democracy role.
22:32
When did it start to change and
22:34
why? Well,
22:35
there's two concurrent sets of issues. Yes. When
22:37
I joined Twitter, it was
22:38
there was a great deal of excitement about
22:41
Twitter and its positive role
22:43
in the world. And I
22:45
shared that excitement, which was why I was
22:47
so excited to go be part
22:49
of that and to grow
22:51
that
22:51
positive influence in the world.
22:53
So
22:53
that's why I joined I joined Twitter. But
22:56
the
22:56
problems at at Twitter, there
22:58
are two different
23:00
issues that have sort of collided and woven
23:03
together
23:03
over the years
23:04
that have been problematic for
23:07
Twitter. One is and you
23:09
know, this has been well documented, including
23:11
by, you know, the early or early days
23:13
Nick Middleton's Fantastic Book
23:15
Hatching Twitter. Twitter has always
23:17
been plagued by management issues. It
23:19
is a platform that
23:21
the creators created something and
23:23
I think they didn't I
23:25
mean, it was brilliant, but I think they didn't even
23:28
understand the way people would use it. And the
23:30
users of Twitter turned it into
23:32
that force. And it was always sort
23:34
of like the leaders of Twitter kind of
23:36
chase the followers and figure out how
23:38
to keep up. And they did in many
23:40
ways, but it
23:41
has always had
23:43
management difficulties. And
23:46
now, you know, we see this playing out. I
23:48
think as I understand it, it was
23:50
pretty chaotic place internally when I was
23:52
there, which is one of the reasons that I left.
23:54
It was very hard to get things
23:57
done. I understand from people that stayed, it got a
23:59
little bit better during
24:00
some of the Jack Dorsey years.
24:02
And now we see where we Dorsey always
24:04
seemed to be kind of a weird dude.
24:07
Tell us about your own experiences. I don't feel But
24:09
I didn't work with him. He was not
24:12
CEO when I was there. Anyway, I didn't
24:14
stay.
24:14
I'm not gonna so
24:16
that
24:17
so one trajectory has been sort
24:19
of the internal dynamics
24:22
of Twitter. and and you
24:24
know, legitimate difficulty
24:26
figuring out how to monetize
24:28
it without destroying what Twitter
24:30
is, which I I don't know the answer to that either,
24:32
so I have incredible sympathy with that
24:35
dilemma. The second issue is the
24:37
world. And
24:37
in two thousand
24:40
fourteen, know,
24:41
before I left, the issues that were percolating
24:43
up, there was a little bit of bots, but there
24:45
was some some abuse and some hate speech,
24:47
and that was
24:48
a huge Game of Wackamole
24:50
as content moderation
24:52
practices were really getting up to
24:54
speed. But
24:55
after that, as, you know, as we now
24:58
know well, this
24:58
sort of, you know, miss disinformation industrial complex
25:00
really kicked in. You know, in the years leading
25:02
up to the twenty sixteen elections, and it's
25:04
only gotten worse. And now we
25:07
have you
25:08
know, an incredibly polarized society,
25:11
societies all over the world. What's the cause and
25:13
effect? Is the story for
25:15
another podcast? So now you've got a
25:17
situation where Twitter where
25:19
it's just hard to be
25:21
Twitter in a world where
25:23
you're trying to keep
25:25
it as
25:26
a healthy platform that does good in
25:28
the world in given all of the chaos of
25:30
the last
25:31
eight years. So
25:32
those two things together have become real I
25:35
think have brought us where we are today. David,
25:37
you
25:37
were talking about Twitter in the world. A
25:39
little bit before in the conversation, you talked about
25:41
Twitter and free speech? And what does
25:44
Elon Musk really mean by free
25:46
speech? I'm gonna connect these two things
25:48
because the majority of
25:50
Twitter users today are actually
25:52
overseas in international
25:55
markets. And it seems to me that Elon
25:57
Musk has a very kind of
25:59
sort
25:59
of
25:59
almost US centric view of speech issues very
26:02
much rooted in our
26:04
politics today. So
26:06
it's about cancel culture.
26:08
and the like. And the idea that people
26:10
on the right are being are
26:12
being silenced. But overseas
26:15
speech is threatened. on a
26:17
daily basis by
26:19
dictators and, you know, illegal regimes.
26:22
And I saw recently that that I think
26:24
that the Turkish courts ordered
26:26
that tweets be taken down
26:29
because they were critical of of
26:31
the the leader of Turkey recipient
26:33
with Erdogan. I
26:35
wonder how you think Elon
26:38
Musk will handle the
26:40
issue of dissidents. And I know this is something that Isankoff,
26:42
he's gonna alarm a question thing on on
26:44
this. But any sense of
26:47
of how this all will play globally?
26:49
Well,
26:49
if if most doesn't realize it yet,
26:52
he's certainly gonna learn very quickly
26:54
that he's gonna have to comply with
26:56
the
26:56
laws of the countries that he's operating
26:59
in. and every country has their
27:01
limits on
27:03
speech.
27:03
So some of it, you know,
27:05
in Germany, there's a lot of restrictions
27:07
around having to do understandably
27:10
around speech and support of, you
27:12
know,
27:12
Nazis. And then you have, with,
27:14
you know, with good intentions, obviously. But
27:16
then you have,
27:17
like you said, the autocrafts who
27:19
are trying to control the kind
27:21
of information about themselves. And
27:24
so,
27:24
you know, what's going to happen
27:26
with Twitter is I mean, and this is the way every platform
27:28
operates is they try to do
27:30
the best they can. And in the end, they have
27:32
to make a choice. Are they gonna operate in
27:35
the country complying with
27:37
laws or rules they
27:37
disagree with or are they gonna pull out?
27:39
And so, I guess, we will
27:42
see how his free
27:44
speech all the time, doctrine,
27:46
how that works out when he's up against
27:49
laws and rigid
27:51
liberal autocrats. I've
27:52
got a few questions along those
27:54
lines because it relates to, I think, one
27:56
of the more alarming aspects
27:59
of the new regime.
28:02
and that is the role of
28:04
the Saudis. Now this is something that
28:06
you had some direct experience
28:08
with, as I recall, A
28:10
few years ago, the FBI busted a Saudi
28:14
espionage plot within Twitter
28:16
to steal personal data
28:18
from the Twitter accounts of
28:21
Saudi dissidents and provide
28:23
them to Saudi intelligence services
28:25
for the purpose of
28:27
passing or worse to those
28:30
dissidents. And the guy, I
28:32
believe you worked with one
28:34
of those guys who was at
28:36
a Saudi spy, ahmed
28:39
Balama, or -- Yeah. -- was the guy.
28:41
He was recently convicted in federal court
28:43
-- Yeah. -- of spying for
28:46
the Saudis in exchange for hundreds
28:48
of thousands of dollars from
28:50
MBS's personal
28:53
secretary.
28:53
And as we reported a year
28:55
ago, MBS himself actually
28:58
boasted about how we did
29:00
that. We had our guy at
29:02
Twitter. So the FBI the plot. They convict Abu
29:04
Bahama, a guy you worked with when
29:06
you were at Twitter. And now we
29:08
learned that the number
29:11
that the second largest shareholder
29:15
in Twitter is
29:17
Prince Alwaleed of
29:19
Saudi Arabia billionaire investor, a
29:21
guy who was
29:24
imprisoned by MBS at the Ritz
29:26
Carlton a couple years ago,
29:28
emerges you know, months
29:30
later, Gaunt, chastened
29:33
and does MBS's bidding
29:36
sells share in his company to
29:38
this Saudi sovereign wealth
29:40
fund. It seems like the
29:42
Saudi's got what they wanted
29:44
you know, from the get go
29:46
now with a
29:48
a substantial corporate
29:50
interest in Twitter itself.
29:52
Yeah.
29:53
It's pretty troubling. You know, Ahmed, I
29:55
did work with him. Lovely guy enjoyed
29:57
working with him. Needless to
29:59
say, wouldn't had not in a million years could
30:02
have imagined that he was using
30:04
the
30:04
internal systems, the open internal
30:07
systems to supply the
30:09
Saudis with the registration
30:12
information of, you know,
30:14
personal data from Saudi dissidents. It's
30:16
horrifying. The systems have been
30:18
more locked down since then with pretty
30:20
much the wild west then.
30:22
Yeah. So now here we are
30:24
again. And you raise a good
30:26
question, which is, what are they
30:28
getting for that investment other than potentially return on that investment
30:30
that I'm sure Elon is hoping to deliver?
30:33
I would like
30:33
to believe maybe
30:36
naively so that they will not
30:38
have access.
30:39
But if I were somebody, I
30:42
didn't want
30:42
the Saudis to have personal
30:45
information about I might not. I might
30:47
take some actions right now. Well,
30:48
what could a government like
30:51
Saudi Arabia or a
30:53
Saudi actor who owns a substantial
30:56
stake in Twitter actually do with it that would
30:58
be damaging. I mean, there there are plenty of
31:00
ways of structuring a corporate
31:02
ownership transaction that wouldn't
31:04
leave that owner with, you know, kind of any
31:06
access to put together
31:08
valuable information. And III
31:10
would have to imagine that that was the
31:12
case. III
31:14
would imagine that there's not
31:16
a clause in their agreement that they can, you
31:19
know, see the
31:19
personal personal data of of
31:21
Twitter users. So That's
31:24
exactly what the Saudis were doing in their
31:26
No. Well, it was. Wonderful. But they but
31:28
they had to recruit these inside
31:30
individuals. So, yeah, I I mean, I
31:32
think it's gonna
31:33
be a while before if, you
31:35
know, for distance or others who
31:37
are operating anonymously, I
31:39
would probably caution them
31:42
about
31:42
their continued use of
31:45
Twitter and take a look at the
31:47
kind of information they provided,
31:49
you know, cell phone numbers, etcetera, when
31:51
they logged in and maybe quit the
31:53
platform,
31:53
honestly. Do you think there
31:55
ought to be AUS government review of this?
31:57
I mean, there there's increasing call
31:59
amongst some senators and
32:02
and for hill investigations into
32:04
this or for CFIUS,
32:06
which is the main kind of
32:08
executive branch agency that reviews foreign
32:10
transactions like this. Is is that something you
32:11
think ought to get outgoing? I don't
32:14
yeah. I don't I'm I a little out of my lane. So
32:16
I'm I'm sure they're they're taking a
32:18
look. Howard Bauchner: Senator Chris Murphy, who we
32:20
had on the podcast a couple weeks ago talking
32:22
about the Saudis, has called for
32:25
a CFIUS review of --
32:27
Yeah. -- the Saudi role in
32:29
Twitter. But I just wanna take you back to
32:31
when we were talking about content moderation
32:34
before And I, you know, you
32:36
said it's this is
32:38
a a tricky issue to say
32:40
the least, but you said there's
32:42
no role for the
32:44
government at all in this.
32:46
And I don't know if you've noticed, but, you
32:48
know, the the intercept published
32:50
a story this week in which
32:52
they reported on communications
32:54
between Department of Homeland Security
32:57
and various social media
32:59
platforms about various items
33:02
that were showing up on Facebook and
33:04
other social media platforms. And they got
33:06
that story wrong. I would really encourage all your
33:08
listeners to read Mike
33:10
Masnick's debunking of that
33:12
INTERCEPT piece. I
33:14
did, and he made some some
33:16
good points On the other hand,
33:18
it was clear to me that there
33:20
was at least discussions
33:22
among people in the government.
33:25
and the social media
33:27
platforms. And I mean, is
33:29
that they were people at
33:31
DHS and various arms
33:33
were alerting social media companies
33:35
to items that they found
33:37
troubling or problematic. Isn't
33:39
that a role that the government was playing
33:41
and is that appropriate? It's
33:43
making
33:43
recommendations to
33:45
the companies about what they should
33:47
moderate or in effect sensor.
33:49
Right.
33:50
So which is part of the DHS,
33:52
which is the cybersecurity
33:53
and infrastructure security agency,
33:56
which was created in
33:58
twenty eighteen by president
34:00
Trump monitors
34:02
globally cybersecurity
34:04
sort of trends, I guess, around
34:06
the world. And when
34:07
they see concerning
34:09
activity, they alert
34:12
companies to risks and
34:15
threats. And that
34:16
could be
34:17
could be across the whole cybersecurity portfolio,
34:20
including coordinated
34:21
campaigns to spread, you know,
34:23
conspiracy theory or propaganda from overseas,
34:26
etcetera, etcetera. they're not
34:28
making content. I they're not
34:30
making content moderation.
34:31
They're not telling
34:33
platforms what content
34:35
to leave up and what to take down,
34:37
they're flagging activities
34:38
that they see as
34:40
a heads up to
34:41
those platforms. But Vivint, let me
34:44
just read you a section from an
34:46
intercept article, and I understand, you know,
34:48
there are legitimate questions about
34:50
aspects of it, but one part of it
34:52
left out at me according to a draft copy
34:54
of DHS's quadrennial homeland
34:58
security review. their report, it said
35:00
the department plans to target
35:03
quote inaccurate information On a
35:05
wide range of topics including, quote, the
35:08
origins of the COVID-nineteen pandemic and
35:10
the efficacy of COVID-nineteen
35:12
vaccines, racial justice, US withdrawal
35:14
from Afghanistan and the nature of
35:16
US support to Ukraine. How
35:18
is US anything about
35:21
US withdrawal from Afghanistan? or the
35:23
nature of US support to Ukraine within the purview
35:25
of DHS to be telling social
35:27
media companies what they should or
35:29
should not publish.
35:31
Well, I don't
35:32
think they are. I don't know. Look, I let
35:34
me just tell you, I'm not inside. I can't
35:36
validate that information. I can
35:38
either debunk nor validate it.
35:40
But again, I would encourage you
35:42
to read Masnix interpretation of the underlying documents
35:45
that the
35:45
Intercept was using, but
35:47
it is
35:48
about
35:50
activity that they're seeing around coordinated,
35:52
misinteraction campaigns.
35:54
And I am now going
35:58
to the world
35:59
of pure speculation. I would imagine some of those
35:59
are around issues having to do with elections,
36:02
America's withdrawal from Afghanistan,
36:04
Afghanistan, etcetera, etcetera.
36:06
No plaque form is gonna
36:08
take its direction from the US
36:10
government. It's just not gonna
36:12
happen, and it's not the way that
36:14
they work. So I, you know, that's all
36:16
I can say. It just doesn't doesn't
36:18
make sense to me nor should it be the
36:20
role of the US government,
36:22
obviously, to opine
36:24
or try to not I mean, opine about
36:26
the specifics of the content. Only
36:28
the origins
36:28
of the behavior. That's why for
36:31
instance at Facebook, they
36:31
call it CIB,
36:34
coordinated inauthentic behavior.
36:36
There's
36:36
a group that looks
36:37
at the behavior of
36:40
online groups which is separate from looking at the validity of the
36:42
content. Those are two separate things.
36:44
Vivian, I just want to bring the conversation
36:45
down from policy
36:48
level to the product level for a second here because I
36:50
think there are people who are curious Twitter users
36:52
are curious about what the Twitter product
36:55
is gonna look like as as
36:57
time goes on under under Musk. So some of the ideas that
36:59
that they talked about is, you know,
37:01
direct messaging to
37:04
VIPs and celebrities,
37:07
videos behind paywalls,
37:09
which I get the feeling
37:11
might mostly be porn. and then,
37:13
you know, bringing back vine, these, you
37:16
know, short videos on a loop
37:18
for young people. And then
37:20
there's the idea
37:22
of the sort of the everything app, the equivalent of
37:24
China's WeChat,
37:26
where you could, you know, get your Uber
37:28
as you could order food, order on Amazon,
37:31
you know, whatever it is in this on this
37:33
one app. What's your sense of where things are going
37:35
in terms of the product? Look,
37:37
it's
37:37
it this is this
37:39
is the
37:40
the first batter's walking up to the play to
37:42
the top of the first inning here. It's impossible
37:44
to know. I've heard all that all of
37:46
that speculation. I didn't hear
37:48
the one about DM ing celebrities
37:50
and paying for that service. I'm sure that
37:52
celebrities would be super excited
37:54
about that. Look, you know, actually,
37:56
I mean, here, I'm not going to
37:58
Twitter doesn't make you know, Twitter's got
38:00
a revenue problem. And so the
38:03
way to figure out what revenue
38:06
opportunities are is to experiment.
38:08
So, you know, I don't have any particular
38:10
criticism for trying
38:12
different things. I suppose, you know, my art at personal desire is
38:14
that they don't mess with the core Twitter
38:16
product other than making content
38:18
moderation
38:18
improvements. but
38:20
it's, you know, he bought the toy so he gets to do
38:22
whatever he wants with it now. And he's
38:24
right that Twitter has
38:25
not cracked
38:27
the code of to this point on
38:29
trying to figure out a way to have, you know, the kind of revenue growth that
38:31
certainly the street demanded. All goes, the street
38:33
now is not relevant in
38:36
the picture. But
38:37
it's not realistic that he can, like or is
38:39
it that he could, you know, significantly
38:41
reduce his dependency on
38:44
advertising as the main
38:46
driver of And if he can, isn't that a problem terms
38:48
of, like, not Yeah. Yeah. It isn't
38:50
the pressure from advertisers and
38:53
it used to be from stockholders until he took it
38:56
private. Isn't that isn't that a check
38:58
on his
38:59
his power? I
39:02
don't know
39:02
about that. I think
39:05
advertisers many advertisers
39:07
will flock towards solutions
39:08
that support whatever it
39:10
is they're trying to market. So And and by the way,
39:12
there was a piece in the New York Times op
39:14
ed page today, which made the point
39:17
that advertisers are a
39:19
significant part of the problem in the sense
39:21
that, you know, they prioritize, you
39:23
know, engagement and engagement
39:25
is often about emotion and
39:27
anger, and that's you know, kind of it. And
39:29
and that was what drives the algorithms and that's part of
39:32
the root of the problem here
39:34
anyway. Yeah. Also, don't
39:36
forget that generally speaking when we're talking about
39:38
advertising on social media.
39:39
It's not just what people
39:41
automatically think of as, you
39:43
know, soap commercials. advertising is
39:46
basically an advertiser is anybody who
39:48
pays to amplify and
39:50
spread a message to
39:52
specific targeted
39:54
audiences. So that can be a shampoo commercial, but it can also
39:56
be an individual who has
39:58
an unreliable piece of
39:59
content they wanna get in front of certain
40:02
eyeballs. So So first of all, we need
40:04
to think just be careful with I mean, just we need to
40:06
be not be careful, but being precise about what
40:08
we mean by advertisers. I don't
40:10
really judge mosque
40:12
for experimenting with various
40:14
new forms of revenue.
40:16
Like I said,
40:17
I would like
40:18
it not to be at the expense
40:21
of the main timeline, you
40:23
know, the main news feed on Twitter,
40:25
and and hopefully it won't
40:27
help that turn you know, that won't turn that
40:29
into into a garbage site. But look,
40:32
he's got to experiment. I get that. I don't
40:34
I don't I don't judge
40:35
there. Just to wrap up, I'd like
40:37
to sort of take a step back.
40:39
And, you know, we've all been talking
40:42
about various threats
40:44
to our occupancy with the coming
40:46
election. What does it say about the
40:48
state of our democracy and
40:50
political culture
40:52
that an eccentric
40:55
billionaire can control
40:57
what a substantial portion
41:00
of the, you know,
41:02
American public seize and control the
41:04
news that it sees, Henry?
41:06
Yes. Yeah. It's it's a
41:08
huge problem. try
41:10
to think of a word bigger than that. It's a monumental
41:12
issue and it's this is not just about Twitter.
41:15
We've looked at the I
41:17
mean, much more substantial also
41:20
control effectively even though it's a
41:22
publicly traded company of a single
41:24
individual, which is Facebook now
41:26
meta, you know, it is
41:28
well documented the impact,
41:30
the negative impact that Facebook has
41:32
had, you know, being a spark
41:34
for, you know, for genocide, for
41:36
ERC theories for the polarization of America, you
41:38
know, Twitter at least is generally speaking
41:41
not algorithmically filtered in that
41:43
same way, and it's and
41:46
it's public. So, yeah, this is a big, you
41:48
know, this is why we at Aspen
41:50
and so many organizations and governments,
41:52
frankly, too, around the world, are
41:55
trying to figure out what are the
41:57
ways that we can, you know, put
41:59
the
41:59
I I was gonna say put
42:02
the genie back in the bottle, but don't want there's no such
42:04
thing as the good old days. You know, people go, oh,
42:06
if only we were back to the days of Walter
42:08
Cronkite, that
42:08
was a problematic era because of all the
42:10
people that were left out of that merit.
42:12
So let's not do that. But to find a way
42:14
to ensure that
42:16
quality information is available to
42:20
people
42:20
at the local, regional, national,
42:23
and global level.
42:24
And, you know, we're not gonna we're
42:26
gonna die trying. There's no
42:28
simple answers, but there's a lot of small
42:30
solutions and things that can help.
42:32
It's a big problem. Well, good
42:34
that you are trying
42:36
because somebody's got to
42:38
in any case. Vivian, I
42:40
wanna thank you for a really interesting discussion.
42:43
And, you know, we
42:45
will see what becomes of Twitter
42:47
and perhaps, you know, the future of
42:50
American democracy in in in coming
42:52
weeks. Thanks
42:54
a lot. Yeah. Thank you. Enjoy being
42:56
with you.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More