Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
different but you were sort of like
0:02
you've kind of been turned into some
0:04
sort of censorship boogie woman. I
0:06
never heard boogie, boogie man turned into
0:08
boogie woman before but it makes it
0:10
sound like someone pretty cool actually.
0:43
Welcome back to Posting
0:45
Through It. I'm Jared Holt.
0:48
And I'm Mike Hayden. Thanks
0:50
for joining us another week.
0:52
I really enjoyed our guest
0:54
today, and I learned a
0:56
lot about social media. You
0:58
know, one person who is on
1:00
social media is a fellow named
1:03
Tim Poole, and I just
1:05
wanted to just let our
1:07
listeners know that we are hard
1:09
at work on a who is
1:12
Tim Poole. podcast episode, which I
1:14
believe will be out next week
1:16
with a very special guest who
1:19
happens to be a New York
1:21
Mets fan, like myself. And so
1:23
everybody is going to enjoy eating
1:26
their vegetables with today's podcast
1:28
and learn a lot. And then
1:30
we are going to consume a
1:32
large stuffed crust pizza next week
1:35
on Timpool. Yeah, we are hard
1:37
at work. you know, down in
1:39
the shafts, if you can conjure
1:41
the image in your mind, listeners
1:44
close your eyes unless you're driving,
1:46
but imagine, you know, the
1:48
mechanic crawling out of the pit
1:50
when you get your oil changed,
1:52
just covered in grease and oil.
1:54
That is us, hard at work,
1:56
preparing the Timpool episode. Imagine
1:58
if you will. a man
2:00
who wears his winter cap in
2:02
the middle of August in the
2:04
hot West Virginia sun? A
2:06
man who I once watched
2:08
a video of this like
2:11
weird alt-right delist influencer
2:13
pull his hat off his
2:15
head and I legitimately thought
2:17
Tim was gonna like punch him
2:19
in the face. I thought. I
2:21
believe that's millennial Matt. Yeah. You
2:24
will be learning about all that
2:26
and more. next week, but
2:28
for now a very, very
2:30
in-depth discussion about social media
2:32
and how it impacts the
2:35
brain and our culture. Joining
2:59
posting through it now, we have
3:01
Renee duressa. She's a associate research
3:04
professor at the McCourt School of
3:06
Public Policy at Georgetown. Before that,
3:09
she's technical research manager at the
3:11
Stanford Internet Observatory. She is the
3:13
author of a book I've really
3:16
been enjoying over the last couple
3:18
weeks called Invisible rulers.
3:20
The people who turn lies into reality
3:22
and I'm really excited to have her
3:24
on the show this week. Renee, how's
3:27
it going? It's good. Thanks for
3:29
having me on. First off, how
3:31
are you holding up? I mean, the
3:33
world is kind of, you know, 10
3:35
kinds of fucked up and crazy right
3:37
now. How are you doing? It is so
3:39
busy. I mean, between my job,
3:42
which keeps me busy, and then
3:44
now there's a whole pile of
3:46
congressional hearings that are starting again,
3:49
that are keeping me busy, and
3:51
then I've got three kids, which
3:53
also keeps me busy, and travel,
3:55
which is busy. I feel like I'm
3:58
I feel like I'm getting things done I
4:00
think helps me cope with the
4:02
chaos. Yeah, you're like me. It's
4:04
like when the world is burning,
4:06
there is almost a weird sense
4:08
of maybe it's just because I'm
4:10
distracted, but like a mental piece.
4:13
Really? Agency. Like you can get
4:15
something done. Yeah. Which feels wrong
4:17
and backwards, but Mike, do you
4:19
feel that at all? Or I
4:21
don't know. I don't know. Like I've
4:24
been doing nothing but my book edits.
4:26
And so I kind of just feel
4:28
nothing. The book doesn't come out until
4:30
2026, so there's a feeling of just
4:33
being kind of mired in something that
4:35
no one can see yet, right? So
4:37
I don't know, I'm not feeling great
4:39
about it, but next month. That was
4:41
a hard process, like just how long
4:44
it takes between when you feel like
4:46
you're done and then turning in the
4:48
edits and then waiting like almost a
4:50
year until everybody else sees it.
4:53
I remember being surprised by just how
4:55
long that gap was. The gap is
4:57
over. Like I said, I've
4:59
really been enjoying it. It's
5:01
connected a lot of sort of
5:03
loose threads in my head.
5:06
But the synopsis of
5:08
Invisible rulers, it's just
5:10
this excruciatingly in-depth,
5:12
but somehow still
5:14
digestible, look at
5:16
the way that social media
5:18
is becoming sort of the
5:21
new invisible ruler of society.
5:23
And I think... Borrowing
5:25
the idea of invisible rulers
5:27
is a really interesting way
5:30
to approach it and maybe we should
5:32
start there like the idea of the
5:34
invisible ruler, the ghost in the
5:36
machine, whatever you want to call
5:38
it. Where does that come from?
5:40
And why did you pick that
5:42
to try to explain how social
5:44
media works? Yeah, it comes
5:46
from Edward Bernese. It comes from
5:49
the book propaganda from the late
5:51
1920s, 1929 or so. I remember
5:53
when I was, so my
5:56
background is in computer science
5:58
and political science. you know
6:00
I rather infamously as it kind of the
6:02
way it came out was sort of infamous
6:04
but I worked for CIA when I was
6:06
an undergrad you know and so I had
6:08
been in in my very kind of first
6:11
job was was in intelligence but not in
6:13
propaganda at all and in kind of technical
6:15
work but I'd always been interested in this
6:17
question of you know how people came to
6:19
hold opinions and I got very interested in
6:21
specifically how people came to form opinions about
6:23
vaccines when I was a new mom being
6:25
suddenly delused with the kind of stuff that
6:28
Facebook throws at you when you're a new
6:30
parent, which is, you know, all of a
6:32
sudden your entire feed changes, your entire experience
6:34
on social media changes the minute that you
6:36
have a kid. And you don't realize it
6:39
until you're there. The minute you post your
6:41
first baby picture, boom, like everything that is
6:43
recommended to you is different. And so I
6:45
got very interested in that question of
6:47
why, right? Why are we shown certain
6:50
things? How do we form opinions? How
6:52
do we form new friend groups? Because
6:54
I really did, right, around that time.
6:56
How do we, how does our identity
6:58
change? How does what is appealing to
7:00
us in our kind of new identity?
7:02
How does that shift happen? Around that
7:04
time that I was paying attention
7:06
to all of a sudden being
7:09
tossed into these, you know, getting
7:11
this constant recommended stream of content,
7:13
anti-vaccine As that was happening to me, the
7:16
conversation about social media was
7:18
also very much focused on ISIS.
7:20
These two things were happening concurrently.
7:22
So as I was studying anti-vaccine
7:24
networks on social media, other people
7:26
were studying ISIS networks on social
7:28
media, and I was really struck
7:30
by the similarities between these two
7:32
completely different groups, but just how
7:34
like the structure of social media
7:36
had fundamentally transformed how they could
7:38
communicate, who they could reach, the
7:40
ways in which networks were forming,
7:42
the ways in which messages were
7:44
being pushed out to people. Unexpected
7:46
things, like if you followed one, you
7:49
know, the recommender system would push you
7:51
more. Just the complete lack of awareness
7:53
from the recommender system of what it
7:55
was suggesting to you, just the absolute
7:57
kind of amorality of the entire thing.
8:00
And I started kind of thinking,
8:02
all right, what is new here?
8:04
And it seemed like the technology
8:06
was very, very new. So I
8:08
was focusing a lot on that.
8:10
But I thought, OK, we have
8:12
a tendency to overreact to new
8:14
technologies as being, OK, this time
8:16
is so different. This time is
8:18
so different. So I got very
8:20
interested in reading about propaganda literature
8:22
from the Olden days, literally from
8:24
a century back. And that was
8:26
what took me to Bernese and
8:28
Lippman and Jacques Alul and these
8:31
sort of foundational writers that I
8:33
had never read in college actually,
8:35
because I had not really gone
8:37
through that training. And I remembered
8:39
reading the book propaganda and getting
8:41
to the passage, you know, there
8:43
are invisible rulers who control the
8:45
destinies of millions. And he's talking
8:47
in this passage about the role
8:49
of appealing to people. in this
8:51
capacity as members of a group
8:53
identity. And it just kind of
8:55
like really clicked in my head.
8:57
It really stuck with me as
8:59
this, it resonated with what I
9:02
felt like I was seeing, this
9:04
idea that all of a sudden
9:06
we were engaging so foundationally as
9:08
members of groups and thinking, oh,
9:10
this is so interesting. Here is
9:12
somebody a century ago talking about
9:14
the power of being unaware of
9:16
where a message is coming from,
9:18
but feeling it resonates so deeply
9:20
with who we are as people
9:22
and that desire to belong to
9:24
a group that I just got
9:26
very interested in in that particular
9:28
body of literature and I really
9:30
started kind of going down this
9:33
rabbit hole. This was maybe eight
9:35
years ago now into trying to
9:37
understand, you know, What was new
9:39
and different, which was the technology,
9:41
but what was foundationally the same,
9:43
which was the psychology of belonging
9:45
and the way that propaganda worked,
9:47
the way that it functioned, and
9:49
why we wanted to pay attention
9:51
to these kinds of messages. So
9:53
the concept of the invisible ruler
9:55
that Bernese lays out and that
9:57
you explain in the book, is,
9:59
you know, there are visible
10:01
rulers. These are our
10:04
politicians, prominent members of
10:06
society, you know, governments,
10:08
major media outlets, and then there
10:10
are invisible rulers, which are the
10:12
kind of force, you know, it
10:14
could be like a force, like
10:17
advertisers. If advertisers pull your funding
10:19
on your cable network after a
10:21
segment, well, they're a kind of
10:24
an invisible ruler, sort of, you
10:26
know, but or the press person.
10:28
that generates like that gets all
10:31
the media to show up at
10:33
a conference or a ribbon cutting
10:35
like that that is a type
10:37
of invisible ruler the person leaking
10:39
information to an outlet is an
10:41
invisible ruler and I just thought
10:44
that was a really interesting
10:46
way to think about today you
10:48
know looking at the Trump
10:50
administration but really just the
10:52
last I mean like decade or
10:54
two of politics as social media
10:57
has become more important,
10:59
more integrated into
11:01
political strategy, it makes
11:04
me wonder like, is the
11:06
lives of TikTok account, and
11:08
then like a new kind
11:10
of invisible ruler, you know,
11:12
if outrage generated on
11:14
that account can shake up
11:17
the deliberations of the press
11:19
shop at the White House. What say
11:21
you? Well, that's the argument that
11:23
I make is that the
11:25
influencers are profoundly important, right?
11:27
That content creators and influencers
11:29
are the most effective propaganda
11:32
that we see today in a lot
11:34
of ways because people don't think
11:36
about them in that context. They
11:38
still think about political propaganda in
11:40
particular as being something that comes from
11:42
the top down the sort of old
11:45
Chomsky model of You know, the mass media
11:47
being in cahoots with the government and,
11:49
you know, taking its cues from advertisers
11:51
and taking its cues, what it can
11:54
and can't say from a, you know,
11:56
from some government leader. When what's happening
11:58
now, I would argue. is that the
12:00
information is increasingly, you know, public
12:02
opinion is increasingly shaped from the
12:05
bottom up. And you have these
12:07
accounts that are very, very good
12:09
at understanding how to tap into
12:11
psychology, how to tap into group
12:13
identity, how to tap into a
12:15
desire to belong, to participate, to
12:17
be a part of a movement.
12:19
That's what Libs of TikTok is
12:21
actually good at, right? They take
12:23
something, they turn it into a,
12:26
you know, they take a, we
12:28
use the term main character, but
12:30
they take sort of an avatar
12:32
of a, an extreme opinion held
12:34
by an alternative faction, right? Some
12:36
enemy, some Lib, they make that
12:38
an object of ridicule and that
12:40
act of participating in the ridicule
12:42
really binds these people together. Like
12:44
that's what the other faction is
12:46
getting out of it. It's really
12:49
solidifying that in group cohesion by
12:51
hating these other people, that sort
12:53
of outgroup animosity. That's what really
12:55
kind of brings this, this group
12:57
of people together. And what you
12:59
see is the realization that that
13:01
can then be used as not
13:03
just an online main character moment
13:05
where we're all getting together to,
13:07
you know, to hate this person
13:10
or yell at this person or
13:12
participate in this online trend. It
13:14
actually then is used, it's, it's,
13:16
it's kind of like transmuted into
13:18
a bigger story. Do you remember
13:20
the, the eating the pets thing
13:22
that happened during Oh, of course.
13:24
Yeah, of course. No one will
13:26
ever forget eating the pets. Of
13:28
course. You're with two people who
13:30
might as well have been eating
13:33
the pets. But
13:35
that, that's another example
13:37
of this, right? You have
13:39
an online rumor, you
13:42
have something that galvanizes participation.
13:44
People are having fun
13:46
with it, right? They're making
13:48
these memes or using
13:50
grok. They're coming together as
13:52
a community. It's again,
13:54
it's bringing that, that cohesion,
13:57
that movement building. But
13:59
then you see JD Vance
14:01
pick it up, right?
14:03
And all of a sudden
14:05
it's a bigger story.
14:07
Now it's not just about
14:09
this one stupid meme
14:11
or this one weird accusation or
14:14
this one person who regrettably is photographed, you know,
14:16
holding a goose or whatever, that that poor guy
14:18
who winds up in the photo, the actual human
14:20
being, who is who becomes kind of like the
14:22
target of this of this online mob, J.D. Vance
14:24
then turns it into like, well, this is really
14:26
a story about who should be in America, who
14:28
should be in that community, who that community is
14:31
for. And that's where you see that handoff, right?
14:33
rumor and community and you know socialization and mob
14:35
dynamics to the propaganda campaign around the bigger story
14:37
which is who is America for what is the
14:39
deep story about immigration that we're telling here and
14:41
that's why I think this handoff that happens there
14:43
it increasingly comes from the bottom and is picked
14:45
up by the political elite as opposed to moving
14:48
from the top down where some machine picks
14:50
it up and sells it. Why does it feel
14:52
that feel like reactionaries in
14:54
general or people who
14:56
may even have fascist
14:58
ambitions are able to take
15:01
advantage of this technology
15:03
so much better than
15:05
left liberal whomever, right?
15:07
Even centrist. I've kind
15:09
of wondered about that too. I
15:11
always believed that the
15:14
algorithm sort of kind of
15:16
pulls people to the right, that
15:18
it benefits moneyed interests. more than
15:20
anything else and maybe left right
15:22
isn't the right way to even
15:25
look at this I don't know
15:27
but it just seems like those
15:29
things that would be that that would
15:31
float to the top of a left
15:33
liberal agenda don't get the same
15:35
kind of juice as something lengthy
15:37
the they're eating the cats
15:39
they're eating the dogs there've
15:42
definitely been you know you
15:44
can definitely point to mob
15:46
dynamics where we've had the you
15:48
know, the weird main character of
15:50
the day situations that are not
15:52
explicitly right coated, right? You remember
15:54
the, I love eating, I love drinking
15:56
coffee in the morning with my husband
15:59
and the weird... Oh yeah, the insane
16:01
mob that's brought up on that one.
16:03
The replies would be like, this is
16:05
insensitive to people who can't drink coffee
16:07
in the morning with their husbands. Don't
16:10
have gardens, don't have gardens. There have
16:12
been, or Covington Catholic was another one
16:14
that came to mind that I wrote
16:16
about in the book. The example where
16:18
there was like the kid in the
16:21
magga hat and you know that sort
16:23
of situation where people really prejudged that
16:25
situation in very unfortunate ways before the
16:27
actual facts emerged. So you definitely do
16:30
see it. What is curated? That's been
16:32
one of these things where there have
16:34
been some papers. There was one that
16:36
Twitter itself was a, members of, researchers
16:38
at Twitter were authors on the paper,
16:41
I'm blanking on the name right now,
16:43
but that did make this argument that
16:45
actually right leaning content does tend to
16:47
perform better. It was sort of put
16:49
out almost in opposition to the argument
16:52
that right leaning content is censored or
16:54
downrank or deprecated on Twitter. But as
16:56
far as the, you know, one of
16:58
the things that happens I think is
17:01
also just this investment, right? Who is
17:03
really making these, you know, that handoff
17:05
piece, that question of like, when are
17:07
the political elites really grabbing this stuff
17:09
and pulling it up? We saw Donald
17:12
Trump doing this in 2016. I mean,
17:14
you guys remember like Maga 3X, right?
17:16
Or when Donald Trump was running this
17:18
very insurgent style campaign, Hillary Clinton wasn't
17:20
doing that. Lord Clinton wasn't on there
17:23
saying like, what are my meme lords
17:25
making and how can I boost it?
17:27
No, it was, you know, it was
17:29
Pokemon, go to the polls. Yeah, I
17:32
am, chilling and see your rapids. So
17:34
it was like that investment, I think,
17:36
wasn't really there. There wasn't that that
17:38
same appreciation for it. Then there was,
17:40
you know, you know, Yoki Bankler and
17:43
the, the books that looked at the
17:45
sort of investment in the right-wing media
17:47
ecosystem. even that was not influencer driven,
17:49
but just the networked propaganda book that
17:51
looked at the link, you know, the
17:54
sort of link driven. ecosystem of just
17:56
the alt-media apparatus that emerged both on
17:58
Facebook and in the you know in
18:00
the broader web they really invested much
18:03
more heavily there I think also because
18:05
the sense was that mainstream media was
18:07
owned by the Libs so why would
18:09
you invest in this ecosystem when you
18:11
already had this apparatus over here that
18:14
honestly that was the sense I got
18:16
looking at the anti-vaxers too right like
18:18
That's why they invested because they didn't
18:20
have access to that other ecosystem. So
18:22
they went and built their own using
18:25
the tools that were available to them,
18:27
even as the CDC and the institutionalists
18:29
were like, whatever, we don't need that
18:31
because we have this other thing. So
18:34
part of it was investment, part of
18:36
it was like a conscious decision to
18:38
do this, right, to say like, we
18:40
can make our people feel. empowered by
18:42
boosting them. Jared, you must have seen
18:45
this also, right? There would be these
18:47
people who would, their entire Twitter bio
18:49
was like, retweeted by Donald Trump four
18:51
times, Donald Trump Jr. six times, like
18:53
Dan Bonjino four times, like, it was
18:56
such a point of pride. It's like
18:58
a trophy, because, yeah, exactly, like, there
19:00
was, like, that sense that, like, the
19:02
elites would boost you. I never, I
19:04
don't think I've ever seen anybody on
19:07
the left do that. Maybe because I
19:09
don't know if that's just not a
19:11
cultural thing or if it's not happening.
19:13
It is to be very cringe. It
19:16
is extremely, I mean it is. I
19:18
think there's credit to the left liberal
19:20
side of social media. It's a pretty
19:22
cringe thing to do. But at the
19:24
same time, you know, you do get
19:27
people who you'll see them say like,
19:29
oh my god AOC followed me, right,
19:31
or you know, so and so followed
19:33
me like there is still that sense
19:35
of like. My hero who is good
19:38
at social media is paying attention to
19:40
me. I had that good tweet and
19:42
somebody liked it. People want to be
19:44
recognized as being part of that community
19:47
as having input and insight and I
19:49
think that it's just it's treated very
19:51
very differently and so starting in 2016
19:53
you really saw that culture begin to
19:55
emerge and it was you know they
19:58
really doubled down in 2020 and. And
20:00
it's kind of continued
20:02
ever since. So I would be
20:04
curious your thoughts on why this
20:06
has been so successful. In the
20:08
book you bring up a couple
20:11
things that I think are interesting
20:13
to get into around that question.
20:15
One is the 99.1 problem, which
20:17
I have to say slowly, so
20:20
it doesn't sound like I'm saying
20:22
like one giant messed up number.
20:24
Being that like on a lot
20:27
of social media platforms. Like
20:29
something like 90% of the
20:31
users fairly consistently across platforms Just
20:33
lurk they like don't post or
20:35
anything they just check it out
20:38
and leave Nine percent of
20:40
users will post like sometimes
20:42
which an occasional user quote-unquote
20:45
of social media platforms tends
20:47
to be defined fairly liberally.
20:49
It's like did you post this
20:52
week? Okay, you're an occasional user
20:54
and then one percent of social
20:56
media users are just like Well,
20:59
me, just incessant, incessant,
21:01
several posts per day. We
21:03
will not stop, we will
21:06
not stop posting kind of
21:08
thing. Then there's also the
21:10
algorithmic sorting of the
21:12
recommendation systems. You know,
21:14
the algorithm has become
21:16
kind of like this like sloppy
21:18
kind of bookie man in the
21:21
wrong hands when people talk about
21:23
it. But. The fact of the
21:25
matter is these platforms recommend you
21:27
content and it seems like in
21:29
recent years are doing it more
21:31
and more. I can't get on
21:33
Instagram without like every third post
21:35
being an advertisement or some kind
21:38
of recommended post that I didn't
21:40
ask for. And the systems that
21:42
recommend that content shape not only
21:44
what we see, but also the kind
21:46
of content that gets created because
21:49
the... influencers want to appeal
21:51
to them and like make a
21:53
living or get notoriety or whatever
21:55
goal they might have. So all of
21:57
that's to say what we see online
22:00
is like so far removed oftentimes
22:02
from reality. We interact with
22:04
people in ways we would
22:06
never interact with them in
22:08
person. We see content that
22:10
is just like borderline unnatural.
22:12
You use an example of like guitar
22:14
guy getting erratic, you know, audience
22:17
captured and stuff, which spoke very
22:19
directly to me as somebody with
22:21
like a pedal board full of
22:23
like. gizmos and funny boxes
22:26
that make silly sounds. But
22:28
it's just like the content
22:30
almost, it's like it is
22:33
an extension of reality, but
22:35
it is also just so
22:37
like deeply unnatural too. But
22:39
yet it has this vice
22:41
grip on society. I'm curious
22:43
like how you square that in
22:45
your head. What you're describing now and
22:48
the things that you're seeing in your
22:50
feed that you didn't ask for that's
22:52
called Unconnected content and Facebook actually discloses
22:54
what percentage of Unconnected content it shows
22:57
you you can see that in their
22:59
transparency reports quarterly Which is actually kind
23:01
of nice. I mean most most platforms
23:03
won't tell you that And it's risen
23:06
from about 8% to last time I
23:08
did a report on this was like
23:10
March of 2024 so it was around
23:12
2425% by that point Right. And the
23:14
reason for it is to compete with
23:17
Tiktok. That's that is why they're doing
23:19
it, right? Because Tiktok, their big innovation
23:21
was the realization that people just wanted
23:24
to be entertained. And it actually didn't
23:26
matter if they had a social graph
23:28
that was attached to those, you know,
23:30
to the people whose content they were
23:32
seeing. Whereas Facebook had started in the
23:34
quaint early days of social platforms where
23:36
you followed your friends and you looked
23:38
at like party photos, right? That was
23:40
what we did with it. Same thing
23:42
with Instagram to some extent. As they
23:44
realize that you could build an entire,
23:46
like what they called interest graphs originally,
23:49
and then as they realize that interests
23:51
were actually in some ways quite transient,
23:53
that it could be entertainment and memes,
23:55
which didn't have to be tied to
23:57
some deep interest you had, but could
23:59
just be like... the thing that you
24:01
were obsessed with that week. I
24:03
remember, like, I started, I got
24:05
recommended the Red Bowl danceier, like dance
24:07
your style, like competition for 2024,
24:09
which actually came out like four
24:11
or five months ago, I think, so
24:14
these are not even new videos,
24:16
but I watched one and I
24:18
was like, hey, this is actually really
24:20
good. These dancers are like amazing.
24:22
I used to dance back in
24:24
the day and so I do watch
24:26
dance content. And I started watching
24:28
more of them and like actually
24:30
then going and looking up some of
24:33
the dancers I liked, and this
24:35
was maybe like two weeks ago,
24:37
my entire YouTube feed is nothing but
24:39
dance videos now. That is it.
24:41
Like that is all I'm getting,
24:43
you know. And I think for the
24:45
probably for like the next couple
24:47
weeks until I hit on something
24:49
else or like go proactively looking for
24:52
something else, like that's what I'm
24:54
going to get. And it's just
24:56
going to be like a deluge. And
24:58
Instagram is like this too, like
25:00
this too, like this too, like,
25:02
like, like, like, like, Like, you do
25:04
one thing and then it thinks
25:06
like, okay, boom, this is what
25:08
they want and in order to keep
25:11
them on site, we're going to
25:13
give them just like an absolute
25:15
shit ton of it and like that's
25:17
what you're going to get. And
25:19
it's very hard to break out
25:21
of it. And I remember when I
25:24
was looking at, this unconnected content
25:26
project, we were looking at AI
25:28
Slop, which didn't even have a name
25:30
then. We were just doing this
25:32
project studying this project studying like
25:34
these garbage AI pages. And that was
25:36
all I was getting and it
25:38
actually rendered Facebook like completely unusable
25:40
for me because once you engaged with
25:43
one of the pages, literally that
25:45
was all you were all that
25:47
you were going to get because the
25:49
like the poll from the recommender
25:51
system like it people really were
25:53
curious about AI-generated content and it just
25:55
pushed it and pushed it and
25:57
pushed it. And one of the
25:59
things that used to be able to
26:02
see on crowd tangle was the
26:04
engagement with these pages after they
26:06
kind of. But what you would see
26:08
is the view counts on this
26:10
content. was like tens and hundreds
26:12
of millions of views on some of
26:14
the videos or some of the
26:16
images and then all of a
26:18
sudden they changed the recommender system and
26:21
you just see the content engagement
26:23
just flatline just like boom just
26:25
drops off a cliff and it's because
26:27
it's entirely determined by what the
26:29
platform decides to show people it's
26:31
not like organic searches where people are
26:33
going and looking for the page
26:35
so that power of suggestion and
26:38
that nudge is really incredibly powerful and
26:40
it's also completely So what you're
26:42
seeing and what your neighbor is
26:44
seeing are completely different, every now and
26:46
then there'll be some cultural moment
26:48
that everybody gets, right? Something that
26:50
is like very, very major, but otherwise
26:53
it is so random and so
26:55
determined just by the things that
26:57
you do that it makes it I
26:59
think challenging to feel like we're
27:01
having some kind of shared conversation
27:03
or shared, you know, like national... discourse
27:05
in a way? Well, we have,
27:07
we have, we have like we're
27:09
completely divided, right? Like that's, I mean,
27:12
it used to be people that
27:14
talk about, oh, we're, we're a
27:16
divided country. And now it's like, there's
27:18
no discussion about it anymore. We
27:20
just know that it's like we
27:22
just know that we're just completely divided.
27:24
And I was just going to
27:26
ask about that, that the way
27:28
those sort of algorithms kind of that
27:31
this kind of widespread sort of
27:33
fascism, fascist streak inside magga or
27:35
maggots help being a fascist movement, is
27:37
that the result of these algorithms?
27:39
I mean, because it didn't seem
27:41
like there was an agenda to turn
27:43
the country this far to the
27:45
right or this authoritarian, right? It
27:47
was, there was a pushback against Trump
27:50
from the Republican Party, the minute
27:52
he started running. And a lot
27:54
of people trying to push against it
27:56
and he became a social media
27:58
phenomenon. on and it seems like
28:00
now like a decade later I don't
28:02
want to oversimplify but it does feel
28:05
like the algorithm really has taken the
28:07
entire party or half the country on
28:09
this ride that they wouldn't have gone
28:11
on necessarily a decade ago. I think
28:13
it helps things become very normalized. It's
28:15
not only the algorithm that's the one
28:17
thing that I think it's like I
28:19
try to make the point even in
28:21
the book it's like the reason I
28:23
describe it as influencer algorithm crowd is
28:25
like you can't have one of those
28:27
things without the other. The algorithm serves
28:29
what it has available. So whether, you
28:31
know, it's showing me these dance videos
28:33
because they're there and I like it,
28:35
right? And it is that because I
28:37
like it piece that actually is, you
28:39
know, one of the things that's there, right?
28:41
There's like curiosity, there's interest, there's something
28:44
that keys off of it. This is
28:46
the same thing with, you know, I'll
28:48
use anti-vaccine content as an example. When
28:50
I was getting pushed it, I would
28:53
click. Now I was like kind of
28:55
hate clicking I was curiosity clicking but
28:57
the reason I kept getting more of
28:59
it was because I engaged and so
29:02
you know when you mentioned a couple
29:04
minutes ago the influencer is incentive or
29:06
the guitar guy example that I that
29:08
I created just to not point to
29:11
one particular person as the audience capture
29:13
avatar. The influencer is creating content for
29:15
the algorithm and the audience and they
29:17
have to always be doing it for
29:19
both. Otherwise, their content isn't going to
29:21
get promoted, which means that nobody's going
29:23
to see it, and they're not going
29:25
to get that initial burst of engagement,
29:27
like the likes and the shares or
29:29
views that are then going to push
29:31
it out to more people. So they
29:33
have to be doing it for the
29:35
algorithm and the crowd. And these things
29:37
always, like, you have to think about
29:39
it basically as like a triangle. And
29:41
so what that means with the kind
29:43
of normalization piece, the thing that's very
29:45
interesting about the influencer is that
29:47
They become that like engine
29:49
of normalization. And when
29:52
they realize that the audience
29:54
is receptive to something that
29:56
feedback loop between the influencer
29:59
and. the audience means that they're
30:01
essentially getting permission to say the more
30:03
extreme thing because the audience is like,
30:06
okay, cool, right? If the audience is
30:08
like, no, no, no, no, no, what
30:10
are you doing? Then they won't do
30:12
it. But if the audience is like,
30:15
yes, totally I'm with you, let's go.
30:17
Then they do. And that's where you
30:19
see that, like, you know, you don't,
30:22
you can't just blame the tech, right?
30:24
The tech recognizes that that that that
30:26
that that that that cycle is happening
30:28
is happening, that cycle is happening, But
30:31
there is that piece of, yeah, I'm
30:33
curious, like, yeah, I'm going along for
30:35
the ride with you. Yes, I'm actually
30:38
gonna kind of like, in some cases,
30:40
nudge you in a particular direction. There's
30:42
this, some of the research that like
30:44
Damon Santola does, he's a professor at,
30:47
I think you pen, if I'm not
30:49
mistaken, I hope I didn't get that
30:51
wrong, where what he talks about is
30:54
this thing called complex contagion, where the
30:56
influencer is almost acting as like the
30:58
gatekeeper in a sense where they're deciding,
31:00
you know, they're seeing their kind of
31:03
phantom talking about all this stuff, they
31:05
can decide whether to pick it up
31:07
and move it along, right, by boosting
31:10
that random thing that they're, you know,
31:12
that their fan is saying, you know,
31:14
like you can see this in the
31:16
election rumor mill, if you pick it
31:19
up and boost it and say, like,
31:21
big if true, you're still kind of
31:23
giving it your credibility, you're giving it,
31:26
you're pushing it out to more people,
31:28
you're pushing it out to more people,
31:30
is that that is a fully normal
31:32
reasonable, completely accepted position to hold in
31:35
that community now in a way where
31:37
even four years ago, they would still
31:39
try to couch it a little bit.
31:42
You know, hey, some people should, you
31:44
know, people should really look into this,
31:46
oh, I don't know, big if true,
31:48
somebody should go check that out. Now
31:51
they're like, look at this, they're doing
31:53
it again, you know, they're stealing it,
31:55
here they are stealing it, here they
31:58
are stealing it, gatekeeping it anymore. They're
32:00
just like moving it right along and
32:02
that's this this shift. from they're no
32:04
longer risking their credibility and kind of
32:07
gating it and serving as a limit
32:09
to it, they're essentially just like moving
32:11
it right on through. And I
32:14
think as that audience gets bigger,
32:16
it also has like a, it
32:18
disincentifies is the opposite, right? I
32:20
mean, Mitt Romney and Liz Cheney
32:22
are political has been at this
32:25
point. Why? Because they had the
32:27
nerve. to stick to their point
32:29
that the capital riot was not
32:31
ideal. And they were punished, right?
32:34
So it's like a punish mechanism
32:36
too. But I, you know, sort
32:38
of last question on the theory
32:40
of understanding part of this before
32:42
we shift gears a little bit
32:45
is, you know, I want to
32:47
get your thoughts on just how
32:49
impactful is this stuff really? Because
32:51
sometimes, you know, like in my
32:54
day job, it can be. I
32:56
sometimes, a post can get like
32:58
10,000 retweets or something like this,
33:00
right? It's like somebody who manages
33:02
an organization who spends like an
33:05
hour a week on the internet
33:07
because they have like a real
33:09
job doing a real thing. But
33:11
you know, sometimes a post can
33:14
get like 10,000 retweets or something.
33:16
But like what's it mean? Nothing,
33:18
right? Or sometimes supposed to get
33:20
like a hundred retweets and then
33:22
holy shit. There's like a hate
33:25
group that just showed up at
33:27
your book launch event or something.
33:29
You know, so how powerful is
33:31
social media like really is there,
33:34
you know, any sort of consistent
33:36
identifiers in terms of like once
33:38
this happens or like if it
33:40
does this or you know, it's
33:42
Like all media it's I've never
33:45
been under the illusion that we
33:47
would like record a podcast or
33:49
I would write a news article
33:51
or a paper and it would
33:54
change the world forever. It's like,
33:56
it's just material and people use
33:58
it and pick it up for
34:00
whatever they might want to do
34:02
or whatever they might want to
34:05
take away from it. But is
34:07
social media in the way it's
34:09
integrated in our lives, like inherently
34:11
impactful in a way that other
34:14
media maybe isn't or are there?
34:16
things about it or like ways
34:18
of interpreting it that can make
34:20
it more or less impactful in
34:22
any given circumstance? The thing that,
34:25
so first of all, I agree
34:27
that oftentimes it's overstated where someone's
34:29
like, look at this one post,
34:31
it got 10,000 likes, oh my
34:34
God, that's such a big deal,
34:36
right? And we've known for a
34:38
long time now, you know, dating
34:40
back to media theory studies and
34:42
from the 1940s that, you know,
34:45
what they called the hypodermic, the
34:47
hypodermic model of media changing people's
34:49
minds, you know, the idea that
34:51
if you saw something on the
34:54
nightly news, your opinion would magically
34:56
be changed. Like, no, that's not
34:58
how it works. But one of
35:00
the things that's interesting about it,
35:02
is that what studies did find,
35:05
even back then, was that it
35:07
was, you know, your opinion would
35:09
be shaped based on the conversations
35:11
that you had with people, right.
35:14
there'd be members of the community
35:16
that were called opinion leaders who
35:18
paid a lot of attention to
35:20
media, and that you would talk
35:22
to those people who paid a
35:25
lot of attention to media, you
35:27
would kind of decide amongst yourselves
35:29
what the facts of the matter
35:31
were. And that was sort of
35:34
how you would over time shift
35:36
your opinion and come to believe
35:38
something or come to like a
35:40
political candidate or whatever it was.
35:42
And what I think is interesting
35:45
about social media and particularly influencers,
35:47
and why I spend so much
35:49
time paying attention to them. is
35:51
that those two things are now
35:54
like merged into one thing, right?
35:56
So they're both media in that
35:58
they have that like. elevated
36:00
reach, that mass reach, and they're kind
36:03
of just like you and they're talking
36:05
to you. And the community of people
36:07
who kind of grow up around them,
36:09
you know, the sort of friends that
36:11
you make online, are the people that
36:13
you're talking to. So you no longer
36:15
have that maybe more diverse set of
36:17
people that you meet in the real world.
36:20
You're no longer geographically bounded with all the
36:22
different constraints that come with that. Instead, you
36:24
have this very kind of homogenous group that
36:26
you've been sort of shunted into as you've,
36:28
you know, your people, you may know, algorithm
36:30
is like, you're just like these people. You
36:32
should all be just like each other together.
36:34
You know, oh look, you all, you know,
36:36
you all share weird anti-government beliefs. You should
36:38
join this Q-anon group over here, right? You
36:40
remember this from the old days of like
36:42
the Facebook, like the Facebook, like the Facebook,
36:44
like the Facebook, like, That's sort of dynamic
36:47
of, hey, we're going to put you
36:49
all together. You're all going to spend
36:51
a whole lot of time talking to
36:53
each other. You're going to follow influential
36:55
figures who are kind of just like
36:57
you. They're going to talk very directly
36:59
with you. They're not going to feel
37:01
like media. They're also going to have
37:03
massive reach and all of the sort
37:05
of trappings that go with what we
37:07
used to think of as media. So
37:09
they're this weird hybrid figure. And that's
37:11
where you're going to spend less and
37:13
less and less time. with friends in
37:16
the real world, this is going to
37:18
kind of replace your socialization
37:20
too. So I think that it
37:23
makes it an incredibly effective persuasion
37:25
engine just in terms of like
37:27
these different dynamics all coming together
37:30
in one place. That's why I
37:32
think it's actually a really
37:34
interesting thing to be paying attention
37:36
to. I think there's this dynamic
37:39
of like the rumor into propaganda
37:41
cycle, right? Like eating the
37:43
pets. was actually a thing. Is
37:45
everything a thing? No. But some
37:47
of them hit. You know, and
37:50
it's understanding how these random moments
37:52
feed into these bigger stories and
37:55
then also the ways in
37:57
which we as individuals actively
37:59
have. the power to participate in
38:01
shaping those things now, which makes
38:03
people feel, I think, much more
38:05
invested as opposed to when they
38:07
were kind of sitting on their
38:09
couch just watching it go by.
38:11
So that's what I think is
38:13
different. I have an unscientific theory,
38:15
and I'm curious if you would
38:17
weigh in just to tell me
38:19
whether it has any basis or
38:21
not. So I always thought like
38:23
something about influenza or social media
38:25
was more effective at radicalizing people
38:27
or at least getting people to
38:29
be more you know to be
38:31
more laser focused on one particular
38:33
thing or the other in part
38:35
because they you they people access
38:37
you influencers access you usually completely
38:39
alone and trapped with your phone
38:42
or your device or whatever whereas
38:44
you know the newspaper is something
38:46
that it's like you know it's
38:48
out in the open everybody knows
38:50
that you're reading it and what
38:52
you're reading when you open a
38:54
newspaper and when you like watch
38:56
Better Call Saul with your husband
38:58
or whatever, there's a kind of
39:00
like sit on the couch and
39:02
watch TV together. Go to the
39:04
movie theater and watch the movie
39:06
together. Go to the concert and
39:08
you know, and watch pavement play
39:10
or something like that. I don't
39:12
know. Yeah. So, so these things
39:14
that I just mentioned, but but
39:16
when you're, you're kind of like
39:18
just online. You're kind of in
39:20
this, you know, this sort of
39:22
stereotype, cliche, whatever, of like you're
39:24
just in a dark room and
39:26
Alex Jones is ranting at you
39:28
about, you know, about, you know,
39:30
what Ukraine is secretly trying to
39:32
do. It has a different, it
39:34
has a different effect, right? And
39:36
then the social media, maybe I'm
39:38
wrong, but like that's just the
39:40
way I've always thought about it.
39:42
It's just people. you know, and
39:44
those kind of misogynistic places and
39:46
stuff like that, where you just
39:48
imagine. these guys in these really
39:50
dark rooms, like just their brains
39:52
cooking on propaganda, they're not socializing.
39:54
They are, they're just trapped with
39:56
their influencers. I think they are
39:58
actually, like one, one thing, people
40:00
will take that stuff and they'll
40:02
chuck it into the group chat,
40:04
right, or the discord server, or
40:06
the message board. It's actually like
40:08
really, or the, you know, subreddit
40:11
back in the day when that
40:13
stuff was on Reddit more. I
40:15
think that there's actually a fair
40:17
bit of I saw this thing
40:19
and now, and now I'm sharing
40:21
it. and I think that it
40:23
is actually very much like who
40:25
you share it with. I mean,
40:27
you look at the number of
40:29
random links that just land on
40:31
even the Chan boards, right? Like
40:33
people are, I see this thing,
40:35
I share with my friends, I
40:37
see this thing, I share with
40:39
my friends, like that's the, like
40:41
that's kind of the function in
40:43
the group chat, right? It's to,
40:45
it's to, like we're like jointly
40:47
processing this weird thing that we've
40:49
just seen. You can do it
40:51
in public on Twitter, right. and
40:53
you can do it with strangers
40:55
on those places, but you also
40:57
have these places where you have
40:59
that more persistent standing community. That's
41:01
I think the difference between the
41:03
group spaces, like the spaces that
41:05
are group oriented, whether that's Facebook
41:07
groups, which are a little more
41:09
obscured, or the ones where the
41:11
message boards where you can actually
41:13
kind of see it all go
41:15
by, versus the ones where you're
41:17
just talking in public to whatever
41:19
random person happens to be online
41:21
at the same time as you.
41:23
You do see, there was like
41:25
the, like the, I used to
41:27
say it is a joke, but
41:29
I think it's actually kind of
41:31
true, right? Like Twitter makes these
41:33
mobs that kind of come together
41:35
really spontaneously, but Facebook was where
41:37
you would have like the cults
41:40
that would really come out of
41:42
there, right? Like the people who
41:44
were like, we are here, man.
41:46
And like Alex Jones stuff would
41:48
get tossed in there all the
41:50
time. It was just this is
41:52
where we are here to get
41:54
tossed in there all the time.
41:56
the standing communities will be a
41:58
little bit more like that. So
42:01
the government has taken some
42:03
interest in this, our good
42:05
friends, the US government, avid listeners
42:07
to the podcast. I learned in
42:10
a very strange way that I
42:12
may retail on a later episode
42:15
of the show, but the Republican
42:17
Party particularly has
42:19
taken a pretty heavy interest in
42:21
social media. Like you said, in
42:24
the early days it was all
42:26
ISIS. It was like, how is
42:29
ISIS using Twitter, using Twitter, to
42:31
get towns to empty out before
42:33
anybody even sets a foot there,
42:36
right? Or like, how are they
42:38
using it to recruit? Which, like,
42:40
even if it was a small
42:42
number of people, that's a huge,
42:44
you know, it doesn't take very
42:47
many people to make a huge
42:49
impact on something, right? So it
42:51
started on that, and then,
42:53
like, some steps happened, and
42:55
now it's like Jim Jordan.
42:57
Really and Mark Zuckerberg about the
43:00
diamond and silk Facebook page and
43:02
why it's not getting as many
43:04
likes as it should or like
43:07
Donald Trump Jr. on Instagram being
43:09
like my post only got 10,000
43:11
faves. Obviously I'm being censored
43:14
here and like that became
43:16
that sort of partisan calling
43:18
card on the flip side,
43:20
especially after the election of
43:22
Trump and you know as
43:24
Democrats tried to hit their head
43:26
against the brick wall figuring out
43:28
what happened. You know, one of
43:30
the early theories was like, well,
43:33
it's the internet. It's Russian trolls
43:35
on the internet and bots on
43:37
the internet, which had a grain
43:39
of truth to it. It's not
43:41
the whole story, but like, even
43:43
a little bit, but in the
43:45
course of all of that, people,
43:48
including yourself very personally, have
43:50
been turned into, like, you
43:52
said it earlier, like artificial.
43:55
Boogie women, you know,
43:57
which which I agree Mike
43:59
sounds incredibly cool. But
44:01
like I don't imagine that experience
44:03
felt incredibly cool for you Renee.
44:05
The first time I ever heard
44:08
of you was in the context
44:10
of hearing how horrible a person
44:12
you are from a person from
44:14
a person who was genuinely horrible
44:16
to be clear. You know I
44:18
can't remember what what Nazi was
44:20
throwing that out. So the government
44:22
went from being like kind of
44:25
interested vaguely like supportive on quote,
44:27
unquote national security grounds of like
44:29
this sort of research into how
44:31
does the internet influence the world,
44:33
misinformation, disinformation, blah, blah, blah. Me
44:35
too. Me too. Me too. That's,
44:37
if it, I realize I speak
44:39
monotone, but like, there's contempt in
44:42
my voice. So they went from
44:44
being vaguely interested to then just
44:46
like going on the offensive against
44:48
researchers and institutions that look at
44:50
this stuff. You had a very
44:52
unfortunate front row seat to this
44:54
kind of through the whole life
44:56
cycle. What changed? Why did they
44:59
get so fucking mad at everybody
45:01
who tries to understand how the
45:03
internet works? Yeah, because some of
45:05
us said the 2020 election was
45:07
free and fair and that was
45:09
an inconvenience to them. Ah, that
45:11
sucks. Yeah, that's your first mistake.
45:13
Yeah, I know, I know. Troublesome
45:16
facts. No, so in, first of
45:18
all, there were some really clear,
45:20
bright lines. Obviously, ISIS was extraordinary,
45:22
you know, unambiguously clear, like we
45:24
don't like terrorists on our social
45:26
platforms. Though, actually, it's funny enough.
45:28
If you go back in time
45:30
and you pull up the old,
45:33
you know, you know, you know,
45:35
you know, you know, you know,
45:37
media articles about that Twitter is
45:39
for a while there making the
45:41
one man's terrorist is another man's
45:43
freedom fighter kind of argument wondering
45:45
like is it a slippery slope
45:47
if we start taking these accounts
45:50
down and like where will it
45:52
end So for a while, they
45:54
would be called, you know, like
45:56
unintentionally based. It was a, but,
45:58
you know, what, what, it was
46:00
like the botical on attack and
46:02
a couple of other things that
46:04
really shifted the narrative as people
46:07
began to realize that for our
46:09
earlier chat, propaganda does work in
46:11
fact, and at some point, some
46:13
susceptible people commit atrocities and, and
46:15
that is, you know, the, the
46:17
trade off of that, you know,
46:19
they began to, you know, to,
46:21
to take down some of that
46:24
content. What happened with Russia then,
46:26
you know, this was one of
46:28
these things that was a little
46:30
bit frustrating for me because I
46:32
was asked to run one of
46:34
the research teams that analyzed the
46:36
data sets that were turned over
46:38
to the Senate Intelligence Committee. So
46:41
there were data sets that the
46:43
platforms attributed to the internet research
46:45
agency, just to be clear, I
46:47
did not attribute. I was just
46:49
given the data that they attributed.
46:51
And the internet research agency is
46:53
like the firm that It's kind
46:55
of widely thought of as like
46:58
executing a lot of like online
47:00
information information operation kind of things.
47:02
Yeah, they would use these these
47:04
trolls when they wanted to you
47:06
know, like kind of change the
47:08
information space ahead of invading Crimea
47:10
or you know, manipulating the American
47:12
discourse. They didn't do it solely
47:15
to interfere in the election. They
47:17
did it far more broadly than
47:19
that. But they did it through
47:21
the 2016 election, which was how
47:23
it came to be very caught
47:25
up in the Hillary Clinton Donald
47:27
Trump discourse. My report on that
47:29
was actually, it's funny because I
47:32
re-read it again recently as I
47:34
was fighting with Matt Tivey about
47:36
it. It's fairly boring to be
47:38
honest. It is extremely descriptive because
47:40
I wanted it to be as
47:42
neutral a description of what happened
47:44
as possible. Nothing that we were
47:46
given would have enabled us to
47:49
answer the question, did this swing
47:51
the election? Right? So I just
47:53
wanted to say, like, here is
47:55
what it is, here is how
47:57
it worked, here is what it
47:59
did, and you know, here's how
48:01
it intersected with these various communities
48:03
and here's why we should be
48:05
paying attention to it. However, because
48:08
of the broader conversation about Donald
48:10
Trump and collusion, which was wholly
48:13
outside of what I looked at,
48:15
it got caught up in that,
48:17
you know, that thing that came
48:19
to be called rushgate, which no
48:22
one can define, but you understand
48:24
it, you know, I too became
48:26
part of the quote unquote rushigate
48:28
collusion hoax. Right, because anybody who
48:31
did any research, even vaguely touching
48:33
Russia, was rebranded as this Democrat
48:35
operative trying to, you know, like
48:38
allege that Donald Trump's election was
48:40
illegitimate. So that was
48:42
a very frustrating thing. But again,
48:44
it was, you know, it's not
48:46
really the end of the world.
48:49
And to their credit, the Trump
48:51
administration did actually institute a series
48:53
of counter foreign disinformation efforts within
48:55
the FBI, within DHS, within
48:57
the intelligence community, within OD&I.
49:00
I feel like I've just
49:02
left one now. Oh, and
49:04
within the State Department, the
49:06
Senate expanded the mandate of
49:08
the Global Engagement Center to
49:10
counter foreign propaganda. That had
49:12
actually been established to counter
49:14
ISIS propaganda. terrorist propaganda, it
49:16
was established to counter Russian
49:18
and Chinese propaganda as well.
49:20
So around this time, China
49:23
had also begun interfering, Iran
49:25
had also begun interfering, and you
49:27
know, and as we found out,
49:29
like the US Pentagon was running
49:32
influence operations, it became table stakes,
49:34
but there was this like really
49:36
clear line again still around foreign
49:38
versus domestic. The problem
49:40
was what happened in 2020 was
49:43
that a lot of the
49:45
same kinds of dynamics of
49:47
wildly viral false
49:49
accusations intended to
49:52
manipulate and mislead
49:55
people began to emerge
49:58
from domestic American
50:00
influencers. And what we started to
50:02
see was that those narratives, false
50:05
and misleading claims, intended to manipulate
50:07
audiences, which do, in some capacities,
50:09
some specific ones, did meet the
50:11
definition of disinformation campaigns, were being
50:13
run by domestic American influencers. And
50:15
this created a real problem. So
50:17
those of us who were studying
50:20
the 2020 election, like Also at
50:22
Stanford Internet Observatory, Kate Starboard's team
50:24
at University of Washington Center for
50:26
an informed public, DFR lab. I
50:28
think you might have been a
50:30
DFR lab at the time, right?
50:32
I was. I didn't. I didn't
50:34
work on EIP. I was like
50:37
deep in militia land at the
50:39
time. Right. Which, you know, looking
50:41
back was maybe a blessing. Yeah,
50:43
you escaped. But as this, you
50:45
know, as EIP, this is what
50:47
we, for those, for the listeners
50:49
like that, was what we called
50:51
the sort of inter-institutional consortium that
50:54
came together. Grafica was the fourth
50:56
org. We came together again largely
50:58
because we thought we would see
51:00
a lot of foreign interference. And,
51:02
you know, Russia, Iran, China did
51:04
mess around, but the most impactful
51:06
claims, both rumors and, you know,
51:09
these allegations that like dominion voting
51:11
machines were changing votes and that
51:13
the election wasn't free and fair
51:15
that had been stolen. all the
51:17
things that we chronicled, the really
51:19
impactful stuff was not coming from
51:21
foreign trolls, it was coming from
51:23
Donald Trump and his inner circle.
51:26
And as we chronicled that, as
51:28
we documented it, as we occasionally
51:30
engaged with tech platforms and said,
51:32
hey, you know, some of these
51:34
posts appeared to violate your terms
51:36
of service, as we engaged with
51:38
state and local election officials and
51:40
said, hey, like this rumor about
51:43
Sharpie markers, this kind of a
51:45
big deal, you should probably respond.
51:47
That work that we did, which
51:49
we did in full view of
51:51
the public, we wrote a 250
51:53
page final report on it, was
51:55
recast two years later when in
51:58
2022 the house flipped and Jim
52:00
Jordan got his gavel, was recast.
52:02
passed by the weaponization committee as
52:04
a vast effort not to track
52:06
what was happening in the 2020
52:08
election, but to somehow mass censor
52:10
the 2020 election in real time.
52:12
And those narratives that I've described,
52:15
dominion, sharpie gate, these like massive
52:17
viral things that we watched go
52:19
by and that we chronicled as
52:21
they went by, they allege that
52:23
we somehow censored as they went
52:25
by, which is complete bullshit because
52:27
anybody listening. Like, Fox News paid
52:29
out $700 and something million dollars
52:32
over that dominion claim, but the
52:34
allegation that Jim Jordan wants you
52:36
to believe is that we somehow
52:38
censored it, right? Nonetheless, that was
52:40
really the tipping point whereby they
52:42
began to make this argument, particularly
52:44
because Donald Trump was also deplatform
52:47
after January 6th, that academic research
52:49
into rumors and propaganda was really
52:51
a vast plot to censor conservatives.
52:53
And that's because... the vast majority
52:55
of people who were moderated, you
52:57
know, or who had their accounts
52:59
taken away after January 6th, were,
53:01
you know, the Q&N on accounts
53:04
and some of the conservative election
53:06
deniers. I remember these, you know,
53:08
some of the, I believe, your
53:10
reports and, or the reports that
53:12
you were working on and other,
53:14
and other folks at the time,
53:16
and, and they were like, the
53:18
dominion pushers were, of course, Trump
53:21
and, and Don Junior and people
53:23
like that, but also like Tim
53:25
Poo And these are people who
53:27
all have some connections to Russia
53:29
as well, right? And one thing
53:31
where like Timpul was recently found
53:33
of that scandal, like that's 10
53:36
media. Right, yeah. So I mean,
53:38
is it like when we say
53:40
that they're like US influencers, I'm
53:42
not I am not saying that
53:44
it's like another Russia gate or
53:46
whatever buried inside the US thing,
53:48
but like have the lines broken
53:50
down so much between our American
53:53
influencers and like. Are the goals
53:55
so closely aligned between magga and
53:57
the Kremlin that it almost doesn't
53:59
matter anymore? There's no need for
54:01
a conspiracy anymore because they're
54:03
just kind of, I mean, look
54:05
what's happening, right? You see what's
54:08
happening with Trump and the Trump
54:10
administration and getting rid of everything
54:12
that the Kremlin would want them
54:14
to get rid of. Bottom text.
54:16
We wouldn't make that attribution at
54:18
Stanford and Observatory. We kept those
54:21
lines very bright because in our
54:23
opinion, we would talk about. the narrative
54:25
overlap is like one very very very
54:27
small part and that's because even when
54:29
I did the work for the Senate
54:32
on the you know the the data
54:34
set that covered from 2015 to 2017
54:36
Russian like the Russian trolls that were
54:38
targeting the right would just go pick
54:40
up turning point USA memes and they
54:43
would literally cover the turning point logo
54:45
with their own logo and they were
54:47
just plagiarized. And that's because it's always
54:49
been true. Well, I mean, they would
54:51
do this with like a whole lot
54:53
of different communities. You know, they would
54:55
target the black community and they would
54:57
pull from like black media and plagiarize
54:59
from there. They would plagiarize from news
55:01
media. They had this really terrible, you
55:03
know, like the live laugh love, like
55:05
that horrible shit, the Pinterest, Pinterest lady
55:07
stuff. Like they had a being liberal
55:09
one that targeted women that used that
55:11
kind of stuff, which I'm sure they
55:13
just pulled from Etsy. They didn't have
55:15
to come up with new stuff because
55:17
Americans do it to ourselves. And if
55:19
you want to create identity and divisive
55:21
content, you just go and take what's
55:23
already available to you. And so when
55:26
we would talk about foreign interference,
55:28
it was very important to us that
55:30
we weren't relying just on content that
55:33
we were always looking at, either some
55:35
sort of evidence of like a network,
55:37
you know, a network in play or...
55:39
explicit accounts where we could say this
55:41
is demonstrably tied to some sort of
55:43
foreign network, some sort of foreign actor.
55:45
So we did in our, you know,
55:47
in our interest to kind of like
55:49
keep those two things separate, like we
55:51
wouldn't have made that argument, but you
55:54
certainly can make the point that there
55:56
is a lot of ideological overlap and,
55:58
you know, I read the tenant. media
56:00
indictments also, there is clearly a realization
56:02
that that kind of stuff can happen.
56:04
I mean, they would also, you know,
56:07
they ran some, they call it peace
56:09
data, I think was the name of
56:11
it, if I recall correctly, like a
56:14
burning left kind of garbage site that
56:16
they tried for a while. So, you
56:18
know, any time they want to appeal
56:21
to some sort of, you know, niche
56:23
identitarian politics, they're going to go and
56:25
find it and find someone who can
56:28
boost it. You know, I've no doubt
56:30
that they're going to continue to do
56:32
it, particularly now that, again, I mentioned
56:35
kind of to its credit, most of
56:37
that election defense infrastructure was built and
56:39
established during the Trump won administration. In
56:42
the first six weeks of the Trump
56:44
to administration, it's all been destroyed, just
56:46
to be clear, all of it is
56:49
gone. Barn Influenced Task Force was dismantled.
56:51
All those employees at Sissa who... had
56:53
the audacity to say that the 2020
56:56
election was free and fair have been
56:58
placed on administrative leave, right? The Odi
57:00
and I, a lot of those people
57:03
have been, the political appointees are gone.
57:05
It's not totally clear what they're going
57:07
to staff there. That's always hard to
57:10
follow because it's, you know, within the
57:12
intelligence community. And then the global engagement
57:14
center was defunded back in December. It
57:17
seems like they're quite well set up
57:19
to just. you know, shoehorn in whatever,
57:21
whatever foreign influence things are going to
57:24
help them and just allow it to
57:26
happen. Well, you know, well, I mean,
57:28
it's, it's, it's that the justification for
57:31
it is like rushigate, like that's the,
57:33
rushigate, like, that's the, rushigate, like, that's
57:35
the, which again, I can't define rushigate
57:38
for you if I tried at this
57:40
point. It's like, you know, you know,
57:42
The other piece of it that's kind
57:45
of fascinating though is like where the
57:47
social media platforms are going to come
57:49
down because it has never been any
57:52
kind of regulatory requirement for them to
57:54
do those investigations and the So as
57:56
I mentioned, you know, Jim Jordan got
57:59
his gavel and he decided that that
58:01
the election integrity partnership and all of
58:03
the work that we had done was
58:06
part of this vast cabal. And the
58:08
argument that they made was that we
58:10
did our work at the direction of
58:13
the deep state because remember, Trump ran
58:15
the government, Trump appointees ran the government
58:17
during the 2020 election. So somehow, even
58:20
though this letter that we get from
58:22
Jim Jordan alleges that he's investigating the
58:24
Biden censorship regime, the people that we
58:26
were talking to worked for the Trump
58:29
administration. So we find ourselves in this
58:31
like the sort of surreal Kafkaesque, you
58:33
know, universe in which we're being accused
58:36
of being told by the government to
58:38
tell the tech companies to take down
58:40
conservative tweets. And they come up with
58:43
this number like 22 million tweets. And
58:45
they get that number 22 million by
58:47
going to our report, which again has
58:50
been on the internet for two years
58:52
by this point, and in our table
58:54
of the most viral narratives, we had
58:57
like the top 10 most viral narratives
58:59
that we had followed, which we calculated
59:01
after the election, right? We added up
59:04
all of the numbers, like 9 million
59:06
for dominion and, I don't know, 700,000
59:08
for Sharpie Gate, whatever it was. And
59:11
at the bottom of the column, it
59:13
sums up to 22 million. And so
59:15
the number that was the top, you
59:18
know, the sum total of the most
59:20
viral stories was 22 million, and they
59:22
just refrained that they literally just took
59:25
that number. And Matt Taaby and Michael
59:27
Schellenberger sat under oath in a congressional
59:29
hearing and said, Stanford Internet Observatory in
59:32
the Election Integrity Partnership, like censored 22
59:34
million tweets. Just a fucking insane allegation,
59:36
you know, and I was like, this
59:39
is so real. If I was, if
59:41
I was that sloppy and it was
59:43
like clear that I was that sloppy,
59:46
it's so crazy because like when I
59:48
was in journalism school, I was like
59:50
reading Tyvee, enrolling Stone. and you know
59:53
with my like long form investigative professor
59:55
just being like this guy is amazing
59:57
I want to be him and if
1:00:00
I had made the kind of blunders
1:00:02
he seems to make every week or
1:00:04
every time he tries to talk about
1:00:07
this I would have like Just like
1:00:09
shot off to an island somewhere and
1:00:11
like never shown my face again. You
1:00:14
wouldn't have been able to work No,
1:00:16
I wouldn't have a job You would
1:00:18
have faced repercussions because your ideology is
1:00:21
and is you can do almost anything
1:00:23
on the right and get away with
1:00:25
it In terms of this is really
1:00:28
important. I think as as a point
1:00:30
like I edited that report just to
1:00:32
be clear like one of the things
1:00:35
that and it was a it was
1:00:37
a massive team effort, you know But
1:00:39
I was the one who edited, you
1:00:42
know, like, aggregated. It pulled it all
1:00:44
together. Like, Kate's team wrote one chapter.
1:00:46
People at Grafka wrote the foreign influence
1:00:49
section. You know, different people wrote different
1:00:51
subsections. And I was the person who
1:00:53
was doing the, you know, the big
1:00:55
final aggregation. And I mean, I cannot
1:00:58
tell you how many rounds of editing
1:01:00
and legal review and just like. Like,
1:01:02
the anxiety I had for every fucking
1:01:05
sentence in that report, checking the citation,
1:01:07
going back, checking the citation, did we
1:01:09
get it right, right? Like, I think
1:01:12
I found one mistake at one point
1:01:14
where, you know, we issued a correction
1:01:16
and, you know, and put something out
1:01:19
after the fact, if I recall correctly,
1:01:21
it'd be up in the Stanford library,
1:01:23
you know, the stacks, just like a
1:01:26
URL that was not correct or something
1:01:28
along those lines. But the, um... You
1:01:30
know, I feel that way about everything,
1:01:33
even when I did the book, I
1:01:35
mean the sheer number of like hours
1:01:37
I spent, like both hiring fact checkers
1:01:40
and fact checking it myself and just
1:01:42
the, literally like the anxiety over not
1:01:44
wanting to be wrong because like your
1:01:47
credibility is on the line with this
1:01:49
stuff. Even the Senate report, man, just
1:01:51
knowing how many people were going to
1:01:54
go through that thing with a fine
1:01:56
tooth comb. And the one thing I
1:01:58
will say. is even after with all
1:02:01
the bullshit smear campaigns that Congress came
1:02:03
at us with they were never actually
1:02:05
able to find anything in it that
1:02:08
we didn't stand behind. They went with
1:02:10
smears and lies where they took these
1:02:12
numbers and turned them into bullshit because
1:02:15
they couldn't actually find anything that was
1:02:17
like wrong to use to discredit us
1:02:19
so they went with lies instead and
1:02:22
or with like we don't like your
1:02:24
speech so we're going to complain about
1:02:26
your speech because we can't actually find
1:02:29
evidence. you know and all the emails and
1:02:31
things so what happens is you know the subpoenas
1:02:33
begin to come in and of course we have
1:02:35
to turn over all of our documents all of
1:02:37
our work product all of the things to you
1:02:39
know all the emails that we're being asked to
1:02:41
turn over both with the government and with the
1:02:44
tech companies and of course there's no
1:02:46
emails in which the government is telling
1:02:48
us to tell the platforms to do
1:02:50
anything but what happens is like you
1:02:52
can't exonerate yourself this is the thing
1:02:55
I think that people really need to
1:02:57
understand right When you're hauled in front
1:02:59
of an investigation, this stupid, you actually
1:03:01
can't exonerate yourself. You have to think
1:03:03
about it as like House and
1:03:06
American Activities Committee. Like that is
1:03:08
the parallel here. You will never be
1:03:10
able to actually prove that you didn't
1:03:12
do the thing. Because when there were
1:03:14
not 22 million tweets, when you know,
1:03:16
I was like, let Elon turn over
1:03:18
the 22 million tweets, like they claim
1:03:20
that they're the Twitter files, boys, like
1:03:22
where are the 22 million tweets? You
1:03:25
know, when they don't have them, they
1:03:27
just move the goalposts. Oh, well, yeah,
1:03:29
maybe the government didn't actually
1:03:31
send you anything, but some analyst
1:03:33
flagged this tweet and we don't
1:03:35
like that you flagged it. And I'm
1:03:38
like, okay, but that's a complaint
1:03:40
about my speech. That's a complaint
1:03:42
about my First Amendment protected speech.
1:03:44
That's not censorship. That's my speech.
1:03:46
You know, and you just wind
1:03:49
up in this surreal universe where...
1:03:51
Nobody remembers the accusation that was
1:03:53
actually made about you and instead
1:03:55
they just complained about something else
1:03:58
and that is the like that is the process
1:04:00
here and for the point about
1:04:02
the sloppiness of the Twitter files
1:04:04
and the reporters, they're never actually
1:04:06
held to account for it because
1:04:08
they were actually asked by the
1:04:10
minority on the weaponization committee to
1:04:12
refile their testimony because Meti Hasan
1:04:14
actually did, funny enough, go and
1:04:16
read the report and he had
1:04:18
Matt Taebi on for an interview
1:04:21
at the Twitter files and he
1:04:23
pointed out that he was wrong
1:04:25
by, you know, that we had
1:04:27
flagged 4, 700 tweets. 4,700 URLs
1:04:29
in total. And he said, you
1:04:31
know, you were off by like
1:04:33
21,997,000, you know, 900 and something.
1:04:35
And why aren't you correcting this?
1:04:37
And they just never bothered to
1:04:39
correct it. So like that too
1:04:41
is, you know, sort of a
1:04:43
surprise, right? You can perjure yourself
1:04:45
if you're, you know, on the
1:04:47
right. If I can suggest a
1:04:49
title for the next report, if
1:04:51
that media ecosystem can offer us
1:04:54
any tips, it's maybe you should
1:04:56
call it big if true. And
1:04:58
that should help you wiggle. The
1:05:00
art of the shameless, right? Yeah.
1:05:02
The tech platforms have also sort
1:05:04
of changed their stance on this.
1:05:06
You know, a platform like Facebook
1:05:08
still has some teams. Not as
1:05:10
big. much resources they used to,
1:05:12
you know, hunting for potential for
1:05:14
an influence campaigns. But their external
1:05:16
funding of that kind of research
1:05:18
is pretty much gone. The rolling
1:05:20
back, a ton of content moderation.
1:05:22
Mark Zuckerberg wrote a letter to
1:05:25
Jim Jordan and was like, yes,
1:05:27
it's true, mean, mean, Joe Biden,
1:05:29
you know, threatened us if we
1:05:31
didn't pull content. And a lot
1:05:33
of these platforms are dealing with
1:05:35
European regulators. who actually have a
1:05:37
fucking backbone and like want to
1:05:39
regulate it or is the approach
1:05:41
perfect is it correct I don't
1:05:43
I don't know if to take
1:05:45
a moral position but they are
1:05:47
doing something where the US government
1:05:49
has done like historically not a
1:05:51
lot. And now it's like after
1:05:53
they've spent multiple years pummeling these
1:05:55
platforms, pummeling researchers, as Facebook, I
1:05:58
would expect behind the scenes potentially,
1:06:00
you know, Tik-talk as it tries
1:06:02
to escape being banned in the
1:06:04
US, has these platforms sort of
1:06:06
changed their attitude and sort of
1:06:08
not give up the fight but
1:06:10
really just deprioritize it a lot?
1:06:12
The tone of Jim Jordan has
1:06:14
been like, oh, come here, sweet,
1:06:16
sweet, sweet, Mark Zuckerberg, my boy.
1:06:18
What are the Europeans doing to
1:06:20
you? Oh my God. And it's
1:06:22
like now that they don't have
1:06:24
Biden to blame. They're planning to
1:06:26
go on the offensive. So I'm
1:06:29
just curious, how much are these
1:06:31
platforms just capitulating? Yeah, well, there's
1:06:33
a, I mean, there's so much
1:06:35
there. The first, content moderation is
1:06:37
both hard and not always well
1:06:39
done, right. And that's the truth
1:06:41
of it. Content moderation in most
1:06:43
of what we've talked about has
1:06:45
been, you know, state actors and
1:06:47
political propaganda and what is the
1:06:49
correct response to political propaganda. So
1:06:51
one of the things that we
1:06:53
argued, I don't even say we,
1:06:55
like that I argued actually, is
1:06:57
just that takedowns don't really work
1:07:00
a lot of the time. I
1:07:02
think people have different opinions on,
1:07:04
depending on what you study, right?
1:07:06
For people who study, maybe militia
1:07:08
groups, maybe violent speech, I think
1:07:10
people have different arguments about. platforming.
1:07:12
For a lot of the stuff
1:07:14
that I look at, the vaccine
1:07:16
speech, political speech, it just doesn't
1:07:18
work. It just goes somewhere else.
1:07:20
It doesn't actually make a difference.
1:07:22
And so a lot of what
1:07:24
I argued for over the years
1:07:26
was the labeling, was the community
1:07:28
notes, actually the community notes are
1:07:30
good, was the giving users more
1:07:33
control, was making recommendation engines better
1:07:35
on a variety of different axes.
1:07:37
And this is where, you know,
1:07:39
some of my current work actually
1:07:41
looking at blue sky as an
1:07:43
architecture that you can build. on
1:07:45
top of is actually trying to
1:07:47
get into that question of when
1:07:49
we say better what do we
1:07:51
mean and how and trying to
1:07:53
actually do those experiments. When we
1:07:55
talk about the shifts that Facebook
1:07:57
has made and the capitulations like
1:07:59
with that letter to Jordan, the
1:08:01
thing that was so interesting about
1:08:04
that letter to me was that
1:08:06
it was a response to job
1:08:08
owning. So we went through this
1:08:10
entire series of like, you know,
1:08:12
the arguments in which Big Neine,
1:08:14
Joe Biden, and the Biden censorship
1:08:16
regime, even during the time it
1:08:18
was run by the Trump government,
1:08:20
was, you know, was demanding all
1:08:22
of these things. He made that
1:08:24
capitulation. Rumor was for a while
1:08:26
to try to get out of
1:08:28
a couple hearings and subpoenas himself.
1:08:30
I don't know how much it's
1:08:32
really going to pay off for
1:08:35
him. The FTC is still kind
1:08:37
of saber rattling over this. So
1:08:39
is the FCC. There are, you
1:08:41
know, Josh Hawley, I don't know
1:08:43
how much people pay attention to
1:08:45
hearings in the way that I
1:08:47
do, but Senator Schmidt, who actually,
1:08:49
you know, launched the Murphy v.
1:08:51
Missouri court case that was one
1:08:53
of these kind of canonical cases
1:08:55
that SCOTUS actually wound up tossing
1:08:57
that made the allegations of the
1:08:59
censorship industrial complex and, you know,
1:09:01
that the Biden censorship regime was
1:09:03
trying to silence all of the
1:09:05
speech. SCOTUS found no evidence of
1:09:08
it. Toss, toss the case. Anyway,
1:09:10
Schmidt had a hearing on Tuesday
1:09:12
of Tuesday of this week. performative
1:09:14
nonsense. The most interesting point of
1:09:16
the hearing was actually Josh Hawley
1:09:18
saying even, you know, they had
1:09:20
these journalists, these right wing journalists,
1:09:22
talking about being censored and then
1:09:24
some free speech focused professors on
1:09:26
the left for the minority talking
1:09:28
about how the real censorship was,
1:09:30
you know, some of the things
1:09:32
that the administration is currently doing.
1:09:34
Anyway, Hawley came out and just
1:09:36
said like, look, for all of
1:09:39
what we're talking about, Mark Zuckerberg
1:09:41
made to be held accountable. So
1:09:43
it's interesting to hear some of
1:09:45
the senators still saying, like, for
1:09:47
all of the capitulation and groveling,
1:09:49
you have attempted to do, like
1:09:51
we are still going. to regulate
1:09:53
you. And it is interesting to
1:09:55
see that that dynamic continuing to
1:09:57
come from some of the lead
1:09:59
figures in the administration. So I
1:10:01
think Zuckerberg is trying to buy
1:10:03
himself friends, but it's not clear
1:10:05
that that has been successful. On
1:10:07
the European front, you mentioned like
1:10:10
regulators with spines. This is going
1:10:12
to really be a test of
1:10:14
those spines, because if you followed
1:10:16
what JD Vance has gone over
1:10:18
to your to do recently, A
1:10:20
lot the rhetoric has been that
1:10:22
content moderation of speech, even in
1:10:24
the European market, where they do
1:10:26
not have the First Amendment, where
1:10:28
they have different views of free
1:10:30
speech, right? Nazi speech is prohibited,
1:10:32
for example, that advance has made
1:10:34
the point that European content moderation
1:10:36
rules are censorship and that American
1:10:38
companies shouldn't have to abide by
1:10:40
them. And so the question is...
1:10:43
to what extent are the European
1:10:45
regulators really going to take a
1:10:47
stand and actually demand that the
1:10:49
American tech companies comply with their
1:10:51
laws, given that this could trigger
1:10:53
one of these stupid tariff wars
1:10:55
or the ways in which the
1:10:57
administration has been rather provocatively picking
1:10:59
fights with purported allies. It's quite
1:11:01
clear that Vance and others in
1:11:03
the administration don't have a particularly
1:11:05
high regard for Europe, quite a
1:11:07
lot of contempt there. And so
1:11:09
it's going to be very interesting
1:11:11
to see whether these laws are
1:11:14
actually, like whether Europe fights on
1:11:16
that sovereignty principle. I think you're
1:11:18
going to see Brazil encounter the
1:11:20
same dynamic as Musk and Zuckerberg
1:11:22
seemingly expect the administration to fight
1:11:24
this fight for them. I've been
1:11:26
talking a lot. Mike, do you
1:11:28
want to wind us down a
1:11:30
bit? Maybe we can talk about
1:11:32
like what's the answer to it.
1:11:34
Oh, no, I want to ask,
1:11:36
I want to ask a question,
1:11:38
which is, what's it like interacting
1:11:40
with Matt Taaby, which is because
1:11:42
I've seen your substance. And I
1:11:45
found it very entertaining in a
1:11:47
kind of gossipy media way because
1:11:49
just it's fun to see people's
1:11:51
private emails and like the attitude
1:11:53
they take. And I just found
1:11:55
him very petulant and ridiculous is
1:11:57
the way I would describe. Seems
1:11:59
like it's just a ridiculous person
1:12:01
who's kind of clinging on to
1:12:03
some idea that he's still a
1:12:05
journalist, but not really. How would
1:12:07
you describe these interactions where he's,
1:12:09
you know, back and forth reaching
1:12:11
out for comment and getting, you
1:12:13
know, and kind of getting fact
1:12:15
checked by you? and vice versa.
1:12:18
You know I think it's the
1:12:20
dumbfounded is the word that comes
1:12:22
to mind every time I get
1:12:24
an email from him but it's
1:12:26
it's been like this since the
1:12:28
very first email I got which
1:12:30
a long time ago he wrote
1:12:32
me a note after I'd written
1:12:34
an article talking about Elon Musk
1:12:36
acquiring Twitter and he wrote me
1:12:38
this random email asking like How
1:12:40
dare I write an article critical
1:12:42
of Elon Musk? And I thought
1:12:44
it was just the most bizarre
1:12:46
thing I'd ever seen. And he
1:12:49
said, you know, why didn't you
1:12:51
reference and then made an allegation
1:12:53
about a past company I'd worked
1:12:55
at and something that they had
1:12:57
done wrong and why didn't I
1:12:59
disclose it or something? And I
1:13:01
said, you know, can you give
1:13:03
me the rules like? You write
1:13:05
about Title IX issues and you
1:13:07
got tagged for sexual harassment. I
1:13:09
don't see you disclosing that at
1:13:11
the top of every article you
1:13:13
write about it. So like, can
1:13:15
you lay out the rules? Like,
1:13:17
what are they? That's my very
1:13:20
first interaction and then it only
1:13:22
got better from there, I guess.
1:13:24
The, you know, when he started
1:13:26
the Twitter files. He just sent
1:13:28
me a note that said, I
1:13:30
need you to comment on the
1:13:32
fact that you worked for the
1:13:34
CIA. And that was the only,
1:13:36
there was nothing like, there was
1:13:38
no question in this. Just comment
1:13:40
on the, just comment on the
1:13:42
fact. Yeah, that was literally it.
1:13:44
It was like, comment on this
1:13:46
publicly available information. Yeah, and I
1:13:48
thought like, what the hell does
1:13:50
that have to do with anything?
1:13:53
Like, yeah, when I was 20,
1:13:55
the smear started and. The way
1:13:57
that I have chosen And I
1:13:59
was like, maybe new to bad
1:14:01
faith attacks, like I'd gotten a
1:14:03
couple, but now having, now that
1:14:05
I am an old pro, bad
1:14:07
faith attacks after the last two
1:14:09
years, my response is just to,
1:14:11
is actually just to publish them,
1:14:13
right? And just to publish the
1:14:15
full, the full, unvarnished interaction, including
1:14:17
my responses so that people can
1:14:19
see what I said, what they
1:14:21
said, and that other people can
1:14:24
make up their mind. And my
1:14:26
frustration with Matt is that I
1:14:28
had requested. corrections on many things
1:14:30
that he has gotten wrong. He
1:14:32
has taken quotes that I have
1:14:34
given and cut them in half,
1:14:36
just foundationally changing the meaning of
1:14:38
them. He has, again, like, just
1:14:40
been factually wrong. Not opinion, like,
1:14:42
I don't dispute his characterizations. I
1:14:44
get it. He wants to, you
1:14:46
know, smear me, spin things. Okay,
1:14:48
this is the way that shitty
1:14:50
media works. Okay, I get it.
1:14:52
But just on the factual factually
1:14:54
wrong things like getting the dates
1:14:57
of my employment wrong who I
1:14:59
worked with wrong Projects he you
1:15:01
know he kept trying to act
1:15:03
as if I worked on Hamilton
1:15:05
68 I did not I had
1:15:07
nothing to do with it. These
1:15:09
sorts of things were I'm like
1:15:11
can you just be like just
1:15:13
just tell the truth you know
1:15:15
what's interesting also the Elon Musk
1:15:17
aspect of it because this is
1:15:19
this guy like really kind of
1:15:21
presents himself as like you know
1:15:23
Jimmy journalists speaking truth to power
1:15:25
and it's like, is there any,
1:15:28
is there any, is there, have
1:15:30
we ever encountered anyone during our
1:15:32
lifetimes as powerful as Elon Musk?
1:15:34
I mean, the guy literally just
1:15:36
bought his way into being president,
1:15:38
you know, and, you know, in
1:15:40
so many words, I can't imagine
1:15:42
anyone as powerful as Elon Musk
1:15:44
and yet. Is the richest man
1:15:46
on the planet or Elon Musk?
1:15:48
Your little press cap and being
1:15:50
like, yeah, I'm stuck, is sticking
1:15:52
up for the little guy and
1:15:54
doing my, you know, you know,
1:15:56
I'm inflicting the power like pain
1:15:59
on the powerful and like taking
1:16:01
taking orders from this like cosmic
1:16:03
freak. How dare the richest man
1:16:05
on the planet face criticism from
1:16:07
the towering behemoth Renee duressa?
1:16:09
It's amazing. I did see
1:16:11
that a lot. Pick on
1:16:13
someone your own size Renee
1:16:15
that's why. It wasn't even
1:16:17
a main article in the
1:16:20
Atlantic. It was like fairly
1:16:22
fair I thought. When you
1:16:24
report on the kind of the
1:16:26
sort of the magga right or
1:16:29
even just white nationalists or
1:16:31
whatever. There's a lot of
1:16:33
effort to sort of portray people
1:16:35
in our position as being
1:16:37
just overwhelmingly powerful. It's like
1:16:40
it blows my mind, you
1:16:42
know, considering the type of
1:16:44
very powerful people that they fluff
1:16:46
up all the time. I just really
1:16:48
felt like, look, if you want to,
1:16:50
if you want to criticize the
1:16:52
work that I did, I would just.
1:16:55
like you to actually criticize the work
1:16:57
that I did, not make up 22
1:16:59
million tweets or a, you know, or
1:17:01
a, in this particular case, in my
1:17:03
most recent engagement with him, he was
1:17:05
trying to discredit the idea that Russia
1:17:07
interfered in the election by insinuating
1:17:10
that the number that Facebook had
1:17:12
put out that 126 million people
1:17:14
had seen and engaged with the
1:17:16
Russian content was really a number
1:17:18
that could be dismissed because it
1:17:20
came from me and my report.
1:17:22
And I kept saying it doesn't come
1:17:24
from me, it doesn't come from
1:17:26
my report, it comes from Facebook.
1:17:28
And I pointed that out with
1:17:30
links, with dates, with like, you
1:17:33
know, it's unambiguously true. There is
1:17:35
no universe in which it's not true,
1:17:37
and I just could not get over
1:17:39
the fact that I had to have
1:17:41
six back and forths about this, and
1:17:43
that rather than just correcting it,
1:17:45
he then turned it into like, well, duressed
1:17:47
a quibbled with the number. And I
1:17:50
didn't I didn't quibble with the number
1:17:52
at all actually I took it at
1:17:54
face value because it was a reasonable
1:17:56
assessment but I quibbled with their Instagram
1:17:59
number because it wasn't and you know
1:18:01
it was just like it's just
1:18:03
an extraordinary thing to pick a
1:18:05
fight over and I you know
1:18:07
it like ate up a Saturday
1:18:09
and then I thought okay I
1:18:11
can't do anything more with this
1:18:13
and I'm just gonna go on
1:18:15
with my life. Well that's what
1:18:17
I wanted to know I just
1:18:20
wanted to know what it's like
1:18:22
to to interact with with Matt
1:18:24
Taaby particularly this new sort of
1:18:26
mutated version of him that is
1:18:28
like now just you know completely
1:18:30
beholden to... billionaire money. I think
1:18:32
all you can do is is
1:18:34
put out the the full exchange
1:18:36
when you deal with bad faith
1:18:38
media. There are plenty and I
1:18:40
don't I don't even I deal
1:18:42
with conservative media and I get
1:18:44
plenty of reasonable inquiries and if
1:18:46
I don't feel like it's an
1:18:48
immediate bad faith inquiry I don't
1:18:50
screenshot it and toss it on
1:18:52
the internet but at this point
1:18:55
whenever something comes in from him
1:18:57
it's a screenshot and toss it
1:18:59
on the internet there's no way
1:19:01
it's going to go well. I'm
1:19:03
excited to see if this podcast
1:19:05
episode makes it into Taiy's lore.
1:19:07
Anyway, Renee, thanks so much for
1:19:09
spending some time with us today.
1:19:11
Listeners, go check out Renee's book.
1:19:13
It's called Invisible rulers, the people
1:19:15
who turn lies into reality. I
1:19:17
can't say enough good things about
1:19:19
the book. It really is great,
1:19:21
Renee. Do you have anything else?
1:19:23
People should check out. Loose Guy
1:19:25
mostly and haphazardly write newsletters sometimes.
1:19:27
Oh yeah, where can people see
1:19:30
the Matt Taaby email exchanges? Oh,
1:19:32
that's on sub stack because that
1:19:34
actually gets good SEO as opposed
1:19:36
to newsletters. But yeah, so just
1:19:38
sub stack and Renade arrested. Maybe,
1:19:40
Jared, can we link to it?
1:19:42
Yeah, yeah, we'll put a link
1:19:44
down in the episode description. Yeah.
1:20:06
You
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More