Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
This is somebody who has been making videos about
0:02
me for months. You know, the next thing I
0:04
knew someone was posting pictures at the inside of
0:06
my apartment. Remember
0:10
Gamergate? Women speaking out online
0:12
have also been met with attacks
0:14
via social media using the hashtag
0:16
Gamergate. What started as an online
0:19
spat about the ethics of gaming
0:21
journalism quickly escalated into a full-blown
0:23
culture war. It helped catalyze the
0:25
alt-right movement, which secured power and
0:27
prominence during the 2016 election cycle.
0:29
Well, it's happening again. Back in
0:32
March, journalists Elissa Morcante wrote a
0:34
story about how Sweet Baby Inc.
0:36
A video game consultancy had been
0:38
helped. gaming companies attract more diverse
0:40
audiences. Alyssa's story was deeply reported
0:42
and factually accurate, but the reaction
0:44
to it quickly took on a
0:47
life of its own. Within days, Mercante
0:49
had been threatened, doxed, and was the
0:51
target of a vicious hate and harassment
0:53
campaign that's become known as Gamergate 2.0.
0:55
Today I'm talking to Alyssa about what
0:57
it's like to be at the center
0:59
of an organized smear campaign, how she's
1:01
fighting back, and why the mainstream media
1:04
continues to be complicit in these attacks.
1:06
Hi Elissa, welcome to Power User. Hi,
1:08
thanks for having me. All right, so
1:10
for people who don't know who you
1:12
are, you're a pretty well-known online culture
1:14
video game journalist. How did you get
1:16
started in this space? I've been writing
1:18
about video games for my whole life,
1:20
but in terms of it being my
1:22
career, it's been the last five or
1:24
six years I was at games radar,
1:27
and then I went to Kataku, which
1:29
is owned by geo media, and my
1:31
reporting has always been on video games
1:33
and the culture surrounding. of everyone's conversation.
1:35
So you wrote this article about a
1:37
company called Sweet Baby that really blew
1:39
up and proved to be very divisive
1:41
on the internet. It also seems to
1:43
have sparked this months long harassment and
1:45
abuse campaign. Tell me about the article
1:47
and kind of what you hope to achieve
1:49
by writing it. What was the crux of
1:51
that story? Sure. So in the darker corners
1:53
of the internet and message boards that probably
1:55
the average internet user would never come across,
1:57
I noticed a conspiracy theory was kind of
1:59
making... the rounds about this particular consulting
2:01
group and what they do in the
2:04
video game industry. Consulting groups in the
2:06
video game industry are a lot like
2:08
how they are. In the movie industry
2:10
they'll get called in by studios making
2:12
games and do a pass on the
2:15
script or they'll offer insights into a
2:17
character. Sometimes they do sensitivity reads. It's
2:19
kind of like they wear many hats
2:21
and they do very very different things.
2:24
This one consultancy group Sweet Baby Inc
2:26
was being accused of forcibly and just
2:28
woke ideologies and characters into contemporary big
2:30
budget video games. And it was being.
2:33
used as sort of a cudgel as
2:35
to why there was a black lead
2:37
character in, you know, an award-winning game
2:39
from that year. And I started looking
2:42
into the people who had created this
2:44
group ostensibly monitoring any game that this
2:46
consultancy firm Sweet Baby had been involved
2:48
in and found that they had their
2:50
own discord, which is like an instant
2:53
messaging service commonly used in the games
2:55
industry, used people who are looking for
2:57
people to play other games with, and
2:59
that discord was full of racist, vitriual,
3:02
home. mopobicslers, misogony, all the kinds of
3:04
stuff that I hadn't really seen at
3:06
that level online in a while. So
3:08
you notice that Sweet Baby, which is
3:11
this consulting firm, had been increasingly involved
3:13
in all of these video games that
3:15
were being, it seems like the target
3:17
of backlash, like you mentioned. So you
3:20
kind of dug into what was fomenting
3:22
that backlash and found this group of
3:24
people that were really dedicated to tracking
3:26
this consultancy firm, Sweet Baby, and it
3:29
seems like producing coordinated harassment campaigns against
3:31
any game. that I guess deigned to
3:33
partner with this consulting firm. Yeah, and
3:35
honestly, once the piece that I published
3:37
went live, and that piece also included
3:40
interviews with the people who work for
3:42
Sweet Baby, Inc., because the goal of
3:44
the piece was to sort of tell
3:46
you exactly what a consultancy firm does
3:49
and what it doesn't do, and try
3:51
and debunk this theory in the clearest
3:53
terms possible. Unfortunately, the moment that piece
3:55
went live, the new... theory cropped up,
3:58
which was that games journalists were running
4:00
defense for this company and then it
4:02
just kind of the attacks spun off
4:04
of that and just started getting worse
4:07
and worse. So any game that was
4:09
even remotely connected to Sweet Baby Inc,
4:11
anybody who worked at Sweet Baby Inc,
4:13
a lot of people who were just
4:16
women or people of color or queer
4:18
folks who were writing about this or
4:20
talking about this online, they were just
4:22
the target of this harassment campaign and
4:24
it has not stopped. It's been non-stop
4:27
since March actually. As you mentioned, there
4:29
are so many of these consultancy groups
4:31
that go around and consult on
4:33
things like diversity and also just helping
4:36
these game companies, like you mentioned, appeal
4:38
to different audiences, right? You don't want
4:40
to just appeal to one demographic
4:42
when you're making video games. Why do
4:45
you think Sweet Baby specifically has become
4:47
such target? I think it's a perfect
4:49
storm of stuff. I think it doesn't
4:52
help that Sweet Baby is run by
4:54
a black woman named Kim Belier. I
4:56
don't think it helps that... the video
4:59
game industry which grew exponentially during the
5:01
height of the pandemic where more and more
5:03
people were buying game consoles and playing games
5:05
that had never been involved in this industry
5:08
before and all of the video game studios
5:10
started hiring more people looking to make more
5:12
money which resulted in mass layoffs the last
5:15
two years. I think people have also noticed
5:17
that there is a perceived decrease in quality
5:19
of some types of games, you know, games
5:21
that are made with microtransactions so that you'll
5:24
buy the game but then you'll keep buying
5:26
stuff. in that game because you want to
5:28
wear a cool outfit or you want to
5:31
have a cool weapon. And those games are
5:33
launching and then failing, kind of miserably. And
5:35
Sweet Baby Inc. was involved in one or
5:37
two games that did not do very well.
5:40
But unfortunately what happened with Sweet Baby is
5:42
it just became this kind of catch-all term
5:44
for anything that they deemed to have any
5:47
sort of... DEI in it, diversity, equity, and
5:49
inclusion. So if there was something that had
5:51
a black character or a female lead, it
5:53
was sweet baby treatment. Sweet baby must have
5:56
touched it. So now it's kind of spun
5:58
out into a lure that. In many
6:00
cases, it's completely untrue that this development
6:02
company hasn't even worked on most of
6:04
these games. About a decade ago, there
6:06
was this movement online called the Game
6:08
or Gate, which obviously you and I
6:11
are very familiar with. It was this
6:13
really coordinated harassment campaign against a bunch
6:15
of women in video games journalism, bad
6:17
faith attacks, and smears. And it was
6:19
really this ultimately served as a sort
6:21
of blueprint for how the far right
6:23
would come to weaponize the internet. That
6:25
received quite a bit of attention after
6:28
the fact. There was a bunch of
6:30
think pieces pieces about it. in 2017,
6:32
2018, 2019, like wow, wasn't that bad,
6:34
right? Good thing that was over. It
6:36
does seem like, though, things kind of
6:38
went underground or at least weren't so
6:40
bad for a while. In the past
6:43
year, really, with this new sort of
6:45
campaign against Sweet Baby and some of
6:47
the journalists involved like you, it seems
6:49
like this. we're experiencing almost a gamer
6:51
gate 2.0. What do you think has
6:53
led to this resurgence? There's so many
6:55
things and I keep saying perfect storm
6:57
and it feels like a cliche that
7:00
I've repeated so many times when talking
7:02
about this, but it is sort of
7:04
that as being an election year and
7:06
all of the kind of political disagreements
7:08
and conversations that happen around an election
7:10
year certainly make the internet an interesting
7:12
place. I mean we know that the
7:14
far right and the people who helped
7:17
prompt up Donald Trump's first run for
7:19
president were directly pulling from the playbook
7:21
of And I don't think it's a
7:23
coincidence that we started seeing it kick
7:25
up again during an election year. And
7:27
you can see that there are people
7:29
who are in the now President of
7:31
the election saying very similar things to
7:34
what Game or Gate 2.0 or the
7:36
people who are inciting this harassment campaign
7:38
say in terms of, you know, this
7:40
company's infested with DEI or we need
7:42
to end wokeness. And the issues that
7:44
the gamers have with video games and
7:46
the diversity of the industry. can easily
7:48
be applied to larger political issues that
7:51
they may have. And this sort of
7:53
belief that there's this massive, you know,
7:55
countrywide conspiracy to elect people or put
7:57
people into jobs who don't deserve it
7:59
because they're a minority or to cancel
8:01
people because they don't think that that's
8:03
fair. I think that combined with the
8:05
way the game regate just kind of
8:08
eventually disappeared also meant that a lot
8:10
of the major players in the video
8:12
game industry didn't really have to acknowledge
8:14
it. You didn't have to have Microsoft
8:16
say, this is screwed up and you
8:18
shouldn't do this, and you shouldn't have
8:20
to have Microsoft say, this is screwed
8:23
up and you shouldn't do this. Or
8:25
Sony say, we stand with the people
8:27
who you are accusing of not being
8:29
ethical in terms in terms of. and
8:31
praying that they wouldn't get the eye
8:33
of Sauron on them, that they wouldn't
8:35
start to get these people looking at
8:37
them and attacking them. And unfortunately, that
8:40
ends up being a sort of tacit
8:42
approval of the way that these people
8:44
are behaving, or at least giving them
8:46
the space to feel like, well, they
8:48
can continue to get away with this.
8:50
It seems like this campaign has obviously
8:52
spanned the internet. It's all over YouTube,
8:54
Twitch. every social platform, but so much
8:57
of this coordinated harassment and attack stuff
8:59
seems to be happening on Twitter, or
9:01
X rather now, especially under Elon. What
9:03
role do you think has Elon's X
9:05
and its sort of toxic algorithm played
9:07
in boosting this Gamergate? 2.0. There's a
9:09
couple of things. First of all, the
9:11
terms of service that X slash Twitter
9:14
has in place are clearly not being
9:16
policed or monitored as well as they
9:18
should be or as well as they
9:20
were before Elon Musk took over. There
9:22
are clear caught violations of terms of
9:24
service. I mean, people with slurs in
9:26
their names. But it's not just the
9:28
terms of service, right? Like I guess
9:31
what role do you think the actual
9:33
algorithm plays and the fact that you
9:35
have these right winged. and harassment. I
9:37
mean, you can buy a blue check
9:39
now and you can farm engagement off
9:41
of that by, you can buy to
9:43
have your account have a blue check
9:45
next to it and earn money off
9:48
of all of the engagement on your
9:50
tweets. And so if you tweet something
9:52
incredibly controversial or something that's going to
9:54
get a lot of interactions, you're going
9:56
to get more money. And those things
9:58
are also boosted through the algorithm to
10:00
people's home pages. I mean, I obviously
10:03
have. another Twitter account that I used
10:05
to monitor things, and that account is
10:07
the entire home page is just filled
10:09
with absolute horrible, racist, homophobic, transphobic, harassment,
10:11
abuse, all of these things, and it's
10:13
just not stopping, it's not being taken
10:15
down, and it's being pushed to the
10:17
top of the page and pushed into
10:20
people's eyes, and they're rewarding that by
10:22
giving people monetary payouts for when they're.
10:24
tweets hit big. Talk to me about
10:26
the structure of these harassment campaigns because I
10:28
feel like there's a few big people at
10:30
the top that are these mainstream content creators
10:32
that are sort of instigators, but they love
10:35
to wipe their hands of these things and
10:37
say, well, I didn't do anything. I
10:39
didn't personally call for X, Y, Z.
10:42
Can you sort of walk me through
10:44
the hierarchy of how these different sort
10:46
of harassers operate and where they land?
10:48
There's usually always in this instance of
10:51
harassment campaigns a few leaders who are
10:53
most of the time very careful
10:55
with what they say and what
10:57
they share. You're rare that you're
10:59
going to see someone who has
11:01
150,000 Twitter X followers outright in
11:03
sight violence or threaten someone or
11:05
anything like that. But what they
11:07
will do is they will point
11:09
ceaseless harassment at or they will
11:11
point endless fingers towards specific people
11:13
in a way that they think
11:15
is coy by saying oh Well,
11:17
I know this person and here's
11:20
a picture of clearly a person
11:22
of color is involved in this
11:24
game that didn't do well and they
11:26
probably don't like you the gamer and
11:28
they rile up a base of smaller
11:30
content creators who tend to be a
11:32
little bit more loose with the stuff
11:34
that they say a little bit less
11:36
careful with what they're inciting, a little
11:39
bit more harassy with their videos and
11:41
their comments, and then below that you
11:43
get sort of like anonymous foot soldiers
11:45
that are just in droves. And those
11:47
are the people who often or not
11:49
are the ones in my inbox or
11:51
in other people's inbox with outright threats
11:54
or in my emails, in my
11:56
Instagram, finding my family's information, those
11:58
anonymous people who are watching all
12:00
of this content, taking in all of
12:02
this content, and being pointed towards people
12:05
that they should be angry towards, for
12:07
some reason. They're not sure why. And
12:09
that's when you end up getting this
12:11
sort of wave and crest of harassment,
12:13
because the leaders are just innocently pointing
12:15
out people, but the undertones are very
12:17
clear what they want to happen there,
12:19
and it's the foot soldiers that will
12:22
always carry that out. So you wrote
12:24
this story on Kataku, and it kind
12:26
of blows up. It becomes this viral
12:28
piece of journalism that of awful people
12:30
hate, hey, tell me what happened next
12:32
to you. Part of the reason why
12:34
it got so bad for me in
12:37
terms of harassment was because I refused
12:39
to back down and I refused to
12:41
apologize for nothing that I had done
12:43
wrong, but just for being who I
12:45
was. And in the immediate aftermath, there
12:47
was a lot of finger pointing that
12:49
this piece was untrue or purposefully leaving
12:52
things out or none of this is
12:54
correct. And when I was pushing back
12:56
at that, that started kind of the
12:58
waterfall of harassment. So I started becoming
13:00
this kind of stand-in for everything that
13:02
was wrong with games journalism. My faces
13:04
plastered over hundreds of YouTube videos, watching
13:07
my every move. I was getting threats
13:09
in my work email and my personal
13:11
email. I had to warn my parents
13:13
to be careful because their address was
13:15
circulating the internet. And it was just
13:17
kind of your bog standard harassment campaign.
13:19
They go after every piece of social
13:22
media you have. They try and find
13:24
every bit of personal information they can.
13:26
kind of obsessing over everything you do.
13:28
I was joking that they like probably
13:30
had my menstrual cycle charted, but it
13:32
really feels like that at a certain
13:34
point. Like it is nonstop and it's
13:37
every breath you take. Somebody is archiving
13:39
it and putting it in these darker
13:41
corners of the web, which eventually bleeds
13:43
into the more typical places that people
13:45
would go like Twitter and YouTube. And
13:47
how has that affected your ability to
13:49
do journalism and to further cover these
13:52
issues? I mean it does feel in
13:54
some ways like your... a pariah, like
13:56
there's a lot of people in the
13:58
industry who have been in credit. supportive
14:00
to me in private but not as
14:02
supportive as I'd like them to be
14:04
in public and you know my company
14:07
when I was at geomedia was very
14:09
clearly unhappy with what was going on
14:11
and very clearly unhappy with the way
14:13
I'm happy with the way I'm happy
14:15
with the way that I was reacting
14:17
to it and it ended up reaching
14:19
ahead where I can no longer work
14:22
there it wasn't working for anybody and
14:24
now I'm in this sort of in-between
14:26
where I really want to keep reporting
14:28
on this stuff. It's why when people
14:30
have been making videos about me, they've
14:32
said, well, why didn't you just go
14:34
the way of Zoe Quinn and Anita
14:36
Sarkizian, who were two women that were
14:39
kind of focal points of the first
14:41
game regate, who drastically decreased their internet
14:43
presence over the last decade, who drastically
14:45
decreased their internet presence over the last
14:47
decade. They want you to go away
14:49
and they want to scare you off
14:51
the internet. And I just kind of
14:54
am refusing to get her... And I've
14:56
seen it in a small scale with
14:58
my recent violence at Rolling Stone. They're
15:00
just, anytime they see my name anywhere,
15:02
they're jumping on it, like dogs with
15:04
a bone. And it really is upsetting
15:06
because at the end of the day,
15:09
you know, I was doing my job
15:11
and I was doing my job and
15:13
I was doing my job and I
15:15
was doing it well and I was
15:17
doing it well and I think being
15:19
put in a position where you feel
15:21
like you might not be able to
15:24
do it anymore is a complete bummer
15:26
culture as you mentioned. and video games
15:28
because I think these are very hyper
15:30
online engaged radicalized communities. Why do you
15:32
think media organizations are so scared? Like
15:34
why do you think they have learned
15:36
nothing from Game or Game Game 1.0?
15:39
Like was there any improvement here? Because
15:41
it seemed like there was this whole
15:43
reckoning around it. But it seemed like
15:45
there was this whole reckoning around it,
15:47
but now we have it happening again
15:49
and those same media companies that were
15:51
publishing what's going to affect your bottom
15:54
line. And that's also why I... I
15:56
think the video game industry leaders and
15:58
these companies. publishers stay quiet as well
16:00
because I think they worry that the
16:02
moment that you take a stance against
16:04
something like this you alienate a part
16:06
of your audience and in a media
16:09
landscape where websites are shuddering left and
16:11
right and shutting down or selling or
16:13
people are getting laid off I think
16:15
there's a fear that you would alienate
16:17
people by loudly standing behind your employees
16:19
who are getting harassed. In some cases
16:21
I also think there are some people
16:24
who are at the tops of these
16:26
businesses that aren't necessarily all that worried
16:28
about the way. that these people behave
16:30
and maybe they don't necessarily disagree with
16:32
some of the things these people say.
16:35
The people who are in positions of
16:37
power at these places are there for
16:39
a reason and I don't think they
16:41
want anything that could upset that power
16:44
balance. And I think seeing that somebody
16:46
is getting sort of endless power balance
16:48
and I think seeing that somebody is
16:51
getting sort of endless abuse for something
16:53
that is a fairly left leaning but
16:55
innocent stance on something scares people who
16:57
are just worried about... their KPI's room.
17:00
They're really just worried about their page
17:02
views. They're worried about how much money
17:04
they can make for the site. And
17:06
they think if we make a statement
17:08
saying, stop harassing someone, that section of
17:10
the internet will no longer entertain us
17:12
and no longer read us. And I
17:15
think it really just is, it's cowardice,
17:17
pure and simple. In some cases, I
17:19
think people are, like I said earlier,
17:21
afraid of that I turning towards them
17:23
and attacking them. But that's why I've
17:25
called for video game industry leaders and
17:27
companies to issue a. with these bad
17:30
actors online. It could just be something as
17:32
simple as, the video game industry is diverse,
17:34
the people writing about it are diverse. There's
17:36
no conspiracy theory here. Please stop harassing people.
17:38
We don't stand with it. Sign every big
17:40
leader in the industry, but that's not happening.
17:42
I feel like there's also, you know, what
17:44
these people fundamentally prey on is also the
17:46
public's lack of media literacy. These independent content
17:49
creators are able to kind of go out,
17:51
they're able to weaponize the traditional media, because
17:53
they know the traditional media is certainly not
17:55
going to cover these campaigns. The traditional media
17:57
doesn't really cover harassment campaigns against marginalized people
17:59
generally. So they're able to kind of push
18:01
these narratives unchecked. There's also this idea from
18:03
a lot of the same people in positions
18:05
of power of don't feed the trolls, right?
18:07
That you loved to hear that saying all
18:09
the time, they say, don't feed the trolls.
18:11
And it seems like that kind of ignores
18:14
the. way the internet is currently set up,
18:16
right, where there are these people that have
18:18
these powerful megaphones that can push these really
18:20
dangerous and defamatory narratives about someone. And if
18:22
you don't respond, I feel like you just
18:24
create this vacuum for these narratives to thrive.
18:26
So I guess how do you feel about
18:28
that don't feed the troll's advice and how
18:30
have you sought to fight this misinformation and
18:32
smear campaign against yourself? It's so funny that
18:34
you say that because that was actually something
18:36
that was said to me multiple times by
18:38
some... people who I will not name when
18:40
I was still working at Kataku and it
18:42
frustrated me twofold one exactly what you're saying
18:44
the conversation of well this is defamation against
18:46
me it's also insinuating or saying outright that
18:48
my my journalistic practice is unethical or that
18:51
I'm lying throughout a published piece and not
18:53
saying anything about that kind of stuff does
18:55
again is a little bit of a tacit
18:57
approval. It's a little bit of an oof,
18:59
oof, ooh, I don't want to say anything
19:01
because I don't want them to pick that
19:03
up and run with it, but when you
19:05
don't say anything, they're running with it anyway.
19:07
And I've learned that even when I don't
19:09
say something that could piss them off, they
19:11
will find something to get this outrage machine.
19:13
So I believe you have to be smart
19:15
with what you're pushing back on. And in
19:17
some cases at first, I kind of felt
19:19
like I was Captain America in that one
19:21
scene, whatever movie it is, where he fights
19:23
like eight guys in an elevator. I kind
19:25
of felt like I had to just go
19:28
at everybody who was saying everything ridiculous about
19:30
me because the ridiculous level got to something
19:32
that was, I mean, it was comical at
19:34
a certain point if it wasn't tied to
19:36
such scary violent threats, but it got exhausting.
19:38
things that are just
19:40
really easily disproven. Things
19:42
that these big leaders
19:44
are saying about how
19:46
Call of Duty added
19:48
bullets that had the
19:50
trans flag on them
19:52
so you can kill
19:54
people as a trans
19:56
person. And I had
19:58
to debunk that because
20:00
I was like that's
20:02
an easy thing to
20:04
knock down and say
20:07
this is completely untrue.
20:09
But what's frustrating is
20:11
the outraged people, the
20:13
people who are consistently
20:15
feeding this outrage machine,
20:17
you can prove that
20:19
something's untrue unequivocally and
20:21
they'll just move on
20:23
to the next thing.
20:25
And so I don't
20:27
really know what else
20:29
can be done aside from kind of
20:31
constantly going up that's wrong. That's wrong
20:33
because without these people being deplatformed from
20:35
places like Twitter or YouTube for spreading
20:38
disinformation or other things that obviously get
20:40
into illegal territory, they're just gonna keep
20:42
doing this because they're still getting the
20:44
views and they're still getting the likes.
20:46
But the other thing that I thought
20:48
about don't feed the trolls is how
20:50
inherently violent it feels to tell a
20:52
woman being incessantly harassed online that their
20:54
behavior has something to do with why
20:56
that's happening. It really feels like when
20:58
someone says that they were sexually assaulted
21:00
and somebody asked them well what were
21:03
you wearing or you shouldn't have had
21:05
so much to drink. It just feels
21:07
like your mere existence and the fact
21:09
that you don't want to lay down
21:11
and take it is why it's happening
21:13
to you. And it's so upsetting and
21:15
frustrating. And also if you do lay
21:17
down and take it, it's worse right?
21:19
Like I think that's what especially these
21:21
sort of people in positions of power
21:23
that make that comment. They think that
21:25
if you don't respond it goes away.
21:28
That's not true. They just don't have
21:30
to see your response to it, right?
21:32
It's out there. It's manipulating people's perceptions
21:34
of you. I also think it's a
21:36
huge problem. I mean you mentioned that
21:38
you had to leave your job and
21:40
stuff that people that run these media
21:42
companies and are in positions of power,
21:45
they will throw any woman right under
21:47
the bus if there's any hint of
21:49
controversy whether it's deserved or not. And
21:51
I think that that is really scary
21:53
because we see men court controversy all
21:55
the time. We see straight men able
21:57
to ride these cycles out and they're
21:59
like well you know he's he's really bombastic
22:02
and he's a fighter or he's you
22:04
know they're never framed as like controversial
22:06
men can use controversy and it is
22:08
seen as like them standing up for
22:11
themselves or them taking control of the
22:13
narrative or you know fighting back bravely
22:15
against you know these bad actors when
22:17
women try to do anything like that
22:20
they're chastised and like you said these
22:22
media companies will run scared they will
22:24
throw you under the bust so fast
22:26
because they believe falsely that throwing you
22:29
under the bus will quell the controversy.
22:31
In fact, it's like giving, it's like
22:33
throwing, you know, red meat to the
22:35
sharks, right? Absolutely. I mean, I've said
22:38
this before, like, cancel culture does exist,
22:40
just not in the way that the
22:42
right wing people would have you think
22:44
it exists, because you can see someone
22:47
like Johnny Depp and that entire case
22:49
against Amber heard, oh no, it's gonna
22:51
ruin his career, oh, she's trying to
22:53
cancel him. Well, quiet life in another
22:56
country with their kid. And I think,
22:58
you know, I saw that firsthand. I
23:00
saw how quickly this bombardment of people
23:02
sending emails to my job and sending
23:05
emails to places that I'm freelancing at,
23:07
this clear attempt to deplatform me, which
23:09
is so ironic considering how often the
23:11
right wing screams about cancel culture and
23:13
how it's ruining everything. But the only
23:16
time I've ever seen someone actually suffer
23:18
from cancel culture or have any sort
23:20
of ramifications for anything, has been a
23:22
woman. or a person of color or
23:25
a queer person because they dared to
23:27
push back against the ruling elite and
23:29
the other people around them. And I
23:31
just think it's ironic that we're sitting
23:34
here worrying and bleeding about that when
23:36
the only people who have suffered from
23:38
it are the ones who are already
23:40
in a bad enough position. It's incredibly
23:43
frustrating. Yeah, well I think one way
23:45
at least I found to kind of
23:47
help mitigate these harassment campaigns is to
23:49
cover them and to educate people about
23:52
them so that when they see a
23:54
campaign like this, they say, oh, that's
23:56
what this is. Oh, this is coordinated
23:58
networked harassment. recognize it for what it
24:01
is instead of just saying why is
24:03
Elissa so controversial or whatever. But I've
24:05
noticed that the mainstream media, traditional media
24:07
generally, doesn't... cover these things. Like, you
24:10
know, for someone like you, right? Like,
24:12
where do you even go to get
24:14
your story out? It's not like there
24:16
are journalists in these traditional outlets that
24:19
even talk about these massive campaigns that
24:21
are happening online. Generally, the traditional media
24:23
tends to ignore the internet, I think,
24:25
unless they're writing about how, like, among
24:27
us, you know, is a problem. That's
24:30
the extent of the games journalism. How
24:32
do you educate people about these campaigns?
24:34
How can we educate the public on
24:37
how to recognize? coordinated, you know, smear
24:39
campaigns. You're absolutely right. I spoke to
24:41
the CBC, the Canadian Broadcasting Company, like
24:43
immediately after my piece went live in
24:46
the harassment started in March. It was
24:48
an incredible interview. I was really proud
24:50
of it. We talked about the harassment
24:52
campaign and then it was very clear
24:55
that some people did not want me
24:57
to continue to talk about this anymore.
24:59
So I kind of went quiet in
25:01
terms of going and looking for places
25:03
to have this conversation, but I kept
25:06
having it on Tik-talk or an my
25:08
Instagram stories, just showing people, just kind
25:10
of the massive amounts of harassment and
25:12
abuse that I would get on the
25:14
daily. I mean, today, somebody sent me
25:16
a picture of their penis in an
25:18
Instagram message saying, you know, since you're
25:20
such a whore, you're gonna have this
25:22
in your mouth soon. And I think
25:24
people really don't realize that when it
25:27
happens like this and when it happens
25:29
to you, it is kind of a nonstop
25:31
barrage that any time you pick up your
25:33
phone. It's a very likely chance that you
25:35
have something in one of your inboxes that
25:37
would make my grandmother faint at the dinner
25:39
table. I would love for traditional media to
25:41
cover this more. I've seen actually a bunch
25:44
of the content that you've posted. I think
25:46
you posted a great tick-talk recently sort of
25:48
documenting your emotions and going through this over
25:50
the past few months, and I feel like
25:52
you've done a lot to raise awareness about
25:54
this on your own channels. You're also pursuing
25:56
legal action, which I think is something that
25:58
most people don't do. about your lawsuit
26:01
and how, you know, your decision
26:03
to pursue justice through that system.
26:05
Yeah, I mean, the first time
26:07
Gamergate happened in 2014, the internet
26:09
was such a foreign concept to
26:11
most lawmakers, judges, attorneys, that most
26:13
of the women who kind of
26:15
raise these issues even with their
26:17
police departments when they would call
26:19
and say, hey, somebody's threatening to
26:22
come to my house. The overall
26:24
response would just be kind of
26:26
one of confusion. What do you
26:28
mean there's someone online is threatening
26:30
you? They're not there in person,
26:32
right? So you're fine, right? It's
26:34
not an issue, right? On a
26:36
legal end, it was very difficult
26:38
to kind of nail down exactly
26:40
what these people were doing that
26:43
was illegal because there just weren't
26:45
very many instances of lawsuits happening
26:47
or proceedings going through that made
26:49
it very clear that this kind
26:51
of behavior was unacceptable. That has
26:53
changed somewhat in the last 10
26:55
years. It's gotten a little bit
26:57
easier to pinpoint illegality on the
26:59
internet in the obvious ways of
27:01
death threats and things like that.
27:04
I kind of agree with you
27:06
in some of these points, just
27:08
dealing with my own harassment, like,
27:10
sure, police understand the concept of
27:12
the internet a little bit better.
27:14
You might get a judge that
27:16
understands how Twitter works, but I
27:18
don't think there's any sort of
27:20
real comprehensive legal precedent for holding
27:22
these types of campaigns accountable. So
27:25
how are you approaching it with
27:27
this suit? So I'm suing one
27:29
of the people who have been
27:31
making content about me since March.
27:33
His name is Jeff Tarzia. He's
27:35
a content creator who primarily makes
27:37
videos on YouTube. I'm suing him
27:39
for defamation. Jocastic terrorism, organized harassment,
27:41
a couple of things that we
27:43
kind of went through at length,
27:46
you know, the benefits to any
27:48
of these points. Alyssa Mercante has
27:50
filed a lawsuit with me. It's
27:52
a very frivolous 54 -page attempt at
27:54
doxing me and attacking me and
27:56
silencing me that isn't gonna work
27:58
and it never does. This is
28:00
somebody who has been making videos
28:02
about me for months, many of
28:04
which include thumbnails of me and
28:07
interesting photos taken from, I don't
28:09
know where he found them online
28:11
and a lot of them contain
28:13
pointed language. that is technically legal, but some of them
28:15
contain outright defamation and obviously he is one of those below the
28:17
big big guys in terms of the organizers of this harassment campaign,
28:19
but he is someone that has quite a large following. And a
28:21
lot of the people who have seen the videos he's made about
28:23
me have echoed the things that he said about me in my
28:25
dams and elsewhere. And it's just been very clear that there has
28:28
been a lot of slandering of my name and my character and
28:30
my ability as a journalist. And so for
28:32
me and my legal team, it was the
28:34
best case scenario of who we should try
28:36
to have some modicum of a consequence for.
28:38
There is clear-cut examples of what we
28:40
believe for defamation, and there is a
28:43
laundry list of content that this person
28:45
has been making about me for months
28:47
and making money off for months. So
28:49
I'm hoping, I'm hoping, I don't know
28:51
how confident I feel because the legal
28:53
system I still feel is very far
28:55
behind, but I'm hoping that this could
28:57
be something that people could look towards
29:00
as a rare example of somebody latching
29:02
on to the back of a harassment
29:04
campaign and benefiting financially off. of it
29:06
while you know seriously affecting the life
29:08
of one person in their sights you
29:10
know I think my life has changed dramatically
29:13
because of all of this and i
29:15
don't think that's fair so we'll see
29:17
yeah i think the profit is such
29:19
a key part of it in the
29:21
meantime i guess what can people do
29:23
to support those who are targets of
29:25
this type of stuff i mean to
29:27
support your work and stuff now that
29:29
you are independent like how i don't
29:31
know if you can pay your legal
29:33
fees like how it's going how can
29:35
people get involved who want to put
29:37
a stop to these types of cycles
29:39
there's a couple of things obviously i
29:41
I started a Patreon and I regularly
29:43
stream on Twitch now. It's twitch.TV slash listener.
29:45
I'm trying to do kind of a hybrid
29:48
gaming and covering like tech and video game
29:50
news. Just kind of testing it out to
29:52
see if that medium is a good place
29:54
to have these kinds of conversations. And I
29:57
have my Patreon, which I'm doing some original
29:59
writing on. Some behind a paywall, some
30:01
not. I'm applying for jobs, you know,
30:03
obviously trying, but in terms of support,
30:05
I mean, just getting the word out
30:07
there. And also, honestly, it seems so
30:09
silly, but reaching out to me and
30:11
just like giving me a virtual pat
30:13
on the back or an I've got
30:15
your back is huge. And also for
30:17
the larger people, the people who are
30:19
not, you know, just us lowly journalists
30:22
that are, you know, struggling to make
30:24
rent in New York City or elsewhere,
30:26
but the people who are leaders at
30:28
these companies. vocal support for not just
30:30
me but the other people who are
30:32
facing harassment I think would be huge
30:34
again that vacuum of not saying anything
30:36
leaves a lot of space for people
30:38
to continue to pile on and also
30:40
I would say if you come across
30:42
something that is so blatantly violating terms
30:44
of service at least give it a
30:46
report sometimes it works sometimes it doesn't
30:48
but it's better than just me feeling
30:50
like I'm constantly monitoring my own harassment
30:52
which gets exhausting well it's traumatizing right
30:55
I mean you're forced to because and
30:57
I hate this idea as well where
30:59
it's like well just don't read it
31:01
or whatever for your own safety you
31:03
do need to read what people are
31:05
saying again to report threats to escalate
31:07
things like and there is no service
31:09
that can just go in and I've
31:11
always wanted something like this just like
31:13
monitor all of your mentions right like
31:15
celebrities have I guess and like deal
31:17
with things and escalate things so you
31:19
manually have to read the most vicious
31:21
stuff about you I mean I think
31:23
it would lead any normal person to
31:25
have severe PTSD and trauma. that you're
31:27
okay and then you just kind of
31:30
have like a moment where either you're
31:32
just so tired that you break down
31:34
or for me I would just read
31:36
these things all day and almost like
31:38
a car crash obsession at a certain
31:40
point where I was like, wow, this
31:42
can't get any worse, right? And then
31:44
it would get worse and worse and
31:46
I think I started noticing that like
31:48
even the most innocuous comment after me
31:50
reading something terrible all day would be
31:52
the thing that broke me and I
31:54
think it was just the volume after
31:56
a while. And the fact that it
31:58
doesn't stop is pretty wild. I mean,
32:00
you know, my old company did offer,
32:03
um, delete me, which is a service
32:05
that's supposed to keep all of your
32:07
personal information offline. But these people who
32:09
are the worst actors in this space,
32:11
and I mean, these are the ones
32:13
who are gathering in these really strange
32:15
dark corners of the internet. These people,
32:17
this is their entire life, finding information
32:19
about people. And so having that service
32:21
only kept them away from finding my
32:23
information out for a little bit and
32:25
then they found it anyway. They found
32:27
it through some sort of loophole or
32:29
some mistake that somebody had made and
32:31
you know the next thing I knew
32:33
someone was posting pictures at the inside
32:35
of my apartment. And so it is
32:38
scary. It is overwhelming. And it's also
32:40
very difficult to explain it to someone
32:42
who's not always online because there is
32:44
that kind of of we'll just turn
32:46
off the computer. It's not that simple.
32:48
There's also the idea of where there's
32:50
smoke there's fire and I think we
32:52
see this especially when we look at
32:54
women and marginalized people. It's this idea
32:56
of I like Elissa and her reporting
32:58
seems good but she's so controversial. They
33:00
just say this about everyone obviously
33:02
I get it too like we
33:05
all see this happen and it's
33:07
It's how they drive people, marginalized
33:09
people, women, like out of the
33:11
industry by stigmatizing them. And if
33:13
you respond, you're asking for it.
33:15
And if you don't respond, you're
33:17
just going to be steamrolled and
33:19
absolutely have put up no fight
33:21
while your entire career is destroyed.
33:23
And the narrative about all of
33:25
this hard work that you've done
33:28
for years is twisted. And you
33:30
just become sort of known for your
33:32
controversy, which is what they want. want
33:34
you to be known for your journalism.
33:36
They want you to be known as
33:38
controversial because it makes it easier for
33:40
you to be dismissed. Absolutely. And it
33:42
has been incredibly difficult, especially since we
33:44
filed the lawsuit. Last week we just
33:46
filed it, it was in the works
33:49
forever, it's been incredibly difficult for me
33:51
to sort of measure to take measured
33:53
approaches to this kind of stuff because
33:55
it is very frustrating to see, especially
33:57
within the last two days since the
33:59
filing, even more... defamation lies and slander
34:01
spread that I just kind of have
34:03
to sit on and that is something
34:05
that I understand it and I know
34:07
that it's short-term pain for potentially long-term
34:09
gain or at least some type of
34:11
consequence for somebody who has been doing
34:13
this but I think I prepared myself
34:16
for it but I don't think I
34:18
prepared myself enough because now it's it's
34:20
gotten to another level that I'm like
34:22
can this go any higher like what
34:24
what the limit does not exist like
34:26
why is this happening? I completely empathize.
34:28
Yeah, I'm sure you do. I'm so
34:30
sorry, you know, for what you've gone
34:32
through, but your work is amazing. I
34:34
hope people follow you, watch your streams
34:37
on Twitter, support your patron, and I
34:39
hope that more online culture, you know,
34:41
writers, pay attention to this, and I
34:43
hope that these mainstream media companies that
34:46
continue to throw journalists under the bus,
34:48
pay attention to what they're doing, because
34:50
I think it's, you know, it's been
34:52
very concerning the past couple years, watching
34:55
these media. at people of color, anybody
34:57
deemed controversial because they think that that
34:59
will sort of distance themselves from controversy.
35:01
In fact, what they have shown is
35:03
that they are willing to bend the
35:06
knee to these really nefarious people online.
35:08
And once those people recognize that these
35:10
media companies will cave to that sort
35:12
of external pressure, it gets even worse,
35:15
right? They get even more power. They
35:17
don't stop at one journalist or one
35:19
woman. No, unfortunately, as much as you
35:21
and I try, we can't fight
35:23
every fight and make these massive,
35:26
massive strides without support of larger
35:28
media organizations or even louder voices
35:30
in the industry, because yeah, it's
35:32
very easy to push one of
35:34
us, to push one of us,
35:36
to push one of us off
35:38
to the side, because yeah, it's
35:40
very easy to push one of
35:42
us or both of us off
35:44
to the side and write us
35:46
off as controversial figures, for it
35:48
and yeah I think the more that
35:50
that happens the more that this will continue. Well Alissa
35:53
thank you so much for sharing your story and chatting
35:55
with me today. Thank you for having me it was
35:57
a great time. That's all for the show you can
35:59
watch Watch full episodes on my YouTube channel
36:02
at Taylor Lorenz. Power User is produced
36:04
by Travis Larchak and Jalani Carter. Our
36:06
executive producer is Zach Mac. If you
36:08
like the show, give us a rating
36:10
and review on Apple Podcast, Spotify, or
36:12
wherever you listen. In the meantime, subscribe
36:14
to my tech and online culture newsletter
36:16
at Usurmag.co. That's Usurmag.co. See you next
36:19
week.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More