Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
As regular control alt-speech listeners will know,
0:03
Mike, we start every week of the
0:05
podcast with a phrase or prompt that
0:07
you might see if you open an app
0:10
or a kind of web service. And we've
0:12
been doing this because we think these little
0:14
bits of text and taglines prompt us and
0:16
make us interact and engage. And this week
0:18
we could not do signal, right? Everyone has
0:20
been talking about signal. Even if we've done
0:23
it before, I can't remember. Signal
0:25
doesn't have a prompt or tagline
0:27
in the same way. So we're
0:29
stealing a bit of marketing material
0:32
from one of their online pages.
0:34
And so this week I'm prompting
0:36
you to speak freely. Well I
0:38
would just like to say that if any
0:41
random CEOs of tech companies want to
0:43
accidentally add me to their group chats
0:45
and tell me what they're thinking about
0:47
content moderation these days, I would not
0:49
be opposed to that. We're going to
0:52
talk about a few of the CEOs
0:54
today actually. I think so, I think
0:56
so. And what about you Ben? Can
0:58
you speak freely for me please? Well
1:00
I was speaking freely at the marked
1:02
insurgent event that we had in London
1:05
last night. I'm a little bit worse
1:07
for where this morning as a result.
1:09
It was a super fun evening,
1:11
but I managed to incorporate, you'll
1:13
be unsurprised Mike, my renovation. into
1:16
my presentation about content moderation. So I
1:18
spoke freely both about moderation and renovation
1:20
and yeah I got a few comments
1:22
and a few jokes about it but
1:25
it was fun. And you didn't you
1:27
didn't end up winning the Twitter sign
1:29
from last week's discussion for as part
1:31
of your home renovation? Not this time,
1:33
not this time unfortunately. Hello
1:43
and welcome to Control Alt Speech,
1:45
your weekly roundup of the major
1:47
stories about online speech, content moderation
1:49
and internet regulation. It's March 28th
1:51
2025 and this week's episode is brought
1:53
to you with financial support from the
1:56
Future of Online Trust and Safety Fund.
1:58
This week we're talking about Elon Musk.
2:00
courting of tough men, porn appearing on
2:02
platforms you might not expect, and rage
2:04
baiting. My name's Ben Whitelaw, I'm the
2:06
founder and editor of Everything in moderation,
2:09
and I'm with a man who's in
2:11
a lot of signal chats and may
2:13
be lurking in one that you're in
2:15
too, Mike Masnick. Please, please accidentally add
2:17
me to your signal chats. Yeah, I'm
2:19
presuming you've never had this done to
2:22
you before. I was going to ask
2:24
you. I have never had anything. I
2:26
am in, as you know, I am
2:28
in quite a lot of signal chats.
2:30
That is, it appears to be the
2:33
group chat app of choice for lots
2:35
of people, but I have never been
2:37
accidentally added to a group chat. What
2:39
a wild story of the nature of,
2:41
yes, the US government right now. Yeah,
2:44
such a shame, such a shame. So
2:46
yeah, if anyone's listening and wants that,
2:48
Mike, feel free to do so. I've
2:50
got some congratulations to give to you,
2:52
Mike. a special award this week. Well,
2:55
I think I think technically I don't
2:57
get the award until October. Oh, okay,
2:59
okay. But it's been announced. You don't
3:01
have an announcement. Okay, talk to us.
3:03
The protector of the Internet Award. Yeah,
3:06
yeah, I've won the protector of the
3:08
Internet Award. I've asked them to give
3:10
me a shield to go with the
3:12
award. It's from the Internet infrastructure coalition,
3:14
which is a really good group. I've
3:16
done various... projects with them in the
3:19
past. I've spoken at their conferences in
3:21
the past and they, I think only
3:23
recently, have started doing this. This is
3:25
not the first year that they've done
3:27
the Protector of the Internet Award, but
3:30
they've been doing it for a little
3:32
while. They have a big thing where
3:34
they fly a bunch of their members
3:36
into DC and they meet with Hill
3:38
and they meet with a award ceremony
3:41
where they give awards to a few
3:43
people. And this year, very... It's a
3:45
nice, nice, you know, little thing. Yeah.
3:47
And yeah, so that'll be in October.
3:49
So you get to dress up and
3:52
receive your shield. Yes, yes. Excellent. Well,
3:54
you might remember last week that we
3:56
had a call out for some podcast
3:58
reviews. And I'm glad to say, Mike,
4:00
we've got some very funny listeners. Because
4:03
the bar for submitting the review was
4:05
not hating the podcast. And this is,
4:07
let's be clear here, this is what
4:09
you told them. You told people, if
4:11
you don't hate the podcast, please review
4:14
it. And our listeners. took us up
4:16
on that with that specific prompt. Yeah
4:18
with glee with actual glee we had
4:20
three reviews which is three more than
4:22
we've had in the last six months
4:24
so thank you to those three listeners
4:27
and all of them said in their
4:29
review in some form I don't hate
4:31
this podcast which is which is and
4:33
I'm very glad to hear that so
4:35
thank you to those people who've left
4:38
the review. For those of you who
4:40
have left a review or who left
4:42
a review a long time ago and
4:44
want to update how you feel about
4:46
the podcast, please go and leave a
4:49
few words and a star rating on
4:51
your podcast platform of choice. When you
4:53
can, it really helps us really reach
4:55
our tentacles out into the wider world.
4:57
And if you want to leave any
5:00
other kind of coded messages within your
5:02
reviews, please do. And we'll try and
5:04
discern them and filter out what it
5:06
is that the secret languages. really really
5:08
appreciate it and with that note we'll
5:11
crack on with today's stories. We're going
5:13
to start with a familiar figure Mike
5:15
but in a context that is slightly
5:17
unfamiliar to him and to us you
5:19
found a few stories about Elon Musk
5:22
and his courting of a couple of
5:24
strong men. Yeah, this has been interesting.
5:26
I think, you know, last week we
5:28
tried to avoid talking about Elon, but
5:30
you can only avoid talking about Elon
5:32
for so long before he must enter
5:35
the conversation. I think it's part of
5:37
the, you know, contract he has with
5:39
the world at this point. He must
5:41
be the main character. And so, yeah,
5:43
there are a couple interesting stories, and
5:46
it sort of started with the fact
5:48
that X decided to... file a lawsuit
5:50
in India basically complaining about the nature
5:52
of the content moderation demands from the
5:54
government. And this struck me as interesting
5:57
on a few levels. First being that
5:59
we had spoken about the fact that
6:01
old Twitter had also sued the Indian
6:03
government over its demands and in fact
6:05
that had been sort of an ongoing
6:08
fight and at one point the Indian
6:10
government had raided Twitter's offices in India.
6:12
Nobody was there because I think it
6:14
was in the middle of COVID. and
6:16
there's all this sort of back and
6:19
forth and pressure and when Elon Musk
6:21
had taken over Twitter and declared that
6:23
the old regime was all into censorship
6:25
and stuff and then almost immediately he
6:27
started obeying a bunch of the orders
6:30
coming from the Modi government to take
6:32
down speech of critics of that government
6:34
and we called out the fact that
6:36
hey this looks bad if you're going
6:38
around saying that you're against censorship and
6:40
then you're obeying the censorship commands from
6:43
the government, it raises questions about how
6:45
committed you are to that, especially when
6:47
the former administration that you claimed was
6:49
anti-speech was willing to fight them in
6:51
court. And then there were a bunch
6:54
of stories for the next two years,
6:56
basically, of X being willing to take
6:58
down critics of Modi over and over
7:00
again. There was a documentary. There was
7:02
a few other things along the ways.
7:05
So just the fact that he is
7:07
now stepping up and decided to sue
7:09
is interesting and different. because that's a
7:11
big shift. And as we've seen over
7:13
the last few years, the thing that
7:16
became clear was that mostly, Elon was
7:18
willing to fight when he disagreed publicly
7:20
with the government. So if it was
7:22
a more left-leaning government, then he would
7:24
go on Twitter or X and declare
7:27
himself to be a free speech martyr
7:29
and talk about how, oh, you know,
7:31
the awful Brazilian government is trying to
7:33
get me to take that stuff. But
7:35
when it was a more right leaning...
7:38
authoritarian government, Turkey and India. the sort
7:40
of classic examples, he seemed to be
7:42
willing to go along with it. So
7:44
it was really interesting to see him
7:46
shift in India at this moment too,
7:48
because this was also a moment where
7:51
his other companies are making headway in
7:53
India. Yeah. So before we get into
7:55
that, just took us through the kind
7:57
of actual case that he's bringing against
7:59
the Indian government and what it leads
8:02
into. So this is actually really kind
8:04
of interesting. So Indian internet law. has
8:06
been kind of back and forth over
8:08
the last like decade or so. And
8:10
there had been lawsuits on this and
8:13
I had written about this years ago
8:15
where they have these different IT acts
8:17
that sort of lay out the intermediary
8:19
liability questions around content moderation and for
8:21
a while it actually looked like the
8:24
Indian law was actually going to kind
8:26
of match. Section 230, but then that
8:28
upset some people in the government who
8:30
wanted content to be more easy to
8:32
take down. And so it shifted in
8:35
a pretty drastic way and made it
8:37
so that the government had a lot
8:39
more power to sort of order content
8:41
to be taken down. And so what
8:43
X is now doing is challenging the
8:46
sort of latest version of the law.
8:48
It's Section 69A of the IT Act.
8:50
And they're saying that that violates free
8:52
speech rights because it basically creates a
8:54
way for the government to send information
8:56
to the platforms that they say have
8:59
to be taken down. And the way
9:01
the law is currently being interpreted, and
9:03
again this is after a few different
9:05
court cases and a few different challenges
9:07
and changes, the way it is being
9:10
interpreted is that if the government sends
9:12
you... content that they believe should be
9:14
taken down, that you really will get
9:16
in trouble if you don't. And we
9:18
saw that again with the previous regime
9:21
and Twitter who did try to fight
9:23
this in court and eventually ended up
9:25
losing. Now the government has responded to
9:27
this and I thought this was really
9:29
interesting and this response came out just
9:32
before we started recording, de honest. And
9:34
they're presenting it as a very different
9:36
thing. So in the lawsuit... Elon is
9:38
referring to, or not Elon, but X
9:40
and their lawyers. I'd love it if
9:43
he wrote his own lawsuits. Yeah, that
9:45
would be, that would be quite something.
9:47
But they're referring to it as a
9:49
censorship portal, basically saying the system that
9:51
is set up to have the government's
9:53
and request is a censorship portal, which
9:56
it probably is actually a fairly accurate
9:58
description of it. The government is pushing
10:00
back and saying, no, no, this is
10:02
not a censorship portal. This is just
10:04
a website that allows us to notify
10:07
you of harmful content. And this struck
10:09
me as really interesting because this has
10:11
been the debate that we've had in
10:13
other countries and in particular in the
10:15
US and the whole thing with the
10:18
Twitter files a few years ago when
10:20
Musk took over, which is that Twitter
10:22
has set up. various portals to allow
10:24
governments or government officials to alert them
10:26
to content that they believe might be
10:29
violating the terms of service. And this
10:31
is, there's like a very specific distinction
10:33
in here that is important, which is
10:35
that the US system and, you know,
10:37
the way it works is that certain
10:40
actors have the ability, they have access
10:42
to a portal where they can submit
10:44
stuff, but it still is up to
10:46
Twitter. or the company, you know, to
10:48
decide, does this actually violate our rules?
10:51
The point is, it's like flagging content,
10:53
and it is a more trusted flagger
10:55
because it's coming from the government, it
10:57
will be reviewed in order and determined
10:59
whether or not it actually violates. And
11:01
as we saw with the actual details
11:04
that came out later of the Twitter
11:06
situation, Twitter often would reject those and
11:08
say this doesn't actually violate our terms
11:10
of service and we will reject them.
11:12
And so the Indian government is sort
11:15
of presenting this as the same thing,
11:17
like this is just a way for
11:19
us to alert you. And I think
11:21
they are deliberately sort of mimicking the
11:23
language that was used in the US
11:26
to present it as not as threatening
11:28
or problematic, whereas what X is claiming
11:30
and what I actually think is probably
11:32
more accurate is that under Indian law,
11:34
unlike in the US, when you get
11:37
a request from the government through this
11:39
particular you feel very strongly compelled to
11:41
remove that content. Okay. And so I
11:43
find it interesting. that India is sort
11:45
of now using the language that was
11:48
used to describe the American situation where
11:50
it wasn't censorship, even though some people
11:52
claimed it was censorship, including Elon Musk,
11:54
to now reflect the situation in India,
11:56
where there is much more clear government
11:59
coercion as part of this process. And
12:01
now we'll see how it fights out
12:03
in the courts, though it is very
12:05
strange that at the time when Elon
12:07
is seen as fairly close with Modi
12:09
and has been doing deals for his
12:12
other businesses including both Tesla and Starlink
12:14
for him to suddenly decide that this
12:16
is a fight with fighting. Yeah that's
12:18
what's really interesting is that Elon Musk
12:20
and Modi met in Washington in February
12:23
and I remember the photos there's a
12:25
bunch of photos that were published of
12:27
him with Modi and some of his
12:29
kids as well. It was a very
12:31
kind of odd photo op, I don't
12:34
know what they were trying to achieve,
12:36
but clearly a kind of photo moment,
12:38
and then as you say, he's trying
12:40
to kind of expand his business interest
12:42
there. Do you think that this court
12:45
case is being used as leverage in
12:47
the push to expand into India with
12:49
Tesla and increase the market share of
12:51
starting? I have no idea. I mean,
12:53
it sounded like he was getting those
12:56
deals anyway, so I'm not entirely sure
12:58
that this is beneficial. So I'm honestly
13:00
a little confused by it. I'm wondering
13:02
if more information will come out at
13:04
some point, that something else came up
13:07
within this process. It strikes me as
13:09
a weird way to have leverage because...
13:11
I'm sure Modi is the stronger player
13:13
in this situation. You know, Musk needs
13:15
access to those markets for his other
13:17
companies. So it strikes me strange, just
13:20
also the fact that, you know, for
13:22
the last two years, he's been willing
13:24
to go along with Modi's demand. So
13:26
I'm not sure what pushed him over
13:28
the edge here. And in fact, that
13:31
leads into one of the other stories
13:33
that we did want to mention here,
13:35
which is that in Turkey, which is
13:37
the other place where Musk has shown
13:39
a willingness to roll over and do
13:42
what was demanded, he's been taking down
13:44
the accounts of various activists and opponents
13:46
to the Erdogan government. And so... Again,
13:48
we've seen it over and over again
13:50
when Moscow is in agreement and aligned
13:53
with the governments of these authoritarian countries
13:55
He seems to have no problem pulling
13:57
down Content that the government is criticizing
13:59
and he did so again this week
14:01
in Turkey But in India suddenly he's
14:04
challenging it. So it's it's a very
14:06
strange situation and I really have no
14:08
idea why yeah, it's funny in a
14:10
way that our main story this week
14:12
is the fact that Elon Musk doesn't
14:15
have a consistent approach to speech. It's
14:17
a story that isn't a story in
14:19
a way. But yes, there's enough there
14:21
I think to kind of bring it
14:23
to listeners and explain that Musk continues
14:25
to be very hard to predict when
14:28
it comes to understanding why he says
14:30
one thing and does something else. And
14:32
not only that by why that seems
14:34
to change from almost month to month.
14:36
Yeah, I mean there's clearly no consistency
14:39
and so I'm sure there's some other
14:41
reason why X suddenly decided it was
14:43
worth fighting this. But this one doesn't
14:45
seem to fit the same pattern, where
14:47
you can sort of easily slide it
14:50
into, well, he likes this government, he
14:52
doesn't like that government, or he needs
14:54
some other thing here. This one is
14:56
just surprising. I almost wouldn't be surprised
14:58
if this lawsuit gets dropped very quickly,
15:01
if because of other back channel discussions,
15:03
Musk is like, hey guys, knock it
15:05
off. I mean, maybe is it that
15:07
he was distracted because he's running the
15:09
US government and wasn't the one to
15:12
make this decision, and then once this
15:14
gets back to him, he'll change his
15:16
mind. I don't know, but it is
15:18
a slightly surprising development. Yeah. It's not
15:20
the only story in which he appears
15:22
this week, Mike, and we both spotted,
15:25
as is always the case, we both
15:27
spotted a story in the verge this
15:29
week that told of Musk back channeling,
15:31
again, the CEO of Redit, Steve Huffman,
15:33
about some issues that he had about
15:36
moderators on Redit taking down and blocking
15:38
links to X. And so this is
15:40
a kind of story that picks up
15:42
from last year but has been recently
15:44
reported this week about how the essentially
15:47
these two CEOs were texting each other
15:49
probably sat back on the out, chilling
15:51
on a, after a long day of
15:53
thinking about how they were going to
15:55
take over the US government. And he
15:58
was kind of complaining about this development
16:00
and subsequently Huffman banned some of the
16:02
sub-readits, deleted all of the comments. And
16:04
so what did you make of this
16:06
as an indicator of musks? Again, inconsistent
16:09
approach to materials. it's yet another example
16:11
of his pure hypocrisy, right? And so,
16:13
you know, Huffman and Musk have been
16:15
friendly for a while, and Huffman in
16:17
the past has clearly been inspired by
16:20
some of Musk's actions and sort of,
16:22
you know, freeing up other tech CEOs
16:24
to be a little more aggressive in
16:26
their viewpoints. But this is just crazy,
16:28
right? Because we know that Elon and
16:30
X have blocked links to all sorts
16:33
of competitors for various reasons. Sometimes for
16:35
a while, right? They were slowing down
16:37
links to sub stack for a while.
16:39
They were blocking links to sub stack
16:41
for a while. They were blocking links
16:44
to Mastodon. They were closing accounts of
16:46
people. Overall, X has completely downranked links
16:48
because Musk wants to keep people on
16:50
the platform. And so he's admitted. He
16:52
finally admitted it. People sort of recognized
16:55
it. That post with links don't get.
16:57
rated as highly in the algorithm. You
16:59
know, so he's clearly done things to
17:01
try and keep people within his platform
17:03
to then go and complain to Huffman.
17:06
that a few subredits had decided as
17:08
a kind of protest that they were
17:10
no longer going to include X links
17:12
or links to X would no longer
17:14
be allowed in those subredits that this
17:17
was some sort of major breach that
17:19
needed often to step in. It suggests
17:21
a level of hypocrisy which is not
17:23
uncommon with Elon Musk, but it does
17:25
seem notable. The other element of it
17:28
was it wasn't just the blocking of
17:30
links to X that he was concerned
17:32
about. He was also concerned about people
17:34
calling out doge employees, the various kids
17:36
that Elon has brought into the federal
17:38
government or working havoc all over the
17:41
place. And he was upset that some
17:43
of those the people were being named
17:45
or talked about. There's a claim that
17:47
it was like advocating violence against them,
17:49
though I think that was somewhat exaggerated.
17:52
People are saying like even naming them
17:54
is advocating violence against them, which is
17:56
not accurate. And so he seemed to
17:58
be partly upset about that. And then
18:00
as part of that discussion also apparently
18:03
upset about some of the subredits and
18:05
the moderators within the subredits deciding that
18:07
they weren't going to allow links to
18:09
the former Twitter. Yeah, it does make
18:11
me think that I should be throwing
18:14
my weight around via text a lot
18:16
more. Apparently, that's the way. I never
18:18
really think of it as a weapon
18:20
in my armory, but it's just making
18:22
me think that there's maybe some people
18:25
that I can make me do something
18:27
for me via text. Well, see if
18:29
you can get into the chat with
18:31
the officials from the US government, you
18:33
start throwing your weight around there. that
18:36
would make for a good podcast next
18:38
week. Yeah, telling JD Vans to shut
18:40
up would be a start. Yeah, bring
18:42
me a coffee, JD. So we're going
18:44
to talk a bit more about CEOs
18:46
of platforms now because I've kind of
18:49
been listening to a couple of podcasts
18:51
with some CEOs of platforms this week
18:53
and there's a really interesting difference in
18:55
how they talk about content moderation. Let's
18:57
start with Evan Spiegel, the CEO of
19:00
Snapchat. who was on diary of a
19:02
CEO with Stephen Bartlett this week and
19:04
talked at length about a whole range
19:06
of different issues with a little segment
19:08
about both Snapchat's approach to content moderation
19:11
and also other kind of tangential issues
19:13
like how meta is thinking about content
19:15
moderation as well and We're going to
19:17
play a bit of a clip for
19:19
you. This is kind of technical wizardry
19:22
that we haven't tried on control or
19:24
speech before, but you're now going to
19:26
be able to hear a little bit
19:28
of that interview with Stephen Bartlett, because
19:30
I think it's a really good response
19:33
and a really good interview around some
19:35
of the issues we talk about here
19:37
all the time on control or speech.
19:39
And there's a great example, I think,
19:41
of a CEO who gets it to
19:44
a large degree, so have a listen.
19:46
really matters right and that's why we
19:48
have content guidelines because we want people
19:50
to feel like they're an environment where
19:52
they can express themselves and I think
19:54
some of the the conversation about different
19:57
content guidelines or having content guidelines or
19:59
not having them has been really interesting
20:01
because I think people are missing the
20:03
broader point if you have a platform
20:05
with no content guidelines and it's full
20:08
of people yelling at each other or
20:10
saying really mean or offensive things or
20:12
posting a lot of pornography that's a
20:14
really uncomfortable thing for most people. Right?
20:16
That's uncomfortable. You say, maybe this platform
20:19
isn't for me. Maybe I don't feel
20:21
comfortable expressing myself here because all the
20:23
stuff I'm seeing isn't really appropriate or
20:25
aligned with my values. And so one
20:27
of the things we discovered really early
20:30
on is if you want to create
20:32
a platform where people feel comfortable expressing
20:34
themselves, feel comfortable communicating with their friends
20:36
and family, having content guidelines is really
20:38
helpful because it means that the content
20:41
experience is one that feels more comfortable.
20:43
But isn't that. People would say, well,
20:45
that censorship. I'm thinking now of the
20:47
video that Mark Zuckerberg released about matters
20:49
changed to their moderation systems, moving to
20:51
Texas, realizing that, I think he said
20:54
that they'd over-indexed with their moderators in
20:56
terms of left-leaning politics. So a lot
20:58
of the right leaning content had been
21:00
censored. What do you make of that
21:02
argument for content moderation? That we don't
21:05
want to censor people. I think it's
21:07
a misunderstanding of the First Amendment and
21:09
how it applies. If we look at
21:11
our country, the way, you know, at
21:13
least here in the United States, with
21:16
the First Amendment, that really focuses on
21:18
the way that the government interacts with
21:20
content creators or content publishers. And it
21:22
says, hey, it's not okay for the
21:24
government to interfere with individuals or publishers'
21:27
self-expression, right? That's not allowed. But one
21:29
of the things the First Amendment also
21:31
does is say... you know, platforms or
21:33
individuals can make choices about what sort
21:35
of content they want to promote or
21:38
want to have on their platform. That's
21:40
part of the First Amendment. You can't
21:42
force the Wall Street Journal to, you
21:44
know, put this article or that article
21:46
or accept any article from any author
21:49
all around the world. The Wall Street
21:51
Journal as a paper can decide what,
21:53
you know, what authors. it wants to
21:55
include on its pages, and that's part
21:57
of the protected First Amendment expression we
21:59
have here in this country. So this
22:02
whole notion of censorship doesn't apply to
22:04
companies that are private businesses that actually
22:06
have a First Amendment right to decide.
22:08
what content is on their platform. And
22:10
they may want to decide we're open
22:13
to literally anything. Anything goes, no problem.
22:15
And it seems like some platforms are
22:17
making that choice. But other platforms like
22:19
ours say, hey, in order to have
22:21
a healthy set of discourse across our
22:24
platform, in order to make sure people
22:26
feel comfortable when they're viewing content on
22:28
our platform, we don't want people to
22:30
come across pornography, for example, or violent
22:32
content, or hateful content. That's not something
22:35
that makes people feel good. want to
22:37
make sure that that content isn't on
22:39
our platform because it doesn't comply with
22:41
our our guidelines. And that may be
22:43
one of the reasons why in some
22:46
of these studies it shows that people
22:48
feel better when they use Snapchat because
22:50
they're not encountering you know really violent
22:52
content when they're using Snapchat. So listens
22:54
have heard Evan speak there about how
22:57
Snapchat approaches content moderation Mike. I mean
22:59
I think his response to Stephen Bartler
23:01
around the First Amendment was particularly interesting
23:03
and I wanted to note the fact
23:05
that Bartler has said a bunch of
23:07
things in the recent memory around content
23:10
moderation in relation to meta. So don't
23:12
if you remember, he posted on his
23:14
LinkedIn page, which has many, many millions
23:16
of followers, about the fact that meta's
23:18
move away from fact-checking and its changes
23:21
to its content moderation policy, represented one
23:23
of the most important videos that people
23:25
will see this year and a course
23:27
correction to... what he seemed to suggest
23:29
that it was slightly coded, but seems
23:32
to suggest was kind of overreach around
23:34
contour moderation. And I've listened to a
23:36
few kind of Stephen Ballet podcasts. I
23:38
don't love the guy, but he has
23:40
a slight tendency to kind of air
23:43
into the manosphere, I find. And so
23:45
it's interesting that he quizzes Spiekel quite
23:47
openly about his contour moderation. And the
23:49
speaker has a really good response, I
23:51
felt. Yeah. Were you surprised by how
23:54
well he handled that and the way
23:56
he seemed to understood it? Yeah, I
23:58
mean, partly, and partly not, I think.
24:00
Partly not because it was a great
24:02
answer. I mean, it's an absolutely fantastic
24:05
and very thoughtful and correct answer, understanding
24:07
the things, that this is not a
24:09
First Amendment issue, that values determine what
24:11
kind of community you want to build,
24:13
and that is what users appreciate as
24:15
well, and that there are reasons to
24:18
do this that have nothing to do
24:20
with censorship, but just what kind of
24:22
community you're trying to build. I think
24:24
it was a fantastic answer. Spiekel's been
24:26
the founder and CEO of Snapchat for
24:29
a while. He's gone through a bunch
24:31
of these fights and arguments and they've
24:33
been involved in some of them and
24:35
I feel like he has a really
24:37
deep grasp. So I'm not surprised in
24:40
that he gets it right. The only
24:42
thing I'm surprised in is that like
24:44
feels like every other CEO in the
24:46
deck space no longer does. Yeah. And
24:48
if anything like Spiekel had the reputation
24:51
historically and this is probably unfair that
24:53
he was you know, he was a
24:55
little bit more of like a frat
24:57
boy not really deep in the policy
24:59
weeds on these things. And yet this
25:02
answer suggests someone who's really thought deeply
25:04
about these things and actually has a
25:06
deeper understanding of it and is willing
25:08
to explain it clearly and not, you
25:10
know, what a lot of CEOs do
25:13
is kind of deflect and mislead and
25:15
sort of dance around it. And he
25:17
was just very direct. He's just like,
25:19
this is not a First Amendment issue.
25:21
It's a values thing. We want to
25:23
build a community. This is what our
25:26
people expect. This is what our users
25:28
expect. And this is the kind of
25:30
thing that we've decided that this is
25:32
what our values are based on. And
25:34
it was, you know, it's fantastic and
25:37
clear and appreciable. And you can hear,
25:39
I really have, you know, I think
25:41
I've maybe heard of Bartlett, but I've
25:43
never seen any of his videos before.
25:45
I was not really familiar with him.
25:48
And I only watched this one really
25:50
familiar with him. And I only watched
25:52
this one little section, you know, a
25:54
little bit longer than the clip that
25:56
we played of him talking about the
25:59
content moderation stuff. were taking down too
26:01
much conservative speech and they had to
26:03
move to Texas for it, which is
26:05
like we know is not true, but
26:07
he seemed really bought into the narrative
26:10
of what happened and so it was
26:12
really nice to see Spiegel just kind
26:14
of push back on him. Yeah, and
26:16
Bala has a kind of Zuckerberg aesthetic
26:18
doesn't he? He's got the kind of
26:21
black t-shirt, he doesn't quite have the
26:23
gold chain, but you know he needs
26:25
the Latin phrase on the t-shirt. Yeah,
26:27
exactly, exactly. And Bala has been kind
26:29
of had some focus on focus on
26:31
him focus on him focus on him
26:34
for... showcasing and highlighting kind of health
26:36
misinformation on the podcast as well. So
26:38
he's incredibly, in the UK at least
26:40
he's incredibly well known, he's incredibly well
26:42
listened to, he's got various books out,
26:45
and he does seem to kind of,
26:47
I would say, spotlight, some slightly odd
26:49
health experts, inverted commerce. And so, again,
26:51
you might expect Spiegel or anybody on
26:53
the podcast to somewhat side with him.
26:56
and actually I thought Speaker did a
26:58
great job of kind of standing his
27:00
ground. One guy who didn't do a
27:02
very good job of that was our
27:04
next CEO who was on a podcast
27:07
this week, Neil Mohan of YouTube. He
27:09
was on the Semaphore podcast Mix Signals
27:11
with Ben Smith and Max Tani and
27:13
we're going to play a bit of
27:15
a clip now of how he was
27:18
responded to some questions about YouTubes. quantum
27:20
moderation policy and some of the tensions
27:22
around the US administration and the kind
27:24
of clash in ideals, you'll notice I
27:26
think in this clip a very different
27:28
sound, a very different tone and he's
27:31
not only defensive I'd say but also
27:33
quite evasive in his answers. You've said
27:35
that the number one priority for YouTube
27:37
is the safety of YouTube's ecosystem and
27:39
we're in a moment when that's actually
27:42
like a slightly unusual thing to say
27:44
and a lot of platforms are really
27:46
backing off. anything like content moderation probably
27:48
because of pressure from this this White
27:50
House and this administration and I wonder
27:53
if you feel like your there's tension
27:55
between you and the in the administration
27:57
on I guess particularly issues around public
27:59
health. I'll say a
28:01
few things. First, and probably most important
28:04
and kind of really at the top
28:06
is everything that we talk about, everything
28:08
we just talked about in terms of
28:10
the business, how content works on our
28:13
platform, etc. is back to our mission
28:15
statement, which is to give everyone a
28:17
voice and show them the world. And
28:19
the first half of that mission statement
28:22
is really about free expression and freedom
28:24
of speech and I can say for
28:26
myself and I know for many of
28:28
the colleagues that I work with every
28:31
single day that's why we come to
28:33
work like that's the power of YouTube
28:35
right like that if you have an
28:37
idea you have a thought and you
28:39
want to share it with the world
28:42
then YouTube is a place where you
28:44
can go and share it without somebody
28:46
telling you that you don't sound the
28:48
right way or you don't look the
28:51
right way or you're saying the wrong
28:53
thing or what have you and that
28:55
is core to our mission and everything
28:57
that we do is ultimately, frankly, in
29:00
service of that. And so it's the
29:02
reason why actually I think we've had
29:04
community guidelines from the very early days.
29:06
And in order to allow creators to
29:09
be able to share their ideas have
29:11
this free sort of voice freedom of
29:13
expression and to earn a sustainable living
29:15
from it we also have rules of
29:18
the road in terms of how our
29:20
platform works right like no porn or
29:22
adult content or financial scams or what
29:24
have you right like back to the
29:27
question around like when you turn on
29:29
the TV like that's not what consumers
29:31
are looking for when they turn it
29:33
on and our advertisers right the brands
29:36
that support that content aren't looking for.
29:38
And so our approach to responsibility is
29:40
with all of that in mind, right?
29:42
But ultimately towards his goal of freedom
29:45
of expression, that's how I've always looked
29:47
at it. You know, and even in
29:49
years past when... You and I've talked
29:51
about it, hopefully I've been consistent in
29:54
terms of that sort of core thesis.
29:56
So yeah, Mike, what did you think
29:58
about how Mohan responded to Ben Smith's
30:00
kind of very pointed questions? Yeah, I
30:03
mean this is the more typical unfortunately
30:05
the more typical CEO's response that we
30:07
hear on these kinds of questions Where
30:09
someone doesn't want to come out and
30:12
say anything that then will be taken
30:14
up by you know Probably a bunch
30:16
of idiots on various social media platforms
30:18
taken out of context and presented as
30:21
something to rally around and so he
30:23
says a lot of nothing and he
30:25
does so in a very sort of
30:27
defensive way and doesn't directly address the
30:30
issue and he could have right I
30:32
mean it would have been great to
30:34
have him come out and respond the
30:36
same way that does. And so, you
30:39
know, I think this, the contrast between
30:41
Mohan and Spiegel is really, really notable.
30:43
And it's, you know, it's unfortunate for
30:45
Mohan that they both came out around
30:48
the same time, but it's just such
30:50
a different answer to effectively the same
30:52
question. Yeah. And also I noted the
30:54
fact that Mohan's responses have changed significantly
30:57
in the last two to three years
30:59
as well. It was only, I think,
31:01
probably at that time made sense. You
31:03
can tell from the clip that he,
31:06
that idea is dead. You know, is
31:08
very much in a platformer in which
31:10
he talked about working with other partners
31:12
in the space, civil society organizations, non-profits,
31:15
kind of experts to help remove borderline
31:17
content. And that, probably at that time,
31:19
made sense. You can tell from the
31:21
clip that he, that idea is dead.
31:24
You know, is very much just doing
31:26
the bare minimum and no more. Well,
31:28
I think it's in some sense it's
31:30
even worse than that, right? Because this
31:33
kind of answer, it's a not trying
31:35
to say anything answer, it's an answer
31:37
that's designed to try not to get
31:39
anyone upset by not actually saying anything.
31:42
And it's a lost opportunity. It's an
31:44
opportunity where he could come out and
31:46
say the same things that Spiegel said,
31:48
which is that it's our place and
31:51
we get to determine how things are.
31:53
And like, yes, we're trying to enable
31:55
free speech. But one of the best
31:57
ways to do that is to have
32:00
a set up that reflects values and
32:02
that people don't feel harassed or feel
32:04
that there's misinformation flowing there and he
32:06
didn't say that. And so I think
32:09
it's a lost opportunity. But maybe an
32:11
opportunity gained in the sense that you
32:13
can now send this to folks in
32:15
the Republican Party in the US and
32:18
justify, and justify like. I've said publicly,
32:20
something that you agree with, and therefore...
32:22
But it's not, right? I mean, it's
32:24
not even, it's not even saying that,
32:27
right? It's not even saying what the
32:29
Republicans want them to say. It's saying
32:31
nothing. That's the problem, right? You know,
32:33
I mean, with Zuckerberg at least, I
32:36
mean, okay, so yes, you could say
32:38
that like this isn't like the complete
32:40
capitulation that Zuckerberg going on Rogan and
32:42
saying a bunch of nonsense was like
32:44
just obviously... completely ridiculous untrue fantasy land
32:47
stuff that the Republicans got excited about.
32:49
Nobody's going to get excited about this.
32:51
What he said was nothing. It was
32:53
empty. Yeah, but sometimes the bare minimum
32:56
is all these guys want to do,
32:58
isn't it? Yeah, I mean, sure, he
33:00
can point to it. But I don't
33:02
think this would satisfy anyone on any
33:05
side of this debate because it's not
33:07
the full-throated endorsement of any particular position.
33:09
It is clearly like trying to tiptoe
33:11
around landmines. Do you think that... the
33:14
US administration will have the potential to
33:16
go after Snapchat or put pressure on
33:18
YouTube because they're not saying what they
33:20
would like them to you. They're not
33:23
kind of towing the line or... We'll
33:25
see. I mean you never know who
33:27
the next target is going to be.
33:29
Right. Obviously there are investigations going on
33:32
now with the FTC that we've talked
33:34
about where they're, you know, they want
33:36
to go after tech but we don't
33:38
know who it is they're going after
33:41
because most of the tech companies have
33:43
sort of towed the But I don't
33:45
know, they haven't really been, you could
33:47
see where it's like, it's been really
33:50
easy because Snapchat is often considered one
33:52
of the ones that kids use. And
33:54
so whenever they're kids safety discussions, Snapchat
33:56
will often come up. So I could
33:59
totally see a kind of moral panic.
34:01
around Snapchat and they'll say that they're
34:03
not handling kids well and that they'll
34:05
do stuff around that but I don't
34:08
know it's impossible to predict with this
34:10
administration. Yeah nonetheless I think it's interesting
34:12
to see two CEOs of two major
34:14
platforms talking about content moderation in the
34:17
same world and it does have the
34:19
sense of these platforms trying to shape
34:21
the narrative and be on the front
34:23
foot with how they talk about this
34:26
topic. Speagle talks about proactively scanning for
34:28
pornography. in a way that Spotify on
34:30
our next story might have something to
34:32
learn from. You found this story about
34:35
Spotify being unprepared, let's say, for some
34:37
graphic content on the platform. Yeah, this
34:39
is almost hilarious, right? So anyone who's
34:41
been in this space for anyone at
34:44
the time knows that if you do
34:46
any kind of user-generated content, like at
34:48
some point you're going to have to
34:50
deal with pornographic content and have a
34:53
clear policy and a way to enforce
34:55
it. That is even true with text,
34:57
but as soon as you get to
34:59
video or imagery, you know, so Spotify
35:02
has always been audio for the most
35:04
part, and music, and so that was
35:06
less of an issue. They've gotten bigger
35:08
into podcasts, and there's been some controversy
35:11
there. But now, sort of realizing how
35:13
much stuff is video and how many
35:15
podcasts are now video, they've sort of
35:17
moved into the video space. And apparently,
35:20
even though their policies are... that they
35:22
will not allow pornography. They were unprepared
35:24
for sexually explicit material suddenly showing up
35:26
and getting very very popular. And so
35:29
they had the top business podcast. their
35:31
listing of top business podcasts apparently included
35:33
some fairly pornographic material, which is not
35:35
normally what I associate with business content.
35:38
And they were sort of taken by
35:40
surprise and had to respond and said,
35:42
oh, of course, you know, it wasn't,
35:44
we didn't intend for that and that
35:47
violated the rules and they eventually took
35:49
it down once it was called out.
35:51
But it suggests that they may have
35:53
moved into the sort of. video market
35:56
without preparing their trust and safety folks
35:58
for the level of pornography and the
36:00
ways that people are going to attack
36:02
things like the trending lists and the
36:05
top lists. Yeah, indeed. And it's surprising
36:07
because it wasn't that long ago that
36:09
they were embroiled in the kind of
36:11
Joe Rogan scandal and they faced a
36:14
whole bunch of heat for their trust
36:16
and safety approach, I would say. They
36:18
then bought a... company called Kinsen, which
36:20
actually did some smart work to identify
36:23
misinformation and kind of false narratives deep
36:25
within podcasts and you know kind of
36:27
often hidden in far-reaching corners of the
36:29
platform. Full disclosure I did some work
36:32
for Kinsen at some point before they
36:34
were acquired by Spotify. That's how I
36:36
know about them. tooling up or maybe
36:38
skilling up around. Well it's been something
36:41
that has been known for a long
36:43
time. Everybody knows. You don't want to
36:45
make too big of a deal of
36:47
it, right? Because like, again, content moderation
36:49
scale is impossible, right? There's always going
36:52
to be something that slips through, people
36:54
are going to make mistakes, things are
36:56
going to get missed. So I don't
36:58
want to make too big of a
37:01
deal of it, but it is noteworthy
37:03
that like, you would think any platform
37:05
as they're expanding into video and pushing
37:07
video heavily, they have to realize. And
37:10
in fact, the video that sort of
37:12
made it, sort of made it, somebody
37:14
just ripped it, put it in there,
37:16
and was able to get a bunch
37:19
of downloads. But the fact that it
37:21
made it into the top business podcast
37:23
lists, you would feel that there would
37:25
be a little bit of extra review
37:28
before something gets to that level, and
37:30
it's just sort of noteworthy that it
37:32
appears that at least this particular attempt
37:34
to get pornography witnessed on Spotify, made
37:37
it past the guards. Did Hector ever
37:39
have a porn problem? In the comments
37:41
Not a bad one. I mean, we
37:43
certainly had the issue of spam, right?
37:46
And so, and a lot of spam
37:48
is sort of like linking to pornographic
37:50
content. And so we definitely have had
37:52
that. We did have, at one point,
37:55
we had this weird thing where somebody
37:57
was showing up and basically writing a
37:59
novel in the comments to a very
38:01
old tector post. The comments are probably
38:04
still there and it was like, I
38:06
mean, just reams and reams of text.
38:08
Well, I had nothing to do. Like
38:10
an erotic novel novel. Yeah, yeah, it
38:13
may still be somewhere in the textured
38:15
archives Because I don't think we pulled
38:17
it down. It was like it was
38:19
on like a really old post so
38:22
like nobody was reading it. It wasn't
38:24
interfering with anyone. Yeah, and so I
38:26
can't remember if we left it up
38:28
or if we pulled it down. It
38:31
was, and they were like coming back
38:33
every show often and just like adding
38:35
another chapter. It was massive. It was
38:37
massive. We've had some weird things happen
38:40
in the comments over time, especially like
38:42
older comments. Yeah. That just go back
38:44
ages and people put in all sorts
38:46
of weird stuff. Maybe we'll do a
38:49
special recording of the podcast and if
38:51
we read out the erotic novel written
38:53
on it in the comments under the
38:55
detector post decades ago. I don't even
38:58
know if I can find it find
39:00
it find it again. We're looking recently,
39:02
we're at somewhere at 82, 83,000 articles
39:04
on Tecta at this point. Wow. And
39:07
so, yeah, even finding that. And over
39:09
two million comments, so it's, we've got
39:11
a fair bit of content. Yeah, good
39:13
luck finding that. It's worth saying that
39:16
Mike and I have been thinking about
39:18
accompanying the audio version of controller speech
39:20
with a video version. And... We have
39:22
been arming and arming about it. We
39:25
would love to hear- We need to
39:27
get to the top of the business
39:29
list on Spotify. Seems like we have
39:31
a way to do that. But yeah,
39:34
if listeners, we'll keep it clean, I
39:36
promise. If listeners have a view on
39:38
whether they would listen and watch a
39:40
control or speech podcast on the platform
39:43
of their choice, drop us a no,
39:45
podcast at control speech.com, CTRL, AOT speech.
39:47
And give us your thoughts. It might
39:49
spur us on to do a version
39:52
where we have both of our big
39:54
heads on a screen together. Talking of
39:56
adult sites, the kind of other story
39:58
that I noted this week was one
40:01
about only fans and it's in relation
40:03
to off-com. We're not starting an only
40:05
fans, but revenue diversification, Mike. We're going
40:07
to make this podcast pay somehow. You
40:10
promised. Offcom has handed a one million
40:12
pound fine to Only Fans for failing
40:14
to supply accurate information about how it
40:16
prevents underage users from accessing explicit content.
40:19
And this is a long-standing investigation which
40:21
has kind of finally come around this
40:23
week. It's a bit of a gaf
40:25
really. Only fans had told Offcom that
40:28
it's challenge age. So the age of
40:30
which it prompts users to tell them
40:32
how old it is, was 23, only
40:34
to find out from the tech provider
40:37
that provided that service, that it was
40:39
actually set to 20. So there was
40:41
a kind of gap of three years
40:43
between what it told off-com and what
40:46
was actually true. And that has led
40:48
it to be given and to accept
40:50
a one million pound fine. That in
40:52
itself is interesting. But I think what's...
40:54
really fascinating to me Mike is the
40:57
release of this story and the timing
40:59
of it okay so last week we
41:01
talked about the online safety act the
41:03
brand new but long in the making
41:06
legislation in the UK finally being rolled
41:08
out and intermediaries in the UK being
41:10
liable for the USA only a week
41:12
into that being true do we have
41:15
this announcement and The announcement is actually
41:17
not related to the online safety act
41:19
at all. It's in relation to a
41:21
regulation that predates the OSA. It was
41:24
in existence before the OSA came to
41:26
being. So basically Offcom could have brought
41:28
this and did obviously bring it against
41:30
only fans at any point. But it
41:33
waited until the OSA had been rolled
41:35
out. I think to potentially give the
41:37
impression of off-com kind of doing its
41:39
job and being the kind of enforcement
41:42
power that it wants to be seen
41:44
to be and I read around the
41:46
reports about this story the Guardian the
41:48
FT and others don't mention the OSA
41:51
or the kind of more niche regulation
41:53
that this enforcement is brought under so
41:55
it kind of gives the impression to
41:57
the unsuspecting eye. that actually this is
42:00
related to the OSA and the offcom
42:02
is suddenly kind of doing its job.
42:04
So the kind of cynical journalist in
42:06
me thinks that this has been coincided
42:09
very well with the OSA last week.
42:11
It's actually nothing to do with that.
42:13
But I have heard on the grapevine
42:15
that there is some enforcement being prepared
42:18
around the OSA and there's some naturally,
42:20
as we've talked about on the podcast,
42:22
there are some platforms that have been
42:24
closely looked at. What did you make
42:27
of the kind of timing of this?
42:29
Are you cynical as I am? Yeah,
42:31
I mean, I don't know. The timing
42:33
might be right just based on like
42:36
apparently only fans alerted off-com to the
42:38
error in January of 2024, which you
42:40
know, a little over a year ago,
42:42
there was the investigation in the back
42:45
and forth and then sort of figuring
42:47
out what it was going to be.
42:49
The timing seems about right. I mean,
42:51
having it come out now is not
42:54
like totally out of the ordinary. Under
42:56
that, look at you being all friendly
42:58
to off home. What have I made
43:00
you do? But yes, I mean it
43:03
is entirely possible that the exact timing
43:05
of the release may have been let's
43:07
say push back a few weeks or
43:09
a month or something Recognizing that the
43:12
OSA was about to go into effect
43:14
and that everybody would be looking to
43:16
offcom and to see how and when
43:18
they actually started enforcing things under the
43:21
OSA and so you know It wouldn't
43:23
surprise me if the timing was massaged
43:25
in some way to make this work
43:27
But, you know, we'll see when the
43:30
actual... enforcement's come out, but yeah, it'll
43:32
be interesting to see and to see
43:34
whether or not this has any impact.
43:36
And if people think like, oh, okay,
43:39
offcom is actually trying to enforce stuff,
43:41
it's possible. I do wonder, I mean,
43:43
you could argue to that offcom was.
43:45
Man, I'm going to defend off-com again.
43:48
They might reasonably have been concerned that
43:50
if they had announced this, you know,
43:52
three weeks ago, that it would confuse
43:54
people into because like it was pre-OSA.
43:57
And so it's like, well, wait, I
43:59
thought this law isn't going into effect
44:01
for two more weeks. So, you know,
44:03
why are they enforcing it now? And
44:06
so there may be some reasons where
44:08
it actually did make sense just to
44:10
keep everybody else from being too confused
44:12
by it. the way that the coverage
44:15
has transpired and I know it's difficult
44:17
for journalists to necessarily know every nut
44:19
and bolt of every single piece of
44:21
legislation and what it refers to but
44:24
it was something that pricked my ears
44:26
at least. Let's round up Mike on
44:28
a slightly kind of more interesting quirky
44:30
story that you found a platform that
44:33
we may end up referring to in
44:35
a future episode of the podcast in
44:37
the opening section. We have to see
44:39
what the prompt is. Tell us says
44:42
EZ US. is this new platform. I
44:44
had heard about it, I think late
44:46
last year, there was some talk about
44:48
it, and it is a sort of
44:51
another Twitter-like platform. It was created by
44:53
Joe Trippy, who is a sort of
44:55
semi-famous political consultant figure, sort of became
44:57
famous in 2004 as the campaign manager
44:59
for Howard Dean and his sort of
45:02
upstart internet-fueled campaign. And then ever since
45:04
then has been sort of in and
45:06
around specifically democratic politics in the US.
45:08
And so late last year, there was
45:11
some talk about how he wanted to
45:13
set up his own social media platform
45:15
and he wanted to do something different
45:17
and it was going to be more
45:20
respectful and respectful. to some extent like
45:22
we've heard all that before you know
45:24
like lots of people said that especially
45:26
after either on must took over it
45:29
was this idea that like oh and
45:31
usually presented in a way where you're
45:33
just like, wow, this person is incredibly
45:35
naive about the realities of human beings,
45:38
and let alone getting a bunch of
45:40
them together. But there do seem to
45:42
be, as the app is now officially
45:44
launched, they do seem to have created
45:47
a few interesting elements to it that
45:49
I think will be worth seeing how
45:51
well they catch on. And so in
45:53
particular, rather than just relying on like
45:56
a... team of trust and safety officials
45:58
to determine who is violating the rules
46:00
and who isn't, there is an element
46:02
of sort of crowdsourcing stuff where they've
46:05
built in, they call it a reputation
46:07
engine that so users themselves get to
46:09
rate other people's posts. And so it's
46:11
sort of a mix of like community
46:14
notes and read it up votes and
46:16
down votes and even a little bit
46:18
of like Wikipedia elements to it where
46:20
it's like. sort of crowdsourcing reputation, and
46:23
the idea being that users who have
46:25
a high score, their content rises to
46:27
the top, users that have a low
46:29
score, their content will be not as
46:32
readily, it'll still be there, but it
46:34
will work into algorithms or be as
46:36
readily shared and viewable. And, you know,
46:38
it's an interesting idea. I have questions
46:41
about how well it'll work in practice.
46:43
You know, as soon as you get
46:45
into this kind of thing, you worry
46:47
about things about brigating and deliberative attacks
46:50
on certain kinds of speech and how
46:52
it can be abused and how, I
46:54
mean, there's always fears about things like
46:56
echo chambers and stuff which I think
46:59
might be a little bit overblown. But
47:01
it's an interesting and different approach. And
47:03
right now, the one thing that I
47:05
do believe very strongly, is that the
47:08
more experiments the better. And so I'm
47:10
happy to see an experiment. I'm happy
47:12
to see how it works. I might
47:14
be a little skeptical that this will
47:17
work out as well as they sort
47:19
of think it will. But again, what
47:21
we need right now is experiments, we
47:23
need differentiation, we need people to try
47:26
different things. And so I'm excited to
47:28
see them enter the space. Yeah. I
47:30
think it's an interesting feature and an
47:32
interesting idea that might kind of inversely
47:35
affect how platforms moderate and act as
47:37
a... way of slowing down harmful or
47:39
egregious content. It kind of made me
47:41
think a bit about the challenges of
47:44
breaking through on a platform like that
47:46
though. If somebody has a reputation and
47:48
they have built a reputation on a
47:50
platform, does that mean that a kind
47:53
of other voices can emerge and content
47:55
can emerge in a way that... You
47:57
definitely have a fear of like this
47:59
becomes a sort of winner-take-all situation and
48:02
the people with the most clout and
48:04
the most power sort of stay that
48:06
way. You have a little bit of
48:08
that no matter what on any social
48:11
media platform. Obviously people with bigger audiences
48:13
just have bigger audiences as this is
48:15
the natural way things are. But yeah,
48:17
there is a concern that this leads
48:20
into there's going to be a strata
48:22
of users who are like the royalty.
48:24
and have all the power. And we
48:26
have seen how that has failed on
48:29
other platforms. And so most notably, Dig,
48:31
which is now coming back, apparently, you
48:33
know, which was like the early version
48:35
of Reddit, where people would vote up
48:38
and vote down stories. They had some
48:40
sort of ranking system where particular users,
48:42
their votes, if they were considered. good
48:44
signalers, their votes counted more. And that
48:47
got to a point where it was
48:49
actually kind of crazy where like the
48:51
leading users on dig. We had this
48:53
happen to us. In fact, someone came
48:56
to us and said, hey, I have
48:58
like strong power signal on dig. Do
49:00
you want me to promote tectored articles?
49:02
And I know that in some cases,
49:04
there were people who were like. sort
49:07
of selling their ability to do that.
49:09
That wasn't the person who approached us,
49:11
was just like, I like texture articles,
49:13
I would wrote them. And I actually
49:16
told him no, because I felt that
49:18
that was, I felt like cheating to
49:20
rely on someone like that. But like,
49:22
once you have that kind of power,
49:25
then there's corruption potential there. Yeah, I
49:27
remember talking to a guy called Rob
49:29
Allam, who's username on Red It is
49:31
Gallo Boob, and he had a ton
49:34
of karma. He's like one of the
49:36
most kind of the most kind of
49:38
like... largest ready-uses for common. He also
49:40
used to get loads of approaches from
49:43
companies and brands, in which he was
49:45
kind of invited to basically shell on
49:47
behalf of the company and he was
49:49
pretty principled about it. But there are
49:52
all these kind of unintended consequences of
49:54
focus. on reputation and it relies on
49:56
having mitigating systems in place to, I
49:58
guess, avoid that. So really interesting experiment
50:01
will be interesting to see if that
50:03
can scale and if people enjoy or
50:05
see if those ideas proliferate onto other
50:07
platforms as well. Yeah, and I'll note
50:10
too that I do appreciate the fact
50:12
that they're trying to build this in
50:14
a decentralized way, using the decentralized social
50:16
networking protocol, which is the Project Liberty
50:19
Project, and there's a few different social
50:21
media apps that are using that. as
50:23
someone who believes in protocols over platforms
50:25
and decentralizedization. I'm excited that they're doing
50:28
that rather than trying to build up
50:30
a brand new thing from scratch. Yeah,
50:32
indeed. Great. Thanks, Mike. That brings us
50:34
to the end of today's episode. Thanks
50:37
to our listeners for tuning in. If
50:39
you enjoyed today's episode or if you
50:41
didn't hate it, you know what to
50:43
do. Let's raise the bar, okay? Last
50:46
week, it was, it was, didn't hate
50:48
it. This week, this week. Let's get
50:50
some, we really, really like the podcast.
50:52
Okay, if you really, really like the
50:55
podcast, I think these three reviews have
50:57
gone to your head. But yeah, okay,
50:59
if you really, really like the podcast,
51:01
leave us a review in which you
51:04
tell us that you really, really like
51:06
it and we will really, really like
51:08
you. And if you like the podcast
51:10
enough to sponsor an episode, we are
51:13
in the market for sponsors, you get
51:15
a mention at the end of the
51:17
podcast and an excellent. 10-minute interview with
51:19
one of us and our listeners are
51:22
growing all the time and we're getting
51:24
lots of really great feedback so get
51:26
in touch podcast at control alt-speech.com. Thanks
51:28
for your time as ever Mike. It's
51:31
great to chat to you. Thanks for
51:33
all the listeners tuning in and we'll
51:35
speak you next week. Thanks for listening
51:37
to Control Alt Speech. Subscribe now to
51:40
get our weekly episodes as soon as
51:42
they're released. If your company or organization
51:44
is interested in sponsoring the podcast, contact
51:46
us by visiting controlalt speech.com, that's CTRL
51:49
alt speech.com. This podcast is produced with
51:51
financial support from the future of online
51:53
trust and safety. Fund, a a
51:55
fiscally multi-doner fund that
51:58
global impact, that
52:00
supports charitable activities to
52:02
build a more
52:04
robust, capable, and inclusive
52:07
trust and safety
52:09
ecosystem. ecosystem.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More