Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Mike, it's not a website that I go to very often,
0:02
okay? But the public access
0:04
to court electronic records site
0:07
better known as Pacer.
0:08
better known as Pacer to its fans. I know is
0:10
one that you go to a lot. and we'll talk about
0:12
why shortly. but when you go to the website,
0:15
it prompts you at the top of the page by
0:17
saying, what can we help you accomplish? So
0:20
today I ask you, what can we help you
0:22
accomplish?
0:23
Well, in an ideal world, you
0:25
would find me nine better justices
0:28
for the Supreme Court who actually understand
0:30
the Constitution?
0:33
Oh God. Okay. That's that's,
0:35
that's a foretelling if ever I heard one,
0:37
yes. what about you? What can we help you accomplish
0:39
today, Ben?
0:40
well, my new year's resolution, Mike was to
0:42
be much more Zen and to be much
0:45
more, you know, calm
0:47
and to, you know, meditate from
0:49
time to time and to really kind of, just
0:51
take things slowly and
0:53
this week has not helped
0:56
at all. And so it's
0:59
testing my resolve. And to be honest, I don't think you can help
1:01
me at all. I'm beyond help. Let's
1:03
get started. Hello
1:13
and welcome to control alt speech. Your
1:15
weekly roundup of the major stories about online
1:17
speech, content moderation, and internet
1:20
regulation. It's January the 10th, 2025.
1:23
And this week's episode is brought to you with financial support
1:25
from the future of online trust and safety fund. This
1:28
week, we're talking about Meta's big policy announcement,
1:31
TikTok's oral arguments in the Supreme court.
1:33
And much, much more. My name is
1:35
Ben Whitelaw. I'm the founder and editor of
1:37
everything in moderation. And I'm
1:39
back with Mike Masnick founder
1:42
of Tector and it's 2025. Mike,
1:45
congratulations for making it this far. Happy
1:48
new year.
1:49
yes.
1:49
Is it too late to say that?
1:50
No, no, no. It is, not too late to say Happy
1:53
New Year. Happy New Year to you. Happy New Year to all of our
1:55
listeners. Welcome back. We
1:57
are, I guess, glad to
1:59
be back. It's
2:02
like, there's a lot to talk about
2:04
and, uh, not all of it is,
2:07
uh, good news. So happy New Year. but
2:09
yeah, it's, it is a new year and it's
2:11
going to be one heck of a year, Ben.
2:14
I'm going to, don't make me look up glad in the dictionary.
2:16
I'm not, I'm not sure glad to be here is
2:19
the thing, but no, it's lovely to be back. It's lovely
2:21
to be talking to the listeners. Again, we had
2:23
a nice break over Christmas. people continue
2:25
to listen to old episodes of the podcast, which
2:27
is great. so we, we saw a lot of listeners,
2:30
tuning in between the, Christmas dinner
2:32
and new year celebrations.
2:33
And I was going just say, you know, we
2:35
always say to, rate review subscribe,
2:38
but I'm going to add one other thing to
2:40
that, which is tell other people about the
2:42
podcast. This is, this is one of the things
2:44
that I think, you know, word of mouth goes a really long
2:46
way in helping to
2:48
spread podcasts and, let's get beyond
2:50
just the rate reviewing and subscribe requests.
2:52
And if you like the podcast, please tell
2:54
other people about it. We, we really appreciate it. We
2:57
know that is how a lot of people find out about
2:59
it. And, uh, it always helps.
3:01
Yeah, I was in a, uh, I was having a chat with a tech lawyer
3:03
in London earlier today. And she said
3:05
that she recommends the podcast to all of her team,
3:07
Mike. Um, and so whenever
3:10
new starters come into the firm, she
3:12
recommends control or speech. So, if other
3:14
people can do the same, we'll be much happier for
3:16
it. So yeah, great to be back. And,
3:18
uh, yeah, really excited about the start
3:20
of this year. As well as the podcast returning,
3:23
you know, we've got lots of stuff on ourselves.
3:25
Everything moderation and tech are
3:27
big years ahead of them. And we'll
3:29
talk a bit about that over the coming weeks. I
3:31
will flag that EIM is doing its
3:33
first in person meetup.
3:36
Do you have plans on the 30th
3:38
of January, Mike?
3:41
I will have to check my calendar, but think
3:43
I will be half a world away from you in London,
3:45
unfortunately.
3:46
The QE might be, might be a lot,
3:49
but, um, you are invited to the
3:51
tech policy event that I'm hosting with,
3:53
Mark Scott from digital politics and
3:56
Georgia Yakovu from the excellent newsletter,
3:58
Horrific Terrific. That's going to be
4:00
a kind of very informal look ahead to what's happening
4:02
this year in the tech policy online
4:05
speech space. We'll be doing a bit of a Q
4:07
and a lots of interesting folks have already signed
4:09
up. hopefully people from, who
4:11
are listening to the and are near London or
4:13
in the UK can come and join us even
4:16
if you can't,
4:19
check. I probably will not be able to
4:21
grab a quick flight to London, unfortunately.
4:24
well, don't, you know, just think about it. You
4:26
don't have to commit now. we should know right
4:29
at this stage that there are, some very
4:31
Interesting and impactful oral
4:33
arguments happening at Supreme Court right now around
4:35
the TikTok ban and you are getting kind
4:37
of messages live on
4:39
your screen, Mike, right from people who
4:41
are tuning in. this is being recorded
4:44
as it all happens.
4:45
Yeah. Yeah. And so, so we're not
4:47
going to go too deep into that, but yeah, I,
4:50
there is a, uh, a sort of, uh,
4:52
group chat, of, some, first amendment lawyers
4:54
I know who have been, uh, listening
4:57
and sending a bunch of messages about it. And
4:59
so I think it's just concluding kind
5:01
of as we're recording this, uh, so
5:03
obviously we're not going to go too deep on it. Just
5:06
from the impression that I've gotten from
5:08
the little bits of the oral arguments that I've heard
5:10
or what I've read this morning, I don't think
5:12
it's gone particularly well for
5:14
the TikTok side of things. I
5:17
do get a sense that
5:19
The, as I alluded to
5:21
in the opening, that the justices
5:24
seem a bit confused. Um,
5:26
and I will note that, literally
5:28
the day after Christmas we filed
5:31
a, brief in the case, and
5:33
we were trying to argue specifically
5:36
for the Supreme Court to understand the First Amendment.
5:38
And I don't think they got
5:40
the message. Uh, there were obviously lots of
5:42
other things Briefs from other amici,
5:44
as they are called, many of whom
5:46
arguing the same thing that, or not the same
5:48
thing. We, we had a slightly different argument, but
5:51
many are arguing along these same lines about the importance
5:53
of the first amendment. And it does not appear that the message
5:55
got through, to very many
5:57
of the justices. Uh, it's
5:59
always a little difficult to sort of read the tea
6:01
leaves from the justices oral
6:04
arguments. And they
6:06
basically did what they often do, which is
6:08
push back on, everyone who was speaking.
6:10
Cause there was the. Tick tock. There was a representative,
6:13
the lawyer for the users who spoke and then for
6:16
the U. S. Government. and they did push back
6:18
on the U. S. Government. the solicitor
6:20
general in ways that suggest like maybe
6:22
they are skeptical of some of her arguments
6:24
and, but the real focus seemed
6:26
to be on, there's always been this,
6:28
conflict between the data privacy
6:31
concerns and the speech concerns
6:33
and the people pushing for the law have
6:35
always done a really good job of conflating
6:37
them so that if you start complaining about, well,
6:40
the data privacy stuff, you say something like, well,
6:42
why don't you pass an actual data privacy law? Then
6:44
they will immediately jump to, but the, Chinese
6:46
propaganda issue. And then you're like, but that's a free
6:48
speech issue. And then they'll say, but the
6:51
national security concerns about the data privacy. So
6:53
there's like this weird dance where they're always kind of
6:55
switched back and forth. And one of the hopes
6:57
was that, at least at the Supreme court,
6:59
that these nine justices would
7:02
be able to separate out those two issues
7:04
and the early impressions from the
7:06
oral arguments was that they were having a really difficult
7:08
time and they were falling for that trick
7:10
where. If you push on.
7:13
one side, the data privacy or
7:15
the speech, they jumped to the other.
7:17
and that's, to me, is really problematic
7:19
because it's like, okay, these are two separate issues and you can
7:21
separate them out and look at each of them independently
7:24
and then have arguments about each of them. But
7:26
the, trick where you talk about
7:28
one and when you begin to realize like the argument
7:30
is falling apart, you immediately jump to the other.
7:33
Feels like a really dangerous dodge
7:35
and it feels like it was working on the
7:37
Supreme Court. And so that, that's a big concern
7:40
for me.
7:40
Interesting. And so you were going to go deeper
7:43
on this next week because. this
7:45
is all coming to a head pretty quickly. Just remind us of
7:47
the timings of this, Mike.
7:48
Yeah. So the ban is supposed to go into
7:51
effect on the 19th, which is
7:54
I guess it's Sunday. the following week.
7:56
and so, the Supreme court effectively
7:58
needs to rule in some way or
8:01
not before that there is the slight
8:03
possibility, which was raised during the oral
8:05
arguments that they could, abide
8:07
by what Donald Trump asked
8:09
them to do, which is a whole nother issue, uh,
8:12
which was to just sort of, yeah. Put the
8:14
whole thing on hold until he was in charge,
8:16
which is one more day, after
8:19
the deadline. and so that is a possible,
8:21
resolution, but the more likely thing is that
8:24
sometime next week, probably
8:26
just as we sit down to record just
8:28
to mess with us,
8:29
It's always the way, isn't it?
8:31
the Supreme Court may come out with its ruling
8:33
on this, in terms of, you know, what
8:35
happens. And If they rule
8:37
that law is valid, which now seems like
8:40
a Decent possibility. Tick
8:42
Tock could effectively be turned off. Uh, there's,
8:44
there are all sorts of questions about what does that
8:46
actually look like, because the real
8:49
legal mechanisms for how that works is actually
8:51
much more complicated. And it actually depends on like,
8:54
Apple and, Google no
8:56
longer allowing you to download new versions
8:58
of it, but people who still have the app might still
9:00
have it. And there's a question of whether or not other
9:02
ISPs have to block it in the interim,
9:04
which is. not entirely clear, and I've heard
9:07
arguments going both ways on that. so,
9:09
you know, during the oral arguments, TikTok's
9:12
lawyer effectively said, like, they would turn it
9:14
off, that's the way this goes, but
9:16
that's not clear that they actually have to turn it off,
9:18
and he wasn't totally committing to it. So,
9:22
You know, not entirely clear, but
9:24
TikTok could go away in, 10
9:26
days from now or nine days from
9:28
now. And so we don't, know
9:30
for sure. There's
9:32
a lot of, a lot up in the air right now. And,
9:34
and, you know, my biggest concern about
9:37
this, and this is what, why we filed
9:39
an amicus brief. In the case is that
9:41
the ruling, because it's so sort of,
9:44
mixing these issues of the speech and the data
9:46
protection China and propaganda,
9:49
that the actual first amendment concerns
9:51
get lost in that. And the ruling that comes out of
9:53
this really, really undermine
9:56
the first amendment in very, very significant
9:58
ways, no matter what you think of TikTok
10:01
and ByteDance and its connection to China.
10:04
And the oral arguments this morning did not
10:06
give me any reason to feel Better
10:08
about that. We'll see what the final ruling is,
10:10
but I'm deeply concerned about
10:13
the larger impact of the ruling, not
10:15
specifically the impact on this one particular
10:17
app.
10:18
Yeah. Okay. That's a helpful summary
10:20
based upon what is a very live story.
10:23
you mentioned the U S government setting
10:25
a kind of dangerous precedent, Mike, that is a very
10:27
helpful segue into
10:30
our first story, which is something that everyone
10:32
listening to this podcast will no doubt have heard a little bit
10:34
about this week, which is the.
10:37
meta announcement, Mark Zuckerberg's
10:39
famous now famous five minute video.
10:41
did they say something?
10:42
They, I don't know if you heard. Yeah, yeah, yeah.
10:45
Yeah. He's, he's got a new watch. Did you
10:47
Oh, oh, oh,
10:49
I hope it's an expensive watch.
10:51
yeah. I can see you've got your watch
10:53
on there. That's I'm guessing that's not 900,
10:56
000 worth.
10:56
No, no, this, this, this was a free
10:58
watch to be honest. It
11:01
was definitely not 900,
11:04
Well, Mark Zuckerberg wore his
11:06
special watch for his big announcement
11:08
this week, the kind of summary
11:10
was the headline was more speech, fewer
11:12
mistakes. And he
11:15
set out new vision. Mike,
11:17
as you would have seen for Facebook, going back
11:19
to what he described as its kind of core principles
11:22
of free expression and speech, and
11:24
he laid out a kind of five point plan
11:26
for how he planned to do that. so for people
11:29
who maybe were hiding under a rock and didn't
11:31
necessarily hear the announcement I'm
11:33
going to quickly kind of run through those
11:35
five points six points if you include
11:38
what I think is one of the most insane aspects of it. And,
11:40
and rather than doing an order Mike I'm going to suggest. I
11:43
do it on a, a sliding scale of insanity,
11:46
if you'll, if you'll allow me to do that,
11:48
Yes, please, please.
11:50
for all intents and purposes, and it was an announcement
11:52
that, you know, rile people up for
11:54
lots of reasons, got an awful amount of news
11:56
coverage. And I don't necessarily think we
11:58
should take it all seriously. so I'm going to kind of do
12:00
it in order of what I think is the kind of least insane
12:03
to the most insane. And then I want to get your thoughts on
12:05
this listeners, if you're listening to this
12:07
and you have thoughts on the order, get
12:09
in touch with us, um, podcast at.
12:11
Control. speech. com as well. We want to hear
12:13
from you. We'll share back some of your thoughts next
12:15
week. So in order, Mike, okay.
12:18
The five things, six things were
12:20
replacing fact checkers, simplifying
12:22
policies, reducing mistakes,
12:25
bringing back civic content, moving
12:27
trust and safety. to Texas, working
12:30
with Donald Trump to push back against governments.
12:33
Okay. They're the six things that he did in
12:35
order. My order is
12:37
thus, okay. The first
12:39
one was bringing back civic content.
12:42
And this is the idea that people all of a sudden
12:44
want politics on the platform.
12:47
they made 2021
12:49
that actually politics wasn't for them, civic content,
12:52
as they kind of termed it. actually was causing division
12:54
and users were feeling stressed by it. All
12:57
of a sudden, surprisingly,
12:59
this is being brought
13:00
no, no stress at all anymore about politics.
13:03
Yeah, they, they, like me have solved their
13:05
meditation and, and, you know, state
13:08
of mind issues and they're bringing
13:10
back civic content. this is probably the
13:12
least insane, still a bit insane, but at least insane
13:15
because, news and politics content
13:17
is important for people to navigate their lives.
13:19
And that was the big criticism back in 2021. you
13:22
know, so there's a, case of bringing this back, there's
13:24
clearly a political element to this and,
13:26
you know, as we'll see when we marry it with other
13:29
parts of the announcement, actually,
13:31
it's going to Having particular political
13:33
outlets and political speech on the platform
13:35
is good for a certain president elect.
13:39
So that's why it's, it's
13:41
insane, but the least insane.
13:43
Okay. Stick with me.
13:44
Okay.
13:46
number two is replacing fact checkers.
13:48
and, bringing in a community note system
13:51
to help fill the gap. Now, fact checking
13:53
is, debated widely for its
13:55
efficacy. people criticize it.
13:57
People have said that, it's slow that it doesn't necessarily
14:00
the job that you think, like it to,
14:02
and since it was brought in and kind of 2016
14:04
post the last Trump. Presidency,
14:07
it has received a lot of flack. I personally
14:10
think that, you don't necessarily know the effects
14:12
of fact checking until it's gone.
14:15
And, I'm interested to see how this will pan
14:17
out, but actually, you know, the
14:19
insane part of this for me is that you can get a system
14:21
like community notes to come in and do
14:23
as good a job because there's a whole raft
14:26
of issues with, with X slash Twitter's community
14:28
notes product, again, it's It's
14:30
very slow, there are some research that says
14:32
it is effective in part, but it's not the
14:34
kind of panacea that I think Zuckerberg is painting out
14:36
to be. So that's why it's my number
14:38
two on the insanity spectrum. number
14:40
three is reducing mistakes.
14:43
Okay. So this is Zuckerberg's saying
14:45
that he was going to, essentially catch less bad
14:47
stuff. That's literally how he put it. and
14:49
by changing the filters. and
14:52
what it was, the AI systems and the
14:54
automated systems we're going to catch. And
14:56
it's going to focus particularly on the most egregious stuff
14:58
now. So less of the kind of lower level
15:00
harms that perhaps it did in the past. And,
15:04
you know, suppression of speech via these automated systems
15:06
has been something that has been in the news a lot.
15:08
It affects, particularly underrepresented groups.
15:10
It was a huge report put out, by
15:12
BSR a few years ago. About
15:14
the Palestine Israel conflict where,
15:17
automated suppression of speech was a huge issue. Meta
15:19
has been criticized for this significantly and
15:22
in and of itself, isn't really an issue. However,
15:25
when you combine this with the policy
15:27
changes that we'll talk about in a second and
15:29
the civic content changes as well, I think
15:31
this is, this is going to be A really
15:33
serious issue. So that's why it's number
15:35
three. Number four, Mike is,
15:38
moving moderators to Texas
15:40
or specifically, if you, if you really tune
15:42
into what he says, moving trust and safety
15:44
and moderation out of California,
15:46
doesn't say where I'm moving content
15:48
reviewers to Texas, aside
15:50
from the fact that there has been lots of content moderation
15:53
done in Texas for a long time, and we know
15:55
that because there was a class action brought by
15:57
moderators in Texas against Meta. this
15:59
is just a giant signaling move. And,
16:02
I don't know if you saw the, the law fair,
16:04
webinar with Kate Klonick, Daphne
16:07
Keller and others, but they made the point that this
16:09
is just the kind of giant, anti California
16:12
hand waving message, like anti
16:14
the coastal areas. sorry, sorry to
16:16
cause offense. Um, and
16:19
so again, kind of insane in its own right,
16:21
just like a complete, signaling move number
16:24
five. And I'm getting into the kind of serious,
16:27
insane territory. Now this is, we're talking
16:29
batshit levels, was this point around
16:31
working with Trump to push back on
16:33
governments going after us companies
16:36
and quote, censoring more. Lots
16:38
of people have quoted him accusing
16:40
the EU of institutionalizing
16:43
censorship. I can't even say it without laughing
16:46
and the, quote secret courts
16:48
in Latin America, which are a clear reference
16:50
to, the issues in Brazil
16:52
that Elon Musk has faced and,
16:54
you know, again, insane,
16:56
you know, whose idea was it to, set
16:58
up a, a system in which he's siding
17:01
with the U. S. government in order to bring about more
17:03
free speech doesn't make any sense.
17:05
And we can talk more about that. And then lastly,
17:08
but you know, clearly most egregiously
17:10
is the simplifying policies element
17:13
of this whole announcement. on the face of
17:15
it, you know, simplifying policies, not a bad
17:17
thing, but he calls out, immigration
17:19
and gender. he flags
17:22
the fact that transgenderism is something that he's
17:24
kind of looking to address. The language
17:26
is very, very coded and very, very
17:28
specific. And since then, we've seen
17:30
some of those policies start to be announced
17:32
and leaked to the press.
17:35
And some of the examples.
17:37
Now in the policies are abhorrent,
17:40
you know, they are calling out, trans
17:43
people are now allowed to be deemed,
17:45
unreal, you know, the, the worst kind of dehumanizing
17:48
language you can come up with, you can
17:50
now say on the platform, according
17:52
to these leaked documents. And so, What
17:54
I've tried to do there, Mike, is give a summary of all
17:56
of those mini announcements and the order
17:59
in which I think they're the most maddening.
18:01
Yeah,
18:06
this is, I mean, this is the problem with
18:08
all sorts of things these days, which
18:10
is that, there is a lot of complexity and nuance
18:13
in here and so much of. just
18:15
the levels of bullshit that exist
18:17
are really wrapped around some kernel
18:19
of accuracy or truth so
18:22
that, like, if you attack it, people will say, yeah,
18:24
but there's a real problem here and, yet,
18:26
it's presented in a way that is so misleading
18:28
and so twisted. so, I
18:31
think that's true here. And I, so I think you're, you're.
18:33
Order is more or less
18:36
reasonable, but all of this
18:38
is, under the backdrop of
18:40
so much of this is done for completely
18:42
nonsense reasons. and is all designed
18:45
to do that. So, like for years
18:47
I've called out their suppression
18:50
of civic content or political
18:52
content. and so the reversal
18:54
of that, sure, you can say like, yeah,
18:56
that makes sense. They never should have done that in the first
18:59
place. That was always a mistake, but
19:01
you look at like, you don't even have to go
19:03
that deep. Just look at the dates of
19:05
when they started this policy. And when they ended
19:07
it, they started it right after Biden
19:10
won. They ended it. Right
19:12
after Trump won again. Right.
19:14
So it's like, so obviously
19:17
political. and, you know, the thing that
19:19
really gets me, this comes
19:21
after the backdrop of, earlier
19:24
In the summer of 2024, you
19:27
know, Zuckerberg sent this like groveling
19:29
message, which we talked about to Jim Jordan.
19:31
And then there was this New York times
19:34
article with the headline, uh,
19:36
Mark Zuckerberg is done with politics.
19:39
And it's like. what all of this makes clear is
19:41
like, no, of course he's not done with politics.
19:43
He's done with democratic
19:45
politics, but he's happy to suck
19:47
up to Republican politics. so
19:50
when you look at it in the backdrop that
19:52
way, it reminds me of like, There,
19:54
this is not exactly the same thing,
19:56
but like, in the copyright fights,
19:59
going back in like the earlier part of the two
20:01
thousands, there's, a wonderful,
20:04
congressional representative Zoe Lofgren
20:06
from California. She's not my representative,
20:08
but nearby. Um, and she was always very
20:11
good on copyright issues. And based
20:13
on the way seniority worked at one point in
20:15
the, two thousands, she was
20:17
lined up to head the,
20:20
IP, subcommittee for
20:22
the judiciary committee, but because
20:24
she's actually good on copyright issues or
20:26
in agreement with me. So I will, I will, you
20:28
know, subjectively say that she's good
20:30
on it. They killed that subcommittee,
20:33
as soon as she was up to head it. And
20:35
then as soon as the
20:37
next person was up to head it. They brought
20:40
it back. And, this strikes
20:42
me as the same thing. It's like, you look at the timing
20:44
of when you kill a program or when
20:46
you start the program. And if it is
20:48
clearly designed to like stop a certain thing
20:50
from happening, there's this, bad reason
20:52
behind it. So even though I think the policy
20:54
was dumb, Changing that, for
20:57
this reason still insane. So
20:59
even if that's your least insane, same thing with the fact
21:01
checkers, I've, you know, from the beginning,
21:03
I've always said like I think fact checking
21:05
as a concept is important, but
21:07
the setup of the way that, social
21:09
media companies have done fact checking, I think has been
21:12
pretty much. ineffective for
21:14
a variety of reasons. And we don't need to go
21:17
into the deeper reasons for why it exists.
21:19
I don't think it's bad that it exists. I
21:21
just don't think it's all that effective. And
21:24
it sort of created this weird vector where
21:26
everyone got so focused on the fact checking
21:28
that they focus on it, the attention
21:30
driven to it and the hatred towards it. Generated
21:33
a lot more heat than was useful
21:36
for anyone. And so, you
21:38
know, so I, again, like, I don't think it's that big
21:40
of a deal that they're like moving away
21:42
from the fact checking program other than as
21:44
a signal. And so again, against the backdrop
21:47
of everything else that it did, and this
21:49
is the one that seems to have gotten people the most worked up.
21:51
And I saw somebody, I've actually now have seen
21:53
it twice where people have referred to it as an
21:55
existential threat to truth and
21:57
it's like, no, it's not an existential
21:59
threat to truth. Threat to truth. if you don't have a fact checker,
22:01
like other people exist who can fact check
22:04
it. just because you don't have this sort of official
22:06
fact check. so I don't think it's that
22:08
big of a deal, but it is also, I
22:10
also feel like the fact checking one in particular
22:13
was sort of used as a bit of misdirection
22:16
because Meta and Zuckerberg
22:18
knew that everyone was going to focus on that.
22:20
So let's throw that out there. Everyone's going to get mad
22:22
about that. And then we're going to do, as you noted,
22:25
like a whole bunch of Much more crazy
22:27
shit in the background that is
22:29
way worse and way more concerning
22:31
in the long run. And so, yeah,
22:34
a signal, but like as an
22:36
effective tool, I don't think the fact checking has,
22:38
been all that big of a deal, then
22:41
we move on to where it starts to get really,
22:43
really crazy. Right. And so the
22:45
mistakes thing, we've discussed this
22:47
just recently, like, you know, on the podcast, we've had
22:49
these examples of the really stupid the, any
22:52
mention of Hitler, even to say like Hitler's bad
22:54
was getting blocked or the, the whole
22:56
like Cracker Jack, story that came up
22:58
where just saying Cracker Jack or Cracker
23:01
was getting banned. And, My one take
23:03
on this, which nobody else has really picked up
23:06
on, but, for all the talk of how
23:08
great the AI is and
23:10
that is automation system is
23:12
like, this seems to be an admission that
23:15
no, we're not that good at this. Meta
23:17
has always had problems with content moderation at scale,
23:20
even though they're the biggest and they've had the most experience
23:22
with it. They've always made silly
23:24
mistakes like this all the time. So there
23:27
is this element of like, if this was just
23:29
a recognition, like, yeah, our automated
23:31
systems are bad and we're not
23:33
really good at this. Again, that would
23:35
be interesting. And that would be an interesting
23:37
admission, but against the backdrop of everything else,
23:39
it is still crazy. And so it is still
23:41
this kind of like, yeah, good
23:44
for them to admit that, but they didn't admit it in a way
23:46
thoughtful or was transparent
23:48
or was useful to the world.
23:50
It was done in a way to say, we're
23:53
going to allow a lot more, really horrible
23:55
shit on the platform.
23:56
Yeah.
23:57
so, that's where we start to get into the really
23:59
crazy stuff. the moderators to Texas
24:01
thing. just even the way that
24:03
Zuckerberg phrased it was basically
24:06
like, we're moving people away from California
24:08
to Texas to stop bias,
24:11
which in what world
24:13
do you think that people in Texas are less biased
24:15
or like, you know, there's not this sense that
24:17
like people in Texas are neutral
24:19
Well, yeah,
24:20
biased, like, what the, like, no,
24:23
no one believes that. And
24:24
listening. I was listening to this law fair webinar
24:27
and people in Austin apparently are like,
24:29
are not like your kind of, you know, typical,
24:32
Republican, you know, like
24:35
Austin is famously not
24:37
like that, yeah.
24:38
Right. So it's like, it's completely arbitrary,
24:41
distinction between California and Texas. And
24:43
clearly it doesn't mean anything, but that's
24:45
the level that he was working at.
24:47
Yeah, I mean, it's funny
24:49
because, another thing that, Zuckerberg
24:51
wrote on threads response to some people
24:53
talking about this, he was like, Oh, I forget
24:55
exactly the way he phrased the first part of it, but
24:57
it was something to the effect of, we, we honestly think
24:59
that this will make the platform better and that will make
25:01
more people use us. Yes. Some
25:04
people might leave the platform,
25:06
due to virtue signaling, but
25:09
blah, blah, blah, blah, blah. and. Like I
25:11
responded to Zuckerberg directly on threats.
25:13
I don't know if he saw, he saw it, but I was like, look,
25:15
this is a fucking tell. Like you're, you're, you're
25:17
admitting, like just using the phrase virtue
25:20
signaling, first of all, almost everyone,
25:22
I won't say everyone, but almost everyone who uses
25:24
that phrase is using it to be an asshole. and
25:27
take a step back, everything
25:29
that this announcement did and everything
25:32
that Metta has done this week. has
25:34
been signaling. I wouldn't call it virtue
25:36
signaling. It's perhaps the opposite
25:38
of that, but to use
25:40
that to sort of dismiss the
25:42
people who might be concerned about
25:44
these changes is just an
25:47
absolute insult on top of all of the other
25:49
things that, he's doing
25:51
Yeah. And be put in danger by them as well. Cause
25:53
you know, this is going to have serious, you
25:55
know, particularly the policy, the simplification of policies,
25:58
which is to call it simplification
26:00
is like. It's mad because it's
26:02
not just a simplification. It's like a degradation
26:04
of, it's a, it's a kind of dismantling
26:06
of policy that has been created
26:09
over years.
26:10
it's not. And so there
26:12
is an argument. You can make an argument that
26:14
like, yes, Meta's policies
26:17
probably are way too complex. There's this
26:19
really fantastic. If people haven't listened
26:21
to it, it goes back a few years till they did
26:23
an updated version. Radiolab, the podcast
26:25
Radiolab did this amazing one where they sort
26:28
of embedded a little bit. I think Kate Connick is in
26:30
it lot, Embedded with
26:32
the Meta, I think at the time,
26:34
Facebook. Moderation policy
26:36
team. And they walk through in such
26:38
a good way. I mean, just the
26:40
challenges and the nuances and like,
26:42
Oh, we created this rule, but now we have this
26:44
exception and then, Oh, but there's that
26:46
exception. And they talk about the rule book
26:48
and how they have to sort of keep adding in clauses.
26:51
Like, yes, this, but not in this case,
26:53
but if this, and you just have
26:55
to keep writing the rules in different
26:58
and more involved in complex ways.
27:00
And they, talk about this process, how the rule book
27:02
just grows and grows and grows because
27:04
you have these exceptions and edge cases and
27:07
all this stuff. And it's fascinating just as a,
27:09
to think through these issues because people on the outside never
27:11
think about it. think through how many of these things involve
27:13
edge cases and stuff. But you can see how
27:15
over time that collects a debt
27:17
of like just complexity and problems
27:19
that lead to other kinds of problems in terms
27:21
of actually how you enforce the rules. And
27:23
so there is this element again, where you can take
27:26
this back into a serious realm and say like, yes,
27:28
I am sure that the rule book. at meta
27:31
is way too complex and could benefit
27:34
from some simplicity other than the fact
27:36
that there are reasons why all of those exceptions
27:38
come into play and there are all these issues
27:40
involved but the reality is
27:42
because we're starting to see so first there's the public
27:45
policies that are available for people
27:47
to see that some people called out wired
27:49
was the first one to call out some of the changes in there
27:52
and now what we've seen is What
27:54
I believe are very angry people within meta
27:56
releasing the internal version of the rule
27:58
book and sending those to various reporters who
28:00
are all rushing to publish them. So we're seeing all sorts
28:02
of stuff. What's happening inside. What
28:04
is happening is not what I would call
28:07
a simplification of the rules.
28:09
What is really happening is very
28:11
clear exceptions written for
28:14
specific culture war issues
28:17
that. The MAGA world
28:19
believes is really important for them to be
28:21
able to say things that
28:23
are insulting and harmful and
28:26
targeting specifically marginalized
28:28
people. and what Meta
28:30
is doing is not a simplification of the rules, which
28:33
would be an interesting project to talk about,
28:35
but rather a, we are writing in
28:37
exceptions for the people who are mad
28:39
at us.
28:41
And, and, in the first, in that kind of like analysis
28:43
we've done there, Mike, we've tried to engage in, relatively
28:46
good faith with what was said. cause this is a, you know,
28:48
online speech podcast after all, you've got to engage in what
28:50
the platform is doing, but I think you're right. You
28:52
know, there's a lot of this, which is, not worthy
28:55
of being engaged with in good faith and like represents
28:57
the wider, company values,
29:00
I think. and we, we don't want to go into those too much,
29:02
but we should talk a bit about them because I think
29:04
that's, that's really what you're, saying there is
29:06
that these changes, the way that
29:09
Zuckerberg did a five minute video
29:11
that sat on the top of a blog post that was
29:14
authored by Joel Kaplan, the new
29:17
head of global policy
29:18
new Nick Clegg.
29:20
The new Nick Clegg, God rest his soul.
29:22
I thought this podcast was going to be
29:24
about Nick Clegg last week when he resigned,
29:26
little did I know what was going to happen in
29:28
the subsequent days. Anyway. Yeah.
29:31
So, the video with the, blog post
29:33
and obviously Kaplan's connections to the Republican
29:35
party, We have to really talk
29:37
about it as a wider, Not
29:40
just about speech kind of announcement.
29:42
Don't we is You can't really do
29:44
it in any other way
29:45
Yeah, the context, as
29:47
with everything and almost everything that we talk
29:49
about, and I try and do the context matters.
29:51
The context always matters. And it's
29:54
very easy to sort of simplify a bunch of these things
29:56
down, but the larger context
29:58
really matters. And it's not just The switch
30:00
from Nick Clegg to Joel Kaplan, but
30:02
also the new appointments to the board of meta,
30:04
which also came out last week, including Dana white,
30:07
who's the head of UFC and is a close
30:09
personal friend of Donald Trump has been
30:11
really engaged in policy issues
30:13
for the sort of mega movement there's
30:15
like this clear declaration that we're now
30:18
going mega And obviously like the moving
30:20
people to Texas stuff, like all of
30:22
this nonsense. But the reality is, again,
30:24
when you look at the changes for
30:27
all the talk of Oh,
30:29
the rules are biased against conservatives,
30:32
which has never really been true you
30:34
know, there's all sorts of research on this and we've
30:36
talked about it and, and all this kind of
30:38
evidence and stuff, all that talk, what
30:40
they've done now is bias. The rules
30:42
specifically in favor of MAGA.
30:45
Culture war talking points. The changes
30:47
to the rules are not simplifying. They're
30:49
not clarifying. They are, we are
30:51
creating explicit exemptions
30:54
for the kind of awful
30:56
speech that you want to use to target.
30:59
Certain communities and that
31:01
needs to be called out and it needs to be
31:03
really clear because for
31:05
all the talk of like, Oh, all of this has been
31:07
working. The refs is kind of like the framing
31:10
that comes up a lot. all of the complaints
31:12
about the way that, different platforms moderated
31:14
and saying like, Oh, you're, biased against
31:16
conservatives, which has never actually been
31:18
true now, what they're doing is
31:20
they are biasing. The way their moderation
31:23
policies work explicitly,
31:25
like the language is so clear, don't
31:27
even want to repeat it because they have this,
31:29
horrifying language of this is what is allowed
31:31
now. And it is clear,
31:34
biased, bigoted speech towards
31:36
certain marginalized communities. That
31:39
will lead to harm and will lead to
31:41
problems. And a lot of this
31:43
is legal speech and there are arguments for, you know, meta
31:46
can do what it wants, but the
31:48
signaling here,
31:49
Yeah.
31:50
for all of, of, Zuckerberg sort of talk
31:53
of virtue signaling, he is
31:55
signaling with this loud and clear
31:57
saying, we want MAGA
31:59
community to be here and to
32:01
use our platforms to spread
32:03
their hatred.
32:04
Yeah. I'm a fan of saying that moderation
32:07
is political and politics is moderation.
32:09
And, and this is the kind of week that has, I think, summarized
32:12
that better than any other. do you think Mike,
32:14
the virtue signaling as
32:16
Zuckerberg would call it is going to play out?
32:18
Do you think people aren't going to kind of vote with their feet
32:20
and stop? Using the platforms
32:22
or do you think the network effects are so big what
32:25
ramifications is there likely to be?
32:27
Well, I mean, who knows, right? And this is,
32:29
this is the big unknown, right? There is
32:31
the argument that like, this is the playbook
32:33
that Elon Musk tried. And
32:36
it may have been successful in other ways, in
32:38
terms of like electing
32:40
a U. S. president and being close to
32:43
him, it has not been successful
32:45
for the platform. X in particular,
32:47
it has lost users. It has lost a ton of advertising.
32:50
It has been very unsuccessful as a business
32:53
strategy for that. in
32:55
that, in that realm. And in fact, like
32:57
it felt like Zuckerberg recognized
32:59
that because after all, he launched threads
33:02
as like a sanely
33:04
run competitor. To
33:07
Twitter slash X. And so
33:09
there was a moment where he recognized that what
33:11
Elon was doing was driving away users. And
33:13
yet now he's kind of doing the same thing.
33:16
And so it will be interesting.
33:19
Somebody pointed out, which I thought was interesting
33:21
and I forget who, and I apologize if you are
33:24
a listener and I am ignoring your contribution
33:26
to this. Um, there's been so much this
33:28
week that I don't remember exactly who
33:30
said what, that, um, For all
33:32
of this new policy and big
33:34
changes to the system, it was all
33:37
done through, Mark Zuckerberg's post
33:39
and the blog post and the Joel Kaplan
33:41
announce, but there was no notification
33:43
for users. If you logged
33:45
into Facebook or Instagram, there was
33:47
no pop up saying our policies have changed.
33:50
so true. Yeah.
33:51
And so that's kind of interesting
33:54
and a little bit problematic. And so you do wonder,
33:56
like for people who don't follow all this stuff,
33:59
how many of them even realize this is going to ha
34:01
this is happening, but, I think in
34:03
the longer run, if this does lead to
34:05
what it seems likely to lead to, which is a lot
34:07
more, just. angry,
34:09
hateful, garbage
34:11
kinds of speech, I feel like,
34:13
people will start to look for alternatives.
34:16
And so it strikes me as was the
34:18
case of Elon taking over Twitter as
34:20
an opportunity for third parties to
34:22
come in and sort of try and take that audience.
34:25
Yeah, and we'll talk a bit about a piece
34:27
that renee diresta has written
34:30
about some of that But it's a reminder I think
34:32
for all of us particularly for me about how
34:34
companies such as meta are really, Just vessels
34:37
they claim to have values
34:39
that they hold, but actually
34:42
they can be filled up with whatever values,
34:44
are around at that time. And for a while that was,
34:47
certainly more democratically inclined values.
34:50
That was one, you know, that cared about. Speech
34:52
and, emphasize fact checking and now
34:54
that is a very different set of values. And I think
34:56
that, and trust and safety is a way that those values
34:58
are manifested. You know, Alice wrote
35:00
a really interesting piece about this for EIM
35:03
a few months back, which I'll link to in the show notes. if
35:05
your values change, then naturally your
35:07
trust and safety and your content moderation
35:09
and your speech, policies are going to change with
35:11
it. And I think that's what we're seeing here.
35:13
Yeah, I do want to raise one issue, which that
35:16
just reminded me of it's a little bit different, which is there
35:18
is this framing and all this, which really frustrates
35:21
me. Also, I mean, a lot of this is obviously frustrating
35:23
me, but like a lot of the framing
35:25
of this was Zuckerberg
35:28
and Joel Kaplan saying, like, this is bringing
35:30
Facebook Meta back
35:32
to who, Being about free speech.
35:35
And that is absolute nonsense on
35:37
multiple levels. One, as we said, like the
35:39
policies are not really about free speech.
35:41
They're specifically exceptions
35:44
to allow for, really problematic
35:46
speech. But the, the bigger
35:48
thing is that like. Facebook was
35:50
never a free speech platform,
35:53
from its earliest days, it had pretty
35:55
heavy moderation and,
35:57
pretty specific rules that they didn't allow
36:00
certain kinds of behavior in certain kinds of speech.
36:02
They never allowed anonymous accounts. They
36:04
always wanted you to use real names. They've
36:06
always had like the no nudity policy.
36:09
They have always been pretty restrictive
36:12
from the beginning. And this idea
36:14
that, Facebook was ever.
36:16
involved in the free speech project
36:19
strikes me as complete nonsense.
36:21
And why, why was that Mike just kind of
36:24
journey, journey back through history? Like what was the reason
36:26
why that happened in the first place? Do you remember?
36:28
I mean, I think it was just sort of like, Zuckerberg
36:30
wasn't in this for free speech. It was never
36:32
about that. I mean, he was trying to build a business
36:35
and, to him, there was no underlying
36:37
like moral imperative to try and help speech.
36:39
I don't think that was true. I think the Twitter people,
36:41
the original Twitter people did believe in
36:43
this kind of like, Ethos of free
36:46
speech and using the internet to enable
36:48
more speech, but Zuckerberg never
36:50
seemed to express that, kind of view.
36:52
He was trying to build the biggest business that he could.
36:55
And as we've discussed, like one of the
36:57
ways that you build a big business is By,
36:59
having a platform that is safe for brands,
37:02
for example, and, and others. And
37:04
so that was really the focus of what
37:06
he was doing. So this idea that they're suddenly like,
37:09
we're going back to our roots as a free speech
37:11
platform is definitely,
37:13
uh, uh, historical revision
37:15
of reality.
37:17
Yeah. Okay. Historical revision of reality
37:19
feels like a neat way to summarize,
37:22
that thanks Mike and there
37:24
are other stories that happened this week that
37:26
there aren't quite as many, there is as big
37:29
as the meta announcement, but we'll, we'll do,
37:31
a bit of a review of those other ones. And. The
37:33
next one also looks at CEOs of
37:35
social media platforms, Mike, that are working in
37:37
cahoots with Republican government officials. So,
37:40
I'll hand over to you for this because you,
37:42
for some reason, were, looking at government
37:45
documents on New Year's Eve. explain
37:49
that for us. First of all,
37:50
Yeah, I was writing an amicus brief on Christmas
37:52
and on New Year's Eve, I was looking at congressional
37:55
documents. My life is so exciting,
37:57
Ben.
37:59
we're grateful for it.
38:00
yeah, you know, honestly, I think this is kind of a continuation
38:03
of the same story in some way, which is that on
38:05
New Year's Eve, Representative Jerry Nadler,
38:08
who's the ranking member of the House Judiciary
38:10
Committee that is the top Democrat on the Judiciary
38:12
Committee, released a report which was, Basically
38:15
the Democrats on the judiciary committee
38:17
releasing this report called the delusion
38:19
of collusion, the Republican
38:21
effort to weaponize antitrust and undermine
38:24
free speech. And it's a really great
38:26
report that for no
38:28
good reason was released on December
38:30
31st to guarantee
38:33
that it would get the least attention possible. There's
38:35
been no news coverage of
38:37
this document, as far as I can tell. Other than
38:40
a tech turd post that I published this morning,
38:43
right before we started recording,
38:44
Go and read it. Go and read it.
38:46
uh, and it is
38:48
a systematic and thoughtful breakdown
38:50
specifically of how
38:53
Jim Jordan, who runs the
38:55
judiciary committee,
38:56
Good friend of the podcast.
38:58
yes, has weaponized
39:00
the government specifically to help.
39:02
Elon Musk to go after
39:04
advertisers who pulled their advertising from,
39:07
Twitter X. And this is a story that
39:09
I've been telling for a long time, and
39:11
I felt like I was the only one. I
39:14
was sort of screaming into the wind, and we've obviously
39:16
discussed it here about, you know, everything that happened specifically
39:18
with Garm, which is the, You know,
39:20
nonprofit, small nonprofit that was trying
39:23
to work with platforms and advertisers
39:25
to figure out how to keep brands
39:27
safe, if they were going to advertise on these
39:30
platforms. which also,
39:32
you know, Twitter slash
39:34
X had excitedly rejoined
39:36
a week before Jim Jordan came out with this report,
39:39
calling it like an antitrust violation
39:41
because they were organizing a boycott.
39:43
of Twitter, which was never
39:46
actually true in any real sense.
39:48
And finally, the Democrats come up with this report,
39:51
basically calling bullshit on everything that
39:53
Jim Jordan said, which since turned into
39:55
a lawsuit that Elon Musk filed against
39:58
Garm and a bunch of advertisers, and
40:00
which led to Garm being shut down by
40:02
the World Federation of Advertisers, but
40:04
here's this report. From the Democrats,
40:06
which got no attention, which calls out
40:09
all of this, that there were legitimate reasons.
40:11
There were legitimate brand safety concerns
40:13
that Elon Musk did a whole bunch of things
40:15
that were really bad for brands on Twitter.
40:18
None of this is surprising to you or I, or anyone
40:20
listening to this. I'm sure that there were perfectly
40:23
legitimate reasons that Garm was
40:25
just there trying to sort of help everyone, but
40:27
had no real impact. Oversight, you know,
40:29
over where people put advertising
40:31
had no control over that. Advertisers
40:33
were making all of their own independent decisions.
40:36
There was no collusion. There was no
40:38
coordination effort. There was no, official
40:40
boycott or anything. There were just a bunch
40:42
of. Advertisers who realized that
40:44
like having your ads next to Nazi content
40:47
is probably not good for your business.
40:50
Probably. I mean, Mark Zuckerberg seems to be betting
40:52
otherwise, but you know, and
40:54
so there was a reason why they did this.
40:57
And then calling out Jim Jordan specifically
40:59
for cherry picking quotes for quoting things
41:01
out of context for making arguments that were clearly
41:03
untrue and not there in order to
41:06
suppress the speech of these advertisers
41:08
of garm of others and
41:10
basically trying to force them
41:13
to do. in the service of helping Elon Musk,
41:15
the wealthiest man in the world, and a big
41:18
funder of Republican causes to
41:20
be able to go after advertisers who choose not to
41:22
advertise on the platform.
41:23
yeah. And there's this great line in it as well the
41:25
report, which is definitely one to go and read about how he's
41:28
Jim Jordan's interim report was like
41:30
a, design for an audience of one
41:32
AKA Elon Musk.
41:34
And there's something like you say about the similarities
41:37
between the first story and Meta's announcement and this
41:39
one, which is that, you you have Republicans
41:41
and CEOs essentially kind of writing love
41:44
letters to each other by
41:46
the form of blog posts and reports
41:48
and letters. And as kind of what I felt when I
41:50
was looking at Zuckerberg video, and he was
41:52
kind of awkwardly explaining his, you
41:55
know, what he was going to do. I was like, this is cringe
41:57
worthy. Like, why don't you just, why don't you just. FaceTime
42:00
Donald Trump and tell him yourself, um,
42:02
you know, keep it, keep it private. And
42:04
I think, you know, this report lays out a similar
42:06
thing, which is that you had Jordan and Musk
42:08
essentially working together on this and, to
42:11
the ends that we saw in the, in the election.
42:13
Yeah, and there's a lot in there. It's, it's
42:15
a 53 page report, I think, and
42:18
it's worth reading. And again, it got no attention because
42:20
the Democrats are totally incompetent at how they
42:22
promote this kind of stuff.
42:23
Right. Right. Yeah. And so, uh, we're
42:26
doing the best to bring it at some readership,
42:28
Mike. we're, we'll go on now to other
42:30
stories. That we've noted and we'll
42:32
stick in the realms of kind of government regulation
42:35
to begin with and just note, as
42:37
part of our kind of quick story roundup that, that actually
42:39
Elon Musk has, been in the news, but just not
42:41
quite so much as, his counterpart
42:43
over at Meta. So you might've seen over
42:46
Christmas between, filing your
42:48
amicus brief and, you know, checking
42:50
the, government website for, for new reports
42:53
that Elon Musk was tweeting furiously about
42:55
a lot of things, including. And
42:57
particularly about the AFD, far
42:59
right party in Germany and giving his support
43:02
for it, that according to Bloomberg
43:04
has triggered a new
43:06
surge of activity around
43:08
the European commission's investigation
43:11
of X Twitter, which is almost,
43:14
I think over 12 months ago was announced and is still
43:16
running in the background. Bloomberg announced
43:18
this week that. Henna Verkanen,
43:21
who is your friend Thierry Breton's replacement,
43:24
in the European Commission is kind of heading
43:26
up a lot of the DSA work and
43:28
Justice Chief Michael McGrath have
43:31
sent a letter to European election
43:33
officials saying that they were moving forward energetically
43:36
on the investigation. I thought energetically
43:39
was a weird word to use, Mike. I know it's a
43:41
small point, but would you, would
43:43
you have used a different word than that?
43:45
don't know. I don't know. I
43:48
mean, it's, they're trying, this
43:50
is all signaling in some way or another,
43:52
I guess this is the point of the podcast and
43:54
so they're signaling they're going to do something. I didn't,
43:56
I would note that this came out a day
43:59
after Le Monde in
44:01
France had a, An article which sort
44:03
of claimed that the
44:05
Europeans were actually backing off of their
44:07
investigations and that
44:09
the, EU commission president,
44:11
uh, Ursula von der Leyen was
44:14
putting on hold all of these investigations
44:16
and refusing to start new ones. It was very
44:18
weakly sourced and, done in a way
44:20
that really appeared like someone
44:23
was trying to. to shake things
44:25
up, maybe to get a response like we are energetically
44:27
pursuing this. Uh, so I,
44:30
I do wonder if that the Bloomberg
44:32
piece is sort of a response to
44:34
the Lamond piece.
44:36
yeah,
44:36
And so that there's something going
44:38
on behind the scenes where some people are saying like,
44:40
maybe we should hold off. and the argument
44:42
that was made in the Lamond piece was that certain European
44:45
leaders are more supportive of
44:47
Musk. And so you have like, Victor
44:49
Orban, obviously, and George Maloney
44:52
in Italy, who sort of Musk supporters,
44:54
so maybe they're sort of pushing back on these investigations.
44:57
And then you have another wing of, EU folks
44:59
who are obviously keen, eager,
45:01
and I guess, energy,
45:03
just think about, uh, going
45:06
after these platforms. And so,
45:08
I think this is a statement that the EU is like,
45:10
look, okay, the U S project
45:12
is in trouble right now. Uh,
45:14
and we are going to continue with our regulations,
45:17
which does raise one point, which we have
45:19
left out so far about the meta story,
45:21
which is kind of important, which
45:23
is that they very quickly clarified meta
45:25
did that. These new policy
45:28
changes do not apply to the EU. Do
45:30
not worry in the EU. We're not doing
45:32
any of this.
45:34
well, yeah, to join the dots a little bit,
45:36
the part of the announcement around
45:39
working with Donald Trump and the US government
45:42
felt to me like a fear
45:44
of regulation, particularly in the
45:46
EU. And you're right, you know, there was a clear explanation
45:49
of how this was us only, it
45:51
might be rolled out. Elsewhere in the future,
45:53
but I got the sense that it was a slight
45:55
fear of the kind of, you know, EU
45:58
regulatory regime. Did you get that sense as
46:00
well?
46:00
I, I, it didn't strike me as fear so
46:02
much as like opportunistic.
46:04
Right. I mean, so this recognition that, Donald
46:06
Trump is very much the bull in China shop
46:08
kind of politician who just
46:10
sort of screams about what he wants. Right. I mean, we're,
46:13
you know, we're here about to take over Canada
46:15
and Greenland and all that. Uh,
46:18
and so, I think it was kind of
46:20
like, Oh, here's an opportunity to
46:22
do what was politically. Impossible
46:24
before, like before the Biden administration
46:27
was never going to go argue, about
46:29
the excesses of the DSA. And I, I'm
46:31
obviously I've been a critic of elements of the DSA.
46:34
And I wish that the U S government was actually more
46:36
vocal in sort of criticizing some aspects of
46:38
the DSA and somehow it's problematic. And
46:40
I see, I think that Zuckerberg
46:43
sees this as an opportunity where it's like, he knows that
46:45
Donald Trump's not going to care about that. So
46:47
here's a chance for him to like, maybe go
46:50
out and say like, Oh, the DSA is this horrible
46:52
thing. And like, if you don't change the DSA, like
46:54
we're going to cut off all trade to Europe. I don't know what he's
46:56
going to do. Right. I mean, it's like, you know,
46:59
it's either like, fix the DSA or we're
47:01
going to invade, Iceland. I
47:03
don't know. Like none of this matters
47:05
anymore. Like nothing makes sense. So I think
47:07
it's just an opportunity for him to try and
47:09
get Trump to push back on the DSA.
47:11
Yeah. Okay. And I mean, the
47:13
idea of speech as a form of trade is something that
47:15
I'd love to dig back into because I think that's, that's
47:17
a
47:18
There's, there's, there's a big history there
47:20
that, yeah, yeah. That's, that's, we're not
47:22
going to do that in the last few minutes of this podcast.
47:25
Okay, cool. But you know, Elon Musk is.
47:27
also in the news for other reasons, you read
47:29
a tech post about, some
47:31
contradictory behavior that he exhibited this
47:33
week,
47:34
Yeah,
47:35
unlike him.
47:36
yeah, it's like, how do we do this story so quickly,
47:38
but like, so there's been this story for a long
47:40
time that there's this guy, Adrian Dittman, who
47:42
people believe is an Elon
47:44
Musk alt account, and he's shown
47:47
up in spaces, there was like a Twitter
47:49
x spaces with Elon Musk, they
47:51
sound identical. This Adrian Dittman
47:53
person sounds exactly like Musk. He's
47:55
been a huge Musk fan. He's always supporting
47:57
him. He talks about what a great father he is. at
48:00
one point I think he talks about like how much
48:02
sex Elon Musk. I mean, it was like ridiculously
48:05
fawning, fan behavior, but
48:07
because his voice sounds just like Musk,
48:09
everyone's like, this is clearly just Musk and,
48:12
and like you know, there've been all these attempts to sort
48:14
of prove it, and a lot of people are totally convinced of it.
48:16
And the spectator came out with this article is basically
48:18
like, no, there is this real dude named Adrian
48:21
Dittman who has this weird global
48:23
history that kind of explains
48:25
why he would have a similar accent and kind of
48:27
explains why he would be like hugely supportive
48:30
of Elon Musk. And it's a real guy.
48:32
And there was joking about
48:34
it because like Elon and Adrian
48:36
have both sort of always assuming they're different
48:38
people have always played coy. about
48:41
this question of whether or not they're the same person. I think
48:43
they both sort of get off on the fact that
48:45
a lot of people think they're the same
48:47
Yeah. It's a funny thing, isn't it?
48:48
yeah. And so, then when the
48:50
spectator article came out, even Elon posted
48:53
like, all right, it's time to admit it. Like I
48:55
am Adrian Dittman, even though the article sort of proves
48:57
that he's not, But then, Twitter
49:00
slash X banned the article,
49:02
banned the authors of the article, banned
49:05
the authors of a study that was used as the
49:07
basis of the article,
49:08
Yeah,
49:10
and, did all that. And it reminded
49:12
me, because I'm the only one who remembers history,
49:14
that Elon Musk was furious
49:18
That Twitter banned the New York
49:20
post for posting the story about
49:22
the Hunter Biden laptop and block
49:24
that link for 24 hours before they admitted
49:26
that was probably a mistake and went back on it. And
49:28
in fact, Elon Musk has said
49:30
the, the former people at Twitter probably
49:32
deserve to go to jail for blocking the New York
49:34
post story and saying that the free
49:36
speech platform should never block news
49:38
stories, and yet here is blocking
49:41
a story from the spectator.
49:42
that was 2024, Mike. This is 2025.
49:45
You forgot. The
49:48
in 2020, in 2024, he did
49:50
the same thing with the revelation of the JD
49:52
Vance dossier by Ken
49:54
you're right. Yeah.
49:56
There's, there's just this level of hypocrisy
49:58
here that I think is worth calling out, even though nobody
50:00
else seems to care about it, that
50:03
He's doing exactly the same thing worse
50:05
in a more extreme manner. The reasoning
50:07
for it is they're claiming that it violated the doxing
50:09
policy, which is nonsense. It is not
50:11
doxing to say like this person who calls himself
50:14
Adrian Dittman is actually Adrian Dittman. That
50:16
is not doxing. That is like, this
50:18
guy is who he says he
50:20
Identification.
50:21
Yeah. Um, but I
50:24
thought it was worth calling out because again, like,
50:26
I feel like no, I'm not saying nobody, I'm
50:28
being a little, you know, hyperbolic here, but like,
50:30
most people were not calling out the hypocrisy there,
50:33
and I thought it was worth mentioning.
50:34
Yeah. Okay. that is worth mentioning. And I think, you
50:36
know, maybe something that the, European
50:39
commission are interested in as they do their investigation,
50:41
who knows, um, the
50:43
only other story, Mike, our flag, just before we
50:45
round up today is a really great piece by
50:47
Rene DiResta, like I said, which is
50:50
published on the online magazine NOMA. And
50:52
it's titled The Great Decentralization
50:55
René has written lots of great kind of essays like this
50:57
in the past, but it's a really nice
50:59
look at the really the history
51:01
of the march from one size fits
51:03
all platforms to decentralize.
51:07
And, and federated spaces. And,
51:10
she kind of rounds up really nicely. were
51:12
the drivers that kind of led us from
51:14
there from the big platforms like
51:16
Facebook and Twitter to
51:18
the, you know, mastodons and the blue
51:20
skies that we're seeing, disclosure.
51:23
Mike, Mike is on the board of blue
51:25
sky, um, just to do that quickly.
51:27
Um, so, so yeah, it's a really nice
51:30
look through, the history books
51:32
as to how that happened. she makes some really
51:34
great references to kind of working
51:36
the refs and referees generally, which I think you mentioned
51:38
as well, when we're talking about Facebook, is
51:40
a really helpful lens,
51:42
I think, through which to see everything that's happening right
51:45
now. And it reminded me actually of
51:47
a, of a Michael Lewis podcast against
51:49
the rules, which talks about referees, which again,
51:51
lots of great stuff. I don't know if you've listened to it. Um,
51:54
big fan of that. And, and, also the potential
51:56
downsides of, of this, you know, of moving
51:59
to smaller, potentially less moderated
52:01
spaces in some cases and what that could
52:03
mean for, polarization for
52:05
society as a whole. And we don't
52:07
know the full extent of that, but Renee summarizes
52:10
it nicely.
52:10
Yeah, it's a really good piece for especially
52:13
if you haven't been paying as much attention to sort
52:15
of the alternative spaces and kind of how
52:17
we got here why they've been successful.
52:20
And I think it also does a really good job,
52:23
frankly, of raising the question
52:25
of trade offs in terms of how, trust
52:27
and safety is handled, both on the centralized
52:29
platforms and the decentralized
52:31
platforms and how there are pros
52:34
and cons to these approaches. I mean, I think
52:36
most of the rest of our podcast today
52:38
has been about some of the cons of
52:40
centralized moderation, when they get
52:42
into the hands of. people who have, different
52:45
viewpoints on things. So that will be
52:47
my diplomatic version of it. Um,
52:49
but there are also real challenges with the
52:51
decentralized systems. And
52:53
a lot of people sort of view them as like, oh, it's just
52:55
the same thing, but, you know, it's
52:58
just a new version, someone trying to do different,
53:00
but the underlying frameworks
53:02
and the whole like protocol concept of
53:04
a decentralized system creates
53:06
different affordances. Some of which
53:08
I, I personally think are really, really beneficial
53:10
and that's why I've been a huge fan of them. That's why I ended
53:13
up on the board of Blue Sky. but some of them
53:15
also have like different challenges and I
53:17
think that the piece that Renee wrote
53:19
really lays them out very clearly and does a
53:21
great job of it. and so I think
53:23
it's just a useful piece for everyone to kind of understand
53:26
this moment that we're in and what
53:28
may be possible and what may be
53:30
the challenges of, trust
53:32
and safety on these more decentralized platforms
53:34
going forward.
53:35
Indeed, really nice kind of weekend
53:38
read, I'd say, to have with your coffee or,
53:40
you know, whatever your drink of choice is this
53:42
weekend. that takes us to the end of our
53:44
episode this week, Mike. we spent a lot
53:46
of time talking about Meta. We touched on TikTok.
53:48
We touched on decentralization at the end
53:50
there. I hope our listeners feel like we
53:53
covered the full gamut of stories that
53:56
have emerged this week. you look tired.
53:58
Ha ha ha ha
53:59
You look like you need a rest and it's only, it's only
54:01
January the 10th. So
54:03
gosh.
54:04
buckle up. Um, it's going to be a wild year,
54:06
I think, but, thanks everyone for listening
54:09
and appreciate you tuning in. If
54:11
you have any feedback about today's episode, drop us
54:13
a line podcast that control all speech.
54:15
com. We'd love to hear from you. give
54:17
us your thoughts on the kind of. Additional
54:20
analysis around meta. Was it worthwhile?
54:22
Would you like us to do that again? we respond
54:24
to all of the emails that get sent in and
54:27
that rounds us up for this week. Thanks very much for listening. Take
54:29
care. I'll see you soon.
54:33
Thanks for listening to Ctrl-Alt-Speech.
54:36
Subscribe now to get our weekly episodes
54:38
as soon as they're released. If your
54:40
company or organization is interested in sponsoring
54:43
the podcast, contact us by visiting
54:45
ctrlaltspeech.Com. That's
54:47
C T R L Alt Speech. com.
54:50
This podcast is produced with financial support
54:53
from the Future of Online Trust and Safety Fund,
54:55
a fiscally sponsored multi donor fund
54:57
at Global Impact that supports charitable
54:59
activities to build a more robust, capable,
55:01
and inclusive trust and safety ecosystem.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More