Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Support for the show comes from
0:02
attentive. Imagine getting a message from
0:04
your favorite brand that feels like
0:07
it was created just for you.
0:09
Chances are, they're using attentive, the
0:11
SMS and email platform that helps
0:14
marketers transform every interaction into a
0:16
personalized experience. It uses an AI-powered
0:18
identity solution that lets you target
0:21
accurately and optimize when and how
0:23
you message. So for the marketers
0:25
and decision makers out there, try
0:28
attentive for messaging that performs and
0:30
results that transform. Visit
0:33
attentive.com/decoder to learn more.
0:35
Support comes from Service Now. We're
0:37
for people doing the creative work they
0:39
actually want to do. That's why this
0:41
ad was written and read by a real
0:44
person, and not AI. You know what people
0:46
don't want to do? Boring busy work. Now
0:48
with AI agents built into the ServiceNow
0:50
platform, you can automate millions of repetitive
0:52
tasks in every corner of your business,
0:55
IT, HR, and more, so your people
0:57
can focus on the work that they
0:59
want to do. That's putting AI agents
1:01
to work for people. It's your turn.
1:04
Visit servicenow.com. Support for Decoder comes
1:06
from AR. Have you ever
1:08
wondered what's powering your smartphone
1:10
and other devices we interact
1:13
with daily or what lies
1:15
at the heart of life-saving
1:17
drug discoveries and robotic surgeries?
1:19
The answer is arm. Arm
1:22
technology is moving the world
1:24
forward, enabling AI to create
1:26
a more meaningful, more connected
1:28
life for everyone, everywhere. Arm
1:31
believes the future isn't about
1:33
technology. It's about people, and
1:35
the possibilities technology
1:38
can offer us all. The future
1:40
is built on arm. You can
1:42
discover more at arm.com/Discover. Today I'm
1:44
talking to Virgin Policy Editor Addie
1:46
Robertson about a bill called the
1:48
Take It Down Act, which is
1:50
one of a long line of
1:52
bills that would make it illegal
1:54
to distribute non-consensual intimate imagery, or
1:56
NCII. That's a broad term that
1:58
encompasses what people used to call revenge
2:01
porn, but which now includes things like
2:03
deep fake news. The bill was sponsored
2:05
by Democrat Amy Klobuchar and Republican Ted
2:07
Cruz, and it just passed the Senate.
2:09
It would create criminal penalties for people
2:11
who share NCII, including AI-generated imagery, and
2:14
also force platforms to take that imagery
2:16
down within 48 hours of a report
2:18
or face financial penalties. NCII is a
2:20
real and devastating problem on the internet.
2:22
It ruins a lot of people's lives,
2:24
and AI is just making it worse.
2:26
There are a lot of good reasons
2:29
you'd want to pass a bill like
2:31
this, but Addie just wrote a long
2:33
piece arguing against it, saying that giving
2:35
the Trump administration new powers over speech
2:37
in this way would be a mistake.
2:39
Specifically, she wrote that passing the ticket
2:41
down act would be handing Trump a
2:43
weapon with which to attack speech and
2:46
speech platforms he doesn't like. At a
2:48
high level, Addie's argument is that Trump
2:50
is much more likely to wield a
2:52
law like this against his enemies, which
2:54
means pretty much anyone he doesn't personally
2:56
like or agree with, and much more
2:58
likely to shield from its consequences the
3:01
people and companies he considers friends. And
3:03
we know who his friends are. It's
3:05
Elon Musk, who now works as part
3:07
of the Trump administration, while at the
3:09
same time running the ex-social network, which
3:11
is full of NCII. Now, Addie and
3:13
I have been covering online speech and
3:16
talking about how it works and how
3:18
it's regulated for about as long as
3:20
the verge has existed. And she and
3:22
I have gone back and forth about
3:24
where the line should be drawn and
3:26
who should draw them about as many
3:28
times as possible as two people can
3:30
over the years. But our conversation and
3:33
our coverage has always presupposed a stable,
3:35
rational system of policymaking that's based on
3:37
the equal application of law. Here in
3:39
2025, Trump has made it clear that
3:41
he can and will selectively enforce the
3:43
law, and that changes everything. Once you
3:45
break the equal application of law, you
3:48
break a lot of things, and there
3:50
is just no evidence that Trump is
3:52
interested in the equal application of law.
3:54
You'll hear us really wrestle with that
3:56
here. The problem doesn't go away just
3:58
because the solution... are getting worse, or
4:00
that the people entrusted with enforcing the
4:03
law are getting more chaotic. So in
4:05
this episode, adding I really get into
4:07
the details of the Take-A-Down Act,
4:09
how it might be weaponized, and
4:11
why ultimately we can't trust anything
4:13
the Trump administration says about protecting
4:15
the victims of abuse. Okay, the Take-A-Down
4:17
Act, and its collision course with
4:20
our constitutional crisis. Here we go. Eddie
4:41
Robertson, welcome to Dakota. Hey,
4:43
let's talk about the Take It
4:45
Down Act. This is a bill
4:48
that would solve the problem of
4:50
AI-generated deep fakes of what people
4:52
had been calling revenge porn. Now
4:54
we call it non-consensual intimate imagery,
4:57
which I think is a
4:59
much better name. What is the Take
5:01
It Down Act? The Take It Down
5:03
Act is one of several bills
5:05
that have been, as you said,
5:08
meant to address AI replicas and
5:10
non-consensual intimate imagery. It sort of
5:12
has two parts. The first part
5:14
is that it's criminalizing NCAAI, including
5:17
digital forgeries. And this is a
5:19
part that is important and has
5:21
gotten somewhat less discussion because it
5:23
is not the most controversial part.
5:26
The controversial part is the taking
5:28
it down part, which is that
5:30
if it's past, then Web platforms
5:32
that focus on user-generated content are
5:34
going to have to create a
5:37
system by which people can report
5:39
intimate visual depictions in the bill's
5:41
language and those depictions have to
5:43
be taken down within 48 hours
5:45
at the risk of the FTC
5:47
stepping in and imposing penalties. So this
5:49
is a bill that's kind of landed
5:51
in the Federal Trade Commission? Yes, well
5:53
the criminal provisions of it get
5:55
just enforced through sort of criminal
5:58
enforcement channels, but the FTC... is
6:00
the one that's responsible for enforcing
6:02
the part against tech platforms. Is that
6:04
new? That feels like a new
6:06
power for the Federal Trade Commission. You
6:08
and I have covered a lot
6:10
of tech platform laws and policy ideas.
6:12
The notion that it's the FTC
6:15
that's going to show up and find
6:17
meta if they don't comply with
6:19
some content moderation rule, that seems new.
6:21
Can't say whether it is completely
6:23
new, but it does seem like kind
6:25
of a novel interpretation it's basically
6:27
Unfair Competition Act law. And so they're
6:29
going to define that, I guess,
6:32
to include NCII, which as you mentioned,
6:34
it's a law that can stretch
6:36
pretty far, but it is a little
6:38
unusual, I think. I want to
6:40
come back to that because the question
6:42
of who gets to enforce this
6:44
law and who might gain leverage over
6:46
the platforms is very important to
6:49
your overall thesis, which is this law
6:51
in this administration is just a
6:53
cudgel. It might solve the problem or
6:55
it might help us in a
6:57
policy making framework think about ways to
6:59
solve the problem, but the reality
7:01
of it is that you're going to
7:03
give a pretty gangster -like Trump administration
7:06
just something to beat platforms over
7:08
the head so they comply with other
7:10
speech ideas. So I want to
7:12
come back to the FTC of it
7:14
because that seems important to me,
7:16
but I just want to stay in
7:18
the sort of practical part of
7:20
the problem. Who gets to decide if
7:23
any of this imagery is inappropriate
7:25
or intimate? Is there a definition we
7:27
get to use? Is it just, here's
7:30
a picture of Taylor Swift, she obviously
7:32
didn't consent to it, she gets to
7:34
say take it down? The interesting thing
7:37
that the Electronic Frontier Foundation and some
7:39
other people have pointed out is that
7:41
there are, it seems like different standards
7:43
for what happens if something is counted
7:45
as criminal versus what has to be
7:47
taken down through these systems. Everything is
7:49
based on the idea that there's a
7:52
definition in the law of an intimate
7:54
visual depiction, which is sort of what
7:56
you might expect. It's someone who is
7:58
engaged in sexual activity, there's nudity, whatever.
8:00
variety of sort of a constellation of things
8:02
there. The part of it that criminalizes
8:04
it talks about these other sets of
8:07
conditions that have to be
8:09
met. So for instance, there's a carve
8:11
out if something is of public interest
8:13
and you have to meet these conditions
8:16
that go beyond the idea that it's
8:18
just a sexual image of someone. The
8:20
take it down part of it, at
8:22
least in the EFF and some other
8:25
folks interpretation, doesn't actually have those limits
8:27
that it really is just is something
8:29
an intimate or sexualized depiction of someone?
8:32
Okay, you have to take it down.
8:34
Which, funnily enough, there is a
8:36
Trump-related example of recently, which is
8:39
that someone in the federal government
8:41
was protesting Doge by creating an
8:43
AI-generated video of Trump licking Elon
8:46
Musk's feet. And there was debate
8:48
on Blue Sky in particular about
8:51
whether this could constitute NCII that
8:53
should be taken down. Blue Sky
8:55
took it down, but then didn't
8:58
because it's in this place where
9:00
yes, it's a sexualized image, it's
9:03
also a sexualized image that is
9:05
not only of a public figure in
9:07
a way that is specifically related to some
9:09
news, but that is framed in the
9:11
context of it's not just that this
9:14
image exists, it's that it... is specifically
9:16
part of a news story about government
9:18
employees doing something that is incredibly noteworthy.
9:20
So you could probably come down on
9:23
either side of whether that is inappropriate
9:25
or not, but it's clearly a situation
9:27
that's unique and that goes just far
9:30
beyond the idea that this is a
9:32
sexualized image of someone. And it doesn't
9:34
seem like the Take It Down Act
9:36
really accounts for that. The Blue Sky example
9:39
is particularly interesting because the
9:41
video that Blue Sky took down
9:43
was not the video itself. It was a
9:45
video of the monitors at the
9:47
Office of Housing and Urban Development,
9:49
where employees had hacked all the
9:51
displays and started playing this video.
9:53
So the video itself was newsworthy.
9:55
And so that is just a
9:58
layer of complexity and complication. and
10:00
nuance that I think is not
10:02
in the law as we see
10:04
it, it's hard for the platforms
10:06
to make determinations, and then you
10:08
have our current set of platforms
10:10
which do not seem well suited
10:12
to making nuanced moderation decisions in
10:14
the current administration, which is just
10:16
sort of constitutionally allergic to nuance.
10:18
How do you think that all
10:20
plays together? Is it just... We
10:22
know it when we see it,
10:24
which is like the classic line
10:26
people use about sexualized imagery. Is
10:28
it we just get to decide?
10:30
Is it famous people are going
10:32
to get protected and regular people
10:34
are going to get washed away
10:36
in the fray? We've just been
10:39
spending decades trying to work this
10:41
out as a legal framework and
10:43
a moderation framework even before AI.
10:45
There were issues where, say, Facebook
10:47
decides, all right, there's no nudity
10:49
on Facebook, but all right, there's
10:51
this, the very famous napalm girl
10:53
photograph from the Vietnam War, so
10:55
does that get taken down? There's
10:57
just this incredibly complicated dance and
10:59
all these incredibly complicated questions about,
11:01
yes, should public figures get more
11:03
protection, less protection? It's something I
11:05
don't think... I have a clear
11:07
answer for and I don't think
11:09
anyone does. It kind of boils
11:11
down to you, you know it
11:13
when you see it. There are
11:15
many situations where it is just
11:17
clearly unambiguous. There are a bunch
11:19
of websites and there are a
11:21
bunch of services whose deal is
11:23
allowing people to make non-consensual intimate
11:25
images or post them specifically because
11:27
they are sexualized images of women
11:29
that we hate. And they are
11:31
women that we personally know and
11:33
we want to humiliate. There's not
11:36
any nuance about whether there's value
11:38
to that it's bad. are situations
11:40
where I think you could enforce
11:42
a law that says this is
11:44
bad and we don't have to
11:46
worry about that. I think that
11:48
the problem with the Take It
11:50
Down Act is that it includes
11:52
really none of in the takedown
11:54
provisions even the focus of that.
11:56
That it really is just very
11:58
broad net that even if we
12:00
weren't under the Trump administration be
12:02
causing all of these problems and
12:04
questions about you're just building this
12:06
system that's very open for abuse.
12:08
And we especially have an example
12:10
of that working already, which is
12:12
the DMCA. So copyright, if you're
12:14
listening to this, have heard of
12:16
copy striking. It's very obvious that
12:18
when you create something that is,
12:20
while not legally mandated, really required
12:22
to get safe harbor protection under
12:24
copyright law, then you're making this
12:26
big, very blunt instrument and you
12:28
have to weigh the potential good that
12:31
it can do against the harm that
12:33
clearly is just undeniably
12:35
happening with something like the DMCA.
12:37
You know, it's interesting about the
12:39
copyright example is that it is
12:41
such a powerful weapon on the
12:44
platforms that in the creator economy
12:46
there exists an entire parallel set of
12:48
norms that culture has developed about how
12:50
nuclear it is to issue a copy
12:52
strike. You see it play out in
12:55
all these ways that I don't think
12:57
the framers of the DMCA could ever
12:59
would have ever contemplated. I don't think
13:01
you can do that with non-consensual
13:04
intimate imagery. I don't think you get
13:06
to have a big normative argument with
13:08
a person who feels wrong because there's
13:10
a sexualized AI depiction of them. This
13:13
seems even worse in that way. Yeah,
13:15
I think there are a couple of unintended
13:17
consequences, which is that part of the
13:19
reason why it's such a big deal
13:21
in... Copyright is that the law just
13:23
gets used for things that it was
13:25
never meant to be used for. Like
13:27
it's not just that someone says this
13:29
is copyright infringement, there is an entire
13:31
extortion industry that is just
13:33
based around the idea that you'll be
13:36
fraudulently accused of copyright infringement. And if
13:38
you don't pay up, then they're going
13:40
to use this blunt instrument against you.
13:42
So it first of all erodes the
13:44
idea that the law itself is worthwhile
13:46
and is addressing that thing. And I
13:49
think that while this should not stop
13:51
people from trying to stop NCII, it
13:53
also creates the scenario where if this
13:55
law and this blunt instrument gets used
13:57
in a way that is not meant to
13:59
actually address the problem, it sort of
14:02
devalues the problem that you suddenly
14:04
get to this point where I
14:06
think if it's like copyright, people
14:08
stop taking the idea of NCII
14:10
accusations seriously because you look at
14:12
this, oh well it's just clearly
14:14
this person trying to cause drama
14:16
in the community or take something
14:18
down for reasons that have nothing
14:20
to do with NCII. And so
14:22
the actual conversation about people who
14:24
are being hurt here can get
14:26
lost if you create this system
14:28
that doesn't really target it well.
14:30
I can guarantee you, there's someone
14:32
who's listening to this right now
14:34
who is saying, this is so
14:36
hard, why even try? And I
14:38
get that. There's a nihilism, I
14:40
think, to the current moment in
14:42
policy making, there's a nihilism in
14:44
the reaction to the Trump administration.
14:46
There's a kind of nihilism embedded
14:48
in the Trump approach to policy
14:50
that says, this is too hard.
14:52
Why even try? People will just
14:54
sort of get tough. But it's
14:57
not actually the case that it's
14:59
too hard, right. to the responsibilities
15:01
of platforms to whether or not
15:03
it should be left up or
15:05
taken down? What does this look
15:07
like across the states right now?
15:09
At this point, I think 48
15:11
states have some kind of NCII
15:13
law. Mostly the laws tend to
15:15
focus on the people who are
15:17
creating it. I don't think there
15:19
are that many. laws that go
15:21
after the larger tech platforms, which
15:23
I think is just the point
15:25
at which it goes from here's
15:27
a person committing a crime, to
15:29
here is this absolutely massive system
15:31
that you have to navigate in
15:33
a way that creates these huge
15:35
risks. And recently, like you said,
15:37
we've sort of been moving toward
15:39
deep fakes, I think around 14
15:41
states currently have mostly just laws
15:43
that add digital replicas to this
15:45
kind of existing NCII framework. A
15:47
lot of the problem with defects
15:49
so far, though, is that there
15:51
are all these other issues that
15:53
get wrapped up into it. So
15:55
there's NCII, but then there are
15:57
also attempts to make laws that
16:00
will fight, say, AI-generated imagery in
16:02
election. information, which is obviously an
16:04
issue, but it is a somewhat
16:06
different issue that raises a whole
16:08
bunch of different constitutional questions and
16:10
harm questions. There's issues that are
16:12
basically the equivalent of copyright infringement.
16:14
There's the Elvis Act, where the
16:16
goal isn't really NCII. It's we
16:18
have to stop artists from getting
16:20
their livelihoods appropriated, which again, serious
16:22
problem, completely different like threat matrix.
16:24
So I think that the whole
16:26
AI discussion is still really confused.
16:28
We spent last year talking about
16:30
the Kids Online Safety Act that
16:32
had a lot of ideas in
16:34
it. It went nowhere. It appears
16:36
to be stalled out completely now.
16:38
But Malania Trump is basically advocating
16:40
for, hey, we should do the
16:42
take-a-down act, like I'm famous. There
16:44
are nudes of me on the
16:46
internet. I don't want there to
16:48
be AI-generated nc-i. What's here's a
16:50
bill like here's just a solution.
16:52
Let's have it and that feels
16:54
like it's very narrow But also
16:56
just ill-considered There have been several
16:58
bills that try to address this
17:01
and some of them have been
17:03
a lot more limited and a
17:05
lot less controversial as a result.
17:07
So the Defiance Act, which passed,
17:09
I believe, out of the Senate
17:11
last year, but didn't end up
17:13
ultimately passing, is something that adds
17:15
AI-generated imagery essentially to existing civil
17:17
penalties for NCII. So back in
17:19
2022, the Violence Against Women Act
17:21
was amended to include civil action,
17:23
which again means like you can
17:25
sue someone for NCII. And the
17:27
Defiance Act kind of bolts again
17:29
as many places have done AI
17:31
generated imagery into that. It doesn't
17:33
include the kind of take it
17:35
down provisions that have proven really
17:37
controversial. There is also the Shield
17:39
Act, which has been reintroduced, which
17:41
introduces criminal penalties. I think that
17:43
there are a bunch of efforts
17:45
to individually criminalize or create civil
17:47
penalties against the creators of this
17:49
thing. And I think that there
17:51
are then these huge problems when
17:53
you try to expand that to
17:55
we have to make anyone on
17:57
the internet who is unknowingly hosting
17:59
it, remove it. And that's the
18:02
shift to the platform, right? That's
18:04
saying, okay, Facebook and YouTube and
18:06
TikTok are now going to be
18:08
responsible for what's on their platforms.
18:10
I just, one more distinction I
18:12
want to make about the various
18:14
state approaches to this in the
18:16
pre AI era. is that they
18:18
were often rooted in copyright law,
18:20
right? Like there would be some
18:22
non-consensual intimate imagery or, you know,
18:24
people had taken photos and then
18:26
one partner would have them and
18:28
distribute them eventually and there's a
18:30
copyright interest, right? You'd like made
18:32
the photo together and that provided
18:34
the basis for some of this
18:36
imagery to come down. I'm not
18:38
sure where that comes from with
18:40
the AI generated stuff. So are
18:42
we just in a totally new
18:44
realm of where the authority to
18:46
take things down comes from? Copyright
18:48
even for non-simulated NCII was a
18:50
nightmare. So the problem with copyright
18:52
is that you have to have
18:54
created the image. And so it
18:56
applied to selfies. If you took
18:58
a picture of yourself and you
19:00
send it to someone else and
19:03
it spread, okay, you own that
19:05
photo. The problem is a bunch
19:07
of NCII, isn't that a bunch
19:09
of it. Even if it is
19:11
something that was consensually taken, it
19:13
wasn't taken by you. So you
19:15
don't own the photograph. It's something
19:17
that a partner took a partner
19:19
took. And so copyright. Either it
19:21
means it doesn't really apply to
19:23
those things, or it means you're
19:25
creating this really weird copyright exception
19:27
that causes all of these other
19:29
problems. Like say there have been,
19:31
this is not related to NCII,
19:33
but lawsuits around whether a paparazzi
19:35
photo can be then claimed by
19:37
the person who was in the
19:39
photo, which just causes all these
19:41
other problems. Yeah, the reason I
19:43
asked that question is the idea
19:45
that the government can look at
19:47
a picture. and declare that it's
19:49
illegal or should be taken down
19:51
is very complicated. It requires some
19:53
framework, it requires some rigor, it
19:55
requires some due process that people
19:57
can understand and argue against. And
19:59
then... making that bigger so that
20:01
the responsibility also lies with the
20:04
platforms like YouTube or TikTok or
20:06
Instagram seems even more complicated. And
20:08
I think that's where you get to the
20:10
Take It Down Act, because that's the big
20:12
step in the Take It Down Act, right?
20:14
Saying, okay, the Federal Trade Commission is going
20:16
to be able to find Instagram if this
20:18
imagery appears on Instagram and Instagram
20:21
doesn't take it down immediately.
20:23
And that seems like a lot of leverage
20:25
for our government to get over these
20:27
platforms. The 48 hours the take it
20:29
down immediately is also a problem
20:31
there because if say you sue
20:33
someone and you go through an
20:35
entire case about whether something is
20:37
NCII at the end of that
20:39
you have say a court pretty
20:41
clearly considered whether it counts and
20:43
the 48 hours issue is just
20:45
creating the situation where not
20:47
only is it a lot of
20:49
power the government has, you're probably
20:52
not getting the same level of
20:54
consideration. You have a bunch of
20:56
moderators having to make these extreme
20:58
snap judgments without really that much legal
21:00
guidance necessarily. And so that, yeah, it's
21:03
not only a lot of power, it's
21:05
a lot of power without the kind
21:07
of consideration that we tend to try
21:10
to give the government when it
21:12
is making calls on speech. We need
21:14
to take a quick break. We'll
21:16
be right back. Support
21:18
for the show comes
21:20
from Alex Partners. Disruption
21:22
is the new economic driver.
21:25
The days of predictable
21:27
business cycles are over.
21:29
For over 40 years
21:31
Alex Partners has helped
21:33
companies develop winning business
21:35
strategies amidst uncertainty. One
21:38
of today's greatest challenges.
21:40
the rise of AI.
21:42
As AI reshapes the
21:44
tech landscape, Alex Partners
21:46
is committed to helping
21:48
your company thrive. In
21:50
their sixth annual Alex
21:53
Partners Disruption Index, a
21:55
global survey of 3,200 senior
21:57
executives, 65% of executives believe
21:59
AI and machine learning provide
22:01
positive opportunities for their companies.
22:04
And 62% of CEOs expect
22:06
significant business model changes in
22:09
the next year. In the
22:11
face of disruption, businesses trust
22:14
Alex Partners to get straight
22:16
to the point and deliver
22:19
results when it really matters.
22:21
Read more on the latest
22:23
trends and C-sweet insights at
22:26
disruption. Alex partners.com. Disruption.alix.com. Support
22:30
for this show comes from liquid
22:32
ivy. It's the middle of winter.
22:34
The air is dry, your radiator
22:36
is blasting, your humidifier ran out,
22:38
and you wake up parched. Sure,
22:40
you can keep a glass of
22:43
water by your bed, but sometimes
22:45
you're so bone dry, you feel
22:47
like you're about to crumble to
22:49
dust like a cartoon skeleton. When
22:51
you need extraordinary hydration quickly, there's
22:53
liquid ivy. They say that just
22:55
one stick in 16 ounces of
22:58
water can hydrate better than water
23:00
alone. Liquid IV is powered by
23:02
something they call LIV hydroscience, an
23:04
optimized ratio of electrolytes, essential vitamins,
23:06
and clinically tested nutrients that turn
23:08
ordinary water into extraordinary hydration. Plus,
23:10
they're easy to take on the
23:12
go, so you can feel hydrated
23:15
after a long flight before a
23:17
workout or when you just feel
23:19
dried out. You can enjoy one
23:21
of the delicious flavors like White
23:23
Peach or Asai Barry and feel
23:25
hydrated and healthy quickly. Treat yourself
23:27
to extraordinary hydration from Liquid IV.
23:29
Get 20% off your first order
23:32
of Liquid IV when you go
23:34
to liquidiv.com and use code Decoder
23:36
at Checkout. That's 20% off your
23:38
first order with Code Decoder at
23:40
liquidiv.com. making everyday experiences like a
23:42
trip to the dentist is especially
23:44
difficult. In fact, 26% of sensory
23:47
sensitive individuals avoid dental visits entirely.
23:49
In sensory overload, a new documentary
23:51
produced as part of Sensodyne's sensory
23:53
inclusion initiative we follow individuals navigating
23:55
a world not built for them,
23:57
where bright lights, loud sounds, and
23:59
unexpected touches can turn routine moments
24:01
into overwhelming challenges. Burnett Grant, for
24:04
example, has spent their life masking
24:06
discomfort in workplaces that don't accommodate
24:08
neurodivergence. I've only had two full-time
24:10
jobs where I felt safe, they
24:12
share. This is why they're advocating
24:14
for change. Through deeply personal stories
24:16
like Burnett's, sensory overload highlights the
24:18
urgent need for spaces, dental offices
24:21
and beyond, that embrace sensory inclusion.
24:23
Because true inclusion requires action with
24:25
environments where everyone feels safe. Watch
24:27
Sensory Overload now, streaming on Hulu.
24:29
At the federal level, there has
24:31
simply never been a good solution
24:33
for regulating non-consensual intimate imagery. It
24:36
isn't either too broad, which creates
24:38
potential civil liberties violations, or too
24:40
narrow, and that it covers too
24:42
little of the problem while the
24:44
problem is still evolving. That's what
24:46
we're seeing today, with AI making
24:48
NCII much more complicated. To take
24:50
a down act seems to be
24:53
firmly in the too broad category,
24:55
which raises all kinds of problems.
24:57
But we're not evaluating all this
24:59
in a vacuum. The states have
25:01
had a patchwork of laws trying
25:03
to cover the abuses of NCII
25:05
for years now to mixed results.
25:07
And as you've heard Addie and
25:10
I talk about, copyright law has
25:12
been one of the only effective
25:14
ways the government has been able
25:16
to curb some of this. The
25:18
government has been able to curb
25:20
some of this within the confines
25:22
of this within the confines of
25:25
this all leave us. And what
25:27
about the current Trump administration has
25:29
Addie concerned that this new bill
25:31
might be weaponized in ways that
25:33
severely undermine its goals? So in
25:35
a normal environment, maybe this law
25:37
passes, maybe there's a bunch of
25:39
chaos, there's a bunch of lawsuits,
25:42
a bunch of... platforms might issue
25:44
some policy documents and we would
25:46
slowly and somewhat chaoticly stumble towards
25:48
a revised policy, right? Maybe the
25:50
law gets amended, maybe there's an
25:52
enforcement regime that builds up around
25:54
the law, something. Frankly, the most
25:56
likely outcome is that someone takes
25:59
this law to court and a
26:01
lot of this is declared unconstitutional.
26:03
Sure. In a functioning system and
26:05
then... maybe part of the law
26:07
stands and maybe hopefully it's a
26:09
good part that isn't open to
26:11
abuse, but good chance it would
26:14
just get overturned. Right, and even
26:16
in that process, I think Congress
26:18
would look at that and say,
26:20
okay, this is a problem, we're
26:22
going to have some solutions for
26:24
the back end of this winter
26:26
lose, right? Like, you can see
26:28
how the normal policymaking legal judicial
26:31
process might otherwise play out. We
26:33
have a lot of history with
26:35
that. Your piece. is titled, The
26:37
Take It Down Act isn't a
26:39
law, it's a weapon, and your
26:41
thesis is that we do not
26:43
live in a normal world, and
26:45
the Trump administration in particular is
26:48
so sclerotic and so addicted to
26:50
selective enforcement that what they're really
26:52
going to do is pass this
26:54
law and then use it as
26:56
a cudgel to beat platforms in
26:58
the submission. process we've been talking
27:00
about this whole time just assumes
27:03
there's a functioning government, there's a
27:05
hard problem, everybody in the government
27:07
fights about this problem, civil society
27:09
does, people play their part, but
27:11
everyone's kind of acting in good
27:13
faith, everyone does actually care about
27:15
stopping NCII, they do recognize that
27:17
there are problems with overbroad restrictions
27:20
on speech, and everyone's trying to
27:22
work toward a solution because they
27:24
believe that laws are things that
27:26
should be applied evenly, and that
27:28
fundamentally work with the Constitution. The
27:30
Trump administration just doesn't believe in
27:32
the rule of law. It doesn't
27:34
think that laws are things that
27:37
you should apply to everyone in
27:39
the way that they are meant
27:41
to be applied by Congress. What
27:43
it believes is that laws are
27:45
things that you apply to the
27:47
people that you hate in any
27:49
way that can hurt them and
27:52
you don't apply them to the
27:54
people that you like. The way
27:56
that you apply them is not
27:58
actually in a way that stops
28:00
the problem they're meant to address,
28:02
it's a way that gets you
28:04
the thing you want, which probably
28:06
has nothing to do with that.
28:09
So we've seen this say play
28:11
out with the TikTok ban might
28:13
be the most absolutely egregious example,
28:15
which is that while I don't
28:17
agree with the ban, it was
28:19
something that was passed with a
28:21
bunch of bipartisan support. It was
28:23
passed after years and years of
28:26
working with TikTok. It was then
28:28
sent up to the Supreme Court
28:30
and the Supreme Court upheld it.
28:32
It is hard to find a
28:34
law that was more rigorously vetted.
28:36
And then Trump takes office. a
28:38
day after it passes and he
28:41
says, well, specifically, I like Tiktok
28:43
because Tiktok got me elected and
28:45
also Tiktok has been saying I'm
28:47
really great. So what I'm going
28:49
to do is I'm going to
28:51
sign an executive order. The executive
28:54
order doesn't make an argument for
28:56
why I have the power to
28:58
extend this deadline. It doesn't make
29:00
any kind of argument for why
29:02
this is compatible with the law.
29:04
What it says is don't enforce
29:06
the law. And there is absolutely
29:09
no reason to do this that
29:11
is compatible with the thing that
29:13
Congress and the Biden administration and
29:15
the Supreme Court did, because he
29:17
doesn't care about the law. What
29:19
he cares about is getting the
29:21
law to do what he wants. And the
29:24
Trump administration is staffed
29:26
with folks who believe this, who act
29:28
this way. We talked about Brendan Carr
29:30
a lot at the FCC, who
29:32
uses his enforcement power or his
29:34
merger of view power. to push broadcasters
29:36
into doing whatever speech he wants
29:39
or punish them for news coverage
29:41
he doesn't like. There's Elon who seems
29:43
like an important character in all this
29:45
because he runs a platform. There's
29:48
Mark Zuckerberg who seems more amenable to
29:50
making deals with the Trump administration or
29:52
our moderation is saying, okay, we have
29:54
this bill that says if you don't
29:56
take down this injury in 48 hours,
29:58
the FTC can find you. Is that
30:00
just another way for Trump to say,
30:02
I could destroy your company unless you
30:05
do what I want or I can
30:07
tell the FCC to hold off? Yeah,
30:09
there are two sides to this and
30:12
one of them is the side that
30:14
we talk about often, which is what
30:16
if this gets weaponized against people that
30:18
the government doesn't like? And then there's
30:21
the other side that I think less
30:23
often is raised before Trump, which is
30:25
even if you take this law seriously,
30:28
you're not going to get it applied
30:30
against the people that are actually hurting
30:32
NCII victims because, again, the administration doesn't
30:35
even care about applying the law to
30:37
people that it should be used against.
30:39
Elon is maybe the clearest example of
30:41
that, which is just, let's take the
30:44
extreme view that it is worth doing
30:46
anything to get NCI off the internet.
30:48
A place this would come into play
30:51
is X, formerly Twitter, which has had
30:53
probably the biggest NCAAI scandal of the
30:55
last several years, which is that a
30:58
bunch of Taylor Swift sexually graphic images
31:00
were posted there and spread there, and
31:02
it did very little to stop them.
31:05
It eventually kind of blocked searches for
31:07
Taylor Swift. If you're looking at major
31:09
platforms, it's the first one you think
31:11
of. You cannot enforce this law against
31:14
decks. It is almost literally inconceivable because
31:16
Elon Musk runs the department that governs
31:18
whether the FTC has money and people
31:21
who work there. The week before I
31:23
wrote this, we broke a story that
31:25
said that someone very likely Doge had
31:28
cut about a dozen people from the
31:30
FTC. I'm trying to imagine a scenario
31:32
where X completely ignores the law and
31:35
says, well, screw you, Taylor Swift, I
31:37
don't like you. In what world does
31:39
the FTC do anything? I can't think
31:41
of a way where it would act
31:44
in any way in the interests of
31:46
NCII victims. Right, you can just make
31:48
the comparison to the TikTok ban. Congress
31:51
passes a law, it goes to Supreme
31:53
Court, as I think the Take It
31:55
Down Act would immediately go to the
31:58
Supreme Court, some version of the law.
32:00
law remains or is thrown out, who
32:02
knows? And then you have a law
32:04
where the president can say, I'm telling
32:07
my FTC not to enforce this law
32:09
as it relates to X. But at
32:11
the same time, he might say, go
32:14
push Mark Zuckerberg. I want to make
32:16
sure I'm the most popular person on
32:18
Facebook today. And so if that doesn't
32:21
happen, we know for a fact that
32:23
this imagery exists on these platforms because
32:25
platforms at scale always have this imagery.
32:28
and we're going to find some way
32:30
for the FTC to punish, then Elon
32:32
go get a ton. And you can
32:34
just see that play out pretty simply.
32:37
I don't think you need to be
32:39
very imaginative to get to that scenario.
32:41
Is there any provision in this law
32:44
that would stop it? So Ted Cruz
32:46
and Amy Klobatar are the primary sponsors
32:48
of the bill, and I asked them,
32:51
do you think that X has any
32:53
way that they could be dinged for
32:55
this? They haven't gotten back to me?
32:57
I don't really know how you would
33:00
build that because the point of laws
33:02
is that Congress writes them and it
33:04
says here's what's supposed to happen and
33:07
the executive branch makes it happen. Like
33:09
the original sin here is that Congress
33:11
has now allowed the executive branch to
33:14
just decide that it doesn't pass laws
33:16
anymore. Like Congress isn't real. And there's
33:18
nothing that Congress can do inside one
33:21
individual bill to solve the fact that
33:23
it is seated all its authority. The
33:25
thing it has to do is get
33:27
that back and say, you have to
33:30
do what we want. So we have
33:32
to be able to write laws again.
33:34
So that problem is playing out, I
33:37
think, across the entire government. That's the
33:39
constitutional crisis that everyone is always talking
33:41
about, that we are always writing about.
33:44
But I just want to stay focused
33:46
on this sort of easy to grasp
33:48
notion of selective enforcement. In a world
33:50
where Donald Trump says there's illegal imagery
33:53
on YouTube, And I'm shutting down Google.
33:55
I'm imposing fines so high on Google
33:57
that it effectively can't run. And we're
34:00
not doing that for X. That's a
34:02
loss. Right? Like Google shows up and
34:04
goes to court and say this is
34:07
selective enforcement. There's some interest on the
34:09
other side of that that might reconcile
34:11
that, but that all feels like the
34:14
elephants are dancing and the regular people
34:16
who are actually the victims in this
34:18
imagery have no ability to stop the
34:20
bad thing from happening. Does that feel
34:23
like regular people who are actually the
34:25
victims in this situation have any recourse
34:27
at all? First of all, there's the
34:30
whole part where you can try to
34:32
directly go after the people who are
34:34
posting this stuff and making it, but...
34:37
In terms of the larger platform stuff,
34:39
you can probably file a lawsuit that
34:41
says this law is not getting followed.
34:43
And then that's good for you. You
34:46
do not have the power of somebody
34:48
like Google. You don't have the legal
34:50
resources. There are non-profits that will probably
34:53
back you, and it's worth a try,
34:55
but it is not something that regular
34:57
people should have to do, or that
35:00
regular people are probably the best equipped
35:02
to do. So that's just the sort
35:04
of graspable issue here. You have the
35:07
selective enforcement. You have massive disparities in
35:09
legal ability and resources and financing between
35:11
the platforms and regular people. You have
35:13
a constitutional crisis. Everyone can see that.
35:16
I really don't think it takes a
35:18
lot of imagination to see all of
35:20
that play out in the context of
35:23
this law and this administration. Then there's
35:25
one turn down the road, where I
35:27
think you do have to see some
35:30
farther consequences. You wrote in your piece,
35:32
there are concerns that this law, the
35:34
Take It Down Act, could be used
35:37
to undermine end-to-end encryption. or to somehow
35:39
go after Wikipedia. How would that work?
35:41
The end-to-end encryption is another kind of
35:43
thing that would be a problem even
35:46
outside Trump, which is that it's just
35:48
not necessarily clear that having a service
35:50
where you can't see what's on it
35:53
doesn't still mean you're in breach of
35:55
the law because you don't know whether
35:57
there's something that you're supposed to take
36:00
down. So say you're running signal or
36:02
eye message, and somebody says, well, there's
36:04
this person forwarding this image. And you
36:06
don't as the company by design have
36:09
access to that. or have the ability
36:11
to stop what people send through your
36:13
service. So are you then liable under
36:16
the FTC? This is just a problem
36:18
that comes up with all kinds of
36:20
rules about takedowns. It's a huge issue.
36:23
And then we get a little more
36:25
to the selective enforcement where it's always,
36:27
again, a problem, but we have never
36:30
had such a clear indication that a
36:32
presidency is going to abuse it. Trump.
36:34
has publicly said to Congress, well, I
36:36
think I'm going to use this law
36:39
too because nobody gets treated as badly
36:41
on the internet as me. And like
36:43
everything, he kind of frames it as
36:46
maybe a joke, but there is no
36:48
reason to believe that he's joking. He
36:50
has extorted millions of dollars from platforms
36:53
that banned him because he filed these
36:55
specious lawsuits and he's very powerful. So
36:57
you could really see a world where
36:59
he does decide that's not a joke.
37:02
I'm going to go after... any platform
37:04
that I think treats me badly. And
37:06
we also then have, like you've mentioned,
37:09
that Elon of it all, Elon has
37:11
made a really clear public stance against
37:13
how much he hates Wikipedia, which is
37:16
a platform full of user-generated content that
37:18
while it is carefully moderated could potentially
37:20
have a problem where bad actors egged
37:23
on by Trump or a functionary or
37:25
one of the many public outlets that
37:27
supports him, tries to get it punished
37:29
by the FTC for say somebody's... spamming
37:32
NCII on it and it's trying to
37:34
create a takedown process but that doesn't
37:36
stop the FTC from claiming that it's
37:39
violating this process and then they try
37:41
to just drain its resources with a
37:43
lawsuit that say it can fight but
37:46
it's just plausible enough that then courts
37:48
have to go in and try to
37:50
work through it. And that's assuming you
37:52
get a judge who is acting in
37:55
good faith, which there is pretty good
37:57
evidence that there are some Texas judges
37:59
that Elon Musk has worked with that
38:02
are not acting in good faith. of
38:04
allowed things like his lawsuit against media
38:06
matters, which is just absolutely ridiculous to
38:09
proceed in a way that has caused
38:11
it to lay off staff and that
38:13
has just drained it even
38:15
if it doesn't ultimately
38:18
lose. We need to take another
38:20
quick break. We'll be right back.
38:26
Sometimes a single performance can define
38:28
an artist's legacy. Think about Hendrix's
38:31
fiery woodstock national anthem or Biance's
38:33
homecoming at Coachella. Coming up on
38:35
Switched on Pop, we're exploring artists
38:37
who've had recent transformative live shows.
38:39
First is Missy Elliott, who recently
38:41
put on her first world tour
38:43
where she taught everybody to get
38:45
the freak-on. And then there's her
38:48
collaborator Timblein, who recently evolved from
38:50
beat maker to orchestra conductor at
38:52
the Songwriter Hall of Fame. And
38:54
then Lady Gagga, whose chromatic ball
38:56
featured a theatrical museum of brutality
38:59
revealing the darker side of
39:01
Gagga's mayhem. Listen to these
39:03
live moments on Switched on
39:05
Pop, wherever you get podcasts.
39:07
Brought to you by Defender. Paramount Plus
39:09
celebrates Women's History Month with
39:11
the women who move mountains
39:14
collection. You ready? For the women
39:16
who break boundaries. Like Zoe Saldania
39:18
and Lionas. who
39:20
were unapologetically themselves, like
39:23
Kathy Bates in Matlock.
39:25
Nobody sees us coming. And
39:27
who forge ahead, like Christina
39:29
Ritchie and Yellow Jackets. I
39:31
thought you'd be more excited
39:33
to see me. Explore the
39:36
Women Who Move Mountains collection
39:38
on Paramount Plus. Stream now.
39:40
Canva presents the killer of
39:42
productivity. It was an ordinary
39:44
workday until... Oh no! This
39:47
meeting! Could have been an email!
39:49
Run! Canva had a creative solve.
39:51
Get email. I'll just put them
39:53
for the team needs in a
39:55
Canva doc, and I'll make it
39:57
visual with images, charts, and graphs.
40:00
Bring productivity killers to justice with creativity.
40:02
Love your work at canva.com. We're back
40:04
with virtuality editor Addie Robertson. Before the
40:07
break, we were diving to Addie's thesis
40:09
around the Take It Down Act and
40:11
how it might be weaponized by the
40:13
Trump administration. What makes it worse is
40:16
the fact that this Congress has ceded
40:18
so much of its authority to the
40:20
executive branch in a way that puts
40:22
us in a very precarious position when
40:25
it comes to preventing presidential overreach. So
40:27
what happens next? And more importantly, is
40:29
there a way to actually tackle the
40:32
problem of NCII in a meaningful way
40:34
at the federal level? Or are the
40:36
victims here just caught in a political
40:38
power struggle as the problem keeps getting
40:41
worse? litigation might be able to solve
40:43
some of these problems, but it is
40:45
costly and slow going and by no
40:48
means certain. And that feels like a
40:50
thing that the Trump administration has not
40:52
realized, right? That flooding the zone with
40:54
all of these actions, with all these
40:57
executive orders, maybe they'll lose even a
40:59
majority of them, but the... The concept
41:01
of action actually brings people into line.
41:04
Do you think this is part of
41:06
that trend? You mentioned Amy Klobuchar is
41:08
one of the sponsors here. Is this
41:10
truly a bipartisan effort or is this
41:13
a bunch of people want something to
41:15
happen and this is the thing that
41:17
seems most likely to happen? I think
41:20
this is bipartisan in the sense that
41:22
these laws have been coming up for
41:24
years now, take it down act. is
41:26
part of a long line of internet
41:29
safety bills. Those bills are bipartisan because
41:31
this is an issue that a lot
41:33
of people care about and genuinely do
41:35
want to stop. But I think that
41:38
Democrats in Congress have just done an
41:40
almost incredibly bad job of responding to
41:42
the threat of the Trump administration. And
41:45
it feels almost like this is just
41:47
inertia of this is a thing that
41:49
maybe you could have done under another
41:51
administration and maybe you could have had
41:54
these fights. that we talked about to
41:56
try to make it better and there's
41:58
not in that world and they don't
42:01
recognize it. You and I have covered
42:03
attempts to regulate content on the internet,
42:05
attempts to regulate internet providers for a
42:07
decade now together, maybe more, which is
42:10
a little scary. And in that time,
42:12
I feel like my personal pendulum has
42:14
swung back and forth, right, to, well,
42:17
maybe we should do some rules for
42:19
platforms because the market is not providing
42:21
any incentive. for platforms to do this
42:23
stuff, right? Like in a sane world,
42:26
the platforms themselves would have gotten way
42:28
out ahead of we should not allow
42:30
sexualized AI-generated images of people, and they
42:33
would have stopped it. But instead, they're
42:35
going the other way, right? They're moderating
42:37
less and less. For some reason, maybe
42:39
to please Trump, maybe because it's cheaper,
42:42
who knows? And so it feels like,
42:44
okay, the government should set some rules.
42:46
And we see that happen in other
42:48
countries. government speech regulation is bad because
42:51
it will just be weaponized by a
42:53
corrupt administration. And I don't know where
42:55
that pendulum will ever land. I don't
42:58
know if it will ever stop swinging
43:00
for me. I'm curious where you are
43:02
because again, you and I have been
43:04
doing this together for so long. Yeah,
43:07
I think there are a few questions
43:09
for me. The first question is how
43:11
much laws could address big platforms at
43:14
all. There's the the theorem that Mike
43:16
Maznick came up with, which is just
43:18
that content moderation at scale that's good
43:20
as impossible. So for instance meta... platforms,
43:23
they do have rules against NCII. They
43:25
have systems they've created that are meant
43:27
to take it down. They're just a
43:30
gigantic platform and for them to moderate
43:32
at the level that they probably would
43:34
need to to actually comply with, say,
43:36
just promptly taking things down, would have
43:39
to be just massive. So there might
43:41
just be something. to bigness that makes
43:43
it inherently impossible. So that's the first
43:46
problem. Then the second problem right now
43:48
is, yeah, as you mentioned, for a
43:50
while it did actually seem like it
43:52
was the government versus big tech. So
43:55
at the very least, you had, if
43:57
you didn't like what these companies were
43:59
doing, the government was at least targeting
44:01
them and was trying to do something.
44:04
And we're just, I think, at the
44:06
other side of the tech lash now
44:08
because at this point, tech companies have
44:11
gotten a friendly administration. And so the
44:13
battle lines just aren't even drawn in
44:15
the same way, which means that you're,
44:17
I think, you can't trust Congress and
44:20
you can't trust the administration to the
44:22
same extent. And so pragmatically, even if
44:24
you think these laws are good in
44:27
theory, they're just less likely to make
44:29
sense and work, and that you also
44:31
have now this at least partial movement
44:33
to create alternative platforms that I think
44:36
is more successful than it's been in
44:38
the past. It's mostly come up through
44:40
micro blogging with, say, Blue Sky and
44:43
Mastodon are serious. attempts at contending with
44:45
these big platforms and those places are
44:47
clearly more vulnerable. So the kind of
44:49
threat that I think sometimes seemed really
44:52
hypothetical in previous years, which is well,
44:54
these big platforms are going to be
44:56
fine, but the little guys are going
44:59
to be fine, but the little guys
45:01
are going to be hurt, which made
45:03
less sense when say you didn't know
45:05
where the little guys were and the
45:08
big platform seemed like they were going
45:10
to get hurt. We're just in that
45:12
hypothetical situation now. Like the stuff that
45:14
sounded to me maybe kind of like
45:17
I'm being paranoid here. It's less. Yeah,
45:19
we can just read about it every
45:21
day. And I think one of the
45:24
interesting things about your piece was that
45:26
even some of the folks that we've
45:28
covered, that we've written about, that we've
45:30
interacted with, who have made different tradeoffs,
45:33
who said, actually this problem is so
45:35
bad, the speech tradeoffs might be worth
45:37
it, are agreeing with you that this
45:40
bill is a weapon that the Trump
45:42
administration could use. Marian Franks, who is
45:44
someone who is someone who takes a
45:46
different stance on the First Amendment in
45:49
general than me and... a variety of
45:51
people that are similar to me has
45:53
still said, yeah, I wish it weren't
45:56
true that the Trump administration's probably going
45:58
to weaponize this in a way that
46:00
doesn't necessarily help NCII victims, but it
46:02
is. This is a crisis that a
46:05
lot of people think is unfortunately just
46:07
going to skew the battlefield. And I
46:09
think that comes back to you can
46:11
have a lot of smart people with
46:14
different views on where the line should
46:16
be. That's civil society. That's what you're
46:18
talking about. That's that system that is
46:21
built up, right? Here are the think
46:23
tanks, here are the policymakers, here are
46:25
the academics who are going to argue
46:27
about how to make policy and what
46:30
the tradeoffs are and whether these ideas
46:32
worked. And usually that leads you to
46:34
some rational refinement over time. But in
46:37
this case, I think that whole ecosystem,
46:39
that whole set of people, is looking
46:41
at bills like this, looking at the
46:43
Trump administration, saying, maybe we shouldn't give
46:46
them more power, because there isn't this
46:48
check on it. There isn't this refinement
46:50
process that will occur. And I'm just
46:53
not sure how we get back to
46:55
it. That seems like the thing that
46:57
has stopped everybody in their tracks, right?
46:59
We all know this is bad. Even
47:02
the problem is happening to Trump himself.
47:04
It's happening to his wife. It doesn't
47:06
seem like they're motivated to stop it,
47:09
right? Or if they're given the tools
47:11
to stop it, they will use the
47:13
tools to actually stop it. What do
47:15
you think is going on there? It
47:18
has just never been clear that there
47:20
is a group of people here who
47:22
care about what happens to them, they
47:24
don't care about what happens to anyone
47:27
else, and they also have spent... an
47:29
extraordinary amount of time and energy signaling
47:31
that they do not care about women,
47:34
that in fact they support men who
47:36
are accused of assaulting women, who are
47:38
accused of sex trafficking women, women make
47:40
up the vast majority of NCII victims,
47:43
and that this is part of their
47:45
attempt to establish an anti-woke culture that
47:47
this is a way to, as JD
47:50
Vance puts it just sort of more
47:52
broadly, that we need to protect masculinity,
47:54
that we need to let men be
47:56
men again. And I think that it
47:59
is rare to see someone so blatantly
48:01
tell you that he does not care
48:03
what happens to women as long as
48:06
it's the women he doesn't like. And
48:08
I think that you should absolutely not
48:10
trust anything that anyone in the Trump
48:12
administration says about protecting women because it
48:15
is only a way to get to
48:17
the people that he thinks shouldn't be
48:19
allowed to abuse women because he doesn't
48:22
like them. His cabinet is full of
48:24
men who have been fairly credibly accused
48:26
of abuse and assault and harassment. He
48:28
has recently... allegedly stepped into free someone
48:31
who a Republican attorney general has called
48:33
an admitted sex trafficker. I think that
48:35
we can't trust him. Women should not
48:37
trust him. No one who cares about
48:40
this issue should trust him. I mean
48:42
that is as clear of a statement
48:44
about the Trump administration as there's ever
48:47
been. It's obvious now in a way
48:49
that it was maybe subsumed in the
48:51
first Trump administration, but now it's right
48:53
there on the surface. I think in
48:56
the context of a law like this,
48:58
which is ostensibly meant to protect people,
49:00
but can actually be used as a
49:03
weapon against companies and people, the administration
49:05
isn't like, it's worth saying out loud.
49:07
What happens next? Is this law going
49:09
to pass? Is it going to get
49:12
signed by the president? Congress seems like
49:14
it's mired in dysfunction. What do the
49:16
next steps here look like? After Kosa,
49:19
which came within like one vote of
49:21
passing and then failed after everyone on
49:23
earth in Washington said they were going
49:25
to support it, I don't really know
49:28
what happens. It seems like anything could
49:30
fail now. This has advanced pretty far
49:32
and obviously it has the backing of
49:35
the President and First Lady, so I
49:37
think it's definitely a real threat. I
49:39
think that at this point... maybe congressional
49:41
dysfunction could still save us. I think
49:44
that maybe the best hope is that
49:46
Congress does manage to pass something that
49:48
is like the Defiance Act that has
49:50
broad support and that really does create
49:53
an actionable way to help this problem
49:55
that is less clearly weaponizable, and I'm
49:57
just hoping for that. I'd
50:00
like to like to thank Addy for joining me on
50:02
the show, the and thank you for listening. hope you
50:05
enjoyed it. enjoyed it. You why this episode or anything else,
50:07
you can email us email us at.com. verge. We We do do read
50:09
all the emails, I and I will tell you, in
50:11
the last week, we got one email saying an
50:13
interview was the most boring ever, and another email saying
50:15
the interview was the best the ever done. ever, So another
50:17
coming. saying You can also save me the on we'd ever done.
50:19
So keep them and we have a save me up an Instagram. on
50:21
threads.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More