Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Alaska Alaska Airlines
0:02
and Hawaiian Airlines have come together,
0:04
bringing you more destinations and even
0:06
more rewards. Now your miles add
0:08
up no matter which airline you
0:11
fly. Head to Hawaii with Hawaiian
0:13
or explore Mexico, the Bahamas, the
0:15
East Coast, and beyond with Alaska.
0:17
Your loyalty just got a major upgrade.
0:19
More flights from the West Coast, more
0:21
perks and more ways to Earth. I'm
0:26
Leon Nafak. And I'm the host of
0:28
Slow Burn Watergate Watergate. Before
0:30
I started working on this show, everything
0:32
I knew about Watergate came from the
0:34
movie All the President's Men. Do you
0:36
remember how it ends? Woodward and
0:38
Bernstein are sitting with their typewriters
0:40
clacking away. And then there's this
0:43
rapid montage of newspaper stories, about
0:45
campaign aids and White House officials
0:47
getting convicted of crimes, about audio
0:49
tapes coming out that proved Nixon's
0:51
involvement in the cover-up. The last
0:53
story we see is Nixon resigns. It
0:56
takes a little over a minute in
0:58
the movie. In real life, it took
1:00
about two years. Five men were arrested
1:02
early Saturday while trying to install eavesdropping
1:04
equipment. It's known as the Watergate incident.
1:06
What was it like to experience those
1:08
two years in real time? What were
1:10
people thinking and feeling as the break-in
1:12
at Democratic Party headquarters went from a
1:14
weird little caper to a constitutional crisis
1:16
that brought down the president? The
1:18
downfall of Richard Nixon was stranger,
1:21
wilder, and more exciting than you
1:23
can imagine. Over the course of
1:25
eight episodes, this show is going
1:27
to capture what it was like
1:29
to live through the greatest political
1:31
scandal of the 20th century. With
1:33
today's headlines once again full of
1:36
corruption, collusion, and dirty tricks, it's
1:38
time for another look at the
1:40
gate that started it all.
1:42
Subscribe to Sloburn now, wherever
1:44
you get your podcasts. At this
1:46
time, I impose that sentence to
1:48
cover all 34 counts. Sir, I
1:51
wish you God speed as you
1:53
assume you second term in
1:55
office. Hi and welcome
1:58
back to Annica. This is Slate's
2:00
podcast about the courts and the
2:02
law and the Supreme Court and
2:04
what the law once was and
2:06
where it's headed now and how
2:09
all of that changes our lives.
2:11
I'm Dolly Lithwick. I cover the
2:13
courts and the law at Slate
2:15
and on Friday morning, the US
2:17
Supreme Court heard what can only
2:19
be described as the most important
2:21
case pitting free speech rights against
2:24
national security in the modern era.
2:26
In taking up Congress's impending tick-tock
2:28
ban and doing so at lightning
2:30
speed, the High Court has really
2:32
inserted itself into this huge global
2:35
debate about media and technology and
2:37
publishing and security, and also, as
2:39
you're about to hear, my very
2:42
favorite... pasta, the court is also
2:44
poised to decide whether you are
2:46
going to literally be able to
2:49
open TikTok in eight days time.
2:51
Slate senior writer Mark Joseph Stern
2:53
and I listen to the TikTok
2:56
arguments together and we're going to
2:58
be digging into those in a
3:00
couple of minutes. Hey there
3:03
Mark. Hi Dahlia! And
3:05
before we get to
3:08
TikTok, in other news,
3:10
the Donald J. Trump,
3:13
on accountability train, made
3:15
its final stop in
3:18
Judge Juan's Manhattan courtroom
3:21
on Friday, where
3:23
the president-elect was
3:25
finally sentenced by
3:27
way of Zoom.
3:30
He is a
3:32
felon. As this court
3:34
has noted, the defendant's conduct,
3:36
quote, constitutes a direct attack
3:38
on the rule of law
3:40
itself. They all said, this
3:43
is not a case that
3:45
should be brought. The protections
3:47
afforded, the office or the
3:49
president, are not a mitigating
3:51
factor. They do not reduce
3:54
the seriousness of the crime
3:56
or justify its commission in
3:58
any way. It's been
4:00
a political witch hunt. It
4:02
was done to damage my
4:04
reputation, so that I'd lose the
4:06
election, and obviously that didn't
4:08
work. After careful analysis and
4:11
obedience to governing mandates
4:14
and pursuant to the rule of law,
4:16
this court has determined
4:18
that the only lawful
4:20
sentence that permits entry
4:22
of a judgment of
4:24
conviction, without encroaching upon
4:26
the highest office in
4:28
the land, is an
4:30
unconditional discharge, which the
4:32
New York State legislature
4:34
has determined is a
4:36
lawful and permissible sentence
4:38
for the crime of
4:40
falsifying business records in
4:42
the first degree. Therefore,
4:44
at this time, I impose
4:47
that sentence to cover all
4:49
34 accounts. Sir, I wish
4:51
you Godspeed as you assume
4:53
the second terminal office.
4:55
Thank you. Now we knew
4:57
this was going to be an
4:59
unconditional discharge. There's no time. There's
5:02
no fine So maybe the thing
5:04
to start with right now is
5:06
the fact that this sentencing went
5:08
ahead at all Yes, this was
5:10
in many ways a huge
5:12
defeat for Donald Trump. I
5:14
think he expected the sentence
5:17
to be halted. He had
5:19
asked the Supreme Court in
5:21
a rather extraordinary and unusual
5:23
order to freeze the sentencing
5:25
and on Thursday night by
5:27
a five to four votes,
5:29
the Supreme Court refused with
5:31
Chief Justice John Roberts and
5:33
Amy Coney Barrett joining the
5:35
three liberals and presumably Justice
5:37
Samuel Alito Haringing. them behind
5:39
the scenes and screaming at the
5:42
top of his lungs that they
5:44
were betraying their god emperor and
5:46
how dare they. And so on
5:49
Friday morning we were able to
5:51
listen as Trump who is about
5:53
to become president once again was
5:56
sentenced to multiple felony convictions, albeit
5:58
with no jail time. or fine
6:00
or probation, and the man who is
6:02
soon to become leader of the free
6:05
world officially, formally and at long last,
6:07
got a rap sheet that will follow
6:09
him into the White House. And I
6:12
guess I should flag for listeners
6:14
that we spoke to Andrew Weisman
6:16
on Thursday as we waited to
6:18
hear whether the Supreme Court was
6:20
going to allow the sentencing to
6:22
go forward. Slate Plus members, if
6:24
you have not listened to this
6:26
week's bonus conversation yet, it is
6:29
really worth your time. And if
6:31
you are not a Slate Plus
6:33
member yet, you can subscribe directly
6:35
from the amicus show page on
6:37
Apple Podcast and Spotify, or you
6:39
can visit slate.com slash. Amicus Plus
6:41
to get access wherever you listen.
6:44
And when Andrew and I recorded,
6:46
we didn't know what the Supreme
6:48
Court was going to do. And
6:50
Mark, I just want to say
6:53
again, the fact that this was
6:55
five to four came down late
6:57
Thursday night was a little tiny
6:59
bit crazy. crazy that there were
7:01
four justices on the Supreme Court
7:04
who are going to say, hey,
7:06
this is a state matter, it's
7:08
a criminal matter, it's not resolved
7:10
or final in any way, but
7:13
Trump is awesome. Yeah, I mean, of
7:15
course it should have been nine
7:17
to zero, right? And as the
7:19
court pointed out in its very
7:21
short order, Trump still has appeals
7:24
available to him through the regular
7:26
appellate process. What he was seeking
7:28
here was an incredibly bizarre and
7:30
unusual kind of exception from the
7:32
rules to just run over to
7:35
his friends in black robes at
7:37
Skotis and say, I don't want
7:39
to have to do all that
7:41
later, just freeze the sentencing now.
7:43
and effectively put it off indefinitely,
7:46
maybe forever. And the majority decided
7:48
we're not going to take that
7:50
step. It gave these two reasons.
7:52
First, again, because the sentence can
7:54
be appealed. He still has all
7:56
these avenues available to him in
7:59
New York court. because the imposition
8:01
on his duties as president-elect are
8:03
incredibly minor. All he had to
8:05
do was zoom into the courtroom
8:08
and sit there for a few
8:10
minutes. And he ended up saying
8:12
something, but he could have pretty
8:14
much just stayed silent and stared
8:16
at Judge Mershon. And as the
8:18
majority explained, even if we accept
8:21
the premise that there is some
8:23
constitutional power of president-elects who are
8:25
not yet president, who do not
8:27
yet have article to authority, to
8:29
escape from a court proceeding or
8:31
a criminal prosecution because they're preparing
8:34
to become president, that this minimal
8:36
burden on Trump does not meet
8:38
that standard. So it was a
8:40
blast of reason from five justices,
8:42
a blast of crazy from the
8:44
four dissent from the four dissenters.
8:47
And at a bare minimum, I
8:49
guess I have to say something
8:51
nice about Roberts and Barrett, which
8:53
is that they at least drew
8:55
a line in the sand and
8:58
declined to extend the terrible presidential
9:00
immunity decision from last summer to
9:02
an entirely new set of facts
9:04
and circumstances. They cabined it within
9:06
some reason, unlike the four dissenters,
9:08
Thomas, Alito Gorsuch, and Kavanaugh, who
9:10
would have just turned it into
9:12
a cudgel to let Trump get
9:14
out of any accountability in any
9:16
circumstance whatsoever whatsoever. Yeah, I mean
9:18
on the one level it
9:20
feels a little bit like,
9:23
you know, John Roberts who
9:25
wrote that immunity opinion and
9:27
Amy Coney Barrett, who would
9:29
have really cabined it and
9:31
was clear about that at
9:33
the time, are slightly walking
9:35
back the absolutely sweeping decision.
9:37
But there's a way in
9:39
which this didn't matter. I
9:41
mean, it matters. It's incredibly
9:43
consequential that at minimum, there
9:45
was a sentencing, at minimum,
9:47
there was a moment at
9:49
which some accountability, possibly the
9:51
only accountability, was leveled. But
9:53
Mark, I gotta say, to
9:55
give them plot it for
9:57
being like, hey, we were.
10:00
super crazy, but we weren't
10:02
that crazy. Just feels very
10:04
on the nose. It also
10:06
feels, I gotta tell you.
10:08
extremely on the nose that
10:10
in the sort of reality
10:12
show your fired version of
10:14
how we do law now
10:16
we essentially had a sentencing
10:18
hearing in which it was
10:20
pronounced yes you're a felon
10:22
yes you did all this
10:24
really bad stuff and yes
10:26
it doesn't matter congratulations next
10:28
week you're the president it
10:30
feels very very very of
10:32
this moment. But let's go on
10:35
to the sentencing hearing itself because
10:37
having allowed the thing to
10:39
go forward, it did in fact
10:42
go forward on Friday morning. Dahlia,
10:44
I understand that the sentencing was
10:46
in some ways an anti-climax, right?
10:49
For him to be given no
10:51
real punishment except the label of
10:53
felon after all this, it feels
10:55
a bit like a letdown, but
10:58
I guess what it... That's not
11:00
the right word. What tantalizes me is
11:02
the possibility that maybe John Roberts and
11:04
Amy Coney Barrett aren't going to be
11:06
in his camp for every single little
11:08
favor that he asks them for over
11:10
the next few years. Like I think
11:13
there's a theory of the chief especially
11:15
that he is a reactive justice, that
11:17
he is a pendulum that swings back
11:19
and forth depending on who's in the
11:21
White House. With Biden in the White
11:24
House, he swung far to the right.
11:26
Maybe with Trump in the White House.
11:28
he'll swing a little bit more toward
11:30
the center. And so while I get people
11:32
sort of mourning the death of the rule
11:34
of law that Trump effectively walks away without
11:37
any meaningful penalties, I guess I'm just going
11:39
to keep my eye on the chief. Maybe
11:41
I'm Lucy with the football. You can tell
11:43
me that I am. But I will hope
11:45
and pray that he has recovered some of
11:48
the faculties that somehow escaped him through four
11:50
years of Joe Biden's presidency. Again, my
11:52
friend, I think I just
11:54
want to say having been the
11:56
prime mover behind all three,
11:58
all three Trump. cases last year
12:01
and written the opinions, the idea
12:03
that John Roberts is like, wait,
12:05
what, this sentencing hearing, this is
12:07
a bridge too far, absolute immunity,
12:10
totally okay, but this, too much.
12:12
Okay, I'm going to give you,
12:14
you're not Lucy, there's no football,
12:16
but I'm just saying I am
12:18
not, I am not changing my
12:21
orientation toward what happened last June
12:23
based on the like very trivial
12:25
win that was that an entirely
12:27
discredited ridiculous Hail Mary to try
12:30
to get this sentencing to not
12:32
happen. One by one vote. Thank
12:34
you, John Roberts. Okay, Mark, let's
12:36
talk. I deserve that. No, let's
12:38
talk. We're talking about these last
12:41
gasps of criminal accountability. And now
12:43
we're not even talking about that.
12:45
We're not, we're now talking about
12:47
the last grasp of historical accuracy,
12:49
which is now in the hand
12:52
somehow of some like, like, melage
12:54
of the 11th circuit, alien canon,
12:56
and possibly the Supreme Court. Can
12:58
we just. get a reading on
13:00
whether you and I and the
13:03
rest of the people who want
13:05
to know the things will ever
13:07
get to see the Jack Smith
13:09
reports, part one, part two. Mark,
13:11
you wrote so eloquently earlier this
13:14
week about what the hell was
13:16
Judge Cannon doing with this case.
13:18
Where do we stand as of
13:20
this moment? So I think someone
13:23
must have switched my allergy medication
13:25
with Hopium because I think that
13:27
we will see the Jacksmith report.
13:29
that involves a Trump's attempted election
13:31
subversion on January 6th. To recap,
13:34
Trump and his former co-defendants had
13:36
asked both Judge Cannon and the
13:38
11th Circuit to block the Justice
13:40
Department from releasing any part of
13:42
Jack Smith's report, even though under
13:45
Justice Department regulations when a special
13:47
has wrapped up their case, they
13:49
are supposed to transmit that report
13:51
to the Attorney General, who gets
13:53
to decide whether to release it
13:56
to Congress and the public. Judge
13:58
Cannon jumped ahead of the 11th
14:00
Circuit and issued a remarkable injunction
14:02
that prohibited the Justice Department from
14:05
issuing the report, even though she
14:07
no longer had jurisdiction over the
14:09
case, which had been appealed, and
14:11
was literally powerless to act. That
14:13
has never stopped her before. It
14:16
didn't stop her this time. She
14:18
acted anyway. She issued an injunction.
14:20
But then on Thursday night, the
14:22
11th Circuit stepped in and said,
14:24
you know what, we don't think
14:27
so. The 11th Circuit denied that.
14:29
same request from Trump and his
14:31
co-defendants. The 11th Circuit said, we
14:33
will not be blocking the Justice
14:35
Department from releasing any part of
14:38
the Smith report. And this is
14:40
the real icing on the cake.
14:42
The 11th Circuit invited the Justice
14:44
Department to appeal Judge Cannon's injunction
14:46
and with a very big wink,
14:49
basically suggested that it will overturn
14:51
that injunction A.S.AP. So this is
14:53
a big vindication for Jacksmith and
14:55
Merritt Garland. indicated that he wants
14:58
to release this report about election
15:00
subversion to the public as quickly
15:02
as possible. Now there's a second
15:04
volume of the report that is
15:06
all about the classified documents case.
15:09
Garland says he doesn't want to
15:11
release that to the public yet
15:13
because the co-defendants still face criminal
15:15
charges and it could prejudice the
15:17
case against them, but he wants
15:20
to release it to Congress. And
15:22
I think if he releases it
15:24
to Congress, there's a very good
15:26
chance that the public will eventually
15:28
see it. All that really matters
15:31
now is that when Trump and
15:33
his co-defendants ask the Supreme Court
15:35
to step in and block the
15:37
release of the report, the Supreme
15:40
Court doesn't run out the clock,
15:42
it acts promptly, it denies the
15:44
request, and then even if it's
15:46
at 11.59 a.m. on January 20th,
15:48
we are getting our hands on
15:51
at least some of Jacksmith's final
15:53
report. It's happening. We are going
15:55
to take a short break, and
15:57
when we come back, Mark and
15:59
I will be joined by Adam
16:01
Hans. professor, first amendment expert, and
16:03
signatory onto one of the amicus
16:06
briefs supporting Tiktak in the giant
16:08
case that the court heard on
16:10
Friday. This podcast is brought to
16:12
you by Progressive Insurance. Do you
16:14
ever think about switching insurance companies
16:16
to see if you could save
16:18
some cash? Progressive makes it easy.
16:20
Just drop in some details about
16:23
yourself and see if you're eligible
16:25
to save money when you bundle
16:27
your home and auto policies. The
16:29
process only takes minutes and it
16:31
could mean hundreds more in your
16:33
pocket. Visit progressive.com after this episode
16:35
to see if you could save.
16:37
Progressive Casualty Insurance Company and affiliates.
16:40
Potential savings will vary, not available
16:42
in all states. Hi,
16:48
I'm Courtney Martin from Slate's advice
16:50
podcast, How Too. Online scams seem
16:52
like they're everywhere these days, and
16:54
our elders are particularly vulnerable. A
16:56
few months ago, my parents were
16:58
the victims of a scam, and
17:00
thankfully at that time, they did
17:02
not lose any money, but we're
17:05
really scared about what could happen
17:07
in the future. That's our listener,
17:09
May. Check out her conversation with
17:11
a fraud prevention expert on our
17:13
new episode. It's filled with tips
17:15
and resources that could help you
17:17
and your loved ones steer clear
17:19
of online scams. Listen to how
17:21
to wherever you get your podcasts.
17:27
Okay, so now we are going
17:29
to turn to one first street
17:32
and the First Amendment and Tiktok
17:34
and Mark you and I listened
17:36
to the arguments on Friday morning
17:38
and so did Gottam Hans who
17:40
joins us now Gottam is a
17:42
clinical professor of law and founding
17:45
director of the Civil Rights and
17:47
Civil Liberties Clinic at Cornell Law
17:49
School. He analyzes through research and
17:51
advocacy how new and developing technologies
17:53
implicate law, privacy, and data protection
17:56
and public policy. God, we want
17:58
to just start by welcoming you
18:00
to Amicus. Thank you for joining
18:02
us to thrash out whatever it
18:04
was that happened in Friday's TikTok
18:06
case. Thank you both for having
18:09
me. I'm a long-time listener, but
18:11
a first-time caller, and looking forward
18:13
to discussing Friday mornings mess. So
18:15
I wondered if you would just
18:17
start for our listeners who maybe
18:19
are not as read into what
18:22
this is all about with a
18:24
tiny bit of table set. And
18:26
can you describe the ban and
18:28
describe the divestiture obligation component of
18:30
the ban and just tell us
18:33
what sort of Congress was thinking
18:35
and when all this is meant
18:37
to go into effect, which is
18:39
like Pro Tip, it's really soon.
18:41
So this issue has a fairly
18:43
long gestation because some followers of
18:46
the TikTok federal government interaction may
18:48
remember that President Trump tried to
18:50
ban the app or do something
18:52
similar to divestiture in his first
18:54
term. That was blocked by the
18:56
courts because it was a purely
18:59
executive action. It was not authorized
19:01
by Congress. It was an overstep
19:03
according to the courts. We have
19:05
a change of administration. President Biden,
19:07
that administration. initiated some discussions with
19:10
the company to try to lead
19:12
to a divestiture of some sort
19:14
or some kind of remedy for
19:16
the problems. There was a long
19:18
negotiation between the executive branch and
19:20
the company unsuccessfully. Then about a
19:23
year ago in early 2024, Congress
19:25
steps in and enacts the law
19:27
that's at issue in the case
19:29
that was argued on Friday. There's
19:31
law has a really long name
19:33
which I... never remember, we just
19:36
called the TikTok Band, that law
19:38
was supported on a bipartisan basis.
19:40
The claim was that there were
19:42
national securities... about tech talks about
19:44
ownership structure because tech talk US
19:46
is controlled by a Chinese-based company
19:49
and as we know the Chinese
19:51
government has a lot of obligations
19:53
on its companies to share data
19:55
in ways that the US was
19:57
concerned about and affecting mass security.
20:00
So the law sort of talks
20:02
about foreign adversaries and social media
20:04
apps or technology companies that are
20:06
controlled by foreign adversaries as a
20:08
general matter, but as a specific
20:10
matter. for Tiktok requires a divestiture
20:13
of the company by January 19th,
20:15
2025, and if that divestiture doesn't
20:17
happen, then what we, you know,
20:19
call the ban basically will go
20:21
into effect. And maybe just one
20:23
more beat on the table set,
20:26
which is just describe who challenges
20:28
the ban and what happens at
20:30
the DC circuit below. Right. So
20:32
there are two groups that are
20:34
challenging the ban. The first is
20:37
Check Talk itself. The company argues
20:39
various things in the DC circuit.
20:41
One thing that's I think a
20:43
bit relevant here is that this
20:45
law is a little weird that
20:47
it allowed for an expedited review.
20:50
There was no trial court findings
20:52
in this case. The statute authorized
20:54
the company to challenge it immediately
20:56
in the court of appeals in
20:58
the DC circuit. So this happened
21:00
really fast without any fact finding.
21:03
The other group that is challenging
21:05
the law are the users. to
21:07
talk, the users, the creators, they
21:09
have First Amendment interests in sharing
21:11
and experiencing content on the app,
21:14
and so they have a separate
21:16
claim based on the First Amendment,
21:18
and because of the sort of
21:20
foreign versus domestic dynamics that ended
21:22
up getting some airtime in the
21:24
argument, those cases were consolidated, the
21:27
arguments were done in the fall
21:29
of 2024 decision issued in December
21:31
of 2024, Bam was supposed to
21:33
go into effect, led to the
21:35
very expedited review and... Supreme Court
21:37
with briefs, unfortunately, for the parties.
21:40
And for me, as an amicus
21:42
supporting of a company due at
21:44
the end of December and or
21:46
argument at the beginning of January
21:48
2025. And the DC Circuit, Mark,
21:51
you can explain really quickly, but
21:53
the DC Circuit pretty cross ideological
21:55
panel just straight up thumbs up
21:57
for the Biden administration. Yeah, that's
21:59
right. It was a three-judge panel,
22:01
two conservative judges, one liberal judge,
22:04
unanimously sided against TikTok, and the
22:06
majority, the two conservative judges, said
22:08
even if strict scrutiny applies, the
22:10
highest standard of judicial review, this
22:12
law survives strict scrutiny. The concurring
22:14
judge, the liberal judge, Judge Shrina
22:17
Vassen, said, well, I think that
22:19
we should apply intermediate scrutiny, which
22:21
is a little bit more relaxed,
22:23
and the law clearly survives intermediate
22:25
scrutiny. across the board, you know,
22:28
very strong opinions for the Biden
22:30
administration, which at least for the
22:32
next like nine days is responsible
22:34
for defending this law, and TikTok
22:36
appealed to the Supreme Court, which
22:38
very, very quickly took up the
22:41
case and set it on its
22:43
rocket docket. It acted much more
22:45
quickly on this than on, say,
22:47
the Trump immunity case, which shows
22:49
that when the justices really want
22:51
to decide something fast, they know
22:54
how to do it. Yeah, one
22:56
of the irony listening to the
22:58
arguments was that they were able
23:00
to fast track... this oral argument
23:02
and get the briefing in and
23:05
do it all really quickly. And
23:07
yet there are no time limits
23:09
left at oral arguments. And so
23:11
it's sprawled on for what felt
23:13
like 200 million years. I spent
23:15
a good amount of the argument
23:18
Googling. How long did Bush Vigor
23:20
really take to be argued? I
23:22
just want to flag for listeners
23:24
who was arguing the cases. Tiktak
23:26
was represented by former solicitor General,
23:28
Noel Francisco, the content creators, in
23:31
other words, people who say, look,
23:33
I am a content creator, I
23:35
need Tiktak to get my message
23:37
out, I can't be nearly as
23:39
salient and forceful and important. on
23:42
other platforms, that was represented by
23:44
Jeff Fisher, and the Biden administration
23:46
was represented by Solicitor General Elizabeth
23:48
Prelogger in her last oral argument.
23:50
I want to add just one
23:52
more kind of precatory, what does
23:55
this all mean, and this is
23:57
for you, got them, but I
23:59
think that Part of what is
24:01
really, really complicated in this case
24:03
is that there's two different justifications
24:05
for the law. And it's sort
24:08
of looming over everything. And one
24:10
is this national security justification. Look,
24:12
this is a foreign entity. It
24:14
is an enemy of the United
24:16
States. We know what they're doing
24:18
with our data. And then there's
24:21
this other kind of very squishy
24:23
first amendment. We kind of don't
24:25
like this stuff that they may
24:27
be encouraging. And I would love.
24:29
it if you could sort of
24:32
parse through for us because I
24:34
think as you suggested this was
24:36
a drumbeat throughout the entire argument
24:38
that these two justifications don't necessarily
24:40
kind of rub along together with
24:42
equal force. My main takeaway from
24:45
this argument is that this is
24:47
why like traditional deliberation is a
24:49
virtue in many situations. There are
24:51
so many complicated issues in this
24:53
case that I think really deserved
24:55
a lot of time, preferably not
24:58
in a two-hour, 40-minute-old argument on
25:00
an expedited basis. And there are
25:02
important governmental interests. I think basically
25:04
everyone concedes. National Security is one.
25:06
What is the tailoring that's appropriate?
25:09
Would the law of general applicability
25:11
be better or worse? Because as
25:13
I mentioned, this law specifically names
25:15
TikTok and its legislative language. This
25:17
is not a law that applies,
25:19
you know, across the board. Do
25:22
we think that there are real
25:24
problems with the data economy of
25:26
collecting massive amounts of data and
25:28
not really deleting it? How do
25:30
we think about corporate structures? You
25:32
know, does the foreign ownership dynamic
25:35
versus domestic ownership make a different?
25:37
Does that why the users maybe
25:39
had a better shot? I think
25:41
during the argument the lawyer for
25:43
the users Professor Jeff Fisher at
25:46
Stanford I think got a better
25:48
treatment than Noah Francisco the former
25:50
solicitor general under President Trump when
25:52
he was representing TikTok? This is
25:54
a big swirl and I get
25:56
it. You know, this is a
25:59
complicated case, but as a really
26:01
baseline matter, my feeling is that
26:03
I started out as a privacy
26:05
attorney. I am really concerned about
26:07
data collection. I understand that national
26:09
security is an issue. I understand
26:12
the foreign political economy. Those are
26:14
all good justifications, but this law
26:16
doesn't do a great job of
26:18
answering the moment, and that's where
26:20
I think the First Amendment problems
26:23
reside. So can I just sort
26:25
of distill from that? I mean,
26:27
I agree completely those were all
26:29
of the issues swirling around. I
26:31
think the Solicitor General tried to...
26:33
reach out and grab all of
26:36
these disparate justifications and issues and
26:38
put them into two buckets. There
26:40
was a first justification for the
26:42
law that said this is about
26:44
data protection, data security, that the
26:46
Chinese government has access to Tiktok's
26:49
data and Tiktok is collecting a
26:51
massive amount of data from its
26:53
users about their age and location
26:55
and activity and all of this
26:57
really sensitive stuff. And the government
27:00
of America is worried that the
27:02
government of China is going to...
27:04
that data and somehow weaponize it.
27:06
Maybe blackmail users, maybe turn them
27:08
into spies against the US government.
27:10
And then there's the second justification,
27:13
which is the one that I
27:15
think gives us all hippie-G-B's and
27:17
definitely disturbed the justices, which is
27:19
the idea that the Chinese government
27:21
can manipulate the speech on Tik-talk
27:23
in order to further the interests
27:26
of the Chinese Communist Party and
27:28
somehow undermine America. And so the
27:30
second justification here seems to be
27:32
aimed at suppressing speech, right, aimed
27:34
at saying we don't like this
27:37
speech and we want less of
27:39
it or we want to manage
27:41
it differently or we want to
27:43
ban it and tell me if
27:45
I'm wrong, but I felt like
27:47
that got more air time. and
27:50
seem to wrangle several of the
27:52
justices across ideological lines. I think
27:54
that is correct. I think that
27:56
part of the challenge with this
27:58
law and this issue is that
28:00
it's all spiraled up with politics.
28:03
Now, I'm of the opinion of
28:05
them. Most laws are rolled up
28:07
with politics, but we have, during
28:09
the time of the tenactment, former
28:11
representative Mike Gallagher, saying, this is
28:14
digital fentanyl, they're coming for our
28:16
kids, want someone, please think of
28:18
the children, we have other Congress
28:20
people complaining or stating that the
28:22
app is biased in favor of
28:24
Palestinians, you know, these are all
28:27
class of First Amendment problems, and
28:29
so what you get is... I
28:31
think some really politically inflected concerns
28:33
about China, and also in addition
28:35
to some politically infected concerns about
28:37
China, larger sort of who can
28:40
influence the American populace. There was
28:42
a joke about people not really
28:44
agreeing very much anymore these days.
28:46
Oh, go ahead. I was just
28:48
going to say, did I understand
28:50
you to say a few minutes
28:53
ago that one problem that bite
28:55
is a bite dance might be
28:57
through TikTok? trying to get Americans
28:59
to argue with each other. That
29:01
it might be just trying to
29:04
fill in. If they do, I
29:06
say they're winning. I'll just say
29:08
in regards, Mark, to your first
29:10
point about the Chinese data collection,
29:12
I think the Solicitor General conceded
29:14
or admitted that some of this
29:17
is future-looking, right? This idea that
29:19
someday the future president, like the
29:21
fact that they were, you know,
29:23
liking things not ticked out, that
29:25
maybe were a little purrion is
29:27
going to be concerned. What happens
29:30
if Iran buys buys buys buys
29:32
Facebook? Right, or some other, or
29:34
we go to war with Sweden
29:36
and Spotify, you know, wrapped ends
29:38
up being something that's humiliating. I
29:41
think those, those future concerns, like
29:43
I understand it, but what's the
29:45
limiting principle and why are we
29:47
singling out one company when we
29:49
could be doing some like, I
29:51
think, a better thought out more
29:54
generally. regulation. I agree with you
29:56
on the second point about the
29:58
concerns about foreign propaganda manipulation. It's
30:00
just editorial discretion by another name.
30:02
I agree with that and I
30:04
think that the challenges here are
30:07
going to be, as they often
30:09
are, in First Amendment cases, line
30:11
drawing ones, and the lines I
30:13
think are not nearly as stable
30:15
as the court might want us
30:18
to believe. Right, this is in
30:20
a weird way and the argument
30:22
really felt like this. It was
30:24
like a national security case and
30:26
a First Amendment case walk into
30:28
a bar. Like they were two
30:31
totally different cases and you could
30:33
feel everybody toggling back and forth
30:35
depending on which of those they
30:37
thought they were going to win
30:39
in the moment. But I think
30:41
all three of us agree that
30:44
the Biden administration one and you
30:46
could see that in you know
30:48
all the real-time kind of reflections
30:50
on this this felt like a
30:52
route and at least for while
30:55
both Francisco and Fisher were arguing
30:57
the case it felt like people
30:59
were like is this eight one
31:01
or seven two like this is
31:03
you know tick-tog it's over and
31:05
yet what was really interesting and
31:08
these are the sort of confounding
31:10
questions you're both raising The court
31:12
certainly roughed up Elizabeth Prelogger. They
31:14
were not, I think, in the
31:16
main, mollified by the answers she
31:18
was giving. And I think that
31:21
it suggested to me, and maybe
31:23
this is part of what you're
31:25
trying to tell us, Gotham, is
31:27
that this might come out as
31:29
a very, very simple answer to
31:32
a set of problems that the
31:34
court was clearly uncomfortable with both
31:36
sides. I agree with your view
31:38
that the company is going to
31:40
lose, which is what I thought
31:42
beforehand. I thought it would be
31:45
closer. It was not. I mean,
31:47
there's lots of clues as to
31:49
why the company is going to
31:51
lose, but one that I think
31:53
is really sailing for the First
31:55
Amendment piece, even though I agree
31:58
that Solicitor General Prelogor had a
32:00
tougher time at point. than one
32:02
might have guessed based on the
32:04
company and creator arguments, is that
32:06
there was very little discussion about
32:09
super slope, which is a classic
32:11
First Amendment concern. You know, what
32:13
we do in this case is
32:15
going to reverberate along for many
32:17
other speakers. Were I think the
32:19
company and the creators going to
32:22
prevail, they would... be, I think,
32:24
more discussion about why they have
32:26
to win because if they don't,
32:28
everyone else is going to lose
32:30
in the long run. You didn't
32:32
really get that. You didn't really
32:35
get the troubling hypotheticals for the
32:37
solicitor general. Well, like, if you
32:39
do this, what's going to stop
32:41
Congress from doing that? You know,
32:43
the people remember probably from the
32:46
Trump immunity cases, the seal team
32:48
six assassinating political enemies as the
32:50
really scary hypothetical. There was none
32:52
of that in this argument, which
32:54
I think demonstrates demonstrates that The
32:56
justices don't see a slippery slope
32:59
will just create an easy answer.
33:01
Probably based on the foreign ordership
33:03
dynamic is my immediate guess, but
33:05
even that too I think has
33:07
some slippery slope dynamics that didn't
33:09
really get addressed and I think
33:12
should concern people who have fears
33:14
about what a Congress that wants
33:16
to designate people as national security
33:18
threats or organizations or foreign governments
33:20
as national security threats where that
33:22
limit might lie. My sense was
33:25
that certainly through Noel Francisco's argument
33:27
and through Jeff Fisher's argument, there
33:29
was this kind of like shadow
33:31
of this ownership problem of, you
33:33
know, this is just a foreign
33:36
corporation and this is just, you
33:38
know, a subsidiary and there's no
33:40
free speech rights that attaches to
33:42
that, like, this is all really
33:44
simple. There's no First Amendment issue
33:46
here. That took up a lot
33:49
of oxygen for at least the
33:51
first half of the arguments. Mark.
33:53
Yeah, I mean, it's confusing because
33:55
TikTok is the named plaintiff and
33:57
TikTok is an American company, but
33:59
TikTok is owned by Bight. which
34:02
is a Chinese company, and the
34:04
Supreme Court has said that foreign
34:06
citizens and foreign corporations operating on
34:08
foreign soil don't have First Amendment
34:10
rights, that they can vindicate in
34:13
federal court here in the United
34:15
States. And so Clarence Thomas asked
34:17
right out of the gate, and
34:19
I think it's an important question.
34:21
Is Tik Talk an American company
34:23
really just trying to vindicate the
34:26
free speech rights of bite dance?
34:28
A Chinese company. You're converting the
34:30
restriction on bite dance's ownership
34:32
of the algorithm and the
34:34
company. into a restriction on
34:36
Tiktak speech. So why can't
34:38
we simply look at it
34:41
as a restriction on Baitans?
34:43
And then Justice Katanji Brown
34:45
Jackson asked a similar series
34:47
of questions about whether Tiktak
34:49
is sort of defending its
34:51
right to associate with a
34:53
Chinese-owned foreign company. And I
34:55
don't think we have really
34:58
good answers to those questions
35:00
because the facts of the case are
35:02
so curious here. I mean, it does
35:04
seem to me. that an American company
35:06
should not forfeit its First Amendment rights
35:09
just because it's owned by a foreign
35:11
company. And this was something that Justice
35:13
Neil Gorsuch brought up again and again.
35:15
He had his libertarian hat very much
35:18
on Friday, and I think it's a
35:20
genuine concern. On the other hand, this
35:22
is, I think, a unique feature of
35:24
the threat here, which is that Chinese
35:26
law requires Chinese companies to share data
35:29
upon demand. And so even though Tiktok
35:31
is an American corporation, we all know
35:33
corporations are people. No one is really
35:35
disputing that here. No one is really
35:38
focusing on the corporate speech aspect. We
35:40
all believe in Citizens United now. You
35:42
know, everyone agrees TikTok can speak. But
35:44
if TikTok is speaking and simultaneously turning
35:46
over data to bite dance, which is
35:49
turning over data to the Chinese Communist
35:51
Party, then that is a problem. And I
35:53
don't know that I saw the justices feel
35:55
their way out of this maze. Got them,
35:57
I'm curious if you did. I feel like
35:59
it was this. huge unresolved question that
36:01
a bunch of justices, you know,
36:03
Thomas, Jackson, Gorsuch, just as Amy
36:05
Coney Barrett as well, kept poking
36:07
at to not very much success.
36:09
Yeah, I agree with that. This
36:11
came up a little bit when
36:13
it comes to sort of the
36:16
least restrictive means component and narrow
36:18
tailoring of First Amendment analysis. And
36:20
I think that the company would
36:22
say, well, you know, the issue
36:24
here is that Even if you're
36:26
concerned about this data sharing question,
36:28
and even if you don't think
36:31
that a generally applicable law covering
36:33
all data collected companies would work,
36:35
then the limit is prevent the
36:37
company from sharing information or use
36:39
that national security agreement negotiation that
36:42
the Biden administration failed to come
36:44
to agreement on with the company.
36:46
There are other methods of doing
36:48
this. I think that the foreign
36:51
ownership piece, you know, I... had the
36:53
wise decision of maybe or unwise I've
36:55
never taken corporations in law school so
36:57
this is not my area. But I
36:59
was trying to think of some
37:01
hypotheticals and there was some talk of
37:04
this about the you know foreign communist
37:06
party you know and what came to
37:08
mind for me and this is not
37:10
fully baked so I might revise it
37:12
with some more rest. What if the
37:15
US government said, well Ethel Rosenberg is
37:17
an agent of the USSR and the
37:19
Soviet-communist party and yet she might be
37:21
an American citizen, but really we all
37:23
know that she's just some, you know,
37:26
planned by the Soviets and they're controlling
37:28
her? So we're going to sentence her
37:30
to death or we're going to put
37:32
her in civil confinement. That seems to
37:34
me maybe an analogous situation that we,
37:37
I think from a First Amendment matter,
37:39
would have some concerns about, you know.
37:41
whether or not corporations are people, they
37:43
seem to be for now. And so,
37:45
for me, the sort of bite-dance, Tiktak-U-S relationship
37:47
is a bit of a red herring, or
37:50
at least it's a path to go down
37:52
that I think is hard to disentangle, which
37:54
is maybe why, I think Dahlia, you're a
37:56
right to say, there's going to be an
37:58
off-ramp that's very... I think limited in
38:01
scope and just says maybe we
38:03
didn't want to get into this
38:05
mess and like what's the fastest
38:07
way off the freeway because we
38:09
don't actually have the time or
38:12
the capacity or the record to
38:14
really delve into the sort of
38:16
Gordian knot that this case and
38:18
these issues present. We are
38:20
going to take a short break. Your
38:22
data is like gold Your data
38:24
is like gold to hackers. They're
38:27
selling your passwords, bank details, and
38:29
private messages. McAfee helps stop them.
38:31
Secure VPN keeps your online activity
38:34
private. AI-powered text scam detectors spots
38:36
fishing attempts instantly. And with award-winning
38:39
antivirus, you get top-tier hacker protection.
38:41
Plus, you'll get up to $2
38:43
million in identity theft coverage, all
38:46
for just $39.9999 for your first
38:48
year. Visit McAfee. Comp cancel any
38:51
time, terms apply. I
38:53
can I can say to my new
38:55
Samsung Galaxy S-25 Ultra, I find a
38:57
kito-friendly restaurant nearby and text it to
38:59
Beth and Steve. And it does without
39:02
me lifting a finger, so I can
39:04
get in more squats anywhere I can.
39:06
One, two, three. Will that be cash
39:09
or credit? Credit, four. Galaxy S-25
39:11
Ultra. The AI companion that does
39:13
the heavy lifting, so you
39:15
can do you. Get yours
39:17
at samsung.com. Compatable, select Ashri
39:19
Google Gemini Account, results may
39:21
very based on input-check for accuracy.
39:24
This episode is brought to
39:26
you by Shopify. Forget the
39:28
frustration of picking commerce platforms
39:30
when you switch your business
39:33
to Shopify. The global commerce
39:35
platform that supercharges your selling
39:37
wherever you sell. With Shopify,
39:39
you'll harness the same intuitive
39:42
features trusted apps and powerful
39:44
analytics used by the world's
39:46
leading brands. Sign up today
39:48
for your $1 per month
39:51
trial period at shopify.com/ tech.
39:53
tech. And we are back
39:56
talking TikTok and National Security
39:58
and the First Amendment. Gotam
40:00
Hans. So, Gotam, you actually
40:02
mentioned that the really hinky
40:05
issue here is the First
40:07
Amendment content-based. You can't get
40:10
away from it. And I
40:12
just want to play a
40:14
little bit of audio. Here
40:17
is Justice Kagan saying to
40:19
Solicitor General Elizabeth pre-logger, more
40:22
or less. Dude, you keep
40:24
talking about content-based manipulation. That's
40:27
a content-based rationale. I think
40:29
you've just given your thing
40:31
away, because content manipulation is
40:34
a content-based rationale. We think
40:36
that this foreign government is
40:38
going to manipulate content in
40:40
a way that will, that
40:42
concerns us and may very
40:45
well affect our national security
40:47
interests. Well, that's exactly what
40:49
they thought about. Communist Party
40:51
speech in the 1950s, which
40:53
was being scripted in large
40:56
part by international organizations or
40:58
directly by the Soviet Union. I
41:00
mean, this is in some
41:02
sense, as you've said from
41:04
the jump here, the strongest
41:06
rationale, right? You can't say,
41:08
we don't like the content-based
41:10
manipulation that is coming from
41:12
the platform and we're going
41:14
to pretend that this is
41:16
a content-neutral decision, right? Yeah, which
41:19
I think is... Not really supported
41:21
by the precedent, most notably the
41:23
cases from last year involving Texas
41:25
and Florida's social media laws, the
41:27
net choice cases, which I also
41:29
was involved in as an amicus
41:31
sign-on. I think that the net
41:34
choice cases made editorial discretion for
41:36
internet companies pretty clearly protected by
41:38
the First Amendment. The implications of
41:40
that holding and that choice I
41:42
think are relevant to the Tiktah
41:44
case and yet I don't think
41:46
the justices were there yet in
41:48
their questioning. So what that means is
41:50
that maybe upon further reflection, although when
41:53
that happens and with what deliberation remains
41:55
unclear, that should come to light. I
41:57
think that the sort of inside out
41:59
going out after content manipulation or
42:01
sorting is actually a content-based distinction
42:03
was starting to come to light
42:05
for the justices, which I think.
42:07
either means that it will get engaged
42:10
with on a meaningful level or maybe
42:12
more likely that we'll get a short
42:14
opinion that defers for another day the
42:16
complexities that we've been grappling with today.
42:19
Can I say I did have a
42:21
little bit of a problem with Justice
42:23
Kagan's hypothetical here? I mean I get
42:25
the appeal of this analogy but it
42:28
reminds me a bit of when gun
42:30
extremists compare muskets to AR-15s and second
42:32
amendment cases and say well if they
42:35
didn't ban muskets then we can't ban
42:37
assault rifles the Soviets were pushing
42:39
American communists in the 1950s to
42:41
publish certain pro-Soviet stuff in magazines.
42:43
Okay, how many people really read
42:45
those magazines? What was their reach?
42:48
We're talking here about a social
42:50
media platform that people are addicted
42:52
to, that 170 million some odd
42:54
Americans use, and that has an
42:56
extraordinary and immediate reach. And we
42:58
also know that there are connections
43:00
between social media and a whole
43:03
lot of problems in our society.
43:05
one out of thin air, eating
43:07
disorders, right? We know there can
43:09
be a connection between eating disorders
43:11
and social media use and social
43:13
media glorifying eating disorders and unrealistic
43:16
body image. What if the Chinese
43:18
government said, one way we're going
43:20
to undermine our adversary America is
43:22
to try to accelerate the epidemic
43:24
of eating disorders? And they just
43:26
pummeled Tik Talk by tweaking the
43:29
algorithm with content that glorified eating
43:31
disorders all aimed at teenage girls.
43:33
And they know which of their
43:35
users are teenage girls. And they
43:37
just direct all of this content that
43:39
is known to or very likely to
43:42
encourage disordered eating at a specific subset
43:44
of users for the purpose of. to
43:46
ruin their lives and harm their families
43:48
and ultimately destabilize America. I know it
43:50
sounds a little weird, this whole case
43:53
is about hypotheticals, but there is no
43:55
comparison between that and what the Soviet
43:57
Union could have done with crappy little
43:59
magazines. like door-to-door by pathetic communists
44:01
in the 1950s who really wanted
44:04
a revolution that was obviously never
44:06
going to happen. I just don't
44:08
see it. And so I feel
44:10
like this brings us to the
44:13
limit of analogies between modern technology
44:15
and the old speech cases. It
44:17
seems to me we're just in
44:19
a different world. And at a
44:22
certain point, the speech issues become
44:24
so fundamentally distinct that they can't
44:26
be reasoned out by analogy to
44:29
pen and pen. paper publications that
44:31
just do not apply to
44:33
the digital age. I am
44:35
sympathetic with the dynamic that
44:37
you're describing. The limits of
44:39
analogy are, you know, really
44:41
contested in the scholarly
44:43
space and, you know, we could
44:46
teach it. a seminar on this together,
44:48
Mark, if you want to start committing
44:50
to Ithaca. But I think that as
44:52
an analytical matter and a doctrinal matter,
44:55
this is not the world in which
44:57
we are in. And welcome to what
44:59
it means for me as a progressive
45:01
civil libertarian, which is to say that
45:04
I live in a world where I
45:06
disagree heavily with a lot of what
45:08
the First Amendment doctrines are, and yet
45:10
I have to litigate and analyze and
45:13
think about that world. I would love
45:15
for there to be a moment
45:17
in which we radically rethink the
45:19
First Amendment architecture in which we
45:21
all are forced to live. This
45:23
case should not be that moment
45:25
for many reasons, one of which
45:27
is the fact that we're talking
45:29
about disfavored speakers when I mentioned
45:31
the Israel-Palestine issue that was raised,
45:33
one of which is the alacrity
45:35
with which this case came to
45:37
us, and one of which is
45:39
the... speed at which the court's
45:41
going to have to make a
45:43
decision, and I think that the
45:45
real hesitants to go into the
45:47
deep end of this pool means that we're
45:49
not going to get that kind of really
45:52
thoughtful, necessary re-architecting of free speech
45:54
that I think we all agree
45:56
needs to happen. So the world
45:58
in which we're in. I think
46:00
we have to figure out what the
46:02
right answer is within that world.
46:04
It's tempting. There were so many
46:06
moments in the case where I
46:08
thought of the famous. garage door
46:11
opener tech case, right? Like where the
46:13
justices are like, I don't know what
46:15
I'm talking about right now. And there
46:17
were certainly moments today where it seemed
46:19
as though sort of the scope, the
46:21
tech scope of what was going on
46:24
was elusive for some of them,
46:26
whereas the national security scope was
46:28
really clear. And I wonder if
46:30
that's why like when you get
46:32
a national security case and a
46:34
First Amendment tech case walking into
46:36
a bar and you kind of
46:38
don't understand. stand some fundamentals, that
46:40
it gets easier to sort of
46:43
put your thumb on the scale
46:45
of national security. I do want
46:47
to sort of at least say
46:49
that I think, you know, if,
46:51
as Mark suggested up top, you
46:53
know there's two rationales here there's
46:55
an easy one and a hard
46:57
one and the easy one is
46:59
oh my god they're collecting our
47:02
data and using it in nefarious
47:04
ways that we may or may
47:06
not be able to explain or
47:08
know about and then this really
47:10
really very hard one which is
47:12
this is kind of a content
47:14
moderation case that is about not
47:16
liking the speaker and it feels
47:19
to me as though at some
47:21
point Brett Kavanaugh kind of challenged
47:23
Elizabeth pre-logger to say, like, more
47:25
or less, and we'll just play
47:27
it here. I don't know how
47:29
you want us to think about
47:31
those two rationales. They're two completely
47:34
different things. Which one wins in
47:36
a foot raise? How are we
47:38
supposed to think about the
47:40
two different rationales here and
47:42
how they interact? The data
47:44
collection rationale, which seems to
47:46
me at least very strong.
47:49
The covert content manipulation rationale,
47:51
as the hypotheticals have illustrated,
47:53
raised much more challenging. questions
47:55
for you about how far
47:57
that goes and if that
47:59
alone If you didn't have the
48:01
data collection piece, you only had
48:03
the covert content manipulation piece. And
48:05
then Mr. Fisher's point, Mr. Francisco,
48:07
is that Congress would not have
48:09
enacted this just based on the
48:11
data collection rationale alone. Just your
48:14
understanding of how the two arguments
48:16
fit together. Mark, how does this
48:18
shake out? I think it shakes
48:20
out in Solicitor General Pre-Logger tacitly
48:22
acknowledging maybe that the data privacy
48:24
rationale is stronger. I think that
48:26
for institutional reasons, she couldn't pick
48:28
a favorite child here. But I
48:30
do think that throughout the course
48:32
of her argument, she was leaning
48:34
more and more toward that rationale.
48:36
I just want to add like,
48:38
I think that there there's an
48:40
open question of whether the law
48:43
can be sustained on one ground
48:45
alone. Tiktok and the content creators
48:47
argued vociferously that Congress relied on
48:49
both of these rationales and if
48:51
one of them is illegitimate, then
48:53
the law has to fall and
48:55
that they're sort of intertwined with
48:57
each other. I don't know if
48:59
I agree with that. I'm curious
49:01
what Godham thinks. I mean, to
49:03
me, that would be a strong
49:05
argument if this were like a
49:07
regulation published in the Federal Register
49:09
by the Department of Commerce, but
49:12
this is an act. passed by
49:14
Congress, signed by the president, duly
49:16
passed into law, that reflects the
49:18
bipartisan concerns of the lawmakers, who
49:20
we have elected to represent us
49:22
and to protect our country. And
49:24
I'm not one for mindless deference
49:26
to the political branches, and I'm
49:28
not one for endless flag waving
49:30
in the face of serious free
49:32
speech concerns. But it does seem
49:34
to me that if these two
49:36
rationales can be disentangled plausibly, and
49:39
one of them stands up on
49:41
its own, then the court has
49:43
got to let it stand up
49:45
and not try to sort of
49:47
bootstrap the forbidden justification to kill
49:49
the whole law. I mean, got
49:51
him dis- I disagree with me
49:53
if I'm wrong, but it seems
49:55
to me that at least some
49:57
deference here is owed to the
49:59
decisions of Congress and that we
50:01
shouldn't be encouraging five lawyers in
50:03
robes to overrule their decision that
50:05
easily. Yeah, I think that the
50:08
disentangling question is hard because, you
50:10
know, think back to the Obamacare
50:12
cases when there is the debate
50:14
about the commerce clause power versus
50:16
the taxing power. Those were very
50:18
discreet, understandable, doctrinally distinct justifications and
50:20
ultimately the taxing power and justification
50:22
went out and the commerce clause
50:24
one did not, wrongfully in my
50:26
view, but that's what happened. I
50:28
don't think this case is like
50:30
that. I think that separating this
50:32
is not like, you know, picking
50:34
the nuts out of the salad.
50:37
We're talking about some like sad
50:39
frata that none of us really
50:41
want to be eating right now.
50:43
You know, you can't just get
50:45
the, you know, get the cheese
50:47
out and be like, oh, I
50:49
don't like eggs today. For me,
50:51
I think at a fundamental level,
50:53
I would really love the court
50:55
to be more thoughtful about not
50:57
overruling Congress or... an agency all
50:59
the time just because they don't
51:01
like it. This is not the
51:04
case where I want them to
51:06
start doing that because what's going
51:08
to happen is what we fear
51:10
about in the First Amendment, which
51:12
is where the court is, oh,
51:14
which speaker do we like today
51:16
and which do we not? Do
51:18
I think the court does that
51:20
all the time? Absolutely. I'm only
51:22
some of a fool. I understand
51:24
how these things work, but I
51:26
don't think we should validate that
51:28
kind of ad hoc decision making
51:30
when it comes to sort of
51:33
feelings. And we know that's where
51:35
we are, but we don't have
51:37
to agree that's where we should
51:39
be. I do think that for
51:41
me, there are more or less
51:43
doctrinally justified ways that the court
51:45
could go. I've always thought that
51:47
they're not going to go the
51:49
way that I want them to,
51:51
and then there's the question about
51:53
how do we stand should bleeding?
51:55
My sense that at the end
51:57
of the argument, there was a
51:59
little bit more awareness of that
52:02
dynamic, that there might be some,
52:04
like I said, cleaner off ramps
52:06
that would be better for the
52:08
messy sort of doctrinal questions that
52:10
didn't really get the development over
52:12
the course of years that this
52:14
case would have taken in a
52:16
normal situation. So, you know, we'll
52:18
see if five people want to
52:20
be prudent in the coming days
52:22
and weeks. because as we also
52:24
have I think some understanding the
52:26
timeline here is like really fast
52:29
but also unclear as to how
52:31
this is going to play out
52:33
and when we can expect any
52:35
kind of stay or decision or
52:37
further action from the court. Right
52:39
that's exactly where I think we
52:41
should end on this question of
52:43
the court's going to do what
52:45
the court's going to do and
52:47
then in a very compressed amount
52:49
of time, Donald Trump is going
52:51
to do what Donald Trump is
52:53
going to do. And this case
52:55
all sort of slams right into,
52:58
you mentioned this at the beginning,
53:00
got him, but you know, Trump
53:02
was for the ban and then
53:04
he was against the ban and
53:06
then he filed a brief saying
53:08
he wanted to fix all the
53:10
things with his like deal making
53:12
ability. So he's everywhere and nowhere
53:14
in this case in some ways.
53:16
How is this going to play
53:18
out in the next two weeks?
53:20
What is going to happen? Pragmatically,
53:22
almost regardless, and there was a
53:24
lot of conversation about, how is
53:27
this going to go? Could they
53:29
even divest? What happens if they
53:31
don't divest? Do we get the
53:33
pause? How does this play out
53:35
in the very, very uncertain transition
53:37
to an administration that really feels
53:39
differently about this, at least today?
53:41
you know, guess and we're really
53:43
in the realm of guesswork. So
53:45
if and when I'm wrong, I'm
53:47
happy to admit it, is that
53:49
I expect that we will get
53:51
a decision on the stay request
53:54
from TikTok and the creators pretty
53:56
quickly to say that the stay
53:58
is not going to be granted.
54:00
The reason I think that that
54:02
is probably going to happen pretty
54:04
quickly is that I don't think
54:06
there are other votes for a
54:08
stay. there weren't before the argument
54:10
and there certainly aren't after. So
54:12
the close to zero possibility of
54:14
a sale, I think still is
54:16
close to zero, if not zero,
54:18
and the court maybe wants the
54:20
company to continue to explore this
54:23
divestiture avenue because that would prevent
54:25
the shutdown. A decision could happen
54:27
at any point that's more reason,
54:29
depending on how much depth the
54:31
court wants to go into. And
54:33
I think, you know. were still
54:35
unclear as to what they would
54:37
do on that front. I have
54:39
still sort of guessed that we
54:41
might get a short, relatively short,
54:43
unsigned, pericurium opinion next week before
54:45
the ban would go into effect
54:47
that would give a better explanation
54:49
as to why a state wasn't
54:52
going to happen. Trump's amicus, I
54:54
think the technical term was messy
54:56
and confusing. And I don't think
54:58
there was a lot of appetite
55:00
for that at all argument. I
55:02
don't think that the future Trump
55:04
administration's claims that they wouldn't enforce
55:06
it or my non-enforcement mean very
55:08
much because there are other companies,
55:10
Oracle, Apple, Google, that are implicated
55:12
by this law. And I don't
55:14
think that those companies are going
55:16
to... rely upon the claims of
55:19
a justice department that they're not
55:21
enforcing it given what it would
55:23
mean to be out of compliance
55:25
with the federal law that rests
55:27
on national security grounds. So I
55:29
think with all those sort of
55:31
politics pretty obvious even to the
55:33
justices I would expect something that
55:35
makes the future options for the
55:37
company clear to the company so
55:39
that they can figure out what
55:41
if anything they want to do.
55:43
Same question to you Mark just
55:45
to play us home here. Am
55:48
I going to be... able to
55:50
open my tick-tock app, got them
55:52
talked about briefly like nuts and
55:54
salad and there was some cheese
55:56
there and now all I can
55:58
think about is tick-tock pasta with
56:00
feta. going to be able to
56:02
get my recipe next week more?
56:04
I think that you will not.
56:06
I think that what will happen
56:08
is, as Gottam said, the court
56:10
will issue a short order or
56:12
a procuring opinion that allows the
56:14
ban to take effect on January
56:17
19th. And you know, the way
56:19
the law actually operates is that
56:21
it prohibits app stores from continuing
56:23
to carry TikTok. There's an open
56:25
question of whether you'll be able
56:27
to use it for a limited
56:29
period of time even after the
56:31
app stores shut it down, but
56:33
I think it will die fairly
56:35
quickly if the court allows. it
56:37
to. Then the big question, again,
56:39
as Gottam said, is, well, what
56:41
if Trump comes in and says,
56:44
surprise, TikTok is back, I'm ignoring
56:46
this law. There were some gestures
56:48
toward that possibility at oral arguments.
56:50
I think the answer, again, is
56:52
nobody knows, but that the app
56:54
stores and big tech companies are
56:56
going to be very hesitant to
56:58
potentially subject themselves to penalties if
57:00
Trump tries to suspend the law
57:02
and the courts later shot him
57:04
down. So I would say everybody,
57:06
I know that Instagram Reels is
57:08
an inferior product. I know that
57:10
you will never get as good
57:13
a recipe for feta pasta, especially
57:15
one pot feta pasta on reals
57:17
as you did on Tiktok, but
57:19
start making the transition if you
57:21
haven't already because otherwise if you
57:23
haven't switched to Instagram by January
57:25
19th, then you might not have
57:27
a video app at your disposal
57:29
at all and you might actually
57:31
have to read a book. And
57:33
that. is the worst fate of
57:35
all. I would just say that
57:37
that situation is probably what's going
57:39
to happen, but that remains a
57:42
First Amendment problem for the users
57:44
who I think really can't be
57:46
overlooked in this. context. We have
57:48
over a hundred million people who
57:50
use it on a monthly basis.
57:52
And I agree that sure, there
57:54
are other venues for experiencing some
57:56
other front-facing camera content. But just
57:58
as I think Mark and Dahlia,
58:00
you would not say that I'm
58:02
going to get the same experience
58:04
listening to District Scrutiny or 5-4.
58:06
That also, I think, applies to
58:09
TikTok versus Instagram and YouTube. Two,
58:11
different speakers and different platforms have
58:13
their own First Amendment environments, and
58:15
a switch does have First Amendment
58:17
costs. That I think is what
58:19
we're going to have to keep
58:21
in mind as we see how
58:23
this fast-moving situation plays out over
58:25
the coming weeks. Man, we were
58:27
just hoist on the petard of
58:29
rival Supreme Court podcast, Mark. I'm
58:31
just going to sit in the
58:33
nut-based reality of that. Gotham Hans
58:35
is a clinical professor of law
58:38
and founding director of the Civil
58:40
Rights and Civil Liberties Clinic at
58:42
Cornell Law School. He analyzes through
58:44
research and his advocacy how new
58:46
and developing technologies implicate constitutional law,
58:48
privacy, and data protection, and public
58:50
policy. And I really, really want
58:52
to thank you for being here
58:54
with us. It is clear we're
58:56
all just formulating our thoughts on
58:58
what just happened. And Mark Joseph
59:00
Stern is, of course, my co-pilot
59:02
on Amicus Mark, and Godam, thank
59:04
you both very, very much. This
59:07
has been quite a marathon as
59:09
oral arguments go, and there is
59:11
nobody I would have preferred to
59:13
talk about it with. Thanks. Thanks,
59:15
Dahlia. Thanks to you both, and
59:17
see you on the internet. And
59:21
that is all for this
59:23
episode of Amicus. Thank you
59:25
so much for listening in.
59:27
Thank you so much for
59:29
your letters and your questions.
59:31
You can keep in touch
59:33
with us at Amicus at
59:35
slate.com or you can find
59:37
us at facebook.com/Amicus podcast. As
59:39
we said on our plus
59:41
episode that was taped on
59:43
Thursday night, our hearts and
59:45
our thoughts are with our
59:47
friends in LA in Southern
59:49
California. We hope you're okay
59:51
and that your family. are
59:53
okay. This week's Amicus Plus
59:55
bonus episode is already in
59:57
your feeds. I am talking
59:59
to the great Andrew Weissman
1:00:02
of about the Jack Smith
1:00:04
reports and about the Supreme
1:00:06
Court's relationship with Donald J.
1:00:08
Trump. You can always subscribe
1:00:10
to Slate Plus directly from
1:00:12
the Amicus show page on
1:00:14
Apple podcasts and Spotify, or
1:00:16
you can visit slate.com/Amicus Plus
1:00:18
to get access wherever you
1:00:20
listen. Sarah Burningham is Amicus's
1:00:22
senior producer. Our producer is
1:00:24
Patrick Fort. Alicia Montgomery is
1:00:26
vice president of audio at
1:00:28
Slate. Susan Matthews is Slate's
1:00:30
executive editor and Ben Richmond
1:00:32
is our senior director of
1:00:34
operations. We will be back
1:00:36
with another episode of Amicus
1:00:38
next week. Until then, hang
1:00:40
on in there. My podcast,
1:00:42
The Queen, tells the story
1:00:44
of Linda Taylor. She was
1:00:46
a con artist, a kidnapper,
1:00:48
and maybe even a murderer.
1:00:50
She was also given the
1:00:52
title, The Welfare Queen, and
1:00:54
her story was used by
1:00:56
Ronald Reagan to justify slashing
1:00:58
aid to the poor. Now,
1:01:00
it's time to hear her
1:01:02
real story. Over the course
1:01:04
of four episodes, you'll find
1:01:06
out what was done to
1:01:08
Linda Taylor, what she did
1:01:10
to others, and what was
1:01:12
done in her name. The
1:01:14
great lesson of this for
1:01:16
me is that people will
1:01:18
come to their own conclusion
1:01:20
based on what their prejudices
1:01:23
are. Subscribe to the Queen
1:01:25
on Apple Podcasts or wherever
1:01:27
you're listening right now.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More