E Pluribus Chaos

E Pluribus Chaos

Released Friday, 7th March 2025
Good episode? Give it some love!
E Pluribus Chaos

E Pluribus Chaos

E Pluribus Chaos

E Pluribus Chaos

Friday, 7th March 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

So Kat, I will admit that I've

0:02

always thought it would be cool to do

0:04

journaling, daily journaling, and keep

0:06

up with all the stuff that I'm doing, and I've never

0:09

actually done it. however,

0:11

I know that Apple. Has a journaling

0:14

app, and if you log in and

0:16

use that journaling app, it gives you various

0:18

prompts to try and inspire you,

0:20

unlike me to actually do some

0:23

daily journaling. So I'm going to ask you one

0:25

of those for the start of today's podcast,

0:27

and that is what's something that made

0:29

you smile today?

0:31

So I know this prompt because

0:33

I know every journaling app on the planet

0:36

because I have unsuccessfully

0:39

attempted to be a journaler for.

0:42

Low so many years and it still doesn't

0:44

work no matter, no matter what. And

0:47

so, you know what I'm gonna say? What's making

0:49

me smile today right now is

0:51

the memories of how many

0:53

times I have tried

0:56

and then failed and tried, like with such heartfelt

0:58

intention. Really. I mean, just

1:00

such deep intention to be

1:03

one of those people that writes and reflects.

1:05

On a daily basis and really, and catalogs

1:08

how my life is going. and

1:10

I, I never make it past maybe

1:12

two weeks, but

1:15

I have, I have one or two weeks of journaling

1:17

every few years going back

1:19

to like the nine, like early nineties.

1:22

well, well, well, well, maybe this will, push you over

1:24

the edge and convince you to finally start

1:26

journaling.

1:27

It turns out there's an app for that. Right. What

1:31

about you? What made you smile today?

1:33

Oh, I think having you on the podcast as our

1:35

guest host today has made me smile, so I

1:37

think we'll have lots of, lots

1:39

of fun discussions.

1:41

I was once described to someone who didn't know

1:43

me but had to meet me. A mutual friend said

1:45

she's brunette and very smiley, and

1:50

that was like all they got and they found me right

1:52

away.

1:53

There we go. There we go. Hello

2:03

and welcome to Control Alt Speech,

2:05

your weekly roundup of the

2:07

major stories about online speech, content

2:09

moderation, and internet regulation.

2:12

Today is March

2:14

6th, 2025, and on this week's

2:16

episode. Which is brought to you

2:18

with financial support from the future of online

2:20

trust and safety fund. We will be

2:22

trying to take a broader look at, well,

2:25

I think everything that's going on in

2:27

the world and what that means

2:29

for the online speech space. as

2:31

you can tell, Ben is away

2:33

this week, and so we have

2:35

our wonderful guest host, Kat

2:38

Duffy, who you have just been introduced to

2:40

as the smiling brunette. She

2:44

is also the senior fellow

2:46

for digital and cyberspace policy

2:49

at the Council on Foreign Relations

2:51

and the CEO of Right

2:53

Stuff Strategies. And so,

2:55

welcome to the podcast. We are very excited

2:58

to have you.

2:59

Thanks, Mike. I'm super excited to be here. Any,

3:01

any afternoon where I get to chat with you is a good one.

3:05

Excellent. So with that sort of happy,

3:07

fun, introduction and, getting

3:09

started, let's, let's dive into

3:11

the darkness.

3:12

Let's move directly into existential doom, shall we?

3:15

Yeah. So we were sort

3:17

of talking about what, what. Things we wanted to talk

3:20

about this week. and we have a bunch of

3:22

stories, but I think there were some, framing

3:24

that we wanted to put on this. I,

3:27

I, I think was the way that we were talking about, because

3:29

both of us in the last week or so,

3:32

have written pieces that sort of try and explore

3:34

the moment we're in and sort of the, broader

3:36

impact of it is that a fair assessment?

3:39

I, no, I think so. I think those of

3:41

us who come from this space, there's a lot of,

3:44

dot connecting that we feel

3:46

is imperative to do right

3:49

now, because we're seeing so many,

3:51

folks who don't. Come from

3:53

the space, maybe not, putting things

3:55

together in quite the way that we've

3:57

seen it play out. And this is where,

4:00

you had sent some, you know, some articles

4:02

and things like that. The, the things that are happening this week,

4:04

and I had also been looking at, some things,

4:06

but yesterday on, you know, on LinkedIn,

4:09

it's your post on.

4:12

Why Tech Dirt basically has

4:14

to be a political blog right now, even if

4:16

it doesn't want to be, that I put

4:18

out on LinkedIn is like, this is my top read of

4:21

the day and I'm willing to bet it will be my top read

4:24

of the week. and so I'm gonna take

4:26

guest host privilege here to

4:28

embarrass you because,

4:31

uh, I actually think it

4:33

was one of the clearest,

4:35

most important and most succinct articulations

4:38

of. This moment and what those

4:40

of us who come from tech and law and policy

4:43

and decades of it have

4:45

learned, that we need to be applying

4:47

Yeah.

4:48

right now to help explain the severity

4:50

of the sort of crisis that we're in and the

4:52

stakes of it. So I think

4:55

that we should start by you

4:57

talking a little bit about what you wrote,

4:59

because it is truly the best articulation I've seen

5:01

to date. Far better I think,

5:04

than mine, which it came from a different lens, but.

5:06

we will get to yours as well, but thank you. That's,

5:08

that's very kind. and so I'll, I'll give

5:10

you just a little bit of the background. I

5:12

think it'll be interesting to sort of talk through kind

5:14

of my thinking on this, which is,

5:17

you know, and, Ben and I have certainly talked

5:19

about on the podcast, stuff that is

5:21

happening, stuff that has been happening ever since the election,

5:23

and obviously ever since the inauguration. And

5:26

how that is impacting

5:28

other stuff, and so I've been writing

5:30

a lot about. What is happening

5:32

with the US government these days? Uh,

5:35

and some of it is Donald Trump

5:37

related, but obviously a lot more of it I think is Elon

5:39

Musk related and sort of the position that he's

5:41

in, which is he is effectively running the government,

5:44

which is problematic. And, and I've had

5:47

a few people I. A lot of people have reached

5:49

out and been very supportive of what I've

5:51

been writing, but a few people have said

5:54

like, well, what, what is Tech

5:56

Dirt now? And we've always been a

5:58

very broad publication. going

6:00

back decades, people have yelled at me, stick to tech

6:03

when I was writing about policy

6:05

issues. and you

6:07

know, the argument has always been that I

6:09

started Tech Dirt many, many years

6:11

ago. with the belief, a very

6:13

optimistic belief in technology

6:17

and its ability to do good in the world. And

6:19

I still have that underlying belief

6:21

in me. But the thing that

6:23

I realized very early on, and

6:25

this does go back to really sort of some of the earliest

6:28

parts of Tech Dirt, was that to

6:30

make that reality, to make the technology

6:32

and the innovation be in a world

6:34

where it can do good and where it can

6:36

create a better world for everyone.

6:39

you need other things you

6:42

need. Structures and institutions

6:44

and framework which they fit, that you,

6:46

you don't have chaos, that you have stability

6:48

and you have understanding and especially

6:51

with the internet the sort of global

6:53

nature of the internet, that have

6:55

to have that sort of global perspective as well,

6:58

and that you want sort of a global stability

7:01

to enable all of the other stuff. I

7:03

would love to just be

7:05

writing about technology. You

7:07

know, now we've never been sort of like

7:09

a gadget blog or like a, you know,

7:12

review this internet service kind

7:14

of blog that was never, that was never a part

7:16

of Tech Dirt. I would love if we could get there,

7:19

I would love if the world were, we're

7:21

so boring and there was nothing

7:23

else going on that I could just say like,

7:26

oh, you know, here's this new AI tool.

7:28

Right. I would love to do that and just, look

7:30

at those things, but we can't

7:33

do that. If we don't have all

7:35

this other stuff in place. And

7:37

so I was sort of thinking about that

7:39

and thinking about people saying, what is sector

7:41

covering now if you're covering all this stuff that is sort

7:43

of US government based and, I just

7:46

put it into this article that basically said, this is

7:48

the story. We can't have those discussions.

7:50

We can't have any of those other things. the

7:53

entire entirety of, the US

7:55

government is being dismantled all

7:57

the other stuff fall. And it's not, it's, it's

7:59

not saying like, well, it is

8:01

partly saying that this is a priority, but

8:03

it's not saying like, we're ignoring these other things.

8:06

We're saying that without

8:08

a stable US government, without

8:10

stable institutions, without a stable

8:12

understanding of. global

8:15

world working together on

8:17

important projects for humanity

8:19

and human rights or whatever other

8:21

stuff doesn't, it doesn't come into being,

8:23

Mm-hmm.

8:24

there are people currently

8:26

who supported this regime and

8:28

still do. And you know, some

8:30

of the, the Doge folks who seem to think

8:33

that the institutions and

8:35

the global infrastructure and all of that

8:37

don't matter and that because

8:40

they're uniquely brilliant.

8:43

This is very much in their own heads you

8:47

have to wipe away the institutions because

8:49

the institutions are holding them back and

8:52

then they, through their unique, lone

8:55

inventor, lone genius

8:58

ability build the better world.

9:00

Mm-hmm.

9:01

And that's not how the world works. And

9:03

so I thought that was just sort of important

9:06

to call out. Then with

9:08

it was the idea that, the folks

9:10

who have been doing the best coverage

9:12

of this, of the moment that we're

9:14

in by far, have been the people

9:17

who have been in the sort of tech, tech policy

9:19

legal worlds.

9:21

Mm-hmm.

9:22

Because we've been watching some of this play

9:24

out, most specifically with

9:26

Twitter, right. and what Elon Musk

9:28

did to Twitter over the last three years. We

9:31

saw this and that's why like the reflection

9:33

and a lot of people pointed this out, I'm certainly not the only

9:35

one to have seen this. Like he is using

9:37

the Twitter playbook, play by play.

9:40

Exactly. Except that when

9:42

he did with Twitter, he owned Twitter.

9:44

Mm-hmm.

9:45

And it was frustrating because I liked Twitter

9:47

and I found it useful back in the day.

9:49

I no longer do, But you know, whatever

9:52

it goes away. Other stuff

9:54

comes up, we can do that. The

9:57

US government is different. One, he doesn't own it.

10:00

Two, he has no, no, authority

10:02

to do this no matter what people say.

10:04

and then three, it's, it's the US government,

10:07

it's, it's not, it's not a social media network.

10:10

and,

10:11

it's absolutely, and it's not. And it's not just

10:13

what's happening, it's not simply the domestic

10:15

ramifications of it, right? It's that the US

10:18

being a functional, trustworthy.

10:21

Country where rule of law is respected

10:23

and democracy is intact, and checks

10:25

and balances exist, that

10:27

is a linchpin to an international

10:30

order in which business can

10:32

thrive and in which innovation can

10:34

occur. And you know, I

10:37

many years, I think in looking at

10:39

the sort of, in particular the kind of rise

10:41

of the tech billionaires. I

10:43

have seen in this sort of increasingly

10:46

libertarian anti,

10:48

it's not even just anti governance, but anti-government

10:51

focus, right? Is that,

10:54

if you're building a company, if you're,

10:56

you know, an investor, if you're private equity, if you're a corporation,

10:59

your incentives are investor risk. They're

11:01

essentially managing for investor risk. Government's

11:04

job is to manage for societal risk,

11:06

Right.

11:07

and managing for investor risk and managing

11:09

for societal risk do not necessarily align

11:12

Right.

11:13

nicely. And so

11:16

the people who get hurt are

11:18

the individual that every day,

11:20

you and me, folks, right? Everyday

11:23

Americans, because you need.

11:26

Government in there, making sure that American

11:28

innovation and American business

11:30

is going to serve both

11:32

American interests, but ideally also

11:34

democratic norms. I think about

11:36

it sometimes as sort

11:38

of an aquifer. That the way

11:41

that we think about democratic principles,

11:43

the way that we think about rule of law, a relative

11:45

lack of a kleptocracy, right?

11:47

There's, you know, some grifts that, you

11:49

know, we can talk about campaign finance all day, but,

11:52

you know, overwhelmingly

11:53

not perfect.

11:55

the system is not perfect, but overwhelmingly,

11:58

other countries can trust, for example, that

12:00

American courts. Will

12:02

operate the way that American courts are supposed to

12:04

operate and other countries can trust

12:06

that America does have checks and

12:09

balances and will have swings and vagaries

12:11

in its political system, but will not

12:13

overnight trash decades,

12:15

if not centuries of political alliances

12:18

and trust. Right.

12:20

They could believe that.

12:21

yeah, and, and suddenly, you know, become

12:23

a hostile actor towards, you know,

12:25

our longstanding partners. Allies.

12:29

And so when you think about this

12:31

sort of, I think of the aquifer

12:33

of democratic norms of rule

12:35

of law, of constitutionality, like

12:37

the political and economic stability

12:39

that is the bedrock. America's

12:42

success in the world. America's geopolitical

12:44

primacy, America's ability to

12:47

bring in talent, America's ability to

12:49

support capital flows, to support innovation.

12:52

All of that is based on

12:54

this aquifer underpinning

12:56

it of our core democratic

12:59

principles. And the more that

13:01

we pump those out.

13:03

Yeah.

13:04

Right. We just, I just feel like they're getting siphoned

13:06

out and just thrown away. The

13:08

more that I, I worry that that bedrock,

13:11

that political and economically stable foundation

13:13

that is getting shakier and shakier and shakier,

13:16

and we're starting to see sinkholes. And

13:18

what I really struggle to understand

13:20

in this moment is the degree to which American

13:23

business in particular, and not

13:25

like, the tech, ad.

13:27

Data harvesting economy, like

13:30

not those global digital platforms, but

13:32

business writ large. I don't understand

13:35

why businesses writ large are

13:37

not standing up much

13:39

more forcefully right

13:41

now and saying, we

13:43

need that bedrock in

13:46

order to continue to survive and

13:48

thrive and this

13:51

degree of instability. is not only

13:53

threatening it, but is, riddled with

13:55

unforced errors that aren't a result

13:57

of strategy, but I would say of

13:59

optics and, vengeance

14:01

and, um, yeah,

14:03

it's, you know, it's a, if this

14:06

is masculine energy, like no thank

14:08

you and I'm not,

14:10

I'm not here for it. Like find me

14:12

some other energy. Uh, right.

14:16

And so. you know, one of the things

14:18

that I had talked about in the article

14:20

that I wrote is how this attack on foreign assistance

14:23

and diplomacy and shrinking

14:25

America's scale in that regard

14:27

is gonna fundamentally hurt our ability

14:29

to deploy AI and be a first mover

14:32

in terms of building AI markets around

14:34

the world. Now, I come from foreign assistance,

14:36

and I come from diplomacy. I come from human

14:38

rights, I come from humanitarian rights. If

14:41

you had ever told me that I would be defending the

14:43

virtues. Of right,

14:45

of the importance of foreign assistance and

14:47

diplomacy because of AI market

14:50

deployment, I would've said, well, there's a lot of

14:52

other reasons that are more important to

14:54

defend it. But not

14:56

withstanding, we have

14:58

just violated agreements

15:02

and undercut trust in

15:04

200 countries. Around

15:07

the world, an enormous range

15:09

of markets we're pulling out of all sorts

15:12

of different elements and components of the international

15:14

order we've pulled out of the World Health

15:16

Organization. Right. and

15:18

so how exactly do you, for

15:20

example, if you're American Pharma, right?

15:22

And you're really trying to work

15:25

on like getting your new vaccine out

15:27

to the world, for example, and America's not

15:29

even a partner in the World Health Organization. What

15:32

does that mean for you and

15:34

what sort of opening does that create for China, which

15:36

is invested dramatically in achieving at

15:39

equal or greater scale

15:41

in terms of its own development model

15:43

and its own diplomatic model?

15:45

China has worked very hard for the last 15

15:48

years to overtake America. On

15:50

that front. And when China

15:52

does go into those markets, it

15:54

doesn't go in only to sell

15:56

a product. It also goes into

15:58

broker influence. And increasingly

16:01

with the us, absent China

16:03

will be able to exert influence that says,

16:05

and you can't buy American products and you can't

16:08

deal with American companies. And,

16:10

and, and, and so we're

16:12

seeding the ground for American

16:14

innovation, that has been laid over

16:16

of decades. Of work in

16:18

such a needless fashion. And I think

16:20

for someone who comes,

16:23

for folks like Elon or even the large global

16:25

digital platforms, because they

16:27

built out so many of their operations,

16:30

global digital, without boots on the ground, without

16:32

having to have local licenses, local markets

16:34

that telcos on the other hand really understand

16:37

the importance of those local relationships,

16:39

right? As do you know, supply chain businesses.

16:41

But I think the global digital platforms

16:44

underestimate. How important

16:46

that underpinning infrastructure is, and

16:48

that trust in American business is,

16:51

that our reach, our scale in

16:53

terms of diplomacy in particular,

16:56

gave the United States. And so this

16:58

to me is something where I, I think this is an incalculable

17:01

loss for American

17:03

entrepreneurs, for investors,

17:05

for multinationals, for

17:07

those who are striving to be. Multinationals,

17:10

their work just got much, much harder.

17:12

Yeah. and, we'll put it in the show notes, but

17:14

your piece was in foreign policy, magazine

17:16

and it's, it's really good. It's really looking at, in

17:18

particular the impact on the AI

17:21

market. of sort of killing off us.

17:23

A ID you know, one of the things,

17:25

this is also frustrating, but,

17:29

but, I mean, you talked about sort of the

17:31

difference between like investor risk and

17:33

societal risk, but I, I think

17:35

the point that both of us are making that is so important

17:37

is that like, if you destroy

17:40

societal risk, that's not good for investors,

17:42

right? and. it's especially not

17:44

good for global, any kind

17:46

of global business, any business that wants to be global.

17:49

And, you know, one of the amazing

17:51

things about the internet, and

17:53

I joke about this, like the first year

17:55

of Tech Dirtt in 1997

17:58

through 1998, again,

18:01

because I'm old, was. Basically

18:04

just stories about like, huh,

18:06

this internet thing, races,

18:08

all sorts of jurisdictional questions

18:10

Mm-hmm.

18:11

because it's this, it's a global thing.

18:13

And unlike, unlike almost

18:15

every other kind of business where like expanding

18:18

globally takes boots on the ground,

18:20

as you said, or, or some sort of effort, to

18:22

go globally with the internet, you were able

18:24

to go global immediately. And

18:27

historically the US

18:29

in particular had been

18:31

a very strong defender of a global

18:33

open internet. the state department,

18:36

for years has been, you know, a big,

18:39

did amazing things that got no credit

18:41

for

18:42

Yep. And across every administration,

18:44

this is a completely bipartisan, including

18:46

the first Trump administration. This has been a completely

18:49

bipartisan, longstanding

18:52

area that the United States has championed a free,

18:55

secure, open, interoperable internet.

18:58

We are the OGs of that concept.

19:01

and certainly like other countries have challenged

19:03

that and China being a big one with, with

19:05

its sort of great firewall and, other countries

19:08

over the last few years. I mean, I think the fracturing

19:10

of the global open internet has been

19:13

a concern, like a major concern.

19:15

But at least historically

19:18

you could sort of rely on the US government to

19:20

at least fight for it. and to, to raise

19:22

the issue of why this was so important. and

19:25

yes, like some of it was, because it helped

19:27

us businesses, right? I mean, so many of

19:29

these businesses that we're talking about are these

19:31

giant US companies, and

19:33

people have concerns about how big they are and how powerful

19:35

they are, and that's, that's reasonable. But,

19:37

what is shocking to me to bring

19:39

this back around is like. How come

19:41

those businesses are, are on board with this?

19:44

do they not realize, I mean, Elon

19:46

Musk I don't think understands anything at this

19:48

point, but does Mark Zuckerberg

19:51

and Jeff Bezos not

19:53

realize that doing this

19:55

undermines the bus? They rely

19:58

on a global internet. They are global

20:00

internet businesses. Do they

20:02

not realize how completely

20:04

dismantling. This substrate,

20:07

bedrock, whatever you wanna call it, that

20:09

it undermines their ability to have

20:12

global internet businesses.

20:13

it feels to me that there has been

20:16

an increasing and rather

20:18

astonishing degree

20:20

of hubris, you

20:24

know, to some degree, if you have. A

20:27

company the size of meta or the size

20:29

of Amazon, or, and I don't wanna put

20:31

Google and Microsoft in these same categories because those are

20:33

much more mature companies. I,

20:35

I would argue they, they have

20:37

come with these issues with greater seriousness

20:39

and maturity. But you know, when

20:42

you look at the Bezoses and you look

20:44

at the Zuck and you look at the Elon's

20:46

and you're looking at what Bezos is doing with the

20:48

Washington Post right now, my gut feeling

20:51

is that they think that they have enough money and enough

20:53

power and enough scale

20:55

Right,

20:55

that they essentially are techno states and

20:57

they can go their own way. I mean, meta was, you know,

20:59

Facebook was trying to lay its own cable,

21:01

right? All around. The Africa so

21:03

that it would just have its own pipelines to go

21:06

over. So they have sort of converted

21:08

themselves in their brains into their own

21:10

techno states, and they're the CEOs of, you

21:12

know, they're the presidents of their own techno states and

21:14

it is their job to essentially

21:17

pursue their company's interests

21:19

and that operates in a vacuum. What

21:21

is really interesting to me is

21:24

the VCs and the major investors

21:27

Who have invested so much in building

21:29

out new companies, right? Trying

21:31

to find new unicorns. Those,

21:34

you know, quote unquote little tech. I mean, little

21:37

tech, billions of dollars, right? But

21:39

all of those companies

21:43

benefit from taxpayer

21:45

dollars going to

21:47

a US global presence that

21:49

promotes. US industry

21:52

promotes US business and helps

21:54

create glide paths for its

21:56

deployment across markets. And

21:58

so to me, I understand sort of why the

22:00

incumbents are, bending

22:02

the knee to the degree that they are. What I don't

22:04

understand is the

22:06

many, many, many other actors where I

22:08

don't really actually see where there are financial incentives.

22:12

Or their power,

22:13

Right.

22:14

is aligned with the current

22:16

state of affairs. I

22:18

just, I don't, I don't get it.

22:20

Yeah, I mean the framing that you hear from them.

22:23

and I unfortunately have spent a

22:25

little too much time paying attention to what they're saying.

22:27

and I regret it every time. But I feel like I need

22:29

to, I, I need

22:31

to understand what they're saying. You know, their

22:34

framing is, I. this is maybe

22:36

gonna sound like a tangent, but I, I sometimes

22:38

wonder if any of these people have ever played chess.

22:40

Right. So, like, I'm not a good

22:42

chess player. my kid is a very good

22:44

chess player, and I sort of stopped once

22:47

he was able to, to beat me consistently. Um,

22:50

and embarrassingly

22:51

Way to, way to way to be a role model Mike way

22:53

yeah.

22:55

way to model as parents, right?

22:57

Just, oh, you got better at me. I'll quit.

22:59

Oh man, there's, there's the internet. You

23:01

can, you can play on chess.com and

23:04

you'll find people who, who are your level.

23:07

'cause I I continued

23:09

to play for a while, but it was, it was when

23:11

it got embarrassing where, where every

23:13

game was basically him

23:16

saying, how did you not see that as

23:18

he destroys me in like five

23:20

moves.

23:21

Yeah.

23:21

Anyways, so I'm not good at chess, and so I'm

23:23

not claiming to be good at chess. But there, you know, the

23:25

most basic concept of how you play chess

23:28

is that you look multiple moves

23:30

ahead, right? and there was a great

23:32

book, which I gave to my son

23:34

when he first started playing chess. which

23:36

is why he's now much better than me. That, the whole

23:39

point of it is, teaching you to look ahead.

23:41

And everybody always says like, oh, like the great chess players

23:43

can look like seven moves ahead, which apparently is garbage.

23:46

Like nobody can actually do that. But this book

23:48

really focused on like learn just

23:50

to look 1.5 moves ahead.

23:52

That was like the whole framing of the book. you're

23:54

not gonna be able to look multiple moves ahead,

23:57

but before you make a move, figure

23:59

out what the response is going to be and

24:02

like, that's it. that's the starting point. This is

24:04

a, basic, Beginner level chess

24:06

book. And the thing that I'm seeing

24:08

with all of these is that none of

24:10

them seem to be able to think what is

24:12

the blowback? what is the response to this?

24:14

and think in a, in a broader way. And

24:16

so the stories that they tell are

24:19

very much, that. regulations

24:22

are holding us back, or antitrust

24:24

laws is holding us back, which is garbage. Like none

24:27

of the, VCs are being held back by antitrust

24:29

law. Uh, you know, maybe at

24:32

the very, very extremes, they could argue

24:34

because of, greater antitrust enforcement,

24:37

the buyers of their, not super

24:39

successful companies are limited.

24:41

You know, I. Meta can't buy

24:43

the company. That didn't become the next meta.

24:46

Google can't buy whatever. but you

24:48

know, that is a minor thing. In

24:50

the big scheme of things, these VCs are always

24:52

going for, they want their companies to become

24:54

the next meta or the next Google or whatever,

24:57

but they seem to think that like. Wiping

24:59

out all regulations will help make

25:01

that possible. And I understand the sort of

25:04

libertarian mindset, you know, sort

25:06

of anti-regulation stuff and like, my

25:08

role on this podcast Ben is here

25:11

is normally to be the guy pushing back on

25:13

bad regulations. will

25:15

speak out about bad regulations, but

25:18

that doesn't mean that like you

25:20

want no regulations at all. Like

25:22

there are important regulations and there are

25:24

important infrastructure. Like I view it

25:26

as infrastructure that give you the rules

25:29

of the road that explain to you how these things

25:31

work. And they seem to have taken

25:33

this idea that was like, especially

25:35

in like the cryptocurrency space

25:38

and in the AI space. and

25:40

like there were some politicians who I think went

25:42

overboard on both of those and said like,

25:44

well, have to like prevent

25:47

anything bad from happening. and I think

25:49

a lot of that was an overreaction

25:51

to, I. felt that they didn't

25:53

do enough with social media. Social

25:56

media went bad and became

25:58

evil or however they wanna put it. And

26:00

so now we have to be much more proactive.

26:03

the new round of technology

26:05

needs to have like strong regulatory

26:08

safeguards in place from the very beginning. and

26:10

I, I understand that, that, you know, I

26:12

don't think that makes sense because I think. It

26:14

helps to understand you learn how the technologies

26:16

work and you learn where the problems are if

26:18

you allow them to, to be created, but like recognizing

26:21

that you should go about it thoughtfully is

26:24

important and. A bunch of these

26:26

VC guys were just like, no. Anyone

26:28

trying to tell us any sort of regulatory approach

26:31

or even thinking about will this harm

26:33

people? You know, will this

26:35

cause other problems? Like we shouldn't even

26:37

have to think about any of that and

26:39

therefore, we're just gonna take a stance

26:41

that allows us to do anything. And

26:43

at best, they think that with a

26:46

sort of Trump and, sort

26:48

of running the White House, that.

26:50

They can just push any, any sort of regulatory

26:53

oversight that might come their way out

26:55

of the way and, you know, just build

26:57

Right. But again, that just ignores all

27:00

of the important other stuff that we've been talking about.

27:02

Well, it's also, it's always interesting to

27:04

me the degree to which people talk about regulation

27:07

as a proxy for constraint, as

27:09

if all that governance can do

27:11

is to limit or to constrain as opposed

27:13

to empower. You know, when you look at the CHIPS

27:15

Act for example. you know, I think

27:17

most people would argue that is, that is regulation

27:20

or legislation, I should say that. Is

27:23

incredible for a lot of American business

27:25

and would really support American innovation

27:27

and would help the supply chain and will fuel,

27:30

you know, billions of dollars more money into research.

27:32

you know, again, then that's part of the ecosystem

27:34

that we've talked about, which then, builds both

27:36

talent and expertise that can get piped

27:39

into the private sector and fuel the innovation

27:41

economy. And so, this

27:43

idea that somehow

27:46

governing is antithetical

27:48

to progress. I would

27:50

argue that all creative processes

27:53

are served by some boundaries

27:58

you waste less time. There's more.

28:00

Yeah.

28:00

Efficacy. Right? Like taking, taking

28:02

away the societal impacts. Taking away

28:05

the risks. I do think it's,

28:07

fair to note that America is a

28:09

very weird duck because

28:11

of federalism. So you

28:13

like the fact that we have states rights and federal

28:15

rights, that is a very distinct

28:18

aspect of the US that

28:20

I think we tend to gloss over

28:23

a little more than we should. Taking

28:25

that away, let's just imagine.

28:28

How much easier it would be to build

28:30

a business if the United States

28:32

had had strong regulation

28:35

in place for decades around cybersecurity

28:38

and cybersecurity controls? Think

28:41

about, I mean, I think we hit 8.3 trillion

28:43

in ransomware last year.

28:45

Think about the amount of money that

28:48

our companies are spending patching.

28:51

Looking for VMs, trying to do cybersecurity

28:54

defense, you know, trying to bury

28:56

ransomware attacks, trying to

28:59

prevent phish, like there's so many,

29:01

you know, updating routers, not

29:03

knowing, which hardware or or

29:05

software is gonna be safe enough,

29:08

not truly being able to balance their risks.

29:10

Boards still don't fully understand how

29:12

to take on cybersecurity

29:15

risks, right? When they're thinking about broader

29:17

risk assessment. We would have

29:19

so much less wasted money and time

29:21

and so much less accumulated risk

29:24

had we had stronger cybersecurity protections

29:26

over the years that could serve as a baseline.

29:29

And so this is where, you know, you see this in aviation,

29:32

you see it in car safety. I mean, we

29:34

have a global financial system because

29:36

banks are probably the most regulated industry

29:39

on the planet. And banks

29:41

came together with governments, everyone

29:43

agreed like, you know, we're not super into money

29:45

laundering. we don't love it. It doesn't serve our

29:47

collective interests. And so now we

29:49

have a pretty effective and pretty

29:52

robust system that

29:54

operates across global financial markets

29:56

to counter. Money laundering. Does it prevent

29:58

all money laundering? No, of course not. Right.

30:01

But does it significantly

30:03

increase the amount of effort and time

30:05

and expertise required to

30:08

do it? It does. And does it significantly

30:10

increase the accountability? right,

30:12

or the punishment if you have engaged in it? It

30:14

does. It also improves your ability to have,

30:17

extradition. Agreements,

30:19

law enforcement, cooperation,

30:21

data sharing, right? All

30:23

of these other things that are consistent discussions

30:26

within the tech space, like

30:28

banks have been navigating that forever

30:31

and they work pretty much all over the

30:33

world

30:33

Yeah,

30:34

like.

30:34

though I, I mean, I'll revert to my,

30:36

my usual role and push back a little bit on

30:39

this, which is that also though, like. Banks

30:41

are not particularly innovative these days. Right?

30:44

No, no, no. Absolutely not.

30:46

And so there is like, this is part of the trade-offs

30:48

that we're thinking about in these things,

30:51

but you, you know, there is this middle ground,

30:53

right? And,

30:54

Well, this exactly I, this is like with crypto,

30:56

I think this is a very, I take

30:59

and respect the point from a lot of

31:01

people who are procr,

31:03

Yeah.

31:04

that it opens up a new world

31:07

of possibility, and it releases

31:09

entrenchment over different financial markets

31:12

and models. but also those

31:14

believers, the ones that I can engage with the

31:16

most heartily are also the ones who are the first

31:19

to acknowledge that you need.

31:21

Some controls, some agreement,

31:24

right? Some understanding if people are going

31:26

to rely on that technology and build on

31:29

top of it. And so it is this, there's this healthy

31:31

middle ground. I don't think it's a neither or

31:34

I think it's a yes and

31:35

Right. Yeah. And I mean, again, it's like

31:38

the internet exists because of the government

31:40

in the first place. Right.

31:41

yeah,

31:42

and there are. Kinds

31:44

as does ai.

31:45

Yes. Right. And, and there are all kinds

31:48

of regulations and, and subs that help

31:50

create that groundwork. And I would argue for example,

31:52

because I'm like the biggest section

31:55

two 30 fanboy, that there is

31:57

like, you know, I'm a big supporter

31:59

of section two 30. That is a regulation, but

32:01

that created the framework

32:03

for how. Internet companies

32:05

could moderate and could do trust

32:07

and safety and do so without facing lawsuits

32:10

for everything that they did. So you had

32:12

this sort of government regulation that,

32:14

set the, playing field and made it level

32:17

and made it so that you could have competition

32:19

and you could have experimentation. but

32:21

that takes some level of forethought.

32:24

Well, and, and also, and I, and then

32:26

I would counter that by saying like, yes.

32:28

And can you imagine how different

32:30

the attention economy would look if

32:32

we had had a federal data privacy law early

32:35

on,

32:35

Yeah.

32:36

right? Like, and had not had an entire

32:39

economy that was based on data harvesting and then

32:41

consequently engagement.

32:42

Yeah. Though though, again, I

32:44

like to, to be clear, like this is really, really difficult

32:47

stuff. I do think if we got a data

32:49

privacy law in like, let's

32:51

say 2004, it probably would've

32:53

been a terrible law because, we

32:56

wouldn't have even realized what it was that, like,

32:59

how to put in place the sort of proper regulations

33:01

for that. And so that's, that is

33:03

This is so, it's, it's so hard. Yeah.

33:05

No, I totally agree with you on the, the way

33:07

that. Deliberative governance,

33:10

deliberative and informed governance, and

33:12

the speed at which technology moves

33:15

like those two things are somewhat antithetical

33:17

Yeah. Right. I mean, this is, this

33:19

was like another point that I, I, other people

33:21

have raised, this is not a unique to me

33:24

viewpoint, which is like the way that the tech

33:26

world thinks is like I. Launch

33:28

and iterate and you're sort of continually iterating

33:30

and you launch and you realize there are bugs and you realize

33:32

things are gonna go wrong and break government

33:34

doesn't work that way. Now that is

33:37

apparently some of the way that Elon Musk

33:39

is thinking about government because he's just breaking

33:41

stuff and like, we'll patch it later

33:43

right,

33:44

without realizing like, you can't quite

33:46

do that for some of the

33:48

stuff that you're trying to do. But I understand

33:51

where that mindset is coming from because that is the tech

33:53

mindset. It's

33:54

yeah. But when you cut off the, you know, when you cut off

33:56

a Twitter corporate card? Versus

33:58

when you cover, when you cut off the corporate.

34:00

Cards of, of people who have to pay

34:03

for dry ice to keep

34:05

cooling specimens

34:07

that are, you know, key to

34:09

decades of scientific research. The

34:12

impacts of that decision are wildly

34:14

different. I don't particularly

34:16

care if anyone's corporate card at Twitter or

34:18

X gets cut off. Like,

34:21

I can, I can roll with whatever those ramifications

34:24

are. and so I, I think just to

34:26

go back to the, broader point in

34:28

terms of looking at the bigger picture,

34:31

this narrative that we're hearing come out

34:33

of the administration that is so, so, so anti

34:36

governance and regulation is also

34:38

very malaligned with where the rest

34:40

of the world sits. Uh,

34:44

and so, you know, you see this,

34:46

there was an article that we were talking about, you know, this

34:48

week that came out around. The criminal

34:50

rings that have been, you know, extorting

34:53

teenage boys who have then committed suicide,

34:55

but also these criminal gangs that

34:57

have been trafficking folks, you know,

34:59

they've been being held in Thailand and

35:01

Myanmar. There's been an outpost in Ecuador.

35:04

there is a real thirst within the international

35:06

community for normative consensus

35:09

on some of these issues. Right.

35:11

you saw this with The real uptick

35:13

on the UN Cyber Crime Convention,

35:16

which I think frankly the first

35:18

Trump administration was very wise to nip in the

35:20

bud. And then the Biden administration is

35:23

the one that took that on, because

35:25

it had been posited by Russia

35:28

and they were attempting to make sure

35:30

that Russia didn't run the table there. I know where they

35:33

were coming from, but what we ended up with is

35:35

a convention that doesn't actually serve business

35:37

or. Really do much to prevent

35:40

cyber crime.

35:41

can, can just, just for our listeners

35:43

who haven't followed, I've written about it

35:45

a few times and there there's some other writeups, but

35:47

can you just really quickly summarize what the

35:49

UN cyber crime, treaty is?

35:51

Are you suggesting that not everyone is paying attention

35:54

to UN treaties?

35:55

it is, it is possible. Uh,

35:57

so if, if we just had like a, the, the quick,

36:00

version of what it is, because it is really

36:02

important and it has sort of flown under the radar,

36:04

unfortunately. and it is very serious,

36:07

and I do not, I still do not understand

36:09

why the Biden administration went for it, but just

36:11

give a quick, quick summation of it.

36:13

Yeah, I would say this is, um, the

36:16

cyber crime. Convention

36:18

was, it's very law enforcement

36:20

driven. And the idea was essentially

36:23

that you need to be able to

36:25

have a treaty that allows countries to work

36:27

with each other on, battling

36:29

cyber crime. So battling ransomware,

36:31

battling fraud, right? Uh,

36:33

and that would include data agreements, extradition

36:36

agreements. All of these things sound

36:38

sort of like, they make sense. but the.

36:40

terms that are in there are very fuzzy

36:43

and they expose, in particular a lot of American

36:45

companies and a lot of American multinationals

36:47

to a really insane

36:49

amount of risk. And so in 25

36:51

years of doing tech policy, I have never seen,

36:54

such a unlikely

36:56

assortment of individuals

36:59

and companies who were against this particular

37:01

convention. Like when, when Maria Resa

37:03

and Meta agree on the same thing. That's

37:06

a sign, right? and so you

37:08

had literally, you

37:09

And just really, really quickly, again, for people

37:11

who don't know Maria ssa as a journalist in

37:13

the Philippines, who has been a, a huge

37:16

critic

37:16

huge critic

37:17

of meta. But anyways,

37:19

And of beta. But, so anyway,

37:21

It's rare in the United Nations

37:23

for a treaty to move quickly. This

37:26

happened in just a couple of years, which

37:28

again, for the United Nations is like warp speed.

37:30

It, it's very much on the pathway to getting

37:33

adopted. It's going to significantly

37:35

increase a lot of risk for American

37:37

businesses. America will never ratify it, but

37:39

lots of countries where American businesses have offices

37:41

will, and especially

37:44

in the age of ai, it creates a lot of risks

37:46

for people like AI researchers and

37:48

AI safety mechanisms, that

37:51

I think weren't foreseen in the drafting

37:53

of it. What's been really interesting,

37:56

About that convention though, is

37:58

that it's really coming from

38:00

this thirst that so many different

38:02

governments have to be able to deal

38:05

with cyber crime. Like there is uniform

38:08

consensus across governments

38:10

that this is not something that is healthy

38:12

or that most governments want to

38:14

support. and so there is a lot

38:16

that we could be leaning into

38:18

And yet,

38:20

And yet, instead, what we're doing is,

38:22

burning trust and burning

38:24

bridges with nations all over the world. And

38:26

you can agree or disagree that the United Nations

38:29

is like where things happen. But

38:31

for the G 77 or for like the lower

38:33

and middle income countries, in particular,

38:36

the United Nations is very much the place

38:38

where they feel like they have a voice.

38:40

Right.

38:41

and because it is to some degree, it's, you know, you're

38:43

whipping votes, right? And it's a vote count.

38:46

One of the things that China has leveraged particularly

38:48

well, over the past many years,

38:51

and Russia has to some degree, but China

38:53

in particular, has really

38:55

done a great job of building up its

38:57

alliances and its strength inside

38:59

that institution. And where you tend to see

39:01

American businesses kind of. Poo-pooing

39:04

the UN as it gets super slow and it doesn't

39:06

achieve anything and it's not important. I

39:08

think what that perspective misses is that

39:10

the UN is an incredibly easy space to co-opt

39:13

if you are strategic and if you are thoughtful

39:15

and intentional about doing so. And

39:17

so, we ignore it at our peril, not

39:19

because of what it can achieve, but

39:22

because of what can be achieved through

39:24

it by other actors who

39:26

are not acting in alignment with American interest.

39:28

And, and there have been a lot of efforts to do

39:31

so, I mean, beyond the cyber crime treaty, there

39:33

was, there were efforts, you know, a few

39:35

years ago through the ITU to effectively

39:37

sort of take over governance of

39:39

the internet in a process that was

39:41

really led by China and Russia as

39:43

well.

39:44

and you're seeing it now. I mean, there's,

39:46

there's something called WSIs, uh,

39:49

which is happening this summer. everyone

39:51

have fun with that acronym. Uh,

39:54

but essentially this is a global

39:56

gathering where, you know, historically you

39:58

sort of set. The standards for the

40:00

next 10 years of internet and internet

40:02

governance. And I can tell you that right

40:05

now, the United States and most

40:07

of the, you know, most of the democracies, most of the countries

40:09

that champion a true interoperable internet,

40:11

they're nowhere to be seen, right? They're too busy

40:13

dealing with everything that's, happening

40:16

with this administration. And so

40:18

they are not coming in with a strategy. Meanwhile, China

40:20

is very much. Driving a

40:22

strategy with the G 77

40:25

to show up in force and

40:27

in a way that risks a

40:29

much more splintered internet and internet

40:31

where the way that we've thought

40:33

about having a digital footprint,

40:36

is catastrophically undermined, and

40:38

people are not paying attention to that at all.

40:40

And so it's, we, we ignore these

40:42

things at our peril because of,

40:44

I think because we've gotten complacent. Frankly,

40:47

and we forget how hard the United States

40:49

worked to protect that for

40:52

decades as a top priority

40:54

in foreign policy.

40:56

speaking of which we're shift gears a little

40:58

bit, but it is kind of the same story in

41:00

a slightly different way. One of the other

41:02

things that you, pointed out was that

41:05

Jim Jordan, who Ben and Sis

41:07

is my best friend, has

41:10

sent subpoenas to a bunch of

41:12

the tech companies specifically

41:15

about their communications with

41:17

foreign countries regarding. Content

41:20

moderation on their services. Talking

41:22

about the eu, the uk, Brazil,

41:25

Australia, and have sent these

41:27

subpoenas to basically

41:29

all the, well, all the big

41:31

companies and a few small companies. So there's

41:33

Google, Amazon, apple Meta,

41:35

Microsoft, X also,

41:38

which is interesting, TikTok, and

41:40

then also Rumble, which is the,

41:42

um. YouTube for maga basically

41:44

if, if

41:45

Surprisingly not to truth social.

41:47

surprisingly not to truth social,

41:49

yeah, I wonder, I wonder why, um.

41:52

So basically demanding, making

41:55

a bunch of claims which are obvious nonsense,

41:57

claiming that Jim Jordan himself,

42:00

proved that the US

42:02

government and the Biden administration pressured

42:05

the companies into, taking down

42:07

conservative speech, which I keep

42:10

repeating it like this. Went to the Supreme

42:12

Court last year, and Amy Coney Barrett

42:14

wrote. Over and over again, there is no

42:17

evidence. You have no evidence

42:19

to support. This

42:20

not only did she write that, she wrote a footnote

42:23

that was so scathing

42:27

that every single, you know, all of us

42:29

with law degrees were like, oh my God, this is

42:31

like my nightmare. That somebody

42:33

that a Supreme Court justice would write a footnote

42:36

like that about, like, that's like you wake

42:38

up in sweats in law school fearing

42:41

that something like that would happen. It was,

42:42

I don't even remember the, the exact photo. Was this

42:44

the, like the lack of candor or there was something along

42:47

those lines right there. I forget the

42:49

was basic. It was basically the footnote that was

42:51

essentially like, normally

42:53

we wouldn't question appellate decisions unless

42:55

it's like such an

42:57

Right, right, right.

42:58

Like oversight. That in fact, no,

43:01

like this is indepen.

43:02

so he's now sent subpoenas and, and basically

43:05

said, companies have to share the,

43:07

communications and basically like, you

43:09

have to prove that you, stood

43:12

up against these censorship

43:14

demands. Is the way, it's framed.

43:17

And it'll be interesting to see,

43:20

like in the subpoenas, he, highlights

43:22

of course, X as being like the gold standard,

43:24

which is garbage. and just,

43:27

disconnected from reality, but then

43:29

also uses Zuckerberg's,

43:32

Spinelessness in making

43:34

claims now that Yes. Oh, the Biden

43:36

administration was too tough on him. that,

43:38

okay, those guys are standing up to them now

43:41

and you have to prove to Jim

43:43

Jordan now how these other companies are

43:45

supposedly standing up to, foreign attacks

43:48

on speech. And so this is kind

43:50

of mind blowing in its own way. The

43:52

fact that a. US official

43:55

thinks that, they can get the communications

43:58

between foreign governments related

44:00

to content moderation stuff, and

44:02

that it's framed in this way that is, again, just

44:04

completely disconnected from reality.

44:06

No, it's, I mean, you know why Oh, y oh, why did

44:08

he ever leave Ohio? I

44:11

mean, this is, uh, I

44:14

say as a mid-westerner, um,

44:16

it was really funny to me in the sort of press release

44:18

from the committee. There was a phrase,

44:21

you know, X has pushed back against lawless

44:23

judicial orders in Brazil and Australia

44:25

mandating global content takedowns. Now

44:28

you can absolutely

44:31

have a debate about whether

44:34

Brazil or Australia

44:36

have the right that a company

44:39

is required essentially to do a global

44:41

content takedown based on any one. Jurisdiction.

44:44

That is a completely fair discussion

44:47

to have. Right. But the idea

44:49

that they're lawless, judicial orders, you, you

44:51

can't have a law

44:53

Right.

44:54

like,

44:55

by definition,

44:56

just because you don't like the law doesn't make

44:58

it lawless. Right. And

45:00

so, you know, one of the things that I find really concerning

45:03

about this as well is also looking

45:05

at what happened with the information research

45:07

space and looking at. How

45:10

Jim Jordan in particular and his

45:12

committee have sort of weaponized their

45:14

subpoena powers in order to force

45:16

discovery, and essentially escalate

45:19

harassment. There is a very McCarthy

45:21

era

45:22

Oh yeah.

45:22

vibe to

45:23

is, he is the, the, the modern

45:25

McCarthy, right? I mean.

45:27

and this is something again where, you know,

45:29

you would hope, you know, the companies historically,

45:32

I, I can't speak as much For Amazon,

45:34

or Rumble. but at least

45:36

for, you know, for Meta, for Microsoft,

45:39

for Alphabet, I mean, I served on the board of a,

45:41

you know, a multi-stakeholder initiative called the Global Network Initiative

45:43

for five years, which is all about voluntary

45:46

principles. and I have reviewed

45:48

hundreds, if not thousands of pages of confidential

45:50

reports in, independent audits, going

45:52

through how the companies have responded

45:55

to. take down requests,

45:57

data requests from different governments

46:00

and historically where, you know, these companies

46:02

have complied with the voluntary principles and these companies

46:04

have said, meta very famously

46:07

agreed to censor more information in

46:09

Vietnam, and in its public statements,

46:11

explained that the Vietnamese government was either gonna

46:13

kick them out or they had to agree to censor more

46:16

content. And so meta agreed to censor more

46:18

content because they thought that. At

46:20

a net level, free expression

46:22

in the country would be served by them being a

46:24

platform in the country, rather

46:26

than not being in the country at

46:28

all. Now, also, it's a revenue generating

46:30

market for them. So, you know, there's that, but

46:33

that idea of like, well, at a net

46:36

level, free expression will be protected.

46:39

That's really where the discourse used to sit

46:42

And that's, that's an interesting place for the discourse

46:44

to sit. Like what is,

46:45

that's an important discussion, right? And I think it's,

46:47

it's a principled one. This,

46:50

you know, as you know, I like, I know you

46:52

had Renee dur Resta on recently, and I think

46:54

the way that Renee has talked about the memification of

46:56

free speech is such a, great

46:58

description of what has happened with this concept

47:01

and this slogan. And so we've ceased to be able

47:03

to have a principled. Debate about this,

47:06

that's based in either the first amendment or in,

47:08

article 19 from a human rights lens.

47:10

And instead we're having a, political

47:14

theatrics conversation that isn't

47:16

at all about free speech, but about

47:18

your power to control

47:21

that speech, which you don't like. And

47:23

there was also something about, the extent

47:26

to which the Biden Harris administration aided

47:28

or abetted. know, these efforts,

47:30

again, you know, this and the, like, may

47:33

I redirect everybody to the

47:35

jawboning executive order that the Trump

47:37

administration pushed out like well before

47:40

Biden. these companies are not neophytes,

47:44

you know, who are, who are being abused.

47:46

many of them have incredibly sophisticated

47:49

legal teams. They have incredibly sophisticated

47:51

processes. they're not fussed

47:54

by any particular country's

47:56

official, yelling at them,

47:58

Right. and a lot of them have, built

48:00

up, tremendous, skills and,

48:02

experience in pushing back on

48:04

countries that are doing things. You know, and

48:07

there are interesting conversations to be had about

48:09

how do you deal with that? And obviously like global takedowns,

48:12

like I've supported, companies pushing back

48:14

on global takedowns,

48:15

and interesting conversations too around like,

48:17

there's so much focus right now on the DSA, right?

48:20

And the DSA is also kind of being. In

48:22

foreign affairs is becoming a, point

48:24

of real tension in transatlantic relations

48:27

and where it used to be a point of tension

48:29

between the companies and Europe. It is

48:31

now a point of tension between the American government and

48:33

the eu, which is a new, issue

48:36

or a heightened issue, I should say now. And

48:38

what's interesting to me is that, again, the DSA is

48:41

really leveraging the, political.

48:44

Trojan Horse of the free speech, arguments

48:47

and debates like those have become

48:49

a hook

48:50

Yes.

48:51

that people are landing on when

48:53

in, fact, again, if you were looking at this from a principled

48:55

stance in terms of risks to American industry,

48:57

arguably the, DMA, the Digital Markets

48:59

Act is, going to be far more impactful than

49:01

the Digital Services Act. Would be, but

49:03

the Digital Services Act is a lot easier to sort

49:05

of package, and to put into

49:08

this free speech

49:10

container, which is actually

49:12

now being used to hold all

49:14

sorts of other fights and power

49:17

dynamics under the auspices

49:19

of a foundational rate.

49:20

Yeah. Alright. I think

49:22

on that happy note, I

49:26

think we can wrap this one up. you

49:28

know, this is a little bit of a different episode than we normally

49:30

do. We did, touch on a bunch of stories and we'll have, links in

49:32

the show notes, but I think we did

49:34

think it was important to understand the

49:37

framework in which we normally are thinking about

49:39

online speech and online regulations

49:42

and trust and safety in all of that is.

49:45

We're entering a different world, and it's important

49:47

to understand that world and how that matters.

49:49

And so I think this conversation was really useful

49:51

and at least for me, hopefully for

49:54

our listeners as well in, in

49:56

thinking through all of that. So, Kat, thank you so much

49:58

for stepping into Ben's shoes. you

50:00

didn't bring the British accent, but, uh, a

50:03

lot, a lot of, a lot of insight.

50:06

I did not, I won't, I won't sound if I,

50:08

if I could only have one, I would sound so much smarter,

50:10

you know.

50:10

There we go. That, that's the trick. The

50:12

British accent always helps. But, but, but thank

50:14

you so much for joining us and, thanks for a really interesting

50:17

conversation. And, uh, Ben will

50:19

be back next week and, we'll have another

50:21

one of these. So

50:22

Well, thanks for having me. I hope you go find other things

50:24

to smile about today.

50:26

There we go. All right.

50:27

All right. Have a good one.

50:32

Thanks for listening to Ctrl-Alt-Speech.

50:34

Subscribe now to get our weekly episodes

50:37

as soon as they're released. If your

50:39

company or organization is interested in sponsoring

50:41

the podcast, contact us by visiting

50:43

ctrlaltspeech.Com. That's

50:46

C T R L Alt Speech. com.

50:49

This podcast is produced with financial support

50:51

from the Future of Online Trust and Safety Fund,

50:54

a fiscally sponsored multi donor fund

50:56

at Global Impact that supports charitable

50:58

activities to build a more robust, capable,

51:00

and inclusive trust and safety ecosystem.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features