Self-Driving Government - Meta Torrents Books, Sideloading TikTok, Xbox Sales

Self-Driving Government - Meta Torrents Books, Sideloading TikTok, Xbox Sales

Released Monday, 10th February 2025
 1 person rated this episode
Self-Driving Government - Meta Torrents Books, Sideloading TikTok, Xbox Sales

Self-Driving Government - Meta Torrents Books, Sideloading TikTok, Xbox Sales

Self-Driving Government - Meta Torrents Books, Sideloading TikTok, Xbox Sales

Self-Driving Government - Meta Torrents Books, Sideloading TikTok, Xbox Sales

Monday, 10th February 2025
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

It's time for Twitter this

0:02

week in tech. Lisa Schmeiser

0:04

is here from No Jitter,

0:06

Daniel Rubino, from Windows Central,

0:08

Attorney Kathy Gellis. We just

0:10

learned the United Kingdom is

0:12

asking Apple for a back

0:14

door into its end-to-end encryption.

0:16

What will Apple do? We're

0:19

going to talk about the

0:21

Cosma bill, which is a

0:23

plan by Congress, to ban

0:25

social media for all people

0:27

under 13. And we'll talk

0:29

a little bit

0:31

about what Doge

0:34

is doing and

0:36

whether it's legal,

0:38

that and a

0:41

whole lot more

0:43

coming up next

0:46

on Twitter. Podcasts

0:48

You Love. From

0:51

People You Trust.

0:53

This is Twit.

0:56

2025. Self-driving government.

0:58

It's time for Twit this

1:01

week at Tech. The show will

1:03

we cover the weeks. Tech News.

1:05

Doing it a little early this

1:07

week. Apologies to people who

1:09

like to watch the stream

1:11

live. I hope you got

1:13

the message. The memo arrived

1:15

because there's apparently a football

1:18

game on later in the

1:20

day. Alicia Schminzer. Are you

1:22

all excited about the big

1:24

Super Bowl? I like it better

1:26

when the 49ers play. And the reason

1:29

I like it better is because the

1:31

roads empty out and I can go

1:33

anywhere in the Bay Area without

1:35

traffic. It's true. This one is

1:37

going to be so low rated

1:39

I think because it's a replay

1:41

of two years ago that I

1:43

think your hike is off. Editor

1:45

in Chief at No Jitter, it's

1:47

great to see you. And I

1:49

see more and more people using

1:51

the Blue Sky handle nowadays Elshmeiser

1:54

at the Blue Skyiser. Hello, good

1:56

to see you. God your hair looks

1:58

good. Are you have you been? What's

2:00

your conditioning routine

2:02

like? I don't know. It is

2:05

a complicated... Is it? Well, you

2:07

know, you gotta have a regime,

2:10

right? You know, I just found

2:12

out you can get a shampoo

2:14

in a bar like soap. Oh.

2:17

And it saves plastic. I

2:19

know. Just a little

2:21

bit of guys. It's

2:23

like, they make shampoo

2:25

conditioner and body wash.

2:27

5 and 1 baby. Mine was

2:29

at two pace. Irish Spring

2:32

5 and 1, it cleans

2:34

everything. Including grout. Good to

2:36

see you Daniel. Thank you

2:38

for being here. Also our

2:40

attorney at large, IP attorney,

2:43

a contributor to Tech

2:45

Dirt. You probably room

2:47

reading her. Somewhat upset?

2:49

What's the word? Post

2:51

on Tech Dirt Critical?

2:53

See, see, see, council.com, great to see

2:56

you, Kathy go. It's always a

2:58

question. Now I realize, C-G-C-O-U-N-S-E-L.

3:00

Oh, we left out a

3:02

C. In the letter. She's not

3:04

a counselor. She's not a counselor.

3:07

She's not, she's a counselor. That's

3:09

right. I also recommend never get

3:11

a domain name that is like,

3:13

oh, I forget if it's homin

3:16

or homophone, but it was a

3:18

bad idea. But it was a

3:20

bad idea. You have to get

3:22

all the other spellings, C-O-U-N, C-I-L,

3:24

and all of that. M-O-U-S-E-O-N, that

3:26

too. Big story today. There

3:28

are a lot of big

3:30

stories today, actually. We've got

3:33

a jammed show, so we're

3:35

going to jam along. UK

3:37

has ordered Apple to give

3:39

it access to encrypted cloud

3:42

data. This is from the

3:44

Financial Times. The British government

3:46

has ordered Apple to grant

3:49

secret access. That's the interesting

3:51

part. How do we know this? I think

3:53

there was a leaker to its

3:56

customers' encrypted cloud storage. This is

3:58

from the Snoopers Charter, which was

4:00

a bill passed the UK investigatory

4:02

powers act last month Apple received

4:04

a quote technical capability notice requiring

4:06

them to create a backdoor to

4:09

a iCloud now this was not

4:11

the government has not said this

4:13

apple has not said this they're

4:15

not allowed to it's one of

4:17

those according to people familiar with

4:19

the matter the move would enable

4:22

UK well You know, I started

4:24

saying UK, but it isn't just

4:26

UK. The move would enable global

4:28

law enforcement and security services. Everybody

4:30

would be affected. It's extraterritorial, which

4:33

means UK law enforcement could access

4:35

the encrypted data of Apple customers

4:37

anywhere in the world, including in

4:39

the United States. We, you know

4:41

what? This has been going on

4:43

for years. The FBI's been asking

4:46

Apple to do it. Apple said

4:48

no. Australia's tried this, but this

4:50

is the first time a nation-state

4:52

is actually, as far as I

4:54

know, told Apple explicitly, we need

4:56

a back door. I feel like

4:59

it had happened before a number

5:01

of years ago, and I thought

5:03

that it was kind of put

5:05

to rest back then, so I

5:07

was surprised to see this news

5:10

where it just kind of came

5:12

to the fore of like, oh

5:14

yeah, we're revisiting this issue and

5:16

ignoring every reason anyone ever told

5:18

us for why it was a

5:20

terrible idea. So, was there a

5:23

big deal about this with the

5:25

San Bernardino shooter? Yeah, and Apple

5:27

said no to the FBI. Yeah,

5:29

yeah. Go ahead, Daniel. I'll just

5:31

say, I think they used to

5:34

have like kind of like holes

5:36

in their operating system where a

5:38

third party software was able to

5:40

access this stuff and then Apple

5:42

patched all that and like, yeah,

5:44

well, so Apple has in the

5:47

past been able to do this.

5:49

In fact, they told the FBI

5:51

in the case of the San

5:53

Bernardino terrorists. that oh if you

5:55

had just taken his phone home

5:57

it would have uploaded the contents

6:00

to iCloud and we could have

6:02

given it to you. So as

6:04

recently as whenever that was eight

6:06

years ago, ten years ago, Apple

6:08

would and did do that. They've

6:11

done it in other situations as

6:13

well. Trump's campaign chief

6:15

was using WhatsApp, which

6:17

was encrypted to

6:19

message. This is in the

6:21

2016 election. But WhatsApp

6:24

uploaded unencrypted text to the

6:26

Apple-i-cloud. and the government got

6:29

it. We know that that

6:31

happened because if you read

6:33

the indictment, they quoted his

6:36

what's app messages from iCloud.

6:38

But Apple patched, they did, they

6:40

patched that hole. They fully encrypted,

6:42

as far as we know, they

6:44

fully encrypted everything. In order for

6:46

iCloud to work, they have to have a

6:48

key because, you know, for a

6:50

variety of reasons. If you have

6:52

fully encrypted cloud storage, there's a

6:54

lot of abilities that go away. which

6:56

is why most cloud storage companies,

6:58

including I think Microsoft, offer, correct

7:01

me if I'm wrong, Daniel, kind

7:03

of an enclave, an encrypted enclave,

7:05

and then the rest of it,

7:07

well, you know, we have the

7:09

keys to it, isn't that right? Yeah,

7:11

it comes like vaults. Yeah, yeah. Yeah.

7:13

Apple does not offer that. Apple just

7:15

says we have the keys, but we're

7:17

not going to give them to anyone. This

7:20

is a huge issue, as we know, from

7:22

back doors in the past. It's not

7:24

been so long ago that

7:27

it was revealed that salt

7:29

typhoon Chinese hackers had access

7:31

to the phones of many

7:33

officials in government high officials

7:35

in government because Going way

7:38

back to the 90s the

7:40

government mandated that there would

7:42

be wiretap capability and digital

7:45

technologies At the time

7:47

the FBI director said no, but

7:49

it but don't worry because this

7:51

backdoor will never leak out well

7:53

And as a result, we've got Chinese

7:55

hackers in our phone network and can't

7:58

do much about it, incidentally. So

8:00

this is always a bad idea.

8:02

I think every real security guru

8:05

says there's no way back doors

8:07

stay secret. It's all fascinating with

8:09

the aspect. Girls this week, they're

8:11

having a bad week. It's a

8:14

bad week for security in general.

8:16

Yeah. Go ahead Daniel. Sorry, I

8:18

was going to say that. You

8:21

know, what's also scary about this

8:23

is the fact that I mean

8:25

mentioned this earlier that, you know,

8:27

if. they do this, then they

8:30

can access data from Americans indirectly,

8:32

right? And this is what the

8:34

NSA has always done. NSA can't

8:37

spy. And with the Patriot Act

8:39

can't spy directly on Americans. But

8:41

if they're spying on someone else

8:43

in another country who happens to

8:46

have access to Americans, that data

8:48

gets collected as well. And so

8:50

it's an indirect way. So I

8:53

wouldn't even be surprised if the

8:55

US government is like. Shrugging their

8:57

shoulders like, you know, we're not

8:59

saying you guys shouldn't do this

9:02

because it would probably help them

9:04

as well They wanted the whole

9:06

eyes, you know A consultant advising

9:09

the United States and encryption matters

9:11

according to the Washington Post who

9:13

broke this story said Apple would

9:15

be barred from warning its users

9:18

If this happens Apple You know,

9:20

had rolled out, and this is

9:22

what started this, a couple of

9:25

years ago, advanced data protection. A

9:27

switch you could turn on, most

9:29

people don't, because it eliminates some

9:31

of, as I said, some of

9:34

the features of iCloud and other

9:36

things, but they offered it, and

9:38

they specifically encouraged government officials to

9:41

use it, because they didn't even

9:43

have the keys to this. This

9:45

was fully encrypted. trying to do

9:47

this a couple years earlier backed

9:50

off after according to the post

9:52

after objections from the SBI this

9:54

is back in the first term

9:57

of president Trump The service is

9:59

an available security option for Apple

10:01

users in the US and elsewhere

10:03

I'm not sure if it's available

10:06

in the UK Probably what will

10:08

happen is Apple will make it

10:10

no longer available in the UK

10:13

But that doesn't solve the problem

10:15

in the US Ron Wyden, Senator

10:17

Ron Wyden of Oregon, a Democrat

10:19

on the Senate Intelligence Committee, said

10:22

the US has to dissuade Britain.

10:24

Trump and American tech companies letting

10:26

foreign governments secretly spy on Americans

10:29

would be unconscionable and unmitigated disaster

10:31

for America. I know why you're

10:33

laughing. I know why you're laughing.

10:35

I know why you're laughing. New

10:38

what he was saying. Unmitigated disaster

10:40

for Americans privacy international security. Signal

10:42

President Meredith Whittaker said using technical

10:45

capability notices to weaken encryption around

10:47

the globe is a shocking move

10:49

that will position the UK as

10:51

a tech pariah rather than a

10:54

tech leader if they implemented the

10:56

directive will create a dangerous cybersecurity

10:58

vulnerability in the nervous system over

11:01

a global economy. You know why

11:03

Meredith Whittaker is commenting on this

11:05

because next... It's signal. Because signal

11:07

there's no back door and signal.

11:10

The UK's obviously. Yeah, she's fantastic.

11:12

She writes a lot of great

11:14

stuff on this topic. Yeah. This,

11:17

you know, given all the news

11:19

coming out of DC, this will

11:21

probably not get the coverage it

11:23

should. But if you're watching our

11:26

shows, you understand why this is

11:28

such a big deal. But I

11:30

think it's up for Apple to

11:33

somehow figure out how to resist

11:35

on its own. We don't have

11:37

a foreign policy at the moment.

11:39

We barely have a domestic policy

11:42

at the moment. So there's, the

11:44

conventional wisdom about how we would

11:46

approach this is not available to

11:49

us. This is not something where

11:51

you get together, you speak to

11:53

your government, and you sort of

11:55

point out like, okay, I know

11:58

you're tempted, but here's the consequences

12:00

of. if this happens. This is

12:02

not a government that is equipped

12:05

to consider the consequences of decisions

12:07

that it makes. And at this

12:09

point, given how much power is

12:11

embodied in any particular administration and

12:14

getting exercised in any particular way,

12:16

it's probably all for incursions for,

12:18

you know, privacy, it seems to

12:21

be the administration's position in an

12:23

awful lot of technology areas these

12:25

days. Android is also at this

12:27

point end to end encrypted in

12:30

its backups. It does not offer,

12:32

it does not have the keys.

12:34

So, Remember that this came out

12:37

because of a leak. It's very

12:39

likely that the same order went

12:41

to Google and went to signal

12:43

and went to meta for WhatsApp.

12:46

Because, I mean, why would they

12:48

just target Apple? Google's been encrypting

12:50

backups for Android phones since 2018.

12:53

Google declined. This is interesting. Google

12:55

declined to say whether any government

12:57

had sought a backdoor. Google is

12:59

turning to such a like. Weird

13:02

company, right? They were like the

13:04

good guys for so long and

13:06

then of course they had that,

13:09

you know, we don't do evil

13:11

thing and they got rid of

13:13

that and then they were like

13:15

And now this week they're like

13:18

we think it's fine to use

13:20

AI to make super smart weapons

13:22

Yeah, yeah, yeah, yeah, that's a

13:25

line of revenue They got rid

13:27

of DEA, well maybe a little

13:29

evil now we're all about the

13:31

evil Defined evil? Well, to be

13:34

fair, getting rid of DEAI seems

13:36

to be the thing. It's interesting

13:38

because it's in anticipation of trouble

13:41

from Washington. Nobody's told them to

13:43

do that yet. But Washington's in

13:45

advance. It's missing the states. So

13:47

what I'd really like to gently

13:50

push back on is instead of

13:52

calling it DEA, which reduces it

13:54

to an acronym and a buzzword,

13:57

let's just point out that what

13:59

companies are companies are. Volunteer. doing

14:01

right now is downplaying or limiting

14:03

efforts to make the workplace more

14:06

diverse, more equitable, and more inclusive.

14:08

That's it. Companies are explicitly saying

14:10

we no longer value diversity, we

14:13

no longer value equity, we no

14:15

longer value inclusion. Like those

14:17

are the words, let's use the words,

14:19

not the acronym. No, I agree with

14:21

you, but why, and this is a

14:24

softball, why is it important that the

14:26

company like Google be diverse? McKenzie has

14:28

done tons. Okay, sorry, I'm gonna pop off

14:30

for a minute and I hope that's okay,

14:32

everybody. That's why I asked the softball

14:35

question. Kathy and Daniel. McKenzie

14:37

has nearly a decade of

14:39

research examining the performance of

14:41

publicly traded companies when you

14:43

have a diverse workforce, especially

14:45

diverse leadership from management up

14:47

to sea level and board

14:49

of directors, compared to companies

14:51

with a homogenous culture. And

14:53

the difference in profit is

14:55

in the double digits percentage,

14:57

as is the difference. and

14:59

employer retention and the difference

15:01

in overall productivity. It's actually

15:03

in the best interest of any

15:05

shareholder to demand the EI because

15:07

it boosts overall company performance and

15:10

the bottom line. Why somebody would

15:12

attack diversity equity and inclusion and

15:14

say we would prefer to take

15:17

these out of our company, you're

15:19

basically saying we'd like a company

15:21

that's less profitable. And I think

15:24

if we're going to talk about

15:26

this, we need to talk about

15:28

McKenzie's research. There's been research

15:30

from Pew, there's been research from

15:33

other institutes focused on taking a

15:35

look at the participation of

15:37

populations in the workforce and what

15:40

the net effect in bottom line

15:42

is. And the research, if you do

15:44

a... What is it called when you

15:46

take a look at multiple studies and

15:48

come to this conclusion? Yeah, it's

15:51

fairly unambiguous. Diversity,

15:53

equity, and inclusion

15:55

brings to bear a wide variety

15:58

of talents, of strength. of

16:00

viewpoints of perception of market opportunities that

16:02

homogenous populations simply don't have. It's better

16:04

for business, it's better for society, it's

16:06

a win-win. And that's why it matters.

16:08

And our discord says that this administration

16:10

is exchanging DEA for homogeneity, exclusion, and

16:12

inequity in labor, which interestingly is Hyle.

16:14

Someone, I wonder if that's pointed, satirical

16:16

commentary on it. But somebody also was

16:19

saying how no, D.I. will just have

16:21

merit. And I think what we need

16:23

to push back on is this idea

16:25

that D.I. was foreclosing merit. D.I. was

16:27

a way of making sure you captured

16:29

merit. Yeah, so that's the point they're

16:31

making. You should promote people, you should

16:33

hire people based on ability alone. Talent.

16:35

Yes, that's how you get it by

16:37

making sure that you're you're tapping into

16:40

it from from problem. I would also

16:42

raise the point that if you're going

16:44

to develop AI and you do it

16:46

only with white men, you're going to

16:48

have a problem that if you make

16:50

games and you only have white men

16:52

writing them. Well, we already know what

16:54

problems you have. In fact, Gamergate has

16:56

now won. It's seven years later and

16:58

Gate won. Well, you take a look.

17:01

I can remember going to CES a

17:03

couple years. And by the way, none

17:05

of the companies that I'm going to

17:07

mention, I'm not going to mention names.

17:09

They're not in business. I've checked, but

17:11

I can remember going and taking a

17:13

look at just a curiosity or somebody

17:15

who's like, I have a smart sensor

17:17

for a diaper. So when you attach

17:19

it to a diaper, an alert goes

17:21

off on your app. and this way

17:24

you know changed the baby which because

17:26

you're in the other room in another

17:28

state you're out of town what first

17:30

of all babies are not hesitant about

17:32

letting you know in the day trust

17:34

me you know I said so does

17:36

this sensor send an alert to every

17:38

caregiver responsible for the baby and they

17:40

said no it's a closed ecosystem so

17:42

it's one sensor one app, one phone,

17:45

if you want multiple users, you have

17:47

to attach multiple sensors. And I was

17:49

like, Ben in a household that functions

17:51

with people. Is that baby walking funny?

17:53

Well, she's got four sensors in her

17:55

door. Yeah, no, this guy thought this

17:57

was a reasonably, perfect, prominent solution. Guy

17:59

is the word. And I was like,

18:01

you don't have kids, do you, or

18:03

a partner, or and. Have you ever

18:06

changed a diaper in your life? Well,

18:08

it was a lot of personal technologies

18:10

where it solved a very specific problem

18:12

for a very specific demographic, or there

18:14

was somebody who perceived a market need,

18:16

but because they didn't have lived experience

18:18

or access to people lived experience, they

18:20

went ahead and wasted time and money

18:22

and effort, creating something that was completely

18:24

unusable for a target audience. Amazing. This

18:27

is a core problem when you have

18:29

a homogenous workforce. That's a great example.

18:31

they won't be able to perceive anything

18:33

outside their experience and they won't even

18:35

know how to get outside their experience

18:37

after a while. I want to make

18:39

sure that I get out the whole

18:41

idea about why merit and DII are

18:43

synonymous and not paradoxical. Because somebody is

18:45

suggesting, well, skin pure pigment has nothing

18:47

to do with merit. Well, A, D,

18:50

I has more to do than just

18:52

skin pigment. But sure, of course not.

18:54

None of these qualities that D, I

18:56

make sure, are mixed up in our

18:58

company so that you have all sorts

19:00

of people from all sorts of walks

19:02

of life, physiology, culture. religion, etc. The

19:04

fact that they can all come together

19:06

and bring their contributions is important because

19:08

it also produces as Lisa was saying,

19:11

a more powerful company because it knows

19:13

how it has a bigger bag of

19:15

tricks to pull from to be able

19:17

to produce to do its business. But

19:19

also the reason why these efforts are

19:21

important is because there are structural issues

19:23

that prevent some of these populations from

19:25

getting in the door to be able

19:27

to make those contributions. And DEA I

19:29

was about removing those barriers to make

19:32

sure that those people could be included.

19:34

because we need

19:36

them to be included

19:38

because it will

19:40

have better business. Well,

19:42

I mean, selfishly,

19:44

the businesses will make

19:46

more money if

19:48

they have access to

19:50

more people with

19:53

more ideas. And there's

19:55

all sorts of

19:57

things. When there's things

19:59

interfering with that,

20:01

it's good to knock

20:03

down the things

20:05

that are interfering with

20:07

it, just because

20:09

we want a better

20:11

culture in country

20:13

that might be more

20:16

mixed up and

20:18

equal. What is the

20:20

percentage of women

20:22

CEOs in this country?

20:25

I don't know, but not high. It's very

20:27

low. Single digit. So

20:30

the other thing, some have pointed out and I

20:32

agree with them, is that a lot of the

20:34

companies that had so -called DEI initiatives were just,

20:36

it was lip service. Basically,

20:38

companies are companies, right? They're not, they

20:40

don't have a heart. They don't

20:42

have a conscience. They're there to maximize

20:44

profits. That's almost their responsibility, their

20:47

corporate responsibility. So when the wind blows

20:49

one way and everybody says, oh,

20:51

you got to do DEI, they write

20:53

up a thing and they put

20:55

it in their mission statement and done.

20:57

They find the stock art where

20:59

it's smiling people of different colors already.

21:01

Yeah. Every front page of every

21:04

website has a black person, a white

21:06

person, a woman, and a Chinese

21:08

person. It's just what you do, but

21:10

is it lip service or is

21:12

it genuine? I think in Google's case,

21:14

it was somewhat genuine. I think

21:16

in a lot of the companies, was

21:18

somewhat genuine. Yeah. Yeah. I mean,

21:20

even in Turkey's case, it was genuine.

21:23

Like they had programs. They had

21:25

shirts. They they did LGBT outreach. Like

21:27

they understood one of the other

21:29

things about being a business that values

21:31

DEI, it just is values and

21:33

not just an acronym is because they're

21:35

your customers and they've got money

21:37

and they can give you the money

21:39

if they think that you're selling

21:42

something worth them giving you money for.

21:44

I remember Martin Luther King Jr.'s

21:46

son on TV saying, well, if you're

21:48

not to include us, we're not

21:50

to include you. And I think that's

21:52

a very important point too. A

21:54

company that very kind of publicly says,

21:56

we're going to roll back our

21:59

hiring goals. And this is what Google

22:01

did. on Tuesday they removed the language in their 10K to the

22:03

SEC which said quote we are and this is all they're

22:05

saying we are convert committed to making

22:07

diversity equity inclusion part of everything we

22:10

do and growing a workforce that is

22:12

representative of the users we serve that

22:14

seems pretty anodyne it doesn't seem

22:16

no because that tells you who they want to

22:19

that tells you who their target audience the users

22:21

are I think you need to pay attention to

22:23

that phrase. So, okay, that's interesting.

22:25

By the way, Google's efforts did

22:28

increase the number of women, the

22:30

number of people of color working

22:33

at Google by a few percentage

22:35

points. It didn't, it didn't overnight

22:37

make it equitable. I think one

22:40

of the things you're going to have

22:42

to look for in the next year

22:44

or two, they've had a return to

22:46

office push too, haven't they? Yeah. One

22:48

of the emerging trends

22:51

in workplace research, Slack

22:53

has shown this, I wish

22:55

I could pull the names

22:57

of the vendors off the

23:00

top of my head, but

23:02

something that's coming up,

23:04

as people take a look

23:07

at the impact of return

23:09

to office policies, is

23:11

the impact disproportionately

23:14

affects women,

23:16

parents, There

23:19

is a suggestion that these policies

23:21

are discouraging equitable participation

23:24

in the workplace because

23:26

as they come back to the office, it's

23:28

not as productive or merit-based environment

23:31

as it was when you were

23:33

purely producing in a collaborative remote

23:35

environment. there's external social stressors, especially

23:37

if you're a parent and you're

23:39

responsible for child care, things like

23:42

that. So this is something I'd

23:44

be very, very curious to look

23:46

at with Google over the next

23:48

few years is they might be

23:50

hosting on numbers that were boosted

23:53

by previous diversity equity inclusion initiatives

23:55

earlier, but let's see what those

23:57

numbers look like in 2026 and

23:59

2027. after a full year of RTO and

24:01

after a cultural downplaying of

24:04

diversity equity and inclusion initiatives. Because

24:06

I think I can see a

24:08

change. Both Google and Microsoft have

24:10

said hybrids not going anywhere. Yeah.

24:12

But Google does say you got

24:14

to come to work three days

24:16

a week. Yeah. And I think

24:18

Microsoft, is it the same at

24:20

Microsoft Daniel? Do you know? Because remember,

24:22

the Amazon said five days a

24:24

week, Amazon said, you're back in

24:26

the office, you're sitting at that desk.

24:29

Great way to shake people out. Well,

24:31

I wonder how many people they lost,

24:33

whether that, what is the cost of

24:35

having. Well, I think that's why Google

24:37

and Microsoft both. The best people before

24:39

you lose the people want to shake out.

24:42

Yeah, like the great thing about a layoff

24:44

is you can target your cuts, whereas

24:46

when you're trying to shake people by

24:48

attrition, the best people leave. The best

24:50

people say no, I'm not coming back.

24:52

Well, that's the most hireable. Well, that's

24:55

right. That's what I mean. Yeah, I'm

24:57

not sure that that equates, but yes.

24:59

It's the people who can get another

25:01

job where they don't get another job

25:03

where they don't have to work in- When

25:05

you look at, okay, this is a collapse

25:07

of a whole bunch of things that should

25:10

not have collapsed, but let's look at

25:12

where the pushback power is. And

25:14

so one of the places that

25:16

there's pushback power is among consumers

25:18

and the public and enough people outraged

25:21

about it that it can certainly create

25:23

some market force pressure and we should

25:25

look into doing that. But the other

25:28

thing high as a lawyer I want

25:30

to know is these companies who thought it

25:32

was in their interest to do

25:34

it. I don't know if that

25:37

quite plays out because even if

25:39

they thought that their business depended

25:41

on the benevolence and quotes of

25:44

Trump, that's not the only

25:46

authority that governs them. They

25:48

are still exposed to states

25:51

and states have laws. They

25:53

have enforcement powers and they've

25:56

got courthouses that private aggrieved

25:58

people who believe they've been

26:00

discriminated against can still access. So

26:02

all they've done is in trying

26:04

to protect themselves against whatever they

26:07

thought was gonna happen or improve

26:09

their position with respect to the

26:11

Trump administration, they've really undermined their

26:13

position with respect to every other

26:15

power and authority and I don't.

26:17

quite know why they thought that

26:19

the you know when they run

26:21

the math on that that they're

26:24

going to be better off this

26:26

way and I think it's time

26:28

to maybe make them see they

26:30

are not in fact better off

26:32

this way. All right. I had

26:34

with a friend this week because

26:36

they were like oh I'm so

26:38

worried that Apple is still sticking

26:41

to D. I and so is

26:43

Costco and I was like don't

26:45

be it's a numbers game and

26:47

clearly they've done the math they've

26:49

run the math and decided that

26:51

visibly sticking to this stance and

26:53

appealing to a customer base is

26:55

going to work out better for

26:58

them than anything the Trump administration

27:00

could possibly do. Like, as Kathy

27:02

pointed out, there's a cost benefit

27:04

analysis to this, and it is

27:06

a little bit astonishing that a

27:08

lot of these places aren't taking

27:10

a look at how their, how

27:12

their public posturing is going to

27:15

land with different state enforcement or

27:17

even international enforcement, because, I mean,

27:19

the EU is not shy about

27:21

regulation. I have to say there's

27:23

a case to be made that

27:25

EU regulation, especially their anti-tech regulation

27:27

like this UK law, is harmful

27:29

to the EU in terms of

27:32

being an innovator. I think the

27:34

EU is behind. This is a,

27:36

let's go back to England for

27:38

a moment. The British public wants,

27:40

87% of the British public would

27:42

back a law requiring AI developers

27:44

to prove their systems are safe.

27:46

before release, which by the way,

27:49

eliminates every single public AI that's

27:51

out there, because all of them

27:53

have been jailbroken. 60% are in

27:55

favor of outlawing the development of

27:57

smart than human AI models. Good

27:59

luck. Well, good luck

28:01

in developing the models too. I would

28:04

say. Well, we can dispute that. I

28:06

think by the, I think it's pretty

28:08

clear to me that by the end

28:10

of this year, AI coding will be

28:13

equivalent of to the best coders out

28:15

there right now. It will replace coders.

28:17

Coders will become something else. We'll see.

28:20

You know, we'll see. That's that's the

28:22

low-hanging fruit. I think there's a lot

28:24

of things that AI can do better

28:27

than a human. So. Maybe not in

28:29

Britain. Leah, when you launch the AI

28:31

show, feel free to be honest. We

28:33

did. We have. We took this week

28:36

in Google and renamed it Intelligent Machines.

28:38

And every week we're going to have

28:40

a guest from the AI. This week

28:43

we had a guy who was the

28:45

head of sales at Open AI. He

28:47

was very aggressively, aggressively in favor of

28:50

AI and AGI and all of that.

28:52

We're going to have in... couple of

28:54

weeks, Ray Kurzweil, who of course created

28:56

the term intelligent machines on it, has

28:59

been saying the singularity is near for

29:01

decades. You may, I do not disagree

29:03

with him at this point. I think

29:06

we've seen amazing progress, don't you think?

29:08

In a human intelligence. And the way

29:10

the thing about the AI stuff is

29:13

once it gets to the point where

29:15

it can recreate itself. and create better

29:17

AI from AI. That's why an AI

29:19

coder might be the first step towards

29:22

a singularity because then it's right in

29:24

its own code and it can iterate

29:26

much faster. Yeah, the issue with all

29:29

this is China, right? Because. The EU,

29:31

UK, can talk about we need restrictions,

29:33

we need laws, we need to slow

29:36

this down, we've got to monitor it.

29:38

It's fine. Sounds great. The United States

29:40

is a little bit in between there

29:42

because we stand the most to, you

29:45

know, gain from having unrestricted AI. But

29:47

China's just out there doing whatever it

29:49

wants. And that's the problem. If you're

29:52

going to have a system where everybody

29:54

is like, like, we need to have

29:56

rules and regular. And China is like,

29:59

go ahead, I can tell that we're

30:01

just going to go ahead and just

30:03

do our thing. They're going to end

30:05

up winning, whatever winning is here, you

30:08

know, the more advanced AI system, they're

30:10

just going to get there first. And

30:12

it's considered to be like the next

30:15

nuclear arms race, right, where we're going

30:17

to get the best AI. And up

30:19

to this point, the United States has

30:21

had, it's been the leader in tech

30:24

and Europe is far behind. And China

30:26

is wririllingling the United States at the

30:28

United States at this point. they have

30:31

some tricks up their sleeves apparently. Now

30:33

we can debate about how much of

30:35

that was stolen or, you know, some

30:38

smoke and mirrors, but at the same

30:40

time, they created a pretty impressive generative

30:42

AI system, you know, pretty quickly. Yeah.

30:44

Well, that's a point. Any very, very

30:47

attractive destination for researchers at this point,

30:49

since we are currently in the US

30:51

embracing an administration and a culture of

30:54

devaluing education. in research investment. So where

30:56

do you think the brains to develop

30:58

this stuff are going to go? Well,

31:01

no, I'm realizing there is a ceiling

31:03

on that on what AI can do.

31:05

And I really want to push back

31:07

on the term of best AI. You

31:10

know, it could produce certain amounts of

31:12

code that will run sort of efficiently

31:14

in a couple of context, but that

31:17

doesn't make it best if the UI

31:19

is terrible. If it doesn't understand all

31:21

sorts of contextual parameters, if it doesn't

31:24

understand how it ultimately intersect interfaces with

31:26

human behavior. Like you need the humans

31:28

to be able to sort of direct

31:30

it. You can't just kind of have

31:33

it pull. We kind of treated as

31:35

a magic one that just pressed this

31:37

little button and all the problems that

31:40

the human beings couldn't quite solve is

31:42

going to magically be solved by a

31:44

computer and it can do some stuff

31:47

but it can't solve them because where

31:49

is it even going to have learned

31:51

the understanding that billions of people haven't

31:53

still managed to figure out. So I

31:56

think there's a feeling. Who makes the

31:58

most profit from it right now the

32:00

systems aren't very profitable right? No expense

32:03

But that's what's interesting about China, right?

32:05

that's growing in enterprise AI is enterprise

32:07

AI that there's been two years worth

32:10

of dumping in all of this money

32:12

and all these resources into co-pilots and

32:14

AI assistants and automated workflows and now

32:16

so-called agentic AI but no one's buying

32:19

and workers are super resistant to using

32:21

AI so that's that's another part of

32:23

the story too is it's not doing

32:26

what people have been

32:28

promised. If you're not understanding the

32:30

problems to solve, you're not going

32:32

to solve them. And I shouldn't

32:34

say none because there are certain

32:37

API applications that have taken the

32:39

trouble to sort of figure out

32:41

here is the problem and here's

32:43

how AI solves it. I don't

32:45

want to stand in the way

32:47

of those, but that is not.

32:49

the buzz. That is not the

32:51

thing that people are talking about.

32:53

That is not where all the

32:56

money is getting dumped into and

32:58

all the essentially hysteria. We are

33:00

growing the tulip bulbs around AI,

33:02

which is not designed to be

33:04

problem specific. It is designed to

33:06

be magic wand specific. And if

33:08

you think you've conjured one, I

33:10

think you're going to be wrong.

33:13

You're not going to solve the

33:15

problems because you never understood the

33:17

problems you were supposed to be

33:19

solving in the first place. The

33:21

framing LITs are an AI as

33:23

blooms taxonomy of learning where you

33:25

move from simply being able to

33:27

remember and retain a piece of

33:30

knowledge to being able to apply

33:32

that knowledge in contextally appropriate situations

33:34

to doing more cognitively complex things

33:36

which depend on application synthesis analysis

33:38

and. contextual flexibility. And Kathy, I

33:40

think we're in a little bit

33:42

of agreement here. AI doesn't show

33:44

a lot of ability to discern

33:47

and adapt to specific contexts or

33:49

to even understand that there are

33:51

that problems exist in different contexts

33:53

and therefore would have different solutions

33:55

depending. It's super great for for

33:57

highly structured automated tasks or for

33:59

queries where you need to. to

34:01

find patterns and looking at those

34:04

patterns helps you find a solution.

34:06

But with really complex things, I

34:08

don't know that we're to the

34:10

point where you have a set

34:12

of artificial intelligence tools that are

34:14

able to take a look at

34:16

a problem like how do we

34:18

free it more nitrogen for growing

34:21

agriculture again and understand the constituent

34:23

parks, break it down, do the

34:25

brute force computing and then put

34:27

it all together. And how do

34:29

you abdicate so much control over

34:31

our world to something that? doesn't

34:33

get hungry, doesn't get sad, doesn't

34:35

grieve, doesn't love, doesn't have anything

34:38

that is part of the human

34:40

existence. It's going to be creating

34:42

solutions where the solution may be

34:44

offing like the entire population, but

34:46

human beings in theory would react

34:48

to that and say we've got

34:50

a problem here. It's interesting. That

34:52

goes full circle back to our

34:55

DEAI, except at this time, we're

34:57

going to have to tell AIs

34:59

to start including us. I need

35:01

to take a break. We have

35:03

a great panel and lots to

35:05

talk about. Lisa Schmeiser is here.

35:07

It's so nice to have you

35:09

from No jitter.com, where she's the

35:12

editor-in-chief. Windows central.com is where you'll

35:14

find Daniel Rubino. Also, editor-in-chief. Good

35:16

to see you. Off the off

35:18

the rails this week. I mean,

35:20

no, I don't mean what's a

35:22

good way to say that you

35:24

have just been knocking it out

35:26

of the park Talking about what's

35:29

going on with Doge and we'll

35:31

we'll get to that in a

35:33

little bit lots more to talk

35:35

about You're watching this week in

35:37

tech our show today brought to

35:39

you by threat locker love these

35:41

guys because they are solving a

35:43

massive issue with security imagine hardening

35:46

your security And never have to

35:48

worry about zero day exploits or

35:50

supply chain attacks or ransom wherever

35:52

again. That's what threat locker can

35:54

do. Worldwide companies like JetBlue trust

35:56

threat locker to secure their data

35:58

and keep their business operating. operations

36:00

flying high, if you will. Now, look,

36:02

every security company says,

36:05

oh, you never have to worry

36:07

about threats again. What makes

36:09

threat locker different? It's something

36:11

called zero trust. And it

36:14

really, really works. Imagine taking

36:16

a proactive, and this is

36:18

the key, these three words,

36:20

deny by default approach to

36:22

cybersecurity. What does that mean? It

36:24

means every action is blocked, every

36:27

process is blocked, every user.

36:29

is blocked unless explicitly authorized

36:31

by your team. It means

36:33

nobody can get inside your network

36:35

and do whatever they want. They

36:37

have to be approved. Threatlocker helps

36:39

you do it and this is so

36:42

important nowadays gives you a full

36:44

audit of every action. That's great

36:46

of course for compliance but also

36:48

helps you with risk management. Because

36:50

if something does happen you know

36:52

exactly where, when and how.

36:54

And Threatlockers 24-7 U. U.S. base

36:56

support team fully supports you in getting...

36:59

The whole thing set up and onboarding

37:01

and beyond. This is so key, you can

37:03

stop the exploitation of

37:05

trusted applications within your

37:07

organization. Keep your business

37:09

secure, keep it secured

37:11

from ransomware. Organizations across any

37:14

industry will benefit from

37:16

threat lockers ring fencing,

37:18

isolates critical and trusted

37:20

applications from unintended uses

37:22

or weaponization, limits attackers

37:24

lateral movement within the

37:26

network. And by the way... Works

37:28

on heterogeneous networks as well. It

37:30

works for Max and PCs. And

37:32

it's very affordable. That's the other thing.

37:35

You might say, well, that sounds

37:37

like big enterprise, big iron. No. Every

37:39

business needs threat locker. Get

37:41

unprecedented visibility and control of

37:43

your cyber security quickly, easily,

37:46

and extremely cost-effectively. With threat

37:48

lockers, zero trust, end point

37:50

protection platform. You need this.

37:52

Visit threat locker.com. You get

37:55

a 30-day trial and... Learn

37:57

more how threat locker can

37:59

mitigate. Even Zero days, even

38:01

completely unknown threats. And help you

38:03

with compliance. threatlocker.com. We really like

38:05

these guys. And I think you

38:08

need to visit threatlocker.com. And if

38:10

you're going to Zero Trust World,

38:12

I wish you have a great

38:14

time in Orlando this week. I

38:17

wish I could be there. They're

38:19

putting on a big conference with

38:21

lots of great and interesting stuff.

38:23

threatlocker.com. You find out more about

38:26

it at Zero Trust World. Threatlocker.

38:28

All right, thank you threat locker

38:30

for supporting Twitter Super Bowl later

38:32

today I Thought this was really

38:34

interesting. I tried to sign up

38:37

so as you may or may

38:39

not know only 38 states allow

38:41

sport gambling betting On sports we

38:43

had a referendum in California and

38:46

it failed I thought it would

38:48

win for sure because everybody likes

38:50

to bet on games And if

38:52

you see the ads on the

38:55

Super Bowl today and every football

38:57

game, every sporting event, it looks

38:59

like draft Kings is legal everywhere.

39:01

Not exactly. You have to go

39:03

to Las Vegas and I think

39:06

it was actually Las Vegas that

39:08

got it overturned in California. They

39:10

want you to come there. Well,

39:12

the crypto folks have found a

39:15

way around it. I don't know

39:17

how legal this is. crypto.com and

39:19

Calci. have both done an end

39:21

around on regulations to allow you

39:24

to bet on the Super Bowl.

39:26

Because they call it a trade.

39:28

Robin Hood thought about doing this

39:30

and decided not to. crypto.com, a

39:33

Singapore-based company, was inviting users in

39:35

the US to quote trade their

39:37

own prediction. No, not the word

39:39

betting is never used on sports

39:41

events, including who win the Super

39:44

Bowl. So what happens is. crypto.com

39:46

has effectively created a contract, a

39:48

swap contract. There's a market for

39:50

yes or no positions in the...

39:53

of the NFL playoffs, college football

39:55

bowl games. For every yes, there's

39:57

a corresponding no. Prices constantly move.

39:59

Sounds a little bit like betting

40:02

in January, for example, a yes

40:04

for the Kansas City Chiefs to

40:06

win this afternoon Super Bowl cost

40:08

$56.75, but if they do, you

40:10

pay $100. The no side, $46.75.

40:13

And you get $100. If the

40:15

Eagles win. And crypto.com gets the

40:17

VIG. I'm sorry, that sounds like

40:19

a betting term. They get $3.50.

40:22

Fees. Fees, it's just fees. I

40:24

have two comments to make. One,

40:26

a lot of these, oh, make

40:28

all this money or do this

40:31

one thing with this one correct

40:33

trip. Like all these people keep

40:35

like thinking they've discovered something and

40:37

it turns out to be securities

40:39

fraud. Like. Not anymore, not anymore,

40:42

not anymore. We got rid of

40:44

Gary Gensler. I don't think the

40:46

SEC is going to get involved.

40:48

I don't know. This is what

40:51

they're doing, right? They started this

40:53

in December right after the election.

40:55

Right, well, so maybe nobody enforces

40:57

it, but basically there's definitely this

41:00

ethos of like, oh, look, I

41:02

have innovated a solution. No, you've

41:04

basically figured out something that already

41:06

was against the law. Oh, you

41:08

didn't realize is that. You didn't

41:11

know what the law was. Well,

41:13

I think that's why Robin Hood

41:15

decided not to, right? That's possible.

41:17

Maybe it's more of a PR

41:20

hit, because I honestly don't think

41:22

that the SEC has now constituted

41:24

is going to do anything. Well,

41:26

that's a separate problem. In fact,

41:29

I'm wondering if I should even

41:31

file taxes this year. Yeah, that's

41:33

a separate problem, too. But I

41:35

had that thought from before. And

41:37

again, who's going to happen if

41:40

I don't? A little too soon

41:42

to make that. Maybe wait a

41:44

year? Okay, I'll try to next.

41:46

No, no, I mean, wait a

41:49

week even and see where we

41:51

are. Will there be an IRS

41:53

on April 15th? It's unknown at

41:55

this point. Do we have a

41:58

treasury department I think is the

42:00

bigger question? Well, IRS is part

42:02

of the treasury, right? Well, they

42:04

give the money. to the Treasury

42:06

even if they're, that money. What

42:09

department is the IRS under? It's

42:11

kind of Treasury. It's probably Treasury,

42:13

but I don't know, I didn't

42:15

realize I needed to know. Anyway,

42:18

that was one comment. I'm sure

42:20

we'll swing around back to that.

42:22

But the other comment, I'll definitely

42:24

get back to that. You notice

42:27

I'm burying it a little bit,

42:29

because I don't want people to

42:31

tune out too. I don't remember

42:33

what happened back then. It's the

42:35

one where they go off the

42:38

rails. It's dark bif Tanner and

42:40

he's basically made all this money

42:42

illicitly because he was. Bif might

42:44

be the CEO of crypto.com. I

42:47

don't know. crypto.com says quote. We

42:49

don't offer sports betting products. We

42:51

offer tradable cryptocurrency commodities and tradable

42:53

financial products which differ from products

42:56

offered by sports books. That's

42:59

my story and I'm sticking to

43:01

it. By the way, this is

43:03

the number one betting day on

43:06

the US calendar. Estimated $1.5 billion

43:08

in legal wagers on the Super

43:10

Bowl. The Supreme Court struck down

43:13

a federal prohibition, you might remember,

43:15

in 2018, creating this $15 billion

43:17

industry. 38 states in Washington DC

43:20

allow it. And Americans now legally

43:22

wage your more than $12 billion

43:24

a month. Yeah, the social effects

43:26

are really disruptive. I have to

43:29

think that this is, that gambling

43:31

addiction is going to become a

43:33

real problem. And you know what

43:36

is interesting? This came from England.

43:38

This came from the richest woman

43:40

in England. I've talked about this

43:43

before. Got rich because her dad

43:45

owned, remember in England, you're going

43:47

to have these betting shops, right?

43:49

bookie shops and you can go

43:52

and you could bet. And her

43:54

dad owned a bunch of them

43:56

and she decided, well let's get,

43:59

let's get digital. But she did

44:01

something that was super smart that

44:03

ended up making her the richest

44:06

woman in England richer than JayK

44:08

Rolling or the Queen. I guess

44:10

the Queen is no longer with

44:12

us, but the King. So Denise

44:15

Coates is trying to get in

44:17

the United States, but she created

44:19

the prop bet. And you've seen

44:22

if you watch the ads on

44:24

the football games. You'll see draft

44:26

kings and all these others Kevin

44:29

Hart talking about you can bet

44:31

on whether that kick is going

44:33

to be good You can bet

44:35

on whether they're gonna the snow

44:38

will hit the ground You can

44:40

bet on almost anything instantly and

44:42

get an instant reward and to

44:45

me this is a recipe for

44:47

disaster You know that's always been

44:49

the the story of a of

44:52

a real somebody with a real

44:54

gambling problem is they'll bet on

44:56

anything anything anything at all You

44:59

know, well, look at the positive

45:01

side. The younger generation has no

45:03

money to really bet, so. They're

45:05

doing it on credit and debt.

45:08

That's even worse. Oh my God.

45:10

Do you want to see this,

45:12

the, the graph? This is, this

45:15

is the graph of legal, of

45:17

legal. Yeah, that's a big problem.

45:19

There's very strong whiffs of things

45:22

are going very wrong with the

45:24

sports competitions themselves. Yeah. Yeah, there's

45:26

a lot of, there's, you know,

45:28

my son, who is a huge

45:31

fan of the Green Bay Packers,

45:33

is not going to watch the

45:35

Super Bowl today. He says it's

45:38

all rigged. Yeah. Now whether it

45:40

is or not, the fact that

45:42

the NFL profits from sports gambling.

45:45

Remember Pete Rose never got into

45:47

the Hall of Fame because he

45:49

bet on baseball? I think your

45:51

son's a really handy bellwether for

45:54

how gambling is just going to

45:56

undermine the nature of sports fans.

45:58

and further separate from being a

46:01

community original thing to being, come

46:03

on, make me some money thing.

46:05

Well, and that's something to worry

46:08

about. One of the things also,

46:10

I was thinking with the, with

46:12

that prop bet thing, you already

46:14

have problems even with the stock

46:17

markets, which in their own

46:19

way is their own form of

46:21

betting, but at least there's some

46:23

form of asset underneath it. you

46:25

have issues where the speed by

46:27

which the information can get exchanged

46:29

and the bet can be placed

46:31

has distorting values and there is

46:33

money to be made by arbitraging

46:35

the advantages that somebody has based

46:37

on the speed that they're able

46:39

to get things done. And we're

46:41

talking like speed in terms of

46:43

split seconds. So for all of

46:45

these things, it's not that you're

46:48

truly dealing with natural odds.

46:50

You're dealing with something that

46:52

is inherently those odds are flexing

46:54

based on advantages built in that

46:56

people aren't calculating for and are

46:59

not getting calculated for. So they

47:01

aren't fair bets. Well, I mean, that's

47:03

also the stock market. It's also, but

47:05

it becomes a big problem and it

47:08

becomes a problem that the stock market

47:10

actually has to try to solve

47:12

for. Well, that's true. And I'm

47:14

just saying these companies are regulated

47:16

by the commodities, the CFTC, what

47:18

does that stand for? The commodities,

47:20

something trading commission. which is a

47:22

separate governmental organization, which

47:25

by the way Elon Musk

47:27

is trying to dissolve right

47:30

now. The Commodities Future Trading

47:32

Commission did actually sign off

47:34

on what they call events

47:37

contracts, allowing people to trade

47:39

on whether Taylor Swift would announce

47:41

her next album or whether a

47:44

new movie would tank at the

47:46

box office. crypto.com submitted

47:48

filings to the CFTC.

47:50

They didn't ask permission, by the way.

47:53

This doesn't work that way. They informed

47:55

the commission of their intent. And

47:57

then it's kind of, there's a

47:59

fast track. process. These companies get

48:01

to self-certify their derivatives contracts. And

48:04

then the CFTC, if they

48:06

decided to, could shut it down.

48:08

Nobody did, of course. The firm

48:11

filed its paperwork five days before

48:13

Christmas. Probably thinking, you know, no,

48:15

nobody's going to be there. Actually,

48:18

it was the day before the

48:20

big government, the threat of a

48:23

government shutdown. They picked the timing

48:25

pretty wisely said Peter Malashev a

48:27

partner at a DC law firm

48:30

right before Christmas after the elections

48:32

So anyway I thought well, let

48:35

me see if I can

48:37

get a crypto.com account quick so

48:39

that I can just you know

48:41

show you what it looks like

48:44

to Trade on the result

48:46

of the Super Bowl unfortunately after

48:48

submitting a picture of my driver's

48:51

license, a video of me turning

48:53

my head left and right,

48:55

giving him my cell phone number,

48:58

they said, we'll get back to

49:00

you in one to five business

49:02

days. So I don't think I'm

49:05

going to get to bed on

49:07

the Super Bowl today. I just

49:10

thought it's got that. So there's

49:12

some security. I'm safe. Yeah. Hey,

49:14

good news. There's going to

49:16

be some good ads on the

49:19

on the game, the big game.

49:21

Open AI is a. doing its

49:24

first ad, there were I

49:26

think three different AI ads last

49:28

year, including anthropic, but no open

49:30

AI. Google, of course, talked about

49:33

its AI. This year, Google

49:35

was going to run a Super

49:37

Bowl ad, I think they still

49:40

are, that said that Gouda was

49:42

the world's most popular cheese. According

49:44

to Gemini. It is not. It

49:47

does not make up 50 to

49:49

60% of the world's cheese consumption.

49:52

That is not true. It's going

49:54

to say how many rocks

49:56

are in it. It isn't part

49:58

of the glue. Right. These are

50:01

almost glue anywhere. Google. has edited

50:03

Gemini's AI response in a

50:05

Super Bowl commercial, this is according

50:08

to the verge, to remove that

50:10

incorrect statistic, the ad shows the

50:12

small business owner using Gemini

50:14

to write a website description about

50:17

Gouda. In the edited video, Gemini's

50:19

response now skips over the specifics

50:22

and just says Gouda is one

50:24

of the most popular cheeses in

50:26

the world. At this point I

50:29

doubt that, but yeah. Well, you

50:31

know why? Because you know those

50:34

little red wax covered baby

50:36

bells? That's gooda. Ah, well, there

50:38

you go. And it's delicious, by

50:40

the way. And you're accepting sponsorship

50:43

offers from Baby Bell. I

50:45

would love some cheese. I'll take

50:47

my money and cheese. Google Cloudaps

50:50

president Jerry Dishler said, it's not

50:52

a hallucination. That's not, it

50:54

reminds me, remember Martin Short used

50:56

to do that lying lawyer, it

50:59

was smoking all the time, it

51:01

had this sweaty lip, I knew,

51:04

I knew that, I always knew,

51:06

it's, it's not a hallucination, it's

51:08

grounded in the web. Apparently, this

51:11

came from a website called cheese.com,

51:13

which is filled with, according

51:15

to the verge, what seems to

51:18

be SEO optimized blogs. Gouda is,

51:20

according to the Evie Baker Professor

51:22

of Agricultural Economics at Cornell,

51:24

most assuredly not the most widely

51:27

consumed cheese in the world. So

51:29

they've edited it out. So you

51:32

will see the ad on

51:34

the Super Bowl if you watch

51:36

that, but no. In fact, that

51:38

business owner was real, the website

51:41

was real, and he has now

51:43

removed the Gouda claim as well.

51:46

For the camera wasn't on me,

51:48

but I'm sure our listeners could

51:50

hear my eyes roll when you

51:53

were talking about this in

51:55

the SEO, my optimized blogs. Yeah,

51:57

well, Yeah, cheese.com, apparently not the

52:00

cheese authority you might have thought.

52:02

A disclaimer beneath Gemini's response

52:04

says, it's not intended to be

52:06

factual. So maybe you're right about

52:09

AI. Why would you? Reminds me,

52:11

the humane. Remember, humane had

52:13

their, their, their, their, their, their,

52:16

their, their, pan. Or, and they

52:18

made a prediction. I had somebody

52:21

do with the stars and, or

52:23

an eclipse or something, and, and,

52:25

oh, wrong. Well, right. Google rolled

52:28

out, The demo had incorrect

52:30

facts. Gemini hallucinated in

52:33

its rollout. Yeah, Gemini

52:35

too just came out this

52:37

week and people are saying

52:39

great things about it. I don't

52:41

know. See, I just, I don't want

52:43

to poo-poo you guys. We will

52:45

find out, but I honestly think

52:48

that we are seeing such

52:50

amazing progress with AI that

52:52

it is almost inevitable that

52:55

we are going to see.

52:57

human level intelligence in the

52:59

next few years, maybe sooner

53:01

or later. I'm not entirely sure

53:03

what that something is. Oh, I

53:05

could be wrong. Something we should

53:07

actually want. Yeah, I can be

53:10

as confidently wrong as people are.

53:12

From a pure computing

53:14

standpoint. this stuff is cool like

53:16

and and we are amazing we've

53:18

not seen anything like this within

53:21

the in terms of is it

53:23

anything that can substitute for intelligence

53:25

is it anything that is useful

53:27

valuable and also not dangerous I

53:29

I would not put you know

53:31

all my chips on that prediction

53:33

I'm not putting all my chips on

53:35

it but I am I feel fairly you

53:38

know I wear now I wouldn't put any

53:40

chips on it I keep showing this this

53:42

is this is a This is announced at CES,

53:44

this is the B computer, it's recording

53:46

this conversation and everything that happens to

53:48

me. And then it gives me an

53:50

AI summary at the end of every

53:52

day with action items and all that

53:55

stuff. And yeah, it makes a lot of

53:57

dumb mistakes when I was watching a movie

53:59

the other night. and it thinks I'm

54:01

rehearsing for a role in Richard III.

54:03

So it's a little confused, because I'm

54:06

not. But there's some stuff in it

54:08

that's kind of amazing, and I feel

54:10

like it's making my journal for me.

54:12

I honestly wish I'd had this for

54:15

my whole life, because it would be

54:17

so cool. But there are mistakes in

54:19

it. You just, you have to, I

54:22

think you have to understand how to

54:24

use it, really. I don't mind it

54:26

as a tool that the humans wield.

54:29

What I bristle at is this idea

54:31

that it is going to be the

54:33

tool that can just have a certain

54:36

degree of autonomy that humans do not

54:38

need to wield it anymore. And especially

54:40

when it comes to atomic weapons. Well,

54:43

you also talk about transparency should be

54:45

part of that too, at least. with

54:47

a human workflow you do have certain

54:50

levels of built-in accountability either in an

54:52

implicit in social way or in an

54:54

explicit organizational way and one of the

54:57

things that's super disturbing about the way

54:59

we're mainstreaming AI utilization in general is

55:01

is the whole it's a black box

55:04

don't you worry you're pretty little monkey

55:06

heads about how it works right no

55:08

I agree with you but it's well

55:11

it's kind of a black box transparency

55:13

and accountability and not in a give

55:15

us your trade secrets way but rather

55:18

when you do have a screw up

55:20

like glue pizza, what are you doing

55:22

about this? How did it happen? How

55:25

can we be sure it will never

55:27

happen again? No, I think that's the

55:29

wrong attitude to be honest. I respectfully

55:32

disagree because that, first of all, you

55:34

can't fix that. It is a black

55:36

box even to the guys who are

55:39

writing this stuff. They don't know exactly

55:41

how it comes to these conclusions. And

55:43

I think AI safety is a mistake.

55:46

What we, you can't expect the, first

55:48

of all, we know it doesn't work.

55:50

Every AI has been jailbroken. We did

55:53

a whole show on it on Tuesday

55:55

with Steve Gibson. It's almost impossible to

55:57

have AI safety. What it requires is

56:00

human intelligence intermediating it. So I agree

56:02

with you, you should not give control.

56:04

of our nuclear oxygen to an AI.

56:07

There's got to be a human in

56:09

between the AI and the launch button.

56:11

But I don't think you can make

56:14

the AI not make mistakes, not hallucinate,

56:16

or be safe. I think that that's-

56:18

There just needs to be a built-in

56:21

level of review and accountability. That's currilacking.

56:23

Yeah. But I think what we have

56:25

to do is train humans, train ourselves

56:28

to use AI appropriately. I really like

56:30

the idea of DEI for AI, by

56:32

the way. I'm going to... Diversity, inclusion

56:35

for AI. It comes down to data

56:37

sets. Well, who's responsible for picking the

56:39

data sets for training? Right. And more

56:42

importantly, how is their work being checked?

56:44

But this was the breakthrough, I think,

56:46

that made this all possible, is that

56:49

we have... I remember 30 years ago,

56:51

I did stories on it. There was

56:53

a woman who was trying to create

56:56

an AI and had... an army of

56:58

100 stenographers typing in every fact she

57:00

could find. Well, in the intervening three

57:03

years, we did it for her, we

57:05

created the internet and put everything we

57:07

could think of into it, and then

57:10

the AI has access to it. So

57:12

I don't think it's a question of

57:14

data sets as much. I mean, it

57:17

might be for face recognition, things like

57:19

that. Generally yeah it's it's whatever they

57:21

can get their hands on right yeah

57:24

yeah well actually that's one of our

57:26

stories is meta yeah with meta in

57:28

the books and sure thing so meta

57:31

what what's the story here there's a

57:33

really really great book that I want

57:35

I want to recommend to anyone who

57:38

wants to talk about AI called code

57:40

dependent by wired reporter Madhamita Mirgia and

57:42

she travels the country and the globe

57:45

and goes to different countries where AI

57:47

sweatshops to set up where people are

57:49

specifically tagging images or specifically tagging different

57:52

pieces of data to train the AI

57:54

and she points out there is no

57:56

unless you have people who dig into

57:59

this you have no idea how what

58:01

data is being used to train you

58:03

have no idea how it's being tagged,

58:06

you have no idea how it's being

58:08

structured, or if it can be even

58:10

used as we get to more and

58:13

more sophisticated iterations of AI, because structure

58:15

versus unstructured data. I think we've run

58:17

out of, I think we've actually, the

58:20

biggest problem is we've run out of

58:22

stuff to train AI with. Yeah. Well

58:24

then let's get to your next story,

58:27

which is a doozy. Oh my gosh.

58:29

Oh, we got big. You know what?

58:31

I'm, I, this is, this like show

58:34

is like a mountain. We've started at

58:36

the little tip of the iceberg. We're

58:38

getting to the middle. There's, there is

58:41

some stuff to talk about. What's going

58:43

on in Washington DC is somewhat shocking.

58:45

Kathy, you've written quite a bit about

58:48

that. We have lots more to talk

58:50

about. It's good to have you, Kathy

58:52

Gellis, writing for Tech, read her, read

58:55

her, her articles this week, her articles,

58:57

this week. I'm amazed that you're not

58:59

like. bouncing off the walls here. I

59:02

am apoplectic. I'm just amazed that I

59:04

am still sitting here, but give a

59:06

time. I can see the apoplexy forming

59:09

also here from Windows Central Daniel Rubino

59:11

editor-in-chief. Always a pleasure to have you

59:13

on. Thank you, Daniel. And of course,

59:16

Leisha Schmeiser from No jitter.com. Our show

59:18

today brought to you by Kota. Have

59:20

you used Kota? Kota is so cool.

59:23

turning your back of a napkin idea

59:25

into a billion dollar startup. You know,

59:27

have you ever thought about that? It's

59:30

going to take countless hours of collaboration,

59:32

of teamwork. And it's hard to build

59:34

that team. It's a team that's aligned

59:37

on everything for values to workflow. But

59:39

there is a tool that helps. It's

59:41

exactly what CODA. was made to do.

59:44

CODA, CODA, it's an all-in-one collaborative workspaces,

59:46

started literally itself as a napkin sketch,

59:48

and now in the five years since

59:51

launching in beta, CODA has helped 50,000

59:53

teams all over the world get on

59:55

the same page. With CODA, you get

59:58

the flexibility. Actually, let me give you

1:00:00

the address right now, because I know

1:00:02

you want to just look at it,

1:00:05

and you can't. Go to coda.io,

1:00:07

coda.io/Twitter. And while you're looking,

1:00:09

let me explain. With CODA,

1:00:11

you get the flexibility of

1:00:13

docs, the structure of spreadsheets,

1:00:15

the power of applications, and

1:00:17

the intelligence of AI, all

1:00:19

together in a seamless workspace

1:00:21

built for enterprise. It

1:00:23

facilitates deeper collaboration, quicker

1:00:26

creativity, gives you more time

1:00:28

to build. You've got to take a

1:00:30

look at this. It's really mind-blowing. It's

1:00:32

incredible. If you're a startup team and

1:00:34

you're looking to increase alignment and agility,

1:00:36

it's a really hard thing to do,

1:00:38

especially, I mean, you can say return

1:00:40

to office, but frankly, most of us

1:00:42

are still remote, right? Cota brings you

1:00:45

all together instead of a physical

1:00:47

office. You have a virtual workspace

1:00:49

that everybody is involved with. Is

1:00:51

it as attached to us, is

1:00:53

utilizing. COD can help you move

1:00:55

from planning to execution and record-to

1:00:57

execution and record time. You've got

1:00:59

to try it for yourself. Here's

1:01:01

the good news. Now you're

1:01:03

at cota.io/Twitter, right? You're

1:01:05

looking, this is interesting. How

1:01:08

about six months free? Six months

1:01:10

free of the team plan for

1:01:12

startups. What about that? Six

1:01:15

months half a year free. C-O-D-A-O-

1:01:17

slash, that's enough to get

1:01:19

that napkin pretty far off

1:01:21

along the line to a

1:01:23

startup, to a unicorn. Get

1:01:25

started for free, get six.

1:01:27

free months of the team

1:01:29

plan. That's the best offer

1:01:31

ever. kota.io/to what we love

1:01:33

these guys. Thank you, Kota

1:01:35

for supporting us. Thank you,

1:01:37

Kota. So let's talk about

1:01:39

this meta story. I'm not

1:01:41

really kind of sure what

1:01:43

happened. Of course, meta, like

1:01:45

every other AI company is

1:01:47

trying to train on as much

1:01:49

data as possible. There is a

1:01:51

rich trove of data that isn't

1:01:54

in fact on the internet. It's

1:01:56

called books and if

1:01:58

you could ingest library

1:02:00

right it'd be good for

1:02:02

your AI to train on

1:02:04

every possible book out there

1:02:07

and of course authors and

1:02:09

publishers not too happy about it

1:02:11

there is a copyright case against

1:02:14

meta raised by book book

1:02:16

authors they allege meta illegally

1:02:18

trained its AI models not

1:02:20

on books it bought but

1:02:22

on pirated books last month

1:02:25

meta admitted to torrenting Controversial

1:02:27

data set known as Libgen

1:02:29

has tens of millions of

1:02:32

pirated books. We kind of

1:02:34

knew this because you were

1:02:36

able to find bits and

1:02:39

pieces of those books in

1:02:41

meta and llama. Yesterday, meta's,

1:02:43

well actually it was I

1:02:46

think Thursday, meta's unredacted emails

1:02:48

were made public for the

1:02:51

first time. The evidence is

1:02:53

meta torrented 81.7 terabytes

1:02:56

of data. Through the site

1:02:58

Anna's archive, including 35.7

1:03:00

terabytes of data from Z

1:03:03

library and Libgen, the court

1:03:05

filing said, Meta also previously

1:03:07

tormented 80.6 terabytes of data

1:03:10

from Libgen. All onto a

1:03:12

laptop! Turrenting from a corporate

1:03:15

laptop doesn't feel right, says

1:03:17

Nikolai Balashkov, a meta research

1:03:20

engineer. writing in April. This

1:03:22

was the message, 2023 message,

1:03:25

adding a smiley emoji in

1:03:27

the same message she expressed

1:03:30

concern about using meta IP

1:03:32

addresses to load through Torrance

1:03:34

pirate content. By September, Bashlakov

1:03:37

dropped the emogies consulting

1:03:39

the legal team directly and

1:03:41

emphasizing in an email that

1:03:43

quote, using Torrance would entail ceding

1:03:46

the ceding, ceding an email that

1:03:48

quote using Torrance would entail ceding

1:03:50

the ceding the ceding the files.

1:03:52

i.e. sharing the content outside, that's what you

1:03:54

don't have to do that, but that's kind

1:03:57

of the, you know, polite thing to do.

1:03:59

This could be... He said, he wrote,

1:04:01

legally not okay. Man, this is why

1:04:03

companies don't want you to put this

1:04:05

stuff in emails. Oh,

1:04:08

emails prove that Meta knew

1:04:10

it was illegal. Bashlikov's

1:04:14

warnings landed on deaf ears. The

1:04:18

authors who were suing say the evidence

1:04:20

showed Meta chose instead to hide its turning

1:04:22

as best it could while downloading and

1:04:24

seeding. They did in fact say, you don't

1:04:26

have to seed, by the way, if

1:04:28

you download. But they did, seeding terabytes of

1:04:30

data from multiple shadow libraries as recently

1:04:32

as April of last year. They

1:04:36

didn't use Facebook's servers while

1:04:38

downloading the dataset to, quote,

1:04:40

avoid the risk of anyone

1:04:42

tracing the seeder downloader. This

1:04:44

is from an internal message

1:04:46

from Meta researcher, Frank Zhang.

1:04:50

He described the work as being in stealth

1:04:52

mode. They modified

1:04:54

settings so that the smallest amount

1:04:56

of seeding possible could occur. Oh,

1:04:59

well. I

1:05:02

think they're busted. Mark

1:05:04

Zuckerberg said, I didn't do it.

1:05:06

I didn't know anything about it.

1:05:08

Come on, Mark, where's that masculine

1:05:10

energy that would, to meet the

1:05:12

moment. He claimed to have no

1:05:15

involvement in the decisions to use

1:05:17

libgen to train AI models. Well,

1:05:19

it wouldn't really matter, I think,

1:05:21

whether he did or he didn't.

1:05:23

is the CEO. He's so much

1:05:25

the person of the, there's some

1:05:27

open questions whether the persona of

1:05:29

the company really can be separate

1:05:31

from some of its leaders, especially

1:05:33

when their personal actions are so

1:05:36

closely tied to them in a

1:05:38

way that corporate structures are not

1:05:40

really operable. I think for him,

1:05:42

he's in better shape than, say,

1:05:44

Musk up the road, but that

1:05:46

raises a question. I don't think

1:05:48

somebody will, I'm sure, try to

1:05:50

pick at that. I have a

1:05:52

couple of concerns about the story

1:05:54

and things that are in the

1:05:57

story. This is, by the way,

1:05:59

I'll give credit to Ashley Belanger,

1:06:01

senior policy. reporter at Arts Technica writing this story.

1:06:03

Go ahead. I, so if they seated, I

1:06:05

do kind of face palm at them,

1:06:07

but a couple of, I don't

1:06:09

necessarily have the same concern if

1:06:11

they downloaded because I don't think

1:06:13

there should be a distinctive... Is

1:06:16

that the right to read you've

1:06:18

spoken about? It would still go into

1:06:20

that, like if you were allowed to...

1:06:22

Does it change it that it's pirated

1:06:24

material? No, because it... I don't think

1:06:27

it's something that it essentially would be

1:06:29

that the copyright holder could say no

1:06:31

you can't read this book unless you

1:06:33

paid but the copyright really only applies

1:06:36

to making copies of it itself. Maybe

1:06:38

it kind of is an AI adjusting

1:06:40

something making a copy isn't that making

1:06:42

a copy? I don't I don't think so

1:06:44

and all and the lawyers who are defending

1:06:47

these companies are very clear that what you

1:06:49

really have is not making a copy you

1:06:51

really have it learning and just storing

1:06:53

it. than me reading and remembering something

1:06:55

I read from a book. I'm not

1:06:57

making a copy of my brain. Yeah,

1:06:59

the lawyers are arguing that, but I

1:07:01

think based on how the technology actually

1:07:04

behaves, that it's really more of a

1:07:06

learning function, in which case, if you're

1:07:08

trying to create an artificial intelligence, it's

1:07:10

going to function and learn the way

1:07:12

a human intelligence would and store information

1:07:15

that it's gleaned in some way that

1:07:17

it can use it again. So in

1:07:19

that sense, I think the downloading is

1:07:21

not particularly dispositive. You know, I take,

1:07:23

I don't, I don't represent meta so

1:07:26

I can throw stones at it as

1:07:28

much as I want, but. Oh yeah, my

1:07:30

legal senses are like, oh, seriously,

1:07:32

dude, like, okay, fine, if I

1:07:34

had to defend it, I defend

1:07:36

it, but I think it's a harder,

1:07:39

it's a harder left. I also, I

1:07:41

have, I have torn it for years,

1:07:43

you don't have to seed. There's

1:07:46

a checkbox in the client. As

1:07:48

the lawyer, I don't want to

1:07:50

blur these two things together. I

1:07:53

think the legal analysis of whether

1:07:55

downloading was okay is a different

1:07:57

legal analysis for whether seating. that

1:08:00

bothers me about this case, there's some

1:08:02

earlier reporting where the, um, some of

1:08:04

these emails had been privileged and the

1:08:07

privilege was essentially punctured through assertions of

1:08:09

the crime fraud exception to attorney client

1:08:11

privilege and I'm a little bit uncomfortable

1:08:13

with and maybe more than a little,

1:08:16

but I, there would be more details

1:08:18

about that happening. You're an attorney and

1:08:20

of course, you're thinking like an attorney.

1:08:22

And that is important in the trial,

1:08:25

but it's not important to our discussion,

1:08:27

right? Well, but it kind of is.

1:08:29

Well, no, let's stipulate that they did,

1:08:31

in fact, bit torn a bunch of

1:08:34

pirated books. I mean, the discussion we're

1:08:36

having is, is that okay or not?

1:08:38

You say it's okay. No, I think

1:08:40

it does matter, because the whole way

1:08:43

that you could puncturing privilege is a

1:08:45

really. No, I understand. From a legal

1:08:47

point of view, that's important. But it

1:08:49

doesn't matter. We can still talk about

1:08:52

it. But it's predicated on the idea

1:08:54

that it is so wrong what was

1:08:56

being protected by this privilege that you

1:08:59

could then have this. That could be

1:09:01

the consequence of it. And I'm thinking

1:09:03

that I'm not sure it is wrong

1:09:05

enough that it should have had that

1:09:08

collateral consequence. Well, it may get them

1:09:10

off the hook. We're not debating whether

1:09:12

they should be what should happen to

1:09:14

the trial. We're just talking about the

1:09:17

fact that they did it and whether

1:09:19

that's okay. Do you think it's okay,

1:09:21

Lisa? You know, I

1:09:23

am just going to be quiet

1:09:25

and let the lawyer and let

1:09:27

the Leo talk to each other

1:09:29

about it. How about you Daniel?

1:09:31

No, this is this is a

1:09:33

real lesson for me. This is

1:09:35

this is a master class and

1:09:37

legal excellence. I'm just here to

1:09:39

learn. Yeah, I've learned from Kathy

1:09:41

about this so-called right to read,

1:09:43

which I think is very interesting.

1:09:45

And if you say an AI

1:09:47

is is not copying but learning

1:09:49

from something, that's something we do.

1:09:51

Well Kathy is it illegal for

1:09:53

me to read a pirated book?

1:09:55

I don't know what right the

1:09:57

copyright holder would be able to

1:09:59

assert against. you. They might try

1:10:01

to say that your personal... They

1:10:03

could go after the piracy site.

1:10:05

Yeah, I think a lot of

1:10:07

the file sharing litigation originally had

1:10:09

been more that making available was

1:10:12

the thing that stepped on the

1:10:14

copyright. So that's what the reading

1:10:16

is a lot harder. Right. They

1:10:18

never went after people for listening.

1:10:20

Right. And it's a much harder

1:10:22

reach to be able to do

1:10:24

that because that's not that's not

1:10:26

controlling the copying in the same

1:10:28

way. Right. It's more. It's a

1:10:30

video games. Everything I was torn

1:10:32

today never went after the people

1:10:34

who were just downloading it. And,

1:10:36

you know, getting back to Leo,

1:10:38

you have pointed us out the

1:10:40

seating part. That's the uncomfortable issue,

1:10:42

right? It's just like your ISP,

1:10:44

if it catches you. using bit

1:10:46

torrent, it's not so much because

1:10:48

you're downloading, it's because you're seeding

1:10:50

and you're sharing copyright of material.

1:10:52

That's the legality. So when it

1:10:54

comes to this issue here with

1:10:56

meta and Facebook and books, it's

1:10:58

the seating that feels a little

1:11:00

uncomfortable that they're doing, but the

1:11:02

ability to, and I think the

1:11:04

courts have very kind of upheld

1:11:06

this a little bit, just came

1:11:08

back to that, you know, ability

1:11:10

to read. It's true. AI. If

1:11:12

you could ask AI, like co-pilot,

1:11:14

like co-pilot or Gemini, or Gemini,

1:11:16

be like, you know, you know,

1:11:18

print up the entire book for

1:11:20

Stephen King, his latest book, and

1:11:22

it gave you the entire thing.

1:11:24

Okay, that would be an issue,

1:11:26

right? Because that would be then

1:11:28

distributing an entire book for free,

1:11:30

you know, and free. But it

1:11:32

doesn't do that. I mean, really,

1:11:34

that a lot of... New York

1:11:36

Times says that that Open AI

1:11:38

did in fact regurgitate New York

1:11:40

Times articles. But in order to

1:11:42

do that, they had to jump

1:11:44

through hoops. They had to print

1:11:46

the first two paragraphs and say,

1:11:48

what would the first two paragraphs

1:11:50

and say, I'm not convinced that

1:11:52

AI is there to, or is

1:11:54

even able to make, you know,

1:11:56

you can't get war and peace

1:11:58

out of an AI. It's also

1:12:00

worth talking a little, well actually,

1:12:02

I guess it was, the copyright

1:12:04

office came out with a study

1:12:06

on its second level of study

1:12:08

on AI and where copyright law

1:12:10

needs to be, although I think,

1:12:12

actually the one that just came

1:12:14

out wasn't on liability for output,

1:12:16

I think it was based on

1:12:18

potential protection available for output. And

1:12:20

it basically said, it's gonna be

1:12:22

case specific and you'll have to

1:12:25

look at how much originality went

1:12:27

into causing it to generate something.

1:12:29

Just fascinating stuff. By the way,

1:12:31

Amazon is going to have a

1:12:33

big event revealing something we have

1:12:35

all been waiting for. February 26th,

1:12:37

you're gonna see Panos Panos Panay,

1:12:39

the former beloved devices guy from

1:12:41

Microsoft, show the new Amazon echoes

1:12:43

with AI, I think, right? This

1:12:45

is, Panos, how many people is

1:12:47

Panos stolen from Microsoft? Didn't more

1:12:49

people follow him over to Amazon?

1:12:51

Yeah. Ralph Green recently went over

1:12:53

there, their head designer. So we

1:12:55

should see more German brutalist designs

1:12:57

for election devices, which I'm all

1:12:59

for. We have made an excess

1:13:01

square. It's like, how is that

1:13:03

bad? That sounds pretty awesome. It

1:13:05

says now three by two because

1:13:07

that's the problem. Awesome designer. Yeah,

1:13:09

I love that surfaces look. I

1:13:11

think they're great. Yeah, this is.

1:13:13

interesting only because Panos Panos and

1:13:15

Amroth Gray and Urb now part

1:13:17

of that division because they do

1:13:19

have some amazing ideas and they

1:13:21

tend to be very innovative and

1:13:23

if anything you know Amazon and

1:13:25

Alexa needs a real kick in

1:13:27

the pants because they have some

1:13:29

real issues their products are fine

1:13:31

but the whole point of theory

1:13:33

which isn't saying much yeah which

1:13:35

isn't so much, but the whole

1:13:37

point was to get you to

1:13:39

use Alexa to buy more stuff

1:13:41

from Amazon or supposed to be

1:13:43

a shopping system. Everybody just ends

1:13:45

up using it as a timer

1:13:47

to play music and they're losing

1:13:49

money on all the hardware. So

1:13:51

there's no reason to use it.

1:13:53

Billions. Reason to use it. Yeah.

1:13:55

They've lost, I think they've got

1:13:57

to figure out, 10 billion or

1:13:59

something over years. Yeah. So they

1:14:01

got to figure out how to

1:14:03

make this interesting enough to, especially

1:14:05

to, especially to charge, to charge,

1:14:07

But they've explored charging people $5

1:14:09

to $10 a month for that.

1:14:11

That was the rumor. Yeah. Yeah.

1:14:13

So like, they got to monetize

1:14:15

this, right? So how useful is

1:14:17

this going to be in your

1:14:19

house versus what's on your smartphone?

1:14:21

Which is real, you know, that's

1:14:23

where the real war has been.

1:14:25

But if Android has theirs, and

1:14:27

of course you have Apple has

1:14:29

theirs, but there's no room there

1:14:31

for Amazon, just like there was

1:14:33

no room for Microsoft on it

1:14:35

ways. I don't. for a smarter

1:14:38

Amazon echo. I have to pay

1:14:40

zero dollars for an echo because

1:14:42

I don't want an echo. You

1:14:44

don't have any echoes anyway. I

1:14:46

don't have any smart home devices

1:14:48

because I don't trust the companies

1:14:50

enough in terms of protecting my

1:14:52

data or respecting my privacy. So

1:14:54

every room in my house has

1:14:56

a Siri, has an echo, and

1:14:58

has a Google voice assistant. Every

1:15:00

room in my house. And I

1:15:02

think that's great. And I wear

1:15:04

this bracelet that's sending everything that

1:15:06

happens to some unknown AI in

1:15:08

the sky. I don't even know

1:15:10

where it goes. We're gonna, by

1:15:12

the way, interview the creators of

1:15:14

this on the 19th of February.

1:15:16

And that's the first thing I'm

1:15:18

asking of is, where's this going?

1:15:20

I think the privacy thing is

1:15:22

always. Go ahead. Just because the

1:15:24

privacy thing is an interesting discussion

1:15:26

because it is in theory extremely

1:15:28

important. In reality though, it's not.

1:15:30

We've seen over and over again

1:15:32

after years reporting on this stuff

1:15:34

that people really just don't kind

1:15:36

of care. They'd rather have convenience

1:15:38

or I don't care. They already

1:15:40

put all their information out there

1:15:42

online. There's not a week that

1:15:44

goes by where most of us

1:15:46

don't get a letter in mail

1:15:48

saying your data's been breached somewhere,

1:15:50

right? Like this is just like.

1:15:52

I stand in front of my

1:15:54

echo show and dance naked just

1:15:56

for the fun of it. Right.

1:15:58

Yeah, exactly. Hoping that somebody in

1:16:00

China is forced to watch it.

1:16:02

There's a joke, right, with younger

1:16:04

generations. They put so much out

1:16:06

there when they're young, that like,

1:16:08

like, you know, for myself, somebody

1:16:10

came out from photos of me

1:16:12

in college, would be a little

1:16:14

embarrassing, right? Because of my generation.

1:16:16

But if you're younger, you really

1:16:18

put everything out there and exposed

1:16:20

yourself. There's nothing left you can

1:16:22

really do. But even, you know,

1:16:24

nudity is not that big of

1:16:26

a deal anymore. So it's like,

1:16:28

there is an interesting counter argument

1:16:30

to this, which is if you

1:16:32

completely expose yourself as much as

1:16:34

possible, there really is no risk

1:16:36

of your data leaking ever or

1:16:38

privacy concerns, right? You just let

1:16:40

it all go. want that you

1:16:42

know Amazon's been talking about this

1:16:44

for a while it's been slowed

1:16:46

down because according to reports it

1:16:48

was incredibly stupid and just really

1:16:51

was like so bad that they

1:16:53

couldn't release it according to Reuters

1:16:55

executives have scheduled a go no

1:16:57

go meeting for putting AI into

1:16:59

echo for Valentine's Day writers says

1:17:01

there they will make a final

1:17:03

decision on the street readiness of

1:17:05

echoes generative AI revamp according to

1:17:07

the people and an internal planning

1:17:09

document seen by Reuters. So it

1:17:11

isn't yet known whether they'll release

1:17:13

this and there have been you

1:17:15

know there have been reports of

1:17:17

it not being very good. I

1:17:19

couldn't get worse in Syria. Syria

1:17:21

has literally gotten worse with Apple

1:17:23

intelligence than it was before. John

1:17:25

Gruber and and another blogger post

1:17:27

asked a asked Siri to tell

1:17:29

it who won every Super Bowl

1:17:31

from zero through 60 and it

1:17:33

was no it was horrible it

1:17:35

was terribly wrong whereas in the

1:17:37

past it would have just said

1:17:39

I don't know but look here's

1:17:41

what I found on the web

1:17:43

about that which would have been

1:17:45

accurate there's a lot that's gotten

1:17:47

bad as AI has been it

1:17:49

minimum sucking all the oxygen in

1:17:51

the room, but also now also

1:17:53

getting embedded in all sorts of

1:17:55

software. I think if you counted

1:17:57

what has not gotten worse, let

1:17:59

alone what has actually gotten better,

1:18:01

the count, you know, very little

1:18:03

has not gotten worse. from all

1:18:05

this. Well, remember, you were, you

1:18:07

were talking about eating rocks and

1:18:09

putting Elmer's glue on pizza. That

1:18:11

was a Google search result. Google

1:18:13

has now again started testing a

1:18:15

new search AI mode internally. And

1:18:17

the reason is there's intense pressure.

1:18:19

I don't use Google anymore. I

1:18:21

use perplexity. I use AI search.

1:18:23

And I bet a lot of

1:18:25

people do. The search results are

1:18:27

better. Google's got to see this

1:18:29

as an existential threat to its

1:18:31

business. Yet Google's existential threat is

1:18:33

making its search engine which had

1:18:35

been industry leading now crap. Well

1:18:37

that's true they did it to

1:18:39

themselves didn't they did it themselves

1:18:41

they were doing they produced a

1:18:43

really good product for years and

1:18:45

years and years and then decided

1:18:47

Let's not. Let's go change it.

1:18:49

Let's give up on everything that

1:18:51

made it good. And instead of

1:18:53

trying to make it better, we

1:18:55

will just make it different. And

1:18:57

they've made it different in a

1:18:59

way that just can't compete with

1:19:01

what they originally had. But they

1:19:03

keep drawing anyway. And it's terrible.

1:19:06

And people just lose trust with

1:19:08

the company. And we don't like

1:19:10

it as an AI company. And

1:19:12

we don't even like it as

1:19:14

a search engine anymore. From 9

1:19:16

to 5 Google. Here here's a

1:19:18

screenshot. How many boxes of spaghetti

1:19:20

should I buy to feed six

1:19:22

adults and ten children and have

1:19:24

enough for seconds? Are they American

1:19:26

or European children? Yeah, really. Americans,

1:19:28

you need like 80 boxes. Adding

1:19:30

the children's portions and the adults

1:19:32

portions gives you a total estimate

1:19:34

of 38 to 54 ounces. Increasing

1:19:36

this by 25 to 50% for

1:19:38

second servings puts you in the

1:19:40

range of 47.5 to 81 ounces.

1:19:42

Most boxes of spaghetti are one

1:19:44

pound. Anyway, it says you should

1:19:46

get three to five boxes of

1:19:48

standard-sized spaghetti to feed six adults.

1:19:50

This is an example, yeah. Or

1:19:52

compare wool down in synthetic jackets

1:19:54

in terms of insulation, water resistance,

1:19:56

and durability. You know, that is

1:19:58

the kind of... we would prefer

1:20:00

to do then, you know, where

1:20:02

can I buy down jackets? Yeah,

1:20:04

or what's the best down jacket,

1:20:06

which we know is a terrible

1:20:08

yourself. Really good. There's there's an,

1:20:10

you know, you point that out

1:20:12

though, and now I'm thinking there's

1:20:14

like wire cutter and Tom's guide

1:20:16

and all of these other sites

1:20:18

that are making their bones on.

1:20:20

offering that kind of a value

1:20:22

of buying advice is journalism. But

1:20:24

they should be expert, based on

1:20:26

expert research. Yeah, no, that's right.

1:20:28

Yeah, because this AI will combine

1:20:30

all of those results. It might

1:20:32

give. I will take their content

1:20:34

and match it up and spin

1:20:36

it out. Yeah, but badly. So

1:20:38

actually, as long as it's bad,

1:20:40

no, they shouldn't be scared, except

1:20:42

it's very annoying because it's hard

1:20:44

to find. My favorite way of

1:20:46

searching Google. smart enough to send

1:20:48

me to the site that best

1:20:50

answers the natural language. How's that

1:20:52

working for you? Reasonably well. I

1:20:54

mean, maybe I'm only using a

1:20:56

narrow range of questions and it

1:20:58

seems to go better with some

1:21:00

than others. And sometimes that may

1:21:02

be based on what is available.

1:21:04

So I think the more interesting

1:21:06

thing is what if when there's

1:21:08

something not on. If there's something

1:21:10

on point, I really don't want

1:21:12

Google to get in the way

1:21:14

of it. If there's nothing on

1:21:16

point, then the question is, what

1:21:19

should Google do, if anything, to

1:21:21

gapfill that? Right. The other issue

1:21:23

that's gonna come up with something

1:21:25

like this, though, is you're gonna

1:21:27

see a, I mean, you're already

1:21:29

seeing a flood of AI glorish

1:21:31

in terms of content writing, where

1:21:33

there are companies devoted to using

1:21:35

generative AI models to pump out

1:21:37

things that can. rank very highly

1:21:39

in search results and say absolutely

1:21:41

nothing of use or value. There's

1:21:43

no primary sources. There's no report

1:21:45

and there's no nothing. And with

1:21:47

this kind of search that result

1:21:49

that you've put in there, Leo,

1:21:51

I didn't see any sources for

1:21:53

citations. I didn't see any facts

1:21:55

that backed it up. And it

1:21:57

actually was a screenshot from Google

1:21:59

by the way. But no, no,

1:22:01

I'm just saying. It would be

1:22:03

super easy to game the results

1:22:05

just by, you know, just by

1:22:07

having AI generate all sorts of

1:22:09

all sorts of content that gets

1:22:12

tossed into the pool of training

1:22:14

data or gets tossed into the

1:22:16

pool of results that any sort

1:22:18

of AI is going to use

1:22:20

to try to aggregate synthesize and

1:22:22

put back a good answer. I

1:22:24

just saw an ad. You have

1:22:26

to take a look at the

1:22:28

data and see where it's coming

1:22:30

from. You really do. I've got

1:22:32

Fox running in the background. I

1:22:34

just saw an ad for a

1:22:36

company called Ramp featuring Saekwan Barkley.

1:22:38

And the whole thing it does

1:22:40

is it looks at all the

1:22:43

things you do. Like in his

1:22:45

case you had a leg day.

1:22:47

And it said, well, we didn't

1:22:49

get a receipt. Give me the

1:22:51

receipt for that leg day. And

1:22:53

Zakuan took a picture of it

1:22:55

and said, okay, got it. And

1:22:57

that's done with AI. But it's

1:22:59

more than just AI. It's AI

1:23:01

that's also watching you every moment

1:23:03

of the day to see what

1:23:05

you're up to. So it can

1:23:07

say, hey, you just took an

1:23:09

Uber. Where's that receipt? This is

1:23:11

the world we're getting into. And

1:23:13

I think you were right when

1:23:15

you said, Daniel, Daniel people don't

1:23:17

care. people. I hate buying expenses.

1:23:20

Yeah. If it asks, if it

1:23:22

asks, if my phone says, hey,

1:23:24

where's the receipt for that? And

1:23:26

I take a picture. I'm happy.

1:23:28

Yeah. Yes. Right. Yes. I need

1:23:30

someone to nag me on certain

1:23:32

things in life and I'll gladly

1:23:34

take that. Oh my gosh. Let's

1:23:36

take a break. I gotta take

1:23:38

a break. We have more. You're

1:23:40

going to have plenty of time

1:23:42

to smash AI. We got lots

1:23:44

more AI coming up, including. Well.

1:23:46

I'll let you stay tuned. We

1:23:48

still haven't talked about TikTok. We've

1:23:50

got a lot of talking to

1:23:52

do. We've got talking to do...

1:23:55

Cybersecurity. Senator Josh Hawley has proposed

1:23:57

jail time for people who download

1:23:59

Deep Seek. Not an insignificant amount

1:24:01

of day time. It's the... Decoupling

1:24:03

America's artificial intelligence capabilities

1:24:05

from China Act. D-A-A-A-I, no, you

1:24:07

could have done better, Josh. You

1:24:10

could have done better, there's so

1:24:12

many ways. So many, so many

1:24:15

ways, you could be fined not

1:24:17

more than a million dollars

1:24:19

and imprisoned for not more

1:24:21

than 20 years. Oh, well, how merciful

1:24:24

of him. No more than 20

1:24:26

years, okay. Just for downloading Deep

1:24:28

Seek, which I have done now

1:24:30

on many of my devices. That's

1:24:32

the Chinese AI that apparently Josh

1:24:35

is very worried about. Great to

1:24:37

have all. I wish this is

1:24:39

this should be an eight-hour show.

1:24:41

There is so much to talk

1:24:43

about. Last night I said, oh, do

1:24:45

I have enough stories? And then I

1:24:47

looked and I went, oh my God.

1:24:50

And I want to get out of

1:24:52

here in an hour because there's some.

1:24:54

I don't know, some football game going

1:24:57

to happen a little later on. You're

1:24:59

watching this week in tech with Lisa

1:25:01

Schmeiser of No jitter.com, Daniel Rubino, Windows

1:25:04

Central, and our own personal attorney,

1:25:06

Kathy Gellis, from Techtert. Good to have

1:25:08

all of you. By the way, not

1:25:10

just Techtert, R street.org, right? You wrote

1:25:13

a nice piece for them. That

1:25:15

was really good. I wrote an erstwhile fellow

1:25:17

with them, and I wrote a bait paper.

1:25:19

Yeah, I didn't realize you were a fellow

1:25:21

there. Yeah, partially. I don't know

1:25:23

if I have a permanent title, but

1:25:26

I did fell under the auspices of

1:25:28

a fellowship. I wrote a white paper

1:25:30

about jaw boning and the DMCA. By

1:25:32

the way, apparently jaw boning,

1:25:34

whatever that is, has a little

1:25:36

bit to do with the TikTok

1:25:38

case as well. I read your

1:25:40

article. We'll talk about it when

1:25:42

we return. But first, a word

1:25:44

from our sponsor. A sponsor, I'm

1:25:47

very proud to say I not

1:25:49

only use. Steve Gibson uses and

1:25:51

we recommend heartily bit warden the

1:25:53

best password manager well more

1:25:55

than a password manager there are

1:25:57

trusted leader and passwords secrets and

1:25:59

pass key management. By the way,

1:26:02

I love pass keys. I wish

1:26:04

more sites used it. When it

1:26:06

first came out, you had pass

1:26:08

keys on your phone, you had

1:26:10

pass keys on your computer. Now

1:26:12

that bit warden supports pass keys,

1:26:14

wherever you have bit warden. You

1:26:16

have your passkeys and that has

1:26:18

been a big improvement in how

1:26:20

passkeys works for me. Bitward now

1:26:22

has 10 million users in 180

1:26:24

countries. This is an open source

1:26:26

success story. Over 50,000 business customers

1:26:28

too. In fact, they've entered 2025

1:26:30

as the essential security solution for

1:26:32

businesses of all sizes, consistently ranked

1:26:34

number one and user satisfaction by

1:26:36

G2. recognized as a leader by

1:26:38

Software Reviews Data Quadrant. Bitwarden continues

1:26:41

to protect businesses worldwide. One of

1:26:43

the reasons I love Bitwarden, and

1:26:45

I think it's because they're open

1:26:47

source. They're always expanding their capabilities.

1:26:49

They're always improving. Recently they announced

1:26:51

the general availability of their native

1:26:53

mobile apps for Iowa and Android.

1:26:55

I did not realize this. I'd

1:26:57

been using Bitwarden for years, but

1:26:59

it wasn't a native app on

1:27:01

iOS. I had no idea. And

1:27:03

it's made a huge difference, you

1:27:05

know? I guess it was an

1:27:07

electron app or a view, a

1:27:09

webview app. Key benefits of the

1:27:11

native mobile app include on iOS

1:27:13

and Android, faster load times, improved

1:27:15

overall app functionality. iOS and Android

1:27:17

platform-specific design, so it really, you

1:27:20

know, it looks like it belongs

1:27:22

on the platform. Much better hardware

1:27:24

integration, in fact, that includes biometric

1:27:26

authentication and multi- device support. So

1:27:28

that's really enhanced usability usability. do

1:27:30

the face ID and my bit

1:27:32

warden and it's unlocked. They have

1:27:34

strengthened their password manager now with

1:27:36

SSH. This is really important. So

1:27:38

if you use SSH as I

1:27:40

do it and you use a

1:27:42

key to log into SSH. You

1:27:44

can now store your authorized SSH

1:27:46

keys in bit warden. This is

1:27:48

huge. to 90% of authorized SSH

1:27:50

keys in large organizations just go

1:27:52

unused because it's not convenient. Well

1:27:54

now this new update centralizes cryptographic

1:27:57

key management, not just SSH, but

1:27:59

in general, enabling secure storage, import,

1:28:01

import, import, you can even generate

1:28:03

your SSH keys directly within the

1:28:05

bitward and vault, which is a

1:28:07

huge improvement for developers and IT

1:28:09

professionals. I'm thrilled. I mean, I

1:28:11

used to have to do a

1:28:13

whole bunch of command line food.

1:28:15

What set Spidwarden apart is that

1:28:17

they prioritize simplicity because they understand

1:28:19

if a security tool isn't easy

1:28:21

to use, isn't simple, people aren't

1:28:23

going to use it, right? Bitwarden

1:28:25

setup only takes a few minutes.

1:28:27

You can easily import from most

1:28:29

password management solutions. And of course,

1:28:31

because it's open source, that means

1:28:33

Spidwarden source code can be inspected

1:28:36

by anyone. They undergo regular or

1:28:38

third party audits and, very importantly,

1:28:40

they publish the full report. So

1:28:42

you can see everything going on

1:28:44

within Bitwarden and be reassured that

1:28:46

it is absolutely secure. Look, your

1:28:48

business deserves a cost-effective solution for

1:28:50

enhanced online security. See for yourself.

1:28:52

Get started today with Bitwarden's free

1:28:54

trial of a teams or enterprise

1:28:56

plan or get started for free

1:28:58

across all devices as an individual

1:29:00

user. bitwarden.com. Let me underscore this,

1:29:02

because they're open source, if you're

1:29:04

an individual user, it's free, and

1:29:06

you get all the capabilities, unlimited

1:29:08

passwords, unlimited devices, iOS, Android, Mac,

1:29:10

Windows, Linux, you also get pass

1:29:12

keys, you also get hardware security

1:29:15

keys, all of that in a

1:29:17

free version, free, forever, for life.

1:29:19

And that's nice. bitwarden.com slash, Twitter.

1:29:21

Since that's still an ongoing thing

1:29:23

right? I mean we never resolved

1:29:25

it. We just did something really

1:29:27

stupid and And here we are.

1:29:29

So just to fill you in,

1:29:31

this is the best. Previously, this

1:29:33

week in tech. Kathy is a

1:29:35

adamant supporter of the First Amendment,

1:29:37

as we all should be, and

1:29:39

says that TikTok is protected by

1:29:41

the First Amendment. Not just TikTok

1:29:43

itself, but every user of TikTok

1:29:45

is exercising their right to free

1:29:47

speech. There are many who say

1:29:49

that or believe that TikTok is

1:29:52

a security problem because it is

1:29:54

a Chinese company with it gives

1:29:56

it presumably gives access effect we

1:29:58

had a open AI guy on

1:30:00

Wednesday on the show said nobody

1:30:02

should have I don't have TikTok

1:30:04

on my phone you'd be crazy

1:30:06

have TikTok on your phone or

1:30:08

Deep Seek or any of these

1:30:10

apps. And I'm not disputing that.

1:30:12

I don't have it either and

1:30:14

for similar reasons. So let's just,

1:30:16

you know, stipulate that Tiktok is

1:30:18

as much of a privacy menace

1:30:20

as it's been accused. That notwithstanding,

1:30:22

the question is, what does the

1:30:24

government get to do about that?

1:30:26

And I think the First Amendment

1:30:28

says, not this, but the Supreme

1:30:31

Court disagreed, although they disagreed very

1:30:33

narrowly. But it was nine, nothing,

1:30:35

right, were you surprised that it

1:30:37

was unanimous? We should mention Kathy

1:30:39

was in the court during the

1:30:41

oral arguments. We had you on

1:30:43

shortly after that. She's admitted to

1:30:45

the Supreme Court to the board.

1:30:47

So I wrote an amicus brief

1:30:49

in the case arguing this is

1:30:51

not how anything works. And I

1:30:53

attended the oral argument listening to

1:30:55

it. I was cautiously optimistic because

1:30:57

it's sort of, I think they

1:30:59

kind of understood what the First

1:31:01

Amendment issues were. But the whole

1:31:03

thing was a mess. Oh, it's

1:31:05

even messier now. It's even messier

1:31:07

now. And so you're asking about

1:31:10

the nine nothing. And I think

1:31:12

the nine nothing is kind of

1:31:14

a byproduct of they just wanted

1:31:16

to get it off its docket

1:31:18

for whatever reason. And one of

1:31:20

the things that concerns me is

1:31:22

I don't know why they thought

1:31:24

they were. unable to stop the

1:31:26

clock on this and provide and

1:31:28

enjoying things and then have things

1:31:30

being briefed at a proper pace

1:31:32

when they could read all the

1:31:34

briefs and get all the full

1:31:36

argument to get all the amici

1:31:38

to have time not over their

1:31:40

Christmas break to be able to

1:31:42

weigh in. So they ended up,

1:31:44

because they didn't do that, they

1:31:47

ended up with everything on a

1:31:49

very accelerated schedule, and then we're

1:31:51

kind of like, well, we heard

1:31:53

the case, we got to do

1:31:55

something. And I have a sense

1:31:57

that the 9-0 decision that they

1:31:59

came out with was the only

1:32:01

thing they could kind of get

1:32:03

everybody to agree on to get

1:32:05

something out the door. So it's

1:32:07

got a lot of framing to

1:32:09

say, be really careful, we're only

1:32:11

speaking to this particular situation as

1:32:13

it applies to Tik-talk. you know

1:32:15

don't read too widely we want

1:32:17

to be cautious about not you

1:32:19

know hurting technology this and the

1:32:21

other thing that kind of acknowledge

1:32:23

that they might be wrong but

1:32:26

they were wrong and they made

1:32:28

a mess and the only thing

1:32:30

that i think is somewhat good

1:32:32

about it is i think they

1:32:34

could have made a worse mess

1:32:36

depending on what their reasoning was

1:32:38

but they made a pretty big

1:32:40

mess anyway both legally in terms

1:32:42

of how do we interpret First

1:32:44

Amendment law going forward, but also

1:32:46

in terms of what now happens

1:32:48

to TikTok itself and all of

1:32:50

its users, and that is a

1:32:52

practical mess that has not been

1:32:54

resolved, but basically it is unavailable

1:32:56

in the United States and American

1:32:58

users can't use it, but some

1:33:00

of them are, well, they're doing

1:33:02

something. Didn't the president say we

1:33:05

can? He can't say that. It

1:33:07

doesn't help and... He did. Well,

1:33:09

okay. Oh, I got it on

1:33:11

here. I think it's okay. He

1:33:13

is capable of saying many things.

1:33:15

There are many words that he

1:33:17

is able to project from his

1:33:19

mouth. But in terms of any

1:33:21

sort of lawful beneficence that he

1:33:23

is legally and constitutionally and lawfully

1:33:25

allowed to just so, no, and

1:33:27

it doesn't solve the problem because...

1:33:29

As you are alluding to, the

1:33:31

whole TikTok ban is yet another

1:33:33

example of jaw boning because one

1:33:35

of the things that does... What

1:33:37

is jaw boning? Jaw boning is

1:33:39

going after, instead of going after

1:33:42

the actual... speaker that you're unhappy

1:33:44

with and trying to regulate, which

1:33:46

you may not be able to

1:33:48

do thanks to the First Amendment,

1:33:50

it's to put pressure on an

1:33:52

intermediary they depend on and make

1:33:54

it so that that intermediary can't

1:33:56

do business and and support the

1:33:58

actual speaker. So it's a way

1:34:00

of hurting a speaker by sticking

1:34:02

it to somebody in the middle.

1:34:04

And that is basically what jawboning

1:34:06

is. That's what they were complaining

1:34:08

about had happened with the Biden

1:34:10

administration. allegedly leaning on the platforms

1:34:12

to turn to delete some speech

1:34:14

and delete some users that whole

1:34:16

bit of if you didn't like

1:34:18

it you couldn't you didn't go

1:34:21

after those users so because you

1:34:23

couldn't because that would be unlawful

1:34:25

unconstitutional so instead you went after

1:34:27

the platforms and told them to

1:34:29

do it and you can and

1:34:31

they did both Apple and Google

1:34:33

have removed tic-tock so that's and

1:34:35

all the other bite dance apps

1:34:37

for that matter from their stores

1:34:39

Tic-oc is now telling Android users

1:34:41

you can side-load the app. You

1:34:43

can't get it on a Mac,

1:34:45

I mean on an iPhone in

1:34:47

any former fashion. Right. So the

1:34:49

question, so Tik-Toc can't do stuff

1:34:51

in the United States, but nobody

1:34:53

in the United States can help

1:34:55

Tik-Toc do anything, including the app

1:34:57

stores who had been providing it.

1:35:00

Couldn't, though, couldn't Apple say, well,

1:35:02

the president said it's okay, so

1:35:04

we're going to put it back

1:35:06

in the store? I would not

1:35:08

if I'm there in-house counsel look

1:35:10

at that everything that there's no

1:35:12

way he could actually make that

1:35:14

make it so that is lawful

1:35:16

he the law law which was

1:35:18

upheld by the Supreme Court has

1:35:20

pretty substantial fines it has pretty

1:35:22

substantive he can't he can't forgive

1:35:24

them of those fines he doesn't

1:35:26

have that power but who's gonna

1:35:28

who's gonna prosecute them you are

1:35:30

playing a lot of Russian roulette

1:35:32

if you basically rely on Trump's

1:35:34

representation and the fact that you

1:35:37

think that Pam Bondi is going

1:35:39

to just sort of let you

1:35:41

get away with it and that

1:35:43

may be true as a practical

1:35:45

standpoint now but it's a rather

1:35:47

corrupt result so you know I

1:35:49

yeah by the way i understand

1:35:51

why apple and google aren't because

1:35:53

why take the chance right there's

1:35:55

no there's no there's no nothing

1:35:57

no no downside to not having

1:35:59

in the store. There's definitely a

1:36:01

potential downside for having in the

1:36:03

store. And it's not just Google.

1:36:05

The app stores is the easiest

1:36:07

target, but it also targets any

1:36:09

of their web hosts. So there's

1:36:11

also the issue of the CDN's

1:36:13

and stuff. Although I thought maybe

1:36:16

one of them did come back

1:36:18

online to help TikTok, but I

1:36:20

think it's a really questionable questionable

1:36:22

move. And it was based on

1:36:24

a promise that Trump had at

1:36:26

least originally made before you would

1:36:28

even inaugurated. And download this, you

1:36:30

have to do it on an

1:36:32

Android device because it allows you

1:36:34

to side load the Apple iPhones

1:36:36

do not. Presumably there's a web

1:36:38

host somewhere, I don't know if

1:36:40

it's in China, there's a CDN

1:36:42

probably as well, I don't know

1:36:44

where it is. Well I think

1:36:46

what kids today are doing is

1:36:48

VPining and trying to use to

1:36:50

get it from Canada, but you

1:36:52

have to do a lot of...

1:36:55

Well to do that you have

1:36:57

to go through a lot of

1:36:59

hoops to be able to do

1:37:01

it but in theory all of

1:37:03

this is unlawful because it's not

1:37:05

just the app stores but also

1:37:07

anybody who's involved with hosting. also

1:37:09

is running afoul of the law

1:37:11

and they also had to go

1:37:13

down even if they could rely

1:37:15

on trumps representation he made the

1:37:17

representation before he was technically president

1:37:19

so they were doing potential liability

1:37:21

oh come on now that's a

1:37:23

technicality you know if his presidency

1:37:25

doesn't last four years you may

1:37:27

be inside the statute of limitations

1:37:29

you know it's a bet the

1:37:32

company decision on something that just

1:37:34

really isn't worth it but it

1:37:36

also is reason why the law

1:37:38

was jawboning and bad on that

1:37:40

regard because the way we decided

1:37:42

to solve the problem of impinging

1:37:44

on certain user speech is to,

1:37:46

or even TikTok itself, like the

1:37:48

people that we wanted to go

1:37:50

after, we did it by going

1:37:52

after middleman to make it impossible,

1:37:54

you know, to just mess with

1:37:56

the people that we were really

1:37:58

going on. So I just did

1:38:00

a, who is on. Tiktok and

1:38:02

it is the register is gandhi.net

1:38:04

it is on Akamai is their

1:38:06

CDN. I think Akamai is playing

1:38:08

chicken with law but at this

1:38:11

point this is not the biggest

1:38:13

kettle of fish to for anybody

1:38:15

to worry about. I just don't

1:38:17

think Pambani is gonna. I don't

1:38:19

think you can rely on anything

1:38:21

and including like I think I

1:38:23

made the argument in my textured

1:38:25

post to say, okay, let's look

1:38:27

at what the math would be

1:38:29

for how much you would potentially

1:38:31

owe if you were found. Oh,

1:38:33

it's outrageous. Yeah. So in theory,

1:38:35

they can extract. enough money just

1:38:37

a dollar less than that and

1:38:39

it's a huge amount of money

1:38:41

and just for this and this

1:38:43

is one law what's the next

1:38:45

law going to look like that

1:38:47

this idea that you could basically

1:38:50

like hope for their benevolence when

1:38:52

this is really extremely a corrupt

1:38:54

offer to have been made and

1:38:56

the idea that oh we can

1:38:58

rely on this and will be

1:39:00

okay it's not rational and quite

1:39:02

frankly if I'm shareholders I'm going

1:39:04

to start wondering what these companies

1:39:06

were thinking of. This is, I

1:39:08

don't think this level of decision

1:39:10

making is cleared by the business

1:39:12

judgment rule. Daniel should the law.

1:39:14

Should Microsoft buy TikTok? By the

1:39:16

way, they're one of the companies

1:39:18

that's being mentioned. Microsoft Oracle. Elon

1:39:20

says, no, I don't want it.

1:39:22

So he's, we can't. But you

1:39:24

never know what Elon Elon. He

1:39:27

didn't want to, he didn't want

1:39:29

Twitter either, so I don't want

1:39:31

Twitter either, so I don't. whether

1:39:33

it's good or bad depends on

1:39:35

your opinion of tick-talk if you

1:39:37

like tick-talk but you wouldn't get

1:39:39

the algorithm so that's part of

1:39:41

the problem the Chinese government says

1:39:43

the algorithm is not for sale

1:39:45

and apparently they control that so

1:39:47

you wouldn't get really you get

1:39:49

the users though and that's not

1:39:51

in an insignificant they're gonna be

1:39:53

looking for something yeah yeah you're

1:39:55

not getting AI would be huge

1:39:57

you're not getting the users no

1:39:59

of posts in my head, which

1:40:01

have not actually been published on

1:40:03

technicality, because I have to finish

1:40:06

writing them. I really think there's

1:40:08

a question about how alienable a

1:40:10

platform actually is, about how, when

1:40:12

you sell it, how much can

1:40:14

you sell? Because platforms are really

1:40:16

not just a corporate asset. They

1:40:18

are also, they are essentially the

1:40:20

community and the community is not

1:40:22

ownable. And all those choices. Yeah, but

1:40:24

if you know to jump off to

1:40:26

jump off of Kathy's point, we

1:40:28

don't have in 20 odd years

1:40:31

of internet community aggregation and

1:40:33

dissipation social platforms rising

1:40:35

and falling. We have

1:40:37

no good example of

1:40:39

any company successfully acquiring

1:40:41

or monetizing a social

1:40:43

network and having to remain

1:40:46

half. Elon Musk bought. Twitter and

1:40:48

got all of the Twitter users.

1:40:50

Well, Daniel, I could argue, though,

1:40:52

that LinkedIn, it's a social network

1:40:54

in the capacity that people use

1:40:57

it to network with each other

1:40:59

to get jobs. But with something

1:41:01

like MySpace or something like Twitter

1:41:04

or something like TikTok, the

1:41:06

value was in being part

1:41:08

of what felt like a

1:41:10

bigger. a bigger community of

1:41:12

affinity or a bigger community

1:41:14

of lifestyle or a bigger

1:41:16

community of social connection and The

1:41:18

shutdown and revival of TikTok has already

1:41:20

rocked people where they're like, oh, this

1:41:23

isn't a tenable way to make a

1:41:25

living, this isn't a tenable way to

1:41:27

have a community, what have I been

1:41:29

doing with my time? Like, the bubble

1:41:31

has already burst socially speaking, all of

1:41:33

these people are going to be looking

1:41:35

for new places to go. And even

1:41:37

if we accept LinkedIn as a rare

1:41:39

counter example, I would even say it's

1:41:41

not a counter example because a lot

1:41:43

of what LinkedIn has grown to be

1:41:45

now maybe have been... while it's been

1:41:47

a Microsoft property anyway, that we probably

1:41:49

did see it, but to the extent

1:41:51

that it's something interesting and viable now,

1:41:53

it's interesting and viable under Microsoft Stewardship.

1:41:55

And if Microsoft divested and sold it

1:41:57

to somebody else, we would see. a

1:42:00

shakeout and lose something really integral,

1:42:02

not just in terms of what

1:42:04

we can count on the platform

1:42:06

operator to deliver, but that the

1:42:08

community itself will be shattered and

1:42:10

lose the value proposition for why

1:42:12

they're there in the first place.

1:42:14

So what about what's happened? Instagram.

1:42:16

Instagrams. Yeah. Yeah. People aren't building

1:42:18

communities of affinity on Instagram the

1:42:20

same way they were. It's a

1:42:22

business channel now, which is fine.

1:42:24

Well, that's how it's been mismanaged

1:42:26

by meta. I don't think that's

1:42:28

because meta bought them. But that's

1:42:30

always the risk. The issue with

1:42:32

Tikak is mostly about, it's about

1:42:34

where they're storing the data. Right.

1:42:36

Yeah. And that's the whole argument

1:42:38

here is that they want the

1:42:40

data story of the United States.

1:42:42

So the US has control over

1:42:44

consumer data due to security reasons.

1:42:46

JD Vance has been put in

1:42:48

charge of the acquisition or sale

1:42:50

or whatever it is. Trump has

1:42:52

talked about creating a sovereign wealth

1:42:54

fund. The US does not have

1:42:56

one. Saudi Arabia has one. Shouldn't

1:42:58

be given the time of day.

1:43:00

It should. All these things. Who's

1:43:02

going to stop him? Well, that's

1:43:04

a separate problem. But assuming we

1:43:06

still have any sort of constitutional

1:43:08

order and functioning law, we don't.

1:43:10

Well, I'm not willing to surrender

1:43:12

that yet. I'm not either, but

1:43:14

what are you going to do

1:43:16

about it? We're going to look

1:43:18

for what what vectors of power

1:43:20

we still have to exert over

1:43:22

things and just sort of saying,

1:43:24

well, sounds good. No, it's more

1:43:26

than the federal marshals going to

1:43:28

do it. We have. We have

1:43:30

things like. We still have the

1:43:32

ability to protest. We still have

1:43:34

the ability to put in market

1:43:36

power. So what happens if a

1:43:38

court rules, if a court rules

1:43:40

something and then President Trump says,

1:43:42

yeah, no. AG Bondi, you can

1:43:44

do anything about it? Well, let

1:43:46

me say goodbye to the union,

1:43:48

but we are not there yet.

1:43:50

And I think there's a lot

1:43:52

of people whose allieship they're counting

1:43:54

on who are not going to

1:43:56

necessarily go along with that. But

1:43:58

we'll see. But one of the

1:44:00

other things. So we have, you

1:44:02

know, just, we still have power

1:44:04

as a people, we have power

1:44:06

to protest, we also have state

1:44:08

power, and then we also have,

1:44:10

and so this was the other

1:44:12

bits that I was writing on

1:44:14

Tector, there is the ability, there

1:44:16

are other avenues to, if we

1:44:18

act soon. And we start to

1:44:20

impose, and there's vectors of doing

1:44:22

this, legal liability on the people

1:44:24

that Trump and Musk are counting

1:44:26

on to do their bidding, they

1:44:28

may see that it's not going

1:44:31

to be worthwhile because if they

1:44:33

have liability judgments against them, even

1:44:35

if the courts can't, even if

1:44:37

the federal courts can't enforce it

1:44:39

themselves, those judgments are enforceable in

1:44:41

states where they have assets. And

1:44:43

that is a really critical leverage.

1:44:45

vector of power that we still

1:44:47

have over these people. They can

1:44:49

only take our country if they

1:44:51

have enough people to make willing

1:44:53

to help them do it. And

1:44:55

if we act now, I think

1:44:57

there's some avenues of attack that

1:44:59

we can make it not worth

1:45:01

the while of the people that

1:45:03

they're depending on to do it.

1:45:05

So Congress has introduced something called

1:45:07

we had KOSA, now we have

1:45:09

KOSMA. A law that this came

1:45:11

out of committee, I don't know

1:45:13

whether it has much chance of

1:45:15

surviving Congress, I have a feeling

1:45:17

it might in the current climate,

1:45:19

preventing anybody under the age of

1:45:21

13 from using social media, just

1:45:23

as it is in Australia, period.

1:45:25

This also is applicable to a

1:45:27

case that the Supreme Court just

1:45:29

heard oral arguments about in the

1:45:31

state of Texas free speech coalition

1:45:33

versus Texas AG Paxton. Texas passed

1:45:35

a law requiring the age gating

1:45:37

of certain internet sites. Do you

1:45:39

think a the Supreme Court will

1:45:41

overturn that law which would probably

1:45:43

then make moot any attempt to

1:45:45

block social media for people under

1:45:47

13? Is this the Texas law

1:45:49

that you just had? Yeah. So

1:45:51

one interesting thing is just the

1:45:53

other day they managed to get

1:45:55

at the district court an injunction

1:45:57

of yet another age-gating law that

1:45:59

came out of Texas that had

1:46:01

been challenged by I think that

1:46:03

choice was probably the plaintiff who

1:46:05

was I think the same lawyers.

1:46:07

The thing that's at the Supreme

1:46:09

Court right now is I do

1:46:11

not like the level of comfort

1:46:13

that too many of the judges

1:46:15

seem to have about whether age-getting

1:46:17

was appropriate for internet. for the

1:46:19

internet. But what Justice Justice Sotomayor

1:46:21

pointed out was that is not

1:46:23

the issue before them. The issue

1:46:25

before them was that when it

1:46:27

went to the Fifth Circuit Court

1:46:29

of Appeals, the Fifth Circuit used,

1:46:31

I think, a rational basis test

1:46:33

for it. Maybe it was intermediate

1:46:35

scrutiny, but it was definitely not

1:46:37

the level of scrutiny that attacks

1:46:39

on First Amendment rights normally requires.

1:46:41

And all we need the Supreme

1:46:43

Court to do right now is

1:46:45

say it was supposed to be

1:46:47

strict scrutiny, go back Fifth Circuit,

1:46:49

try again. Now you review the

1:46:51

district court and use the right

1:46:53

standard because they didn't do that

1:46:55

with Tiktok. They didn't do it

1:46:57

in, well, so they sort of

1:46:59

did it with Tiktok, kind of.

1:47:01

So what happened at the district

1:47:03

court at the appeals court and

1:47:05

Tiktok, which is where it started

1:47:07

because Congress short-circuited the whole path

1:47:09

and it began the challenge at

1:47:11

the court of appeals, was that

1:47:13

court said yeah first amendment rights

1:47:15

are implicated and we will presume

1:47:17

without deciding that strict scrutiny the

1:47:19

highest level was appropriate to have

1:47:21

to use to decide whether the

1:47:23

ban was constitutional or not but

1:47:26

then using that allegedly using that

1:47:28

standard they then decided yeah it

1:47:30

was totally fine which basically guts

1:47:32

the utility of this very strict

1:47:34

standard because very little should be

1:47:36

able to leap over those obstacles.

1:47:38

I don't want to have people

1:47:40

that have to have a lot

1:47:42

of agree to understand the conversation,

1:47:44

so we'll kind of hold it

1:47:46

at that point. But something we've

1:47:48

talked about quite a bit in

1:47:50

the past with you, Kathy, and

1:47:52

I think I'm starting to understand

1:47:54

it, but... These are legal niceties

1:47:56

and I don't know how much

1:47:58

they have to do with what's

1:48:00

happening in the political sphere. But

1:48:02

they do have a lot. Well

1:48:04

I do, I understand, but I

1:48:06

don't know if that's gonna make

1:48:08

any difference. Because of every, because

1:48:10

the timing of when we all

1:48:12

shrug and say, oh my gosh,

1:48:14

we're screwed. There may be a

1:48:16

point where that is true, but

1:48:18

it is really important not to

1:48:20

just shrug too soon and give

1:48:22

up when we still have legal

1:48:24

organs and legal principles that may

1:48:26

still be operative. So the details

1:48:28

do matter because they also matter

1:48:30

for when and how and where

1:48:32

when and how to get upset

1:48:34

and how to base best. apply

1:48:36

that being upset. I am not

1:48:38

going marching downtown with a sign

1:48:40

that says, apply strict scrutiny. I

1:48:42

probably would, but I mean, that

1:48:44

may not be the thing that

1:48:46

is necessarily going to say, look,

1:48:48

the tick-talk thing is also from

1:48:50

the aftermath of the end of

1:48:52

the Biden administration, and now we

1:48:54

are in kind of in a

1:48:56

whole new world. So if you

1:48:58

want to say like, does the

1:49:00

tick-talk ban matter, I mean, so

1:49:02

many other things have overtaken it

1:49:04

where we have bigger fish to

1:49:06

fry. But under the old fish

1:49:08

market rules, the old fish market

1:49:10

is they really made a mess

1:49:12

of things and that was a

1:49:14

sort of mess that is why

1:49:16

things like what we're seeing in

1:49:18

now are more able to take

1:49:20

root, but at least like we're

1:49:22

able to talk to each other,

1:49:24

organize, still have access to the

1:49:26

internet to do these things because

1:49:28

the First Amendment wasn't completely undermined

1:49:30

when it came to... when it

1:49:32

came to internet speech. And so

1:49:34

that's why, like, those details really

1:49:36

matter, because if we were still

1:49:38

living in the old fish market,

1:49:40

I really want to hold the

1:49:42

line to not make it too

1:49:44

easy for the things the government

1:49:46

wants to do, to be too

1:49:48

easy for it to be too

1:49:50

easy for it to do, because

1:49:52

if it can do them, then

1:49:54

Trump legitimately can do them for

1:49:56

us. Right now, he's doing them

1:49:58

illegitimately, and that is really a

1:50:00

significant difference in terms of what

1:50:02

we're fighting for and what tools

1:50:04

we have to do that to

1:50:06

do that fighting for, we have

1:50:08

to do that fighting with. The

1:50:10

kids off social Media Act on

1:50:12

Wednesday to ban children under 13

1:50:14

from social media. Lisa, you've got

1:50:16

a young young person in your

1:50:19

home. Should kids be blocked from

1:50:21

social media under 13? The reason

1:50:23

I'm hesitant to say blanket yes

1:50:25

or blanket no is because I

1:50:27

think It

1:50:30

would be a disservice to any

1:50:32

young person to block them and

1:50:34

as opposed to teaching them responsible

1:50:36

to use and modeling responsible use. And

1:50:38

who better than the parents to do

1:50:40

that? 13 is an arbitrary number.

1:50:42

A parent knows best what a kid

1:50:45

can and cannot handle. Also, you have

1:50:47

to point out a lot of

1:50:49

parents are not responsible social media users.

1:50:51

Well, whether that will work well or

1:50:54

not, I don't know. But one

1:50:56

of the things I would point out

1:50:58

is a lot of apps and services

1:51:00

aimed explicitly at children are designed

1:51:02

to boost engagement and addictive behavior. And

1:51:05

we banned. advertising to children on

1:51:07

children's TV shows. We've done a lot

1:51:09

of stuff. I was sitting in a

1:51:11

briefing with the head, I was

1:51:13

sitting in a session one time where

1:51:16

the person who was then the head

1:51:18

of the YouTube Kids app was

1:51:20

happily chattering about how great it was

1:51:22

that they had figured out a way

1:51:25

to boost engagement so kids just

1:51:27

kept clicking on videos and watching. And

1:51:29

to them, this was a sign that

1:51:31

the app was succeeding. Like, it

1:51:33

didn't matter what was getting important to

1:51:36

the heads of the kids. It didn't

1:51:38

matter that there's a clear trade

1:51:40

off. And if you're spending this much

1:51:42

time online, consuming other people's content, that's

1:51:45

time away. Like, they were like,

1:51:47

we've succeeded with this app. I understand

1:51:49

that, but I don't think government

1:51:51

should get involved. No, I do. You

1:51:54

think though that you've posed an interesting

1:51:56

question, which is if you have

1:51:58

apps that are specifically designed to boost

1:52:00

engagement independent of what's actually being consumed,

1:52:03

is there a point where there

1:52:05

should be regulation or there should be

1:52:07

limits? The same way there were limits

1:52:09

on advertising to kids. or the

1:52:11

content of children's cartoon programs. But this

1:52:14

is why when I'm getting to talk

1:52:16

about strict scrutiny, it matters because

1:52:18

if you're going to speak in broad

1:52:20

terms of is there a point, those

1:52:23

levels of scrutiny help you figure

1:52:25

out where constitutionally you can get to

1:52:27

that point. Because sometimes things can. survive

1:52:29

strict scrutiny where we have a

1:52:31

compelling state, compelling reason for the government

1:52:34

to act and they've narrowly tailored the

1:52:36

way they're going to act so

1:52:38

that it like best deals with the

1:52:40

problem they're trying to solve and

1:52:42

without collateral damage to the rest of

1:52:45

the right. That's why strict scrutiny when

1:52:47

it comes to the First Amendment

1:52:49

and free speech principles is so important.

1:52:51

And that's why I'm like apoplectic as

1:52:54

the word is about what happened

1:52:56

with Tik because at the DC circuit

1:52:58

they said they were using. strict scrutiny,

1:53:00

but they used something that was

1:53:02

much, much lower. They're like, yeah, the

1:53:05

government had a reason. I'm sure everything

1:53:07

they did was fine and they

1:53:09

didn't look for any narrow tailoring. And

1:53:11

then what they did is the Supreme

1:53:14

Court is they even said, we

1:53:16

don't know what the reason is, but

1:53:18

I'm sure it's a good one. Yeah,

1:53:20

I mean, they did a little

1:53:22

bit better defending it, but I don't

1:53:25

want to give them that many,

1:53:27

many problems. And then what the Supreme

1:53:29

Court did, which I think, which I

1:53:32

think is both good and bad,

1:53:34

which I think is both good and

1:53:36

bad, They still allowed it, but they

1:53:38

used intermediate scrutiny. And I'm really

1:53:40

concerned that they used the lesser standard

1:53:43

because I think the situation really compelled

1:53:45

called for the higher standard, but

1:53:47

they didn't ruin the higher standard by

1:53:49

doing what the DC Court of Appeals

1:53:52

had done, which was to use

1:53:54

the very difficult standard and then just

1:53:56

making it open season, in which case

1:53:58

we don't have a really strict

1:54:00

standard anymore. by Ted Cruz and Hawaii's

1:54:03

Brian Shatz, a Democrat. And Shatz said,

1:54:05

when you got Ted Cruz and

1:54:07

myself in agreement on something, you pretty

1:54:09

much captured the ideological spectrum. It's a

1:54:12

sign that he should be rethinking

1:54:14

what he's doing. No, no, I'm afraid

1:54:16

not. I honestly think, I think

1:54:18

this law should go into effect that

1:54:20

people want and parent, it's going to

1:54:23

help parents out. I don't see

1:54:25

a problem with it. We have arbitrary

1:54:27

laws for youth all the time. Not

1:54:29

they got it. They can't buy

1:54:31

cigarettes. They can't join the arm. They

1:54:34

can't drive a car. We have tons

1:54:36

of these restrictions in place for

1:54:38

youth. So if people want this. One

1:54:40

of the things that I find really

1:54:43

interesting, the feedback that I get

1:54:45

from kids, is that they're actually kind

1:54:47

of relieved when teachers take their phones

1:54:49

in school. And the feedback that

1:54:51

my daughter and her friends had with

1:54:54

TikTok ban was good. The people we

1:54:56

know who are always on TikTok

1:54:58

are super unhappy anyway. Maybe this gives

1:55:01

them a chance to reset. It was

1:55:03

really interesting to me that the

1:55:05

law, the proposed law seems to be

1:55:07

more vibes rather than data, like,

1:55:09

you know, with shots and Chris, not

1:55:12

supposed to be how to run the

1:55:14

railroad. Yeah, it's more advised, but

1:55:16

data, but I would love to see

1:55:18

some data that backs up. When you

1:55:21

have the people who are being

1:55:23

targeted by the log, yeah, get the

1:55:25

phones out of my hands. I don't

1:55:27

an excuse not to have to

1:55:29

have them in school. Find us the

1:55:32

data that explains why the vibe. Why

1:55:34

the vibe? It is there. We

1:55:36

do have the data. We have the

1:55:38

data that a suggests that the negativity

1:55:41

is maybe overstated and we also

1:55:43

have the data to suggest that there's

1:55:45

a ton of positive. And so basically

1:55:47

now we're looking at a season

1:55:49

where we are about to lose our

1:55:52

constitutional democracy and everybody needs to

1:55:54

be able to kind of come together

1:55:56

and push back against it. And we're

1:55:58

talking about taking away the tools

1:56:00

to do it. a problem, if we

1:56:03

think dictatorship is bad, why would we

1:56:05

make it that the dictator can

1:56:07

lawfully suppress the ability of the people

1:56:09

to come together and speak out against

1:56:12

it? And that's what the law

1:56:14

is. Even if you're under 13? Well,

1:56:16

it isn't about under 13. It's about

1:56:18

13 to 18. So you actually

1:56:20

are dealing with people who are growing

1:56:23

into adults, very imminently adults. There's no

1:56:25

gradation between ages. And at 13

1:56:27

are... world aware enough to be able

1:56:29

to start to make decisions both about

1:56:32

their own well-being and also the

1:56:34

world that they're living in. Australia's law

1:56:36

is 16 and under and I be

1:56:39

very, it goes into effect later

1:56:41

this year, I'd be very interested to

1:56:43

see what happens there. Australia does

1:56:45

not have a First Amendment or any

1:56:47

of Poland so a lot more can

1:56:50

happen. Do they have strict scrutiny?

1:56:52

No, they don't even have like an

1:56:54

embodied right of free speech. They have

1:56:56

wallabies though, that's... What's with the

1:56:58

First Amendment here, though? Like, I don't

1:57:01

understand. Is it for, we're talking about

1:57:03

First Amendment for the corporations or,

1:57:05

because as a 14-year-old, I'm not sure,

1:57:07

what right do you have to post

1:57:10

an Instagram? 14-year-olds have First Amendment

1:57:12

rights. This is not new, you're ground

1:57:14

to trouble. But to post an Instagram?

1:57:16

Sure. But to post an Instagram,

1:57:18

I'm right to moderate that free speech

1:57:21

then. There's also First Amendment rights to

1:57:23

moderate that speech. That the platform

1:57:25

is not. At the end of the

1:57:27

day, though, their speech is moderated,

1:57:29

which is not protecting their personal. It's

1:57:32

moderated by private parties. That's the difference.

1:57:34

Not by the government. Right. The

1:57:36

first amendment speaks to the government cannot

1:57:38

come in and decide what speech is

1:57:41

good and what speech is bad.

1:57:43

But a private party, like I just

1:57:45

had to delete a troll on my

1:57:47

Facebook page. Like I get to

1:57:49

do that because I'm a private person.

1:57:52

right that would be uh... that's against

1:57:54

the constitution of the united states

1:57:56

such as it is you can keep

1:57:58

it you are allowed to card

1:58:01

me, but he's not allowed to come

1:58:03

in and card me. And he's not

1:58:05

allowed to say if he's carded me

1:58:07

that, oh, you, you. Is Elon Musk

1:58:09

allowed to come in here? Now, there's

1:58:11

a really interesting thing about whether

1:58:13

he has now merged himself in

1:58:15

a way that he is a

1:58:17

state actor. So, well, no, he

1:58:19

wouldn't. I hope he's a state actor,

1:58:21

because if he's not, how come he

1:58:23

has access to our private information

1:58:26

in the Treasury Department

1:58:28

and the OPM I hope he's

1:58:30

a state actor. You think he's not

1:58:32

a state actor? I don't know if

1:58:34

I necessarily want him to be a

1:58:36

state actor. I don't know if he

1:58:39

necessarily is, but even if he's a

1:58:41

state actor, he's not state actor enough.

1:58:43

If he's not, then he's in violation

1:58:45

of the Computer Fraud and Abuse Act,

1:58:47

as you point out. I have been

1:58:50

touting that as a legal theory. And

1:58:52

one of the reasons why I've been

1:58:54

pushing it is because we don't. if

1:58:57

we can impose civil liability and get

1:58:59

money damages from any of the

1:59:01

people who are helping with this nonsense

1:59:03

that are then enforceable not in federal

1:59:06

court but it's state courts including blue

1:59:08

states where they may have assets that

1:59:10

can scare the crap out of them

1:59:12

hopefully enough that they back off hold

1:59:15

this thought because i do want to talk

1:59:17

about dosh in just a minute we got to

1:59:19

take a break do we have to For

1:59:21

long. You can exercise your free

1:59:24

speech rights. Just a little bit.

1:59:26

We don't have to. We could

1:59:28

skip right over it. I feel

1:59:31

like there is some, there is

1:59:33

a tech angle. A little,

1:59:35

a huge tech angle. Absolutely.

1:59:37

We'll stick with that part.

1:59:40

We won't, we won't talk

1:59:42

politics so much. Kathy Gellis

1:59:44

is here. She is an

1:59:46

impassioned advocate of

1:59:49

strict scrutiny. Okay, for

1:59:51

fundamental rights like

1:59:53

free speech. Yeah, no. And so

1:59:55

as I understand it now, strict

1:59:58

scrutiny really is you have. first

2:00:00

amendment rights, but there are some

2:00:02

things that might curtail those rights,

2:00:05

and it seems sensible that that

2:00:07

should be a very high bar,

2:00:09

right? Exactly. And that's all you're

2:00:12

saying. Strict scrutiny is the highest

2:00:14

possible bar before you take away

2:00:16

somebody's free speech rights. Right, because

2:00:19

we don't want to just say,

2:00:21

oh, well, we, like, you know,

2:00:24

I was kind of where the

2:00:26

discussion was going. Well, we've made

2:00:28

exceptions for advertisement length and this,

2:00:31

then the other thing, and it's

2:00:33

like the fact that we have

2:00:36

made. exceptions doesn't mean we just

2:00:38

get to always ignore it. What

2:00:40

it means is that we looked

2:00:43

at each exception and kind of

2:00:45

quizzed it for what is the

2:00:47

reason and is it narrowly tailored

2:00:50

enough and then based on what

2:00:52

the answers to those questions were

2:00:55

decided whether no, unconstitutional you don't

2:00:57

get to do it or okay

2:00:59

fine we will make this very

2:01:02

very narrow exception but just because

2:01:04

we made an exception once doesn't

2:01:06

mean that we always get to

2:01:09

make exceptions. They all have to

2:01:11

be tested. I think that's a

2:01:14

very good way to put it.

2:01:16

Thank you, Kathy. Also here Daniel

2:01:18

Rubino, who thinks kids are stupid

2:01:21

and should be off social media

2:01:23

right away. Do you have kids,

2:01:26

Daniel? Oh no. I'm quoting that

2:01:28

whole mess. Senator and Chief, smart

2:01:30

man, editor and chief, Windows Central.

2:01:33

You know, I mean, your argument,

2:01:35

completely credible. This is why this

2:01:37

is so tough this stuff. It

2:01:40

is not an easy thing to

2:01:42

understand or act upon. And that's

2:01:45

why we, you know, we chew

2:01:47

it out here. That's the whole

2:01:49

point of this. Lisa Schmeiser is

2:01:52

also here from No, no, excuse

2:01:54

me, No Jitter. No, no, no,

2:01:56

no, no, I had a little

2:01:59

jitter in my No Jitter, a

2:02:01

little jitter, just because of this

2:02:04

Camoocchio, I'm very happy to say,

2:02:06

Express VPN. Have you ever brassed

2:02:08

in incognito mode? Probably not as

2:02:11

incognito as you might think. In

2:02:13

fact, Google even admitted it. They

2:02:15

just settled a $5 billion. lawsuit

2:02:18

after being accused of secretly tracking

2:02:20

users in incognito mode. Google's defense?

2:02:23

Oh incognito doesn't mean invisible. Well

2:02:25

in fact all your online activity

2:02:27

is still 100% visible to third

2:02:30

parties unless you use express VPN.

2:02:32

The only VPN I use I

2:02:35

trust you better believe when I

2:02:37

go online especially when I'm traveling

2:02:39

in airports coffee shops in other

2:02:42

countries express VPNs my go-to. Why

2:02:44

does everyone need Express VPN? Well,

2:02:46

with that ExpressVPN, third parties can

2:02:49

still see every website you visit,

2:02:51

even in incognito mode. That means

2:02:54

your ISP, your mobile network provider,

2:02:56

the admin of that Wi-Fi network

2:02:58

you've discovered. Why is ExpressVBN the

2:03:01

best? Because it hides your IP

2:03:03

address, rerouting 100% of your traffic

2:03:05

through secure encrypted servers. Easy to

2:03:08

use, just fire up the app,

2:03:10

click one button to get protected,

2:03:13

it works on all devices, iPhones.

2:03:15

Android phones, laptops, tablets, and more.

2:03:17

So you can stay private on

2:03:20

the go. And it's rated number

2:03:22

one by top tech reviewers like

2:03:24

CNet and The Virgin. Protect your

2:03:27

online privacy today. Visit expressvpn.com/Twitter. That's

2:03:29

e-X-P-R-E-S-S-V-N-com slash Twitter. You get an

2:03:32

extra four months free when you

2:03:34

buy a two-year package. Express vpn.com/Twitter

2:03:36

what we thank up so much

2:03:39

for supporting. This show this contentious

2:03:41

program actually it's really good. I

2:03:44

don't I hate it when a

2:03:46

show everybody just kind of yeah,

2:03:48

well, whatever I think it's really

2:03:51

important to fight for these issues

2:03:53

We really have a fight on

2:03:55

our hands right now and and

2:03:58

we need to and we need

2:04:00

to kind of think about it

2:04:03

and talk about it There's a

2:04:05

little I have a little concern

2:04:07

about the young people Elon is

2:04:10

putting in our government institutions institutions

2:04:12

Now some of them are really

2:04:14

geniuses. There's one of the one

2:04:17

of the Doge interns is the

2:04:19

kid who decoded those scrolls. Remember

2:04:22

there were the the it was

2:04:24

the $700,000 Vesuvius challenge ancient scrolls

2:04:26

that had been buried in volcanic

2:04:29

dust the Herculineum scrolls and he

2:04:31

used AI and scanning technique to

2:04:34

actually read those. I mean that's

2:04:36

a pretty smart kid, 23-year-old Luke

2:04:38

Farrer. So there's a guy, smart

2:04:41

guy, but some of the team

2:04:43

is not maybe the best in

2:04:45

the world. One of the Doge

2:04:48

teens is a former, this is

2:04:50

from Krebson Security, former Denizen, that's

2:04:53

a loaded word of the calm,

2:04:55

which is An archipelago of discord

2:04:57

and telegram chat channels, Brian Krebs

2:05:00

writes, that function as a kind

2:05:02

of distributed cyber criminal social network.

2:05:04

And in fact, there's some evidence

2:05:07

that this kid, Edward Karstein, who

2:05:09

was, maybe you've seen him in

2:05:12

the news as big balls, he

2:05:14

founded, when he was 16 years

2:05:16

old, Tesla dot sexy LLC. which

2:05:19

controls dozens of web domains, including

2:05:21

at least two Russian registered domains.

2:05:23

One of those domains, which is

2:05:26

still active, offers a service. This

2:05:28

is the kid who's in the

2:05:31

Treasury Department right now, offers a

2:05:33

service called Healthy, which is an

2:05:35

AI bot for discord servers targeting

2:05:38

the Russian market. It's something that

2:05:40

would have come up in a

2:05:43

security clearance review, but of course

2:05:45

there are no security clearance reviews

2:05:47

for the Doesh kids. He also,

2:05:50

someone using a telegram handle tied

2:05:52

to him, solicited a DDOS for

2:05:54

hire service in 2022, and he

2:05:57

worked for a short time, got

2:05:59

fired from as it turns out,

2:06:02

a company. that specializes in protecting

2:06:04

companies from DDOS attacks,

2:06:06

but this company, Packet

2:06:09

Wear, or Diamond CDN,

2:06:11

was actually full of

2:06:13

former hackers. Didos, in

2:06:15

fact, experts, because they had

2:06:18

set up quite a few

2:06:20

Didos operations. Kostin's,

2:06:23

LinkedIn profile said he

2:06:25

worked at. NEDD does

2:06:28

company called Path Networks.

2:06:30

Wired, you might have read

2:06:32

the Wired article, described it

2:06:35

as a network monitoring firm

2:06:37

known for hiring reformed

2:06:39

black hat hackers. Anyway,

2:06:41

it goes on and on.

2:06:44

I recommend reading the Cribbs

2:06:46

article because there's a

2:06:48

lot of stuff that would be

2:06:50

at very least cause for

2:06:52

concern. All of it. all of

2:06:54

it. I think there's I think

2:06:56

there's nothing that should be, you

2:06:59

know, if they just the fact

2:07:01

that they didn't maybe smash everything

2:07:03

is no relief. The entire incursion

2:07:05

and the mode and the method

2:07:07

that the incursion was made, if

2:07:09

not outright unlawful, was at least

2:07:11

on wise and and I think

2:07:13

unlawful because we have obsessed about

2:07:15

keeping our most sensitive systems as

2:07:17

protected as possible. We've passed laws

2:07:19

that tried to punish incursions into

2:07:21

them that were unauthorized. And what

2:07:23

we have basically done is handed

2:07:25

the kings to the kingdom of

2:07:27

our most sensitive systems and our

2:07:29

most sensitive data, and we've handed it

2:07:31

to people who did not have appropriate

2:07:34

authorization in the way the law requires.

2:07:36

That is bad. That is a problem.

2:07:38

It has caused harm, and we just

2:07:40

don't necessarily know yet the full. measure

2:07:42

of that norm, but we know it's

2:07:44

a crude and we know it's a crewing.

2:07:47

To add to that, we

2:07:49

had heard that the resignation,

2:07:52

the fork in the road

2:07:54

resignation emails would not apply

2:07:56

to CISA staffers. In

2:07:58

fact, it did. And This is

2:08:00

CISA, of course, very

2:08:02

important part of the

2:08:04

Department of Homeland Security,

2:08:07

Cybersecurity Infrastructure Security Agency.

2:08:09

Initially excluded from those fork-in-the-road deferred

2:08:11

resignation offers. However, on Wednesday, some

2:08:13

CSA staffers were given the offer,

2:08:15

that gives them one day to

2:08:17

decide, by the way, just hours,

2:08:19

in fact, to decide whether to

2:08:22

accept it. This is according to

2:08:24

three sources who spoke to NPR

2:08:26

and condition of anonymity. Sisa,

2:08:28

I think everybody who

2:08:31

listens to us knows

2:08:33

is extremely important to

2:08:36

national security. And perhaps

2:08:38

a problem for those

2:08:40

Doge staffers who've been

2:08:43

entering the State Department,

2:08:45

the Treasury Department,

2:08:48

OPM, USAID. Now they're

2:08:50

GSA and NIH and I

2:08:52

think their sites are set

2:08:55

on basically everything. Yeah, I mean, so

2:08:57

remember that the Doge, no

2:08:59

one associated with Doge has

2:09:01

any loyalty to national security

2:09:04

whatsoever. They're not, they're not

2:09:06

operating in the interest of

2:09:08

the country. They're operating in

2:09:10

the interest of the whims

2:09:12

of Elon Musk and What

2:09:15

he's doing is the same

2:09:17

thing that was extensively documented

2:09:19

in the excellent Ryan Mac,

2:09:21

Katie Conger book, Character Limit,

2:09:23

where they talked about how

2:09:25

when he took over Twitter,

2:09:27

the primary driving force

2:09:30

behind everything he did

2:09:32

was to reduce, to eliminate,

2:09:34

to get rid of things

2:09:36

and call it change and

2:09:38

call it transformation, when instead

2:09:41

all it was just breaking

2:09:43

things with no understanding of

2:09:45

what he was breaking in no

2:09:47

consequence. That's what's happening here too.

2:09:49

And we can talk about the kid

2:09:51

who decoded the scrolls, but just because

2:09:54

you happen to be very good

2:09:56

at one specific data problem doesn't

2:09:58

make you smart and solving in

2:10:00

solving a question that shouldn't have

2:10:02

to be asked, which is how can

2:10:04

we eliminate entire departments of the

2:10:06

government? Yeah, exactly. Yeah. So we

2:10:08

have people in our chat room who

2:10:11

say they're acting by banning Tiktok by

2:10:13

bending the Constitution possibly to a breaking

2:10:15

point because we were worried about national

2:10:17

security and China sloping its data. And

2:10:20

so then when we just went and

2:10:22

handed all of the data to the

2:10:24

gang of every nation, Nazis, you know, exactly.

2:10:26

Yeah. So we have people in our chat

2:10:29

room who say they're acting in

2:10:31

our best interest despite your disapproval.

2:10:33

They say, it's, you know, it's nothing

2:10:35

scares Democrats more than full transparency.

2:10:38

And I don't think I would,

2:10:40

I would, but what about this

2:10:42

is transparent? Exactly what, what in

2:10:44

any part of this decision process

2:10:46

has been transparent. Also, it's a

2:10:48

fairly ridiculous, the government is

2:10:51

wasteful. Screaming about AI and. The government's

2:10:53

not wasteful though. If you take a look

2:10:55

at the one, the one, the one, the government

2:10:57

that consistently fails audit is the Department

2:11:00

of Defense. That's where your waste is.

2:11:02

That's where the lack of transparency is.

2:11:04

I don't see those sweeping through there.

2:11:06

They're going after places that are teeny

2:11:09

tiny percentages of the U.S. I disagree.

2:11:11

But I would stipulate the government can

2:11:13

be wasteful and has been wasteful and

2:11:15

undoubtedly there are government programs that are.

2:11:18

pork that are boondoggles that Congress

2:11:20

put in there to benefit, you

2:11:22

know, their constituency. But there's a

2:11:24

legal way, there's a legal way.

2:11:27

to go through these. And there's

2:11:29

an illegal way to go through

2:11:31

these. This is what happens when

2:11:33

you have 40 years of Democrats,

2:11:35

Republicans, exploiting government, growing the federal

2:11:38

government, despite the fact Republicans forever

2:11:40

were like against federalism, you know,

2:11:42

against large government, and then they

2:11:44

had gone to power, of course,

2:11:46

and they spent like crazy as

2:11:48

well and continued to grow. And we

2:11:51

grew these institutions and we grew these

2:11:53

bureaucracies and the number one job of

2:11:55

a bureaucracy is to protect itself. to

2:11:58

make sure it still needed. Are you

2:12:00

in favor of this? Is this a

2:12:02

good way to cut the fat? I'm

2:12:04

in favor of it theoretically. Like this

2:12:06

idea of going through and cleaning out

2:12:08

and getting rid of all stuff in

2:12:10

a government, it just being super aggressive,

2:12:12

I'm for it. The way they're doing

2:12:14

it now? No, I'm not. The way

2:12:16

that I don't care for Elon Musk,

2:12:18

I don't trust him. I feel like

2:12:20

a logdi. things that they're going after

2:12:22

are self-serving. And I'm really worried about

2:12:24

that. Firing the director of the FAA,

2:12:26

for instance, because he was the one

2:12:28

who stopped SpaceX's launches because they were

2:12:30

unsafe. Congress is the problem. Oh,

2:12:33

sorry, go ahead. I was just

2:12:35

going to say, the issue here

2:12:37

is, one thing I learned, you

2:12:39

know, major in political science many

2:12:41

years ago, but you study the

2:12:43

French Revolution, the problem is is

2:12:45

whenever you have a system going

2:12:47

to one extreme for a very

2:12:49

long time, is there's always a

2:12:51

counter-revolution to it, which is often

2:12:53

just as extreme and negative. And

2:12:55

that's what we're seeing. Yeah, the

2:12:57

reign of terror was not a

2:12:59

great improvement over Louis XVI. Right.

2:13:01

I wouldn't stipulate to that pretense that

2:13:03

that's what's going on here. Even if

2:13:06

Congress was wrong, its job is to

2:13:08

raise that money and decide where it

2:13:10

gets spent and that's supposed to happen

2:13:12

until we decide to elect different. members

2:13:15

of Congress. We didn't, we didn't, we

2:13:17

did that slightly, but we didn't do

2:13:19

that very much. But that's only the

2:13:21

part thing because you're talking about, oh,

2:13:24

well, maybe we've overspent, but we're also

2:13:26

looking at Keynesian economics, which talk about

2:13:28

that what we spend also has beneficial

2:13:30

effects on the economy, and the real

2:13:32

question really needs to be whether in terms

2:13:35

of when we evaluate whether we're overspending or

2:13:37

not, is whether we're getting value for the

2:13:39

money, and it's reducing value to America's own

2:13:41

interest. I mean, even if you just do

2:13:44

it with how much we spend on, you

2:13:46

know, we pay American farmers for their rice

2:13:48

so that we can feed hungry people around

2:13:50

the world, which not only gives money to

2:13:53

the farmers, but also make sure that

2:13:55

we don't have starving people and people like

2:13:57

America a little bit more. So the only

2:13:59

question. of its waste is, are we

2:14:01

spending money? I'm getting no value

2:14:03

out of it, but we're clearly

2:14:06

spending the money and getting value

2:14:08

out of it. So the whole

2:14:10

idea of reviewing this- All right,

2:14:12

I don't want to debate that

2:14:14

because honestly now this is political.

2:14:16

So let's go back to the

2:14:18

technology. Thomas Shedd, who was appointed

2:14:20

Technology Transformation Services Director and Ally

2:14:22

of Elon Musk, told GSA workers

2:14:24

that the agency's new administrator is

2:14:26

pursuing an AI first strategy. It's

2:14:28

not a question of AI coming

2:14:30

in and finding where the fat

2:14:32

is. It's actually AI running the

2:14:34

agency. The GSA, the government services

2:14:36

agency, what does it do? It

2:14:39

does a lot. In what he

2:14:41

described as an AI for strategy,

2:14:43

sources say shed provided a handful

2:14:45

of examples. of projects, the GSA

2:14:47

acting administrator is looking to prioritize,

2:14:49

including the development of AI coding

2:14:51

agents that would be available for

2:14:53

all agencies. GSA provides government services,

2:14:55

right? He made it clear that

2:14:57

he believes much of the work

2:14:59

at TTS and the broader government,

2:15:01

particularly around finance tasks, could be

2:15:03

automated, automate the accounting, a cybersecurity

2:15:05

expert. Told Wired on Monday, eh,

2:15:07

this red, raises red flags, automating

2:15:09

the government's not the same as,

2:15:11

well, I don't know, self-driving cars.

2:15:14

People, especially people who aren't experts

2:15:16

in the subject domain, coming into

2:15:18

projects, often think this is dumb,

2:15:20

and then find out how hard

2:15:22

the thing is, really. Honestly, I

2:15:24

wouldn't let Tesla drive me around.

2:15:26

I definitely don't want a self-driving

2:15:28

government. No, I mean, we're government

2:15:30

by the people for the people

2:15:32

and this is, we had systems.

2:15:34

how it was supposed to work

2:15:36

and we have systems for how

2:15:38

to change it if we don't

2:15:40

like the way it's working and

2:15:42

none of them are this. This

2:15:44

is something else. This is a

2:15:47

lot of power that was usurped

2:15:49

by people who don't have the

2:15:51

authority to usurp as much power

2:15:53

if they've helped themselves. Can the

2:15:55

president give them that authority? No.

2:15:57

I think the answer is no,

2:15:59

there are statutes that constrain his

2:16:01

power, plus the text of the

2:16:03

Constitution itself, which tells him that

2:16:05

his job is to enforce the

2:16:07

law, not to ignore it, between

2:16:09

the constitutional text and the law

2:16:11

that's actually been written that prescribes

2:16:13

how he can use his power,

2:16:15

he doesn't have the power to

2:16:17

say, yeah, you guys are fine,

2:16:20

go do whatever you want, because

2:16:22

there are laws that say that's

2:16:24

not how it works, because the

2:16:26

Congress representing the people supposed to

2:16:28

be able to... observe and keep

2:16:30

track of what's going on and

2:16:32

make sure that the power is

2:16:34

being wielded in ways that the

2:16:36

people prove of. It is concerning

2:16:38

that it is happening invisibly. It's

2:16:40

happening invisibly. Without any transparency or

2:16:42

without any transparency because that's and

2:16:44

that's what the allegations that Doge

2:16:46

is illegal is that there were

2:16:48

rules about if you wanted to

2:16:50

empower certain people to do certain

2:16:52

things. There were rules that had

2:16:55

to be followed manifest in different

2:16:57

laws about which people and how

2:16:59

do you empower them and what

2:17:01

boxes do you have to check

2:17:03

and it wasn't just bureaucracy it

2:17:05

was just it was to make

2:17:07

sure that there would be that

2:17:09

supervisory capacity to figure out how

2:17:11

power was getting wielded and none

2:17:13

of what Doge is empowered to

2:17:15

do complied with any of those

2:17:17

rules and that matters for what

2:17:19

I wrote about on Tector which

2:17:21

gets back into the technical which

2:17:23

is I think that the authority

2:17:25

that the Doge brothers have wielded

2:17:28

in these systems potentially makes them

2:17:30

personally liable for computer fraud and

2:17:32

abuse act violations because I think

2:17:34

it is inherently without authorization that

2:17:36

they've been in there and demanded

2:17:38

that access and if because the

2:17:40

power that was given to them

2:17:42

was not lawfully given to them

2:17:44

in which case they're there like

2:17:46

any other wrongful hacker would be

2:17:48

there they just happened to have

2:17:50

gone through the front door and

2:17:52

if that It's true. The problem

2:17:54

with all this, though, is it's

2:17:56

perception. There's a reason why people

2:17:58

aren't in the streets protesting all

2:18:01

this right now is because no

2:18:03

one likes the federal government. And

2:18:05

a lot of people don't believe

2:18:07

it works for them. And this

2:18:09

is a perception. messaging issue right

2:18:11

because people aren't upset that a

2:18:13

lot of these institutions are being

2:18:15

undermined destroyed granted we haven't seen

2:18:17

the ramifications of it which I

2:18:19

think could be definitely they might

2:18:21

have a different opinion when FEMA

2:18:23

doesn't kind of bail them out

2:18:25

especially for the people who are

2:18:27

Especially for people who are at

2:18:29

the lower rungs of society, I

2:18:31

think will be mostly affected. But

2:18:33

the reason why this stuff is

2:18:36

happening is because Democrats have not

2:18:38

done anything the last 20 years

2:18:40

to help stem government, make it

2:18:42

more efficient, bring it down in

2:18:44

size. There's a lot of talk.

2:18:46

And people kept electing Democrats. And

2:18:48

the thing is they were in

2:18:50

the street. There was a protest

2:18:52

in front of treasury that took

2:18:54

place in the street. And with.

2:18:56

Congress trying to get in the

2:18:58

door and being shut out of

2:19:00

a building that Congress funds to

2:19:02

shut out the officials that we've

2:19:04

duly elected to run our country

2:19:06

and not be allowed to be

2:19:09

in the building that they fund.

2:19:11

Like that is a problem there

2:19:13

and you have people on the

2:19:15

street and we're... Daniel has a

2:19:17

point. There has been a huge

2:19:19

outcry about this. There's been, you

2:19:21

know, working on it. But I

2:19:23

think Daniel makes a really good

2:19:25

point where there's a... public perception

2:19:27

that the democrats are the problem

2:19:29

because they made this government that's

2:19:31

not working so oh thank God

2:19:33

there's change right any change would

2:19:35

be better than people just want

2:19:37

to act kind of but you

2:19:39

know it goes back to the

2:19:42

whole thing in in character limit

2:19:44

where Elon Musk is like look

2:19:46

at me I'm changing Twitter you

2:19:48

are but you're not improving it

2:19:50

you're just taking things away and

2:19:52

then claiming that that's an improvement

2:19:54

those are two completely records I

2:19:56

do not like Trump, okay, but

2:19:58

no, no, no, no, his term

2:20:00

is second term in office here.

2:20:02

He's done a lot. Now, it's

2:20:04

been very busy. We can say.

2:20:06

saying that's under judgmental call whether

2:20:08

it's good or bad but he's

2:20:10

doing he's creating action and for

2:20:12

a lot of Americans they've just

2:20:14

seen no matter who we've elected

2:20:17

year after year it's always the

2:20:19

same they don't see change in

2:20:21

their lives and I think that's

2:20:23

what people are voting for or

2:20:25

wanting to see now whether it's

2:20:27

going to happen it's going to

2:20:29

benefit them I'm you know real

2:20:31

skeptical of that right I think

2:20:33

this could be a lot going

2:20:35

a lot of bad directions. But

2:20:37

that's why people are going along.

2:20:39

I completely agree with you. There's

2:20:41

a complete dissatisfaction with government and

2:20:43

a sense of helplessness. And it's

2:20:45

pretty universal. I don't blame the

2:20:47

Democrats, by the way. There were

2:20:50

a lot of elections started it

2:20:52

with by saying the, you know,

2:20:54

the worst scariest words in the

2:20:56

English language are, I'm the government.

2:20:58

We're here to help, which is

2:21:00

a funny and probably popular thing

2:21:02

to say, but I got to

2:21:04

tell you if Hurricane Helene devastated

2:21:06

your community and FEMA came to

2:21:08

help feed you and get you

2:21:10

housed, that is help. And you're

2:21:12

right, Daniel, I understand completely dissatisfaction

2:21:14

and I think you're exactly right.

2:21:16

I also think I fear the

2:21:18

consequences. We'll find out. We're going

2:21:20

to find out. We're going to

2:21:22

get to see. We're going to

2:21:25

take a break. Enough of that.

2:21:27

We have other things to talk

2:21:29

about, including. Xbox sales, what the

2:21:31

hell man? I threw that in

2:21:33

for you Daniel. You're watching this

2:21:35

week in tech. Leisha Schmeiser, Daniel

2:21:37

Rubino. Kathy Gellis, great to have

2:21:39

all three of you. This has

2:21:41

been an excellent conversation. I think

2:21:43

a very important one too. Our

2:21:45

show today brought to you by

2:21:47

Nat Sweet. Name You Probably Know

2:21:49

Well. What is the future hold?

2:21:51

I mean we're just talking about

2:21:53

it. Especially if you're in business.

2:21:55

Ask nine experts, you're gonna get

2:21:58

10 answers. Rates are gonna rise.

2:22:00

Oh no, wait, but maybe rates

2:22:02

are gonna fall. Oh, inflation's up.

2:22:04

Oh no, inflation's down. Can someone

2:22:06

please invent a crystal ball? Until

2:22:08

then, over 41. thousand businesses of

2:22:10

future-proofed their business with net suite

2:22:12

by Oracle the number one cloud

2:22:14

ERP. It brings accounting financial management

2:22:16

inventory and HR into one fluid

2:22:18

platform. With one unified business management

2:22:20

suite there's one source of truth

2:22:22

giving you the visibility and control

2:22:24

you need to make quick decisions

2:22:26

and with real-time insights and forecasting

2:22:28

you're peering into the future. with

2:22:31

actionable data. It is a crystal

2:22:33

ball. When you're closing the books

2:22:35

and days, not weeks, you're spending

2:22:37

less time looking backward, more time

2:22:39

on what's next. Whether your company's

2:22:41

earning millions or even hundreds of

2:22:43

millions, Net Suite helps you respond

2:22:45

to immediate challenges and sees your

2:22:47

biggest opportunities. If I had needed

2:22:49

this product, it's what I'd use.

2:22:51

Speaking of opportunities, you can download

2:22:53

the CFO's guide to AI and

2:22:55

machine learning. Now this is something

2:22:57

you need, and it's free at

2:22:59

Net Suite. netsweet.com/twitter get the cFO's

2:23:01

guide to aye and machine learning

2:23:03

this is something we all need

2:23:06

to understand better at netsweet.com/twit thank

2:23:08

you net sweet for supporting the

2:23:10

show and thank you dear listener

2:23:12

dear viewer for us supporting us

2:23:14

by going to that address netsweet.com/twit

2:23:16

hearty thank you to all of

2:23:18

our ClubTwit members who make this

2:23:20

show and all the shows we

2:23:22

do possible. Yeah, we have ads,

2:23:24

but ads don't cover all the

2:23:26

costs. And that's where ClubTwit really

2:23:28

makes a big difference. We're so

2:23:30

glad to have 12,000 plus members.

2:23:32

It's only $7 a month. The

2:23:34

reason they're members, you get ad

2:23:36

free versions of all the shows.

2:23:39

You're giving us money. We don't

2:23:41

have to show you ads. You

2:23:43

wouldn't even see this little... little

2:23:45

moment of begging. You also get

2:23:47

special programming that we don't do

2:23:49

anywhere else. We had a great

2:23:51

Chris Marquard photo segment last week

2:23:53

on Thursday and that's on the

2:23:55

Twit Plus feed available to club

2:23:57

members. We've got other Other events

2:23:59

coming up, Michael's Crafting Corner, Stacey's

2:24:01

Book Club, I think it's a

2:24:03

great way to join a fantastic

2:24:05

community. And don't forget the Discord

2:24:07

place you can hang out with

2:24:09

all the other club Twitter members.

2:24:12

It really is a good place

2:24:14

for conversation, even when the shows

2:24:16

aren't on, about all of the

2:24:18

things Geeks are interested. If you're

2:24:20

interested, I hope you are. It

2:24:22

makes a big difference to us.

2:24:24

Keeps the shows flowing. It keeps

2:24:26

our staff paid. Keeps the lights

2:24:28

on Twitter. slash Club Twitter.

2:24:30

And thank you in advance.

2:24:32

Because of our club members,

2:24:34

we're able to stream our

2:24:36

live shows on eight different

2:24:39

platforms now in the Club

2:24:41

Twitter, YouTube, Twitch, LinkedIn,

2:24:43

TikTok, and Kick, and I'm

2:24:45

missing one. Did I say TikTok?

2:24:47

Did I say X? Yeah, everywhere. So

2:24:49

watch the shows live if you

2:24:52

want. This show is

2:24:54

Sunday afternoon. Usually 2

2:24:56

to 5 p.m. Pacific.

2:24:58

We started a little

2:25:00

early today because of the

2:25:02

Super Bowl. That's a 5

2:25:05

to 8 p.m. Eastern. That's a

2:25:07

2200 UTC. If you want to

2:25:09

watch live, of course you don't

2:25:12

have to. You can always download

2:25:14

a copy of the show. Uh,

2:25:16

crypto. Huh? Did you say kick?

2:25:18

Yeah, kick. Well, where all the

2:25:21

Nazis are. So, uh. Funny. Let

2:25:23

me see if I can get

2:25:25

it right. Discord, YouTube, Twitch, I

2:25:28

think I left out, Twitch, TikTok,

2:25:30

x.com, LinkedIn, Facebook, and Kik. Yeah, there's

2:25:32

the E. Wow. But basically stream anywhere

2:25:34

that allows us to, you know, to

2:25:37

connect our pipe to their stream, because

2:25:39

we, you know, and we see the

2:25:41

chat, I see the chat from all

2:25:43

the different places. We don't get a

2:25:46

lot of people from Kik saying high

2:25:48

in the chat, but I see a

2:25:50

lot of people from YouTube. It's great.

2:25:53

Right now is there are

2:25:55

figures, oh 1,249 people it

2:25:57

looks like on those eight

2:25:59

platforms. So the vast majority

2:26:01

of the audience doesn't watch

2:26:03

live, but if you want

2:26:05

to, it's nice. We get the,

2:26:08

I like the interaction. It's

2:26:10

very useful. Speaking of X, German

2:26:12

civil activists have won a

2:26:14

victory against Elon Musk's X.

2:26:16

They sued, saying we want information

2:26:18

so we can track the

2:26:20

spread of election swaying information on

2:26:23

the network. This is an

2:26:25

urgent filing because Germany has a

2:26:27

national election February 23rd. And as

2:26:30

you know, Elon has kind

2:26:32

of weighed in on the election

2:26:34

in favor of the right

2:26:36

leaning, right leaning, far right

2:26:38

ADF party. Berlin District Court said

2:26:40

you've got to give civil

2:26:42

rights groups the information so that

2:26:45

they can, the data so

2:26:47

they can track misinformation and

2:26:49

disinformation. X did not. want to

2:26:51

do that. And if they

2:26:53

continue to not help, I don't

2:26:56

know what's gonna happen. They

2:26:58

have to pay $6,200. That can't

2:27:00

be right. Please tell me that's

2:27:02

not right. I think that's

2:27:04

just the court costs. There is

2:27:07

one court case that has

2:27:09

gone away. President Trump has

2:27:11

ended his legal challenge over his

2:27:13

ban on Twitter. After January

2:27:15

6th, this was a long-running legal

2:27:18

fight. The notice was released

2:27:20

late Friday. Doesn't say how

2:27:22

the case was resolved. I guess

2:27:24

it was just one of

2:27:26

those handshake deals. Trump's lawyer declined

2:27:29

to comment. X has declined

2:27:31

to comment. It's over. The

2:27:33

one known from Facebook also went

2:27:35

away with that $25 million.

2:27:37

Yeah, wasn't that something. I don't

2:27:39

think that's an expensive dinner

2:27:41

at Maralago that Mark Zuckerberg had

2:27:44

there. I don't think that the

2:27:46

business judgment rule protects Zuckerberg

2:27:48

from shareholder action for that. And

2:27:51

it's a stretch of an

2:27:53

argument. but I think it

2:27:55

will be really interesting to see

2:27:57

if somebody tries that seemed

2:27:59

to be it was a winnable

2:28:01

case by Facebook and I

2:28:03

don't they would have spent

2:28:05

vastly so was the CBS case

2:28:08

so was I mean there

2:28:10

be the benevolent overlord to bless

2:28:12

their murder that's what it

2:28:14

is so does Elon I mean

2:28:17

this is because you have a

2:28:19

strong Executive, to say the

2:28:21

least. That is not the word

2:28:23

to use. That is a

2:28:26

lawless executive. And that is

2:28:28

the worst thing. You have a

2:28:30

scary, how about a scary

2:28:32

executive? And then you fight, then

2:28:34

you fight for your business.

2:28:36

Because if you surrender, you

2:28:38

will be paying, it will not

2:28:41

just be the 25 million,

2:28:43

and it will be 25 million

2:28:45

and 25 million, and then

2:28:47

that's gonna come right out

2:28:49

of shareholders profits and pockets.

2:28:51

Yeah. Well, the shareholders get

2:28:53

to stand up if they want.

2:28:56

Cryptostealing apps have been found

2:28:58

in the Apple App Store. This

2:29:00

is the very first time.

2:29:02

There have been malware in

2:29:04

Apple apps in the past. Android

2:29:06

and iOS apps contain a

2:29:08

malicious software development kit designed to

2:29:11

steal your Cryptocurrent wallet recovery

2:29:13

phases. Let me say that

2:29:15

again because it came out wrong

2:29:17

completely. Cryptocurrency wallet recovery phrases.

2:29:19

That made a lot more sense.

2:29:22

It's an OCR stealer. So

2:29:24

the whole idea is you

2:29:26

pop up your wallet and it's

2:29:28

looking at the screen and

2:29:30

it's stealing your recovery key. Here's

2:29:32

the scary part. The infected

2:29:34

apps were downloaded more than a

2:29:37

quarter million times on Google Play.

2:29:39

Unfortunately, we don't have numbers

2:29:41

for Apple's App Store, but I

2:29:44

imagine it's a similar number.

2:29:48

Anyway, sometimes this stuff sneaks through,

2:29:50

right? We will learn this week,

2:29:52

I think, if Apple's gonna release

2:29:55

a new iPhone SE, that's probably

2:29:57

gonna happen this week, according to.

2:29:59

Mark German, the rumor guru at

2:30:01

Bloomberg. Okay, no, nothing to say

2:30:03

there. This is very popular, I

2:30:06

think, cheap phone. I thought that

2:30:08

was walked by, walked back. Was

2:30:10

it? Yeah. Yeah. Well, it was

2:30:12

just a rumor Apple never announces,

2:30:14

pre-announces anything. So maybe Apple said

2:30:16

it's not going to happen or

2:30:19

did German walk it back? There

2:30:21

are telltale signs. There are telltale

2:30:23

signs. A room, a spokesperson for

2:30:25

Apple declined to comment. Yeah. Oh,

2:30:27

if it was walked back, then

2:30:30

I'm sorry. I was getting all

2:30:32

excited. And here's a nice little

2:30:34

apology from an iOS engineer who

2:30:36

leaked information about coming products to

2:30:38

the Wall Street Journal and the

2:30:40

information. He was compelled. He's, they

2:30:43

settled with Apple. He's. Of course,

2:30:45

not working there anymore. And apparently

2:30:47

it was compelled to post the

2:30:49

following apology on X. I spent

2:30:51

nearly eight years as a software

2:30:54

engineer at Apple. During that time

2:30:56

I was giving access to sensitive

2:30:58

internal Apple information, including what were

2:31:00

then unreleased products and features. But

2:31:02

instead of keeping this information secret,

2:31:04

this information with journalists who covered

2:31:07

the company. I didn't realize at

2:31:09

the time. but it turned out

2:31:11

to be a profound and expensive

2:31:13

mistake. Hundreds of professional relationships I'd

2:31:15

spent years building were ruined, and

2:31:18

my otherwise successful career as a

2:31:20

software engineer was derailed and will

2:31:22

likely be very difficult to rebuild

2:31:24

it. Kids, leaking was not worth

2:31:26

it. I sincerely apologize to my

2:31:28

former colleagues who not only worked

2:31:31

tireless Sam projects for Apple. worked

2:31:33

hard to keep them secret they

2:31:35

deserve better. I have to say

2:31:37

that, I'm just going to say

2:31:39

to you, blink twice if you're

2:31:42

being held hostage. And clearly this.

2:31:44

learn is you just have to

2:31:46

be better leaking materials. Yeah, don't

2:31:48

get caught. Yeah, don't get caught.

2:31:50

I do. I mean, you're journalists,

2:31:53

at least Lisa and Daniel, you

2:31:55

are, I guess you are, I

2:31:57

guess you are too, Kathy. I

2:31:59

don't know if you take tips

2:32:01

or leaks. Why do people leak?

2:32:03

Yeah, I don't, you know what,

2:32:06

I occasionally will get an offer

2:32:08

and I always turn it down

2:32:10

because I don't. I will defend

2:32:12

with journalists ability to promise on

2:32:14

an anonymity. Daniel you said you

2:32:17

you you depend on leaks oh

2:32:19

yeah well you know we've exclusively

2:32:21

been doing leaks for over a

2:32:23

decade now I mean that's how

2:32:25

I kind of cut my teeth

2:32:27

in this industry. I used to

2:32:30

leak Nokia stuff all the time

2:32:32

and things for Windows phone back

2:32:34

in a day. It's the right

2:32:36

of consumers to know, right? I

2:32:38

understand why companies don't want them

2:32:41

to know, but if you're about

2:32:43

to buy a phone and you

2:32:45

find out that Tuesday Apple's going

2:32:47

to release an iPhone SE for

2:32:49

instance, that's of benefit to you

2:32:51

and then you have the right

2:32:54

to know that. Why do people

2:32:56

leak to you though, Daniel? Why

2:32:58

does somebody at the company, given

2:33:00

the company, given the risk? Yeah,

2:33:02

so there's a couple reasons. There's

2:33:05

one reason is simply that employee

2:33:07

is unhappy with decisions that have

2:33:09

been made maybe for a product

2:33:11

and thinks it should go another

2:33:13

way, right? So they're leaking as

2:33:15

a dissatisfaction because I know others

2:33:18

will be upset about this and

2:33:20

they're hoping that the blowback kind

2:33:22

of influences decision making. The other

2:33:24

one, of course, is sometimes there

2:33:26

are internal conflicts. on a topic

2:33:29

within a company on a product

2:33:31

or a project. And so one

2:33:33

group is trying to sort of

2:33:35

get favoritism, right? So if something

2:33:37

gets leaked and people are like,

2:33:39

yeah, that's awesome. They should totally

2:33:42

do that. They can literally go

2:33:44

like back to a meeting and

2:33:46

we know this happens and they'll

2:33:48

cite an article of ours and

2:33:50

read the comments and people actually

2:33:53

do really want this, right? So

2:33:55

you're help shaping the narrative there.

2:33:57

Sometimes there are controlled leaks. A

2:33:59

lot of times Apple goes to

2:34:01

the Wall Street Journal, for instance,

2:34:03

it says, you know, don't tell

2:34:06

anybody I said this, but, and because

2:34:08

that word, yeah. they control the narrative

2:34:10

exactly. And some people do it because

2:34:12

they think it's what they do is

2:34:14

cool and they want to share with

2:34:16

other people. It's just literally that, right?

2:34:18

They know, I met a lot of

2:34:21

people, they're just super fans of the

2:34:23

products that they work on and they're

2:34:25

excited about them and they kind of

2:34:27

want to get momentum going. A lot

2:34:29

of times it works in our favor.

2:34:31

Like we reported in 2018 that Microsoft

2:34:33

and its partners are going to start

2:34:35

working on dual screen and foldable laptops

2:34:37

and PCs. And it happened, it

2:34:39

was many years later, of course.

2:34:41

But we helped start shaping that

2:34:43

narrative for them by getting that

2:34:45

info out there. So that when

2:34:48

it happened, it was even like

2:34:50

people were starting to get prepared

2:34:52

for this idea of like there

2:34:54

will be these devices, right? So

2:34:56

there are benefits to this leaking

2:34:58

stuff. I totally understand why, like

2:35:00

I can tell you for a

2:35:02

fact, the surface division absolutely hated

2:35:04

us for it. personal calls from

2:35:06

Microsoft over them. And they are

2:35:08

absolutely not happy. Xbox and

2:35:10

the gaming teams, way cooler

2:35:12

though. You know, they're not.

2:35:14

They're just happy anybody's interested.

2:35:16

Right. Yeah. So you have

2:35:18

different groups that are also,

2:35:20

like I said, you know,

2:35:22

Panos Panay, he's good at

2:35:24

what he does, but he

2:35:26

absolutely does not like or

2:35:28

appreciate leaking stuff to the

2:35:30

media. He very famously was

2:35:32

they were it was doing

2:35:34

a Microsoft Surface event and

2:35:36

Paul the rot tells the story

2:35:39

came up to Paul and took

2:35:41

his laptop just and started to

2:35:43

use it. You probably were there.

2:35:45

Yeah. He's a great guy. Oh

2:35:47

yeah he's a fascinating guy. I'm

2:35:50

not crazy about how pumped he

2:35:52

is about everything. I mean he's

2:35:54

a little bit very weird. Yeah.

2:35:56

I've always told people I've had

2:35:59

a person. You know what he

2:36:01

does those presentations on stage? I've

2:36:03

had a personal one of that.

2:36:05

It was for our police service

2:36:07

pro seven. It was just me,

2:36:09

him, and like 12 other PR

2:36:11

and handlers in the row. And

2:36:13

he personally gave me the whole

2:36:15

like panos experience and it was

2:36:17

awesome. But he's just very passionate

2:36:19

about this stuff. So it's genuine.

2:36:21

Yeah. Oh, 100%. And whenever you

2:36:23

talk to him, he's one of

2:36:25

the most sincere. people have ever

2:36:27

met like and like talking to

2:36:29

him it's almost like it's intense

2:36:31

but like it's all in a

2:36:33

good way right it's just someone

2:36:35

who's really enjoys his job I

2:36:37

mean the whole we've reported that

2:36:39

one of the reasons why he

2:36:41

left Microsoft was because they want

2:36:43

to get rid of the surface

2:36:45

duo and the Neo projects which

2:36:47

were his sort of his it

2:36:49

was the number one thing he

2:36:51

was actually really really and Thue

2:36:53

was about working on and they

2:36:55

said no you can't do those.

2:36:57

So he's like well if I

2:36:59

can't do what I want then

2:37:01

I wouldn't leave. The duo was

2:37:03

the phone right that had the

2:37:05

dual screen the hinges I bought

2:37:07

one I thought it was really

2:37:09

interesting but they never had a

2:37:11

potential. Yeah and they completely lost

2:37:13

the threat on that and the

2:37:15

Neo never got released. Right. Yeah.

2:37:17

Another dual screen right was in

2:37:19

another dual screen yes that was

2:37:21

another dual screen piece and with

2:37:23

the phone they were going to

2:37:25

go to a single screen foldable

2:37:27

there like that was the third

2:37:29

iteration they were going to do

2:37:31

and they were they were improving

2:37:33

dramatically like had they got to

2:37:35

a third one they would have

2:37:37

been in a much better position

2:37:39

but like all things Microsoft they

2:37:41

have really ideas that get cut

2:37:43

off way too early and they

2:37:45

don't allow them to mature you

2:37:47

know it's a lot of times

2:37:49

they're so far ahead of things

2:37:51

that they they they cut them

2:37:53

back and then they will miss

2:37:56

the opportunity and then that's the

2:37:58

story of mobile for them period

2:38:00

yeah yeah it's it's really interesting

2:38:02

how mobile has much of just

2:38:04

fumbled mobile when it has turned

2:38:06

out to be probably the most

2:38:08

important part of computing right people

2:38:10

don't buy PCs anymore. They don't

2:38:12

buy laptops even. They've got a

2:38:14

phone. Although you could argue that

2:38:16

smartphones themselves, the market has flattened

2:38:18

two, right? Because everybody has flattened

2:38:20

two point two billion units. And

2:38:22

they're too bad. Laptop of PC

2:38:24

still do extremely well because they're

2:38:26

extremely critical to everyday tasks. But

2:38:28

these are both are now flat

2:38:30

markets. We're not expecting necessary. Although

2:38:32

I would say growth for PC

2:38:34

market 2025 will be a thing

2:38:36

just to refresh cycles for corporations

2:38:38

and enterprise. So that's interesting. We

2:38:40

talk about this a lot on

2:38:42

Windows Weekly and it was a

2:38:44

very kind of anemic growth in

2:38:46

the PC market this quarter. And

2:38:48

it really is Paul's theory that

2:38:50

it's just people aren't buying compute.

2:38:52

Businesses have to still refresh every

2:38:54

five to six. Yeah, we're coming

2:38:56

on the five-year cycle from the

2:38:58

pandemic. use, you know, boom, which

2:39:00

was huge, Windows 10 is expiring,

2:39:02

right? So if you're a consumer,

2:39:04

not necessarily a big deal, if

2:39:06

you're an enterprise, it's a really

2:39:08

big deal, and you can't upgrade

2:39:10

a lot of your computers to

2:39:12

Windows 11. So you have those

2:39:14

two things, and then the more

2:39:16

dubious, I would say still critical

2:39:18

AI PC aspect with NPUs being

2:39:20

more critical to these computers for

2:39:22

enterprise, again, more of a big

2:39:24

deal. So I think you'll start

2:39:26

to see this stuff. Growth in

2:39:28

2025 it won't be huge, but

2:39:30

you they are expecting all the

2:39:32

companies I talk to positive growth

2:39:34

in 2025 and 20 sometimes It's

2:39:36

not a leak. It's a partner

2:39:38

that accidentally reveals Things in this

2:39:40

case take to the game publisher

2:39:42

Yeah, kind of reveal a little

2:39:44

something about Xbox sales, right? Microsoft

2:39:46

doesn't report Xbox sales, right? Microsoft

2:39:48

doesn't report Xbox sales. They stop

2:39:50

reporting that nine years ago, which

2:39:52

tells you a little bit, a

2:39:54

little bit of something, something there.

2:39:56

Take two in their Q3 earnings

2:39:58

report said 94 million. consoles from

2:40:00

the current generation, not including the

2:40:02

switch, are estimated to have been

2:40:04

sold as of November. Since Sony

2:40:06

has announced that it shipped 65.5

2:40:08

million PlayStation 5, it would probably

2:40:10

mean that the Xbox has sold,

2:40:12

I don't know, 28 million, about

2:40:15

a third, that number. It's, it's,

2:40:17

is that a surprise? Not really,

2:40:19

right? No, in Microsoft reported 29%

2:40:21

revenue decline and hardware for games.

2:40:23

Yikes. They've been doing an ad

2:40:25

campaign lately saying Xbox is everywhere,

2:40:27

implying that you don't need the

2:40:29

console. You could use it on

2:40:31

your phone, your laptop, a tablet.

2:40:33

Right, cloud gaming for them and

2:40:35

the game pass grew 2% and

2:40:37

they are seeing a lot of

2:40:39

growth in cloud gaming, which is

2:40:41

really kind of, you know, Microsoft

2:40:43

strategy for gaming I would say

2:40:45

is more focused on one studios,

2:40:47

right? They have call duty now,

2:40:49

which is done huge. So they're

2:40:51

making, they're doing very well with

2:40:53

its publishing. arm for gaming due

2:40:55

to the Blizzard acquisition. Their game

2:40:57

pass is doing very well too,

2:40:59

which isn't just Xbox, as you

2:41:01

can get it with PC gaming,

2:41:03

which still continues to grow. And

2:41:05

then you have handheld gaming, which

2:41:07

is still very small, but it

2:41:09

is definitely growing. We've seen a

2:41:11

lot of creation of new handheld

2:41:13

gaming systems this year. So there's

2:41:15

still a lot of momentum here.

2:41:17

Consoles themselves are kind of, you

2:41:19

know, We're pretty old right now,

2:41:21

you know, old in quotes, but

2:41:23

a couple years into the consoles

2:41:25

for this generation. Will there be

2:41:27

another Xbox console, you think? I

2:41:29

think there will be, but I

2:41:31

think it's going to, it might

2:41:33

be conceptually different in terms of

2:41:35

what they're doing with hardware. I

2:41:37

think they started this back with

2:41:39

the Series X, you know, this

2:41:41

idea of merging or moving closer

2:41:43

towards the PC model for console.

2:41:45

I think they really want to

2:41:47

have. Xbox not be as distinct

2:41:49

from PC gaming as it has

2:41:51

been historically, which would save them.

2:41:53

money in the bottom. But right,

2:41:55

they're trying to do what's called

2:41:57

like play anywhere, this idea of

2:41:59

like a game is coded for

2:42:01

both Xbox as a console, but

2:42:03

also PC. If you buy it

2:42:05

once, you can play on both

2:42:07

systems. So I think that's really

2:42:09

going to be their value. I

2:42:11

think console gaming is, it'll always

2:42:13

be there, but it's always been

2:42:15

raised within margins on the hardware

2:42:17

end, right? It's always been about

2:42:19

the game licensing and selling the

2:42:21

games. That's really been where you

2:42:23

make money on consoles. Hi, this

2:42:25

is Benito. There's also always Nintendo.

2:42:27

Nintendo will always make consoles. Sure.

2:42:29

And they're going to make a

2:42:31

new switch. The switch will be

2:42:33

coming out in a month or

2:42:36

two. And I'll probably buy it.

2:42:38

Yeah, it gets right on their

2:42:40

IP, right? Yeah. Yeah. Who doesn't

2:42:42

want to play Final Fantasy? Or

2:42:44

in my case, animal crossings. I

2:42:46

want to play the Atari 800.

2:42:48

I have sitting in a closet.

2:42:50

You have one? I'll buy it

2:42:52

from you. Oh, I grew up

2:42:54

with one, but I can't find

2:42:56

the floppy drive. But otherwise, oh

2:42:58

yeah, I got one. Don't you

2:43:00

have cartridges? I have a few

2:43:02

cartridges, but most of the games

2:43:04

we played were on floppies. And

2:43:06

I think I may have the

2:43:08

floppies themselves, but we couldn't find

2:43:10

the bit in the attic that

2:43:12

had the, how do you actually

2:43:14

get the program off the floppy

2:43:16

and get to play it? So

2:43:18

I may be missing parts. Well,

2:43:20

I'm going to. Go ahead, I'm

2:43:22

sorry. I was going to say,

2:43:24

there's a ton of like, I

2:43:26

remember years ago, I was going

2:43:28

to buy a Coleco Vision, and

2:43:30

it was, people will buy Coleco

2:43:32

Vision, wow. Yeah, and they completely

2:43:34

restore them to take them apart,

2:43:36

clean them, replace the wires. Wasn't

2:43:38

that the one that had the

2:43:40

stringy floppy? Wasn't that the, wasn't

2:43:42

that the storage, it was like

2:43:44

a cassette, but they called it.

2:43:46

Oh, they have a car. It's

2:43:48

alright. We had a cassette too,

2:43:50

and then I think I do

2:43:52

have, but it was too slow.

2:43:54

We didn't play the games off

2:43:56

the floppy drive. But yeah, actually,

2:43:58

the option key or the select

2:44:00

key got sticky at some point,

2:44:02

so it will need a little

2:44:04

bit of PLC. So Tuesday, Civilization

2:44:06

7 comes out. and I'm going

2:44:08

to take the rest of the

2:44:10

week off. No, I'm going to

2:44:12

be gone Wednesday through Saturday because

2:44:14

I'm going to the, I'm taking

2:44:16

Lisa to the Tucson International Gem

2:44:18

and Mineral Show, a rock show.

2:44:20

So it's not a rock concert,

2:44:22

it's a rock show. So I

2:44:24

will not be here for Wednesday's

2:44:26

shows, but Michael will be filling

2:44:28

in for me on Windows Weekly

2:44:30

and Twig. And maybe I will

2:44:32

play a little Civ 7 while

2:44:34

I'm gone. I don't know. We

2:44:36

gave it a very good review.

2:44:38

Did you? People are getting really

2:44:40

excited about it. I'm not, I've

2:44:42

never been a Civ player because

2:44:44

turn-based games never, I always like

2:44:46

real-time strategy like I was a

2:44:48

huge age of empires fan, but

2:44:50

Benito's convinced me. I have to,

2:44:52

and I do not like that

2:44:55

I cannot play my age of

2:44:57

empires, which I lawfully purchased and

2:44:59

will not run on my machine.

2:45:01

I think that's it. Well, you

2:45:03

need an iPad because they just

2:45:05

released it for IOS for I.

2:45:07

I have to buy it again

2:45:09

to play the software. I lawfully

2:45:11

purchased from Microsoft. I bought Age

2:45:13

of Empires many times. Purchase the

2:45:15

license to play that game which

2:45:17

they invoked. Many, many times. Did

2:45:19

they even revoke? Where's my notice

2:45:21

of that revocation? A-O-E-2 is incredible,

2:45:23

yeah. I don't always like the

2:45:25

newfangled animations that some of the

2:45:27

remastered games are. I kind of

2:45:29

like the stuff that's not fully

2:45:31

pixelated, but like closer to that

2:45:33

style, the ones that are like

2:45:35

so detailed. I bought an Age

2:45:37

of Empires 4, and I wanted,

2:45:39

oh, if I like 3, I

2:45:41

better like 4, and I returned

2:45:43

it in the return window. I

2:45:45

just found it so garish that

2:45:47

I didn't want to play it

2:45:49

anymore. It gets me because I'm

2:45:51

watching the trailer for Civ7 and

2:45:53

it's all cutscenes. This is not

2:45:55

what the game looks like. Give

2:45:57

me a break. This is really,

2:45:59

this is really, it's kind of

2:46:01

deceptive, really. You wanna know what

2:46:03

the game looks like? Let me

2:46:05

see if this show. It's like,

2:46:07

click, it's the top down, you're

2:46:09

looking at the, you know, anyway,

2:46:11

I still look forward to doing

2:46:13

it. That's not it either. There

2:46:15

it is, this is a little

2:46:17

closer to the real thing. Anyway,

2:46:19

thank you Kathy for being here.

2:46:21

Thank you for the work you

2:46:23

do at Tech Dirtert. Try

2:46:25

not to bust a an aneurism

2:46:27

or something, just. Breathe deep and

2:46:29

go look at the San Francisco

2:46:32

Bay. It's gorgeous today. I really appreciate

2:46:34

everything you do. C.G.counsel.com. She's

2:46:36

on Blue Skycathy with a

2:46:38

C. Gallus. Yes. Well, that's

2:46:40

why I write what I write. I

2:46:42

think that a lot of people are

2:46:44

really upset and don't really know how

2:46:46

things work. And if I can explain

2:46:48

how things work, I think that will

2:46:51

help focus people, help them take a

2:46:53

breath, and then organize their strength to

2:46:55

use and use it in usable and

2:46:57

useable ways. I'm frustrated. I

2:46:59

don't know what to do. I

2:47:02

don't feel like there's much we

2:47:04

can do. But I will keep

2:47:06

reading you. I will read. Keep

2:47:09

calling your Congress people. We live

2:47:11

in California. Our Congress people are

2:47:13

not, you know, they're not the

2:47:15

bad guys. No, but they need to

2:47:18

know. It's data that they can use

2:47:20

and say, hey, call volume has surged

2:47:22

over a hundred calls a day,

2:47:24

right? Or they keep a record

2:47:26

of what matters to you as

2:47:28

well. the hold the line and

2:47:30

the senators in particular and yeah

2:47:32

it's even if I'm sure you're

2:47:34

especially needs to hear from you

2:47:37

yeah okay well he's my senator yeah

2:47:39

thank you for being here Kathy

2:47:41

thank you Lisa Schmeiser no jitter

2:47:43

dot com you're the editor-in-chief

2:47:45

tell me about no jitter

2:47:48

The quick elevator pitch is we cover the

2:47:50

technologies that help move information from point

2:47:52

A to point B and allow everybody

2:47:54

to act on that information. So it's

2:47:56

communication and collaboration technologies

2:47:59

from personal. workspaces all

2:48:01

the way up to enterprise networking

2:48:03

and contact centers. That's pretty important

2:48:05

stuff it sounds like. It's great

2:48:07

to have you on the show

2:48:09

and I appreciate it and everybody

2:48:11

should immediately high the to no-jitter.com

2:48:13

you can sign up for the

2:48:15

newsletters and stay on top of

2:48:17

communication in the workplace. Daniel Rubino

2:48:19

editor and chief of Windows Central

2:48:21

also a part of my regular

2:48:23

daily news check every day something

2:48:25

good. at windowscentral.com. Thank you for

2:48:27

being here. Daniel, I really appreciate

2:48:29

it. Yeah. Thanks to all of

2:48:31

you for being here. You can

2:48:33

go watch the Super Bowl now.

2:48:35

I think we got it done

2:48:38

just in time. This is normally

2:48:40

2 p.m. Pacific, 5 p.m. Eastern.

2:48:42

Excuse me, 2,200 UTC to watch

2:48:44

the show on those eight live

2:48:46

streams. But of course, it's a

2:48:48

it's a podcast, which means you

2:48:50

can get a copy of it

2:48:52

at our website. Twit. or YouTube,

2:48:54

where you'll see the video. That's

2:48:56

a great way to share clips

2:48:58

from the show. And finally, of

2:49:00

course, the best thing to do

2:49:02

is subscribe in your favorite podcast

2:49:04

player. Just because you have it,

2:49:06

it doesn't mean you have to

2:49:08

listen to it. But it would

2:49:10

be nice if you downloaded it.

2:49:12

And it'd be even nicer if

2:49:14

you listen to it. We appreciate

2:49:16

it. Thanks again to all of

2:49:18

our club members who make the

2:49:20

show possible. Thanks to Be Nido.

2:49:22

Gonzales, who is our technical director,

2:49:24

producer, great job bonito, Kevin King,

2:49:26

will be editing the show later

2:49:28

today. Thanks to all of you

2:49:31

for joining us. We're celebrating our

2:49:33

20th year as a podcast. In

2:49:35

fact, April will be the 20th

2:49:37

anniversary of the first twit. And

2:49:39

as I've said, for the last

2:49:41

20 years, it's hard to believe.

2:49:43

Thanks for being here. We'll see

2:49:45

you next week. Another twit is

2:49:47

in the can. Bye.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features