Apple's Intelligence beta and more AI chaos

Apple's Intelligence beta and more AI chaos

Released Friday, 2nd August 2024
 1 person rated this episode
Apple's Intelligence beta and more AI chaos

Apple's Intelligence beta and more AI chaos

Apple's Intelligence beta and more AI chaos

Apple's Intelligence beta and more AI chaos

Friday, 2nd August 2024
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Have a question or need how-to advice? Just

0:03

ask Meta AI. Whether you

0:05

want to design a marathon training program

0:07

or you're curious what planets are visible

0:09

in tonight's sky, Meta AI

0:12

has the answers. It can

0:14

also summarize your class notes, visualize your

0:16

ideas, and so much more. It's

0:19

the most advanced AI at

0:21

your fingertips. Expand your world

0:23

with Meta AI. Now

0:26

on Instagram, WhatsApp, Facebook, and

0:28

Messenger. Hello

0:33

and welcome to our cast, the flagship podcast

0:35

of Saturday Samsung. The

0:39

show where Samsung executives who have been forced

0:41

to work six days a week until they

0:43

make more money just have ideas. I'm

0:45

your friend, Eli. David and Alex

0:47

are both on vacation, which I feel is very rude.

0:50

Liam also on vacation. So this is just

0:52

gonna be a wild episode. Allison

0:56

Johnson's here. Hi, Allison. Hello. The

0:59

song is here. Hello, let's do

1:01

chaos. We've just been left

1:03

to our own devices. I'm helpless without

1:05

Liam and David and Alex on the

1:08

show at this point. Anything

1:10

could happen. I haven't been left to my own devices

1:12

to run a roadchast in years. For

1:15

good reason, I think. So we're just

1:17

going to see what happens. Thank you two for joining me.

1:19

We're here for the ride. I'm

1:21

here for Saturday Samsung as

1:23

the as the local Korean gadget

1:26

reviewer on staff. I am so

1:28

here for Saturday Samsung.

1:30

We're going to get to what the

1:32

Saturday Samsung is. This I would say,

1:34

you know, up until now, it's been stuff. Samsung

1:37

will give you a free TV if you buy a TV.

1:39

It's a real idea generated by

1:41

Saturday Samsung. This week's

1:44

is, I think, bananas in a different way,

1:46

but we'll come to it. There's much other

1:48

stuff going on. The iOS 18.1 developer beta

1:50

with Apple intelligence. Some Apple intelligence features hit.

1:53

Allison, you played with that. The you reviewed the

1:55

Galaxy Watch Ultra or as I call it the

1:58

Apple Watch Ultra. There's

2:00

some other gadget news. There's a Pixel 9 event

2:02

coming up. We caused

2:04

an entire furious news cycle about

2:06

subscription mice. I'm sorry. And then

2:08

because we have been left to

2:10

our devices, I am trying

2:13

a new style of lightning round. Still

2:15

unsponsored. But it's, you

2:17

know. Get out of this, folks. David, David, and Liam are

2:19

not around to put me in a box. So

2:22

we're gonna try something new. Chaos.

2:25

It's chaos. I wanna start with, we

2:28

should just call it the chaos round. I think so. One

2:32

of the topics is a little serious. I don't know if we,

2:34

oh, we'll get there. We're gonna get there. We're

2:36

doing it live, everybody. Sure. Let's

2:39

start with, there's big news in the week.

2:42

Just outside of tech, it's the Olympics. It

2:45

feels like the

2:47

experience of watching, consuming

2:49

the Olympics is very different this year

2:52

than in years past. Every other year, we run a story

2:54

that's like, streaming the Olympics sucks. We

2:56

haven't really had to run that story this year. It's

2:59

cause it's on TikTok, baby. I have

3:01

not streamed a single second of

3:03

the Olympics. I'm telling you, I have not

3:06

streamed a single second of the Olympics. Normally,

3:08

I'm all over the gymnastics, but everything I

3:10

know about the Olympics has just come through

3:12

doom scrolling my TikTok feed and just going

3:14

like, oh, actually, that's kind

3:17

of really wholesome. I really love chocolate

3:19

muffin guy. And I can't

3:21

remember his name. Yeah, she was explaining chocolate

3:24

muffin guy. This is a foreign

3:26

concept to me. It's

3:28

just this Norwegian athlete who has

3:30

discovered the chocolate muffin from the

3:33

Olympic village cafeteria,

3:35

and he gets increasingly weird with this

3:37

chocolate muffin. It just appears in his

3:39

bedroom, and he's in a thong, and

3:41

you're like, what is happening? What?

3:44

Oh, that's what I'm gonna say. She

3:46

really loves the chocolate muffin. You

3:49

know what's happening. That's true. Yeah,

3:52

doom's trying, TikTok is

3:54

a weird way to feel about America right now. This is hard to

3:57

know how much American pride you have

3:59

depending on what you're doing. what TikTok scroll you're in.

4:02

Like at this moment, it's like, I have

4:04

immense pride in our gymnasts who have crushed

4:06

it once again. Oh no, he's talking again.

4:09

I have immense pride in this, like it's what

4:11

is happening. That's a

4:14

lot. We have an entire story about the

4:16

Olympics in TikTok. Mia wrote it.

4:19

I actually wanted the title to be the influencer Olympics,

4:23

but I think that's what people think that's Coachella. So,

4:26

but it is true.

4:29

Yeah, I thought it was good fun, but it

4:31

is true that the athletes are

4:33

young by and large, very young, except for the 51

4:35

year old Turkish man who got a

4:38

silver medal in shooting with no equipment whatsoever,

4:40

great photo. But

4:42

it's true that the athletes are by

4:44

and large, very young. The US women's

4:46

gymnastic team, literally after winning gold was

4:48

caught on camera talking about what TikTok

4:50

they wanted to make. Oh yeah. Which

4:52

is incredible. And so you just have

4:54

this like very native

4:56

population to this format in this

4:59

platform in particular. And

5:01

then there's just the universe of piracy that

5:03

happens on TikTok. If you want to

5:05

watch clips of cool sporting events, the

5:08

TikTok community has you covered and TikTok's lawyers

5:10

also apparently have you covered because they don't

5:12

seem to care about it. And

5:14

that's like one really interesting way of consuming

5:16

this, right? You're getting kind of

5:18

the raw feed by people who are on the ground,

5:21

who speak in the language of video, speak in the

5:23

language of social video. And then you're getting a bunch

5:25

of clips of cool stuff curated by a community that

5:28

does not care about NBC's exclusive rights

5:30

to stream. What I will

5:32

say is the commentary on

5:34

TikTok is much funnier because you just

5:36

have people filming their screen while they're

5:39

watching it. Like the men's gymnast, pommel

5:41

horse guy. I don't know any of

5:43

these athletes names. I know them by

5:45

their memes on TikTok, but just like

5:47

I was watching these two women watching

5:49

pommel horse guy and they're like, yes,

5:52

yes, Peter Parker. Yes, you better

5:54

work. You better work. Just having

5:56

this very sassy commentary about this

5:58

man. doing the pommel horse. And

6:00

I was just like, this is

6:03

I'm watching them watch their

6:05

TV screen. And they're

6:07

just better than like the NBC commentary people

6:09

who are just like, Oh yes, he's on

6:11

the pommel horse and he did the skill.

6:13

So serious. So serious. It's

6:16

like, it's like watching

6:18

it with your friends, except my husband doesn't

6:20

give a crap about the Olympics. So I'm

6:22

just like, Oh yeah, pommel horse

6:24

guy, you can get it. Clark

6:27

Kent of pommel horse. So

6:29

I often say that almost every

6:31

experience I have with tech talk should be

6:34

a PhD in media studies. What you are

6:36

describing as a PhD in media studies, right?

6:38

Like the layers of abstraction away from the

6:40

thing itself happening. And then

6:42

the entertainment that random strangers and the internet providing

6:44

you, all of that is just very

6:46

different. And it feels new this

6:49

time in a way that two

6:51

years ago, three years ago, at the

6:54

last Olympics, it was

6:56

a two years ago, three years ago, the

6:58

last Olympics first. Three. They were in 2021 because of,

7:00

you know, COVID. Events.

7:04

Yeah. Yeah. That's why the Olympics just

7:06

has always felt to me like this

7:09

big, serious, important thing. And like, that's

7:11

kind of part of the appeal, but

7:13

there's something great about like, yeah, having

7:16

it kind of distributed a lot,

7:18

like, you know, from people on

7:20

the ground and having more fun with it.

7:22

Like it doesn't have to be this like

7:25

big booming voice of like NBC

7:27

will deliver you the Olympics and that is

7:29

how you will watch them. Right.

7:31

If you put 18,000 teenagers in the

7:33

middle of France with cell phones, like

7:36

something funny is going to happen. Yeah.

7:38

That's it. That seems correct. You

7:40

just learn more about different sports because

7:42

I don't care about rugby. I love

7:44

Alona Marr. She's hilarious on tech talk.

7:46

She's just all over. My

7:48

entire feed yesterday was just lesbians crying

7:51

that Alona Marr was not a lesbian

7:53

because they were just like in love

7:56

with the women's rugby team. And I was like, and that'll be

7:58

it for the chaos for a chest, everybody. We're

8:01

going to end it here. But that,

8:03

right, the idea that different communities, different

8:06

groups can view the thing together

8:08

in a different way as a community is

8:11

new. That just is a new thing that

8:13

even sports Twitter doesn't accomplish. It's

8:18

the native video communication ability of

8:20

particularly the younger generation that is

8:23

interesting and new. That's one thing

8:25

that's different about these Olympics. The

8:27

other thing that's different is that actually NBC and

8:29

Peacock are doing a good job. I

8:31

must disclose, NBC

8:33

is a minority investor in Vox Twitter.

8:55

Just go for it with their app. It's also there's not

8:58

problems. Random ad breaks

9:00

in the middle of stuff. There's infinite

9:02

complaints about just how overwhelming it is.

9:07

They had every idea and executed

9:09

every idea. If you want to

9:12

just watch highlights of the sports, you can just

9:14

watch highlights of the sports. If

9:16

you want to watch the

9:18

traditional gauzy 1980s style prime

9:20

time broadcast with completely unnecessary

9:23

human interest interludes about

9:25

how the gymnast grew up knowing they would be gymnasts.

9:28

Those stories are always the same. When they were four, they were

9:30

like, I'm going to jump on stuff. Every

9:33

four years you can get that story from NBC if you want.

9:35

That's available to you. They have the gold zone,

9:38

which is basically NFL red zone for the Olympics,

9:40

that they're doing 10 hours a day. Inside

9:43

of the app, when you're watching stuff, you can just be

9:45

like, I just want to keep watching this

9:47

thing that we're whipping around to, which is fascinating. They

9:50

have multi view, which is the more standard, just like

9:52

here's four feeds at once. They

9:55

have Al Michaels doing AI powered highlights that

9:57

you can just listen to. which

10:00

is super weird. Like

10:02

it's just weird to have robot Al Michaels being

10:04

like, the gymnast jumped over stuff

10:06

again. But it's like all the

10:09

ideas. I had someone in my mentions, cause I

10:11

was posting my own threads and they're like, all

10:14

of this is stuff that a normal company would

10:16

like de-scope to hit the deadline. And

10:18

NBC just like did it all. And

10:20

then on top of it, they're like, all of it's like pretty good. Like

10:23

Goldzone is actually really funny. There's like

10:26

very funny NFL trauma inside of Goldzone.

10:30

So, you know, there's red zone, where you gotta quip

10:32

around all the games on Sundays. And

10:34

there used to be two red zones. There was the one

10:36

on DirecTV and there was the one that the NFL did.

10:39

Google bought the rights to Sunday ticket. They took

10:41

the NFL red zone and they shut down the

10:44

DirecTV red zone. So Scott Hansen,

10:46

who does the NFL red zone is like gonna do

10:48

it with Google now. NBC realized

10:51

they can't just have Scott Hansen 10

10:54

hours a day, 16 days in a row. So

10:57

they hire Scott Hansen and the guy who used to do

11:00

the DirecTV red zone to Goldzone. So they're

11:02

like back together, not as rivals, but as

11:04

friends. Like all, it's hilarious that they did

11:06

that. This is a PhD. But

11:10

they had to do it. Like somebody at NBC had to

11:12

be like, all right, let's do a red zone for Olympics.

11:14

How will we do that? We should just get all the

11:16

red zone guys. Yeah. They're like, go

11:18

get some contracts together. Like make it happen. And

11:21

it all is kind of working. Like it's working

11:23

better than if I suggested

11:25

to you NBC was gonna

11:27

make an app full of Olympics content.

11:30

Your expectations would be very low.

11:33

And it's called Peacock, but they're actually doing

11:35

good. I

11:38

love that. I'm an Olympics lover. Yeah.

11:40

I love that. You're watching a Peacock or are you like full TikTok?

11:43

So I have to put an asterisk

11:46

that like I'm a winter Olympics girl.

11:48

And I don't know what it is

11:51

this year, but I've been like shouted down by

11:53

the summer Olympics people about how the winter Olympics

11:55

is not the real Olympics. I think

11:57

I just like got tired of it. So I've kind

11:59

of been missing out. on this

12:01

year's games. You can't, there's Snoop

12:03

Dogg. Like in between everything.

12:05

Yeah, oh well I'm getting the, yeah. And like

12:08

the genius decision to have Snoop Dogg and Flav

12:10

a Flav here. I'm like. This is what I

12:12

mean by NBC just went, like somewhat NBC is

12:14

like, is there a Snoop Dogg budget? And they're

12:16

like yes. In addition to that, there's Flav a

12:18

Flav budget. They just like went

12:21

for it. It's great. I

12:24

would, you should click around. It's

12:26

like actually like fairly entertaining just

12:29

as a tech experience to

12:31

be like, oh they like tried this out. Like

12:33

what if this was all happening? You can

12:35

kind of see there's news this week that

12:37

venue sports, which is the big mashup of

12:39

Disney and Fox and everything, ESPN. They're

12:43

gonna price it like $43 a month, which

12:45

is crazy and it's like, what would you

12:47

guess? That's Peloton money. Yeah,

12:50

only you sit around. So it actually might

12:52

be worth $50 a month. But

12:55

it's kind of like, oh, this is the bar. Like

12:57

if you wanna be anywhere close to that much money, you've

13:00

gotta deliver a user experience that

13:02

is at least this good. Because

13:04

if it's just like a minimum

13:06

viable product list of things, like

13:09

no one's gonna pay the money. All

13:11

right, so that's the Olympics, it's going on. I'm curious how people

13:13

are watching it. Send us a note. You

13:15

know, there's more time left in it. One

13:18

thing I'll say is it doesn't seem like the action

13:20

is on Twitter the way it is with other live

13:22

sports. It is on TikTok. And there's

13:24

something happening there that I think is just fascinating.

13:27

All right, we should talk about iOS 18. Alison,

13:30

you played with iOS 18.1, which

13:33

is the developer beta, not the public beta yet. But

13:35

I think that is getting pretty blurry in

13:37

that everyone is playing with it. I mean, no one was gonna stay

13:39

away from Apple Intelligence. So you played with it, what do you think?

13:42

Yeah, it's interesting. I think my

13:44

kind of big takeaway is like,

13:47

well, you get the visuals of the

13:49

new Siri, the like glowing border and

13:51

you can type to talk to Siri.

13:54

And there's a couple things that are new about Siri

13:57

that like it follows context

13:59

better. like between questions, which like

14:01

Google assistant has been doing for a

14:03

little while now, but, so

14:06

that's kind of like setting the stage. And

14:09

there's these other little things throughout

14:11

the UI that are like when

14:14

and if it all

14:16

gets put together, like that could be

14:18

really cool. But right

14:20

now it's just kind of like Siri lights

14:22

up and then you can rewrite

14:24

an email and you'll get

14:26

email summaries in your inbox. It

14:29

was just especially funny like when

14:31

you open the mail

14:33

app instead of the like each

14:36

email has the first couple lines of

14:38

the actual email copy, there's just a

14:40

little summary now and that makes a

14:43

lot of sense for like a long

14:45

email, but most of my emails are

14:47

garbage like promo stuff. So

14:49

it just summarizes the promotional stuff and

14:51

it's like lunch boxes

14:54

ship free. And

14:56

I'm like, didn't

14:58

need a summary of that, but thank you. That

15:00

is very much like what's happening with

15:03

Grok and trending topics on Twitter now.

15:05

Oh no. Where like the AI just doesn't

15:07

know that some things are jokes. And so

15:10

the trending topics are like

15:12

when I turned black in relation to

15:14

like people tweeting about Trump talking about Kamala Harris.

15:16

And it's like, no, that's not right. That's

15:19

not what people mean, that's just a joke. Yeah, you

15:21

didn't get it. Yeah. AI

15:24

is just so sincere about stuff. It

15:27

just like, it's funny. We

15:29

have to laugh about it. So are

15:31

those the only features that are out yet?

15:33

It's the summarizing emails and rewriting. Summarizing.

15:36

I've seen a lot of pictures of people

15:39

intentionally writing very mean notes and trying to get the

15:41

AI to make them nicer, which is fascinating. Oh,

15:43

the one I did, you

15:45

know at the end of Willy Wonka when

15:48

Gene Wilder does that like fizzy lifting drink

15:50

thing like you stole fizzy lifting drink. Yeah.

15:53

I got into a note and rewrite it

15:55

like friendly. And it was like, hey,

15:57

just so you know, you can't be.

16:00

stealing fizzy lifting drinks. Um, so

16:02

that's as far as I got

16:04

with that. Very amusing to me

16:06

personally. Um, yeah,

16:09

there's, there's little bits and bobs here

16:11

and there that are escaping

16:13

me at the moment. It's in the emails.

16:16

No photos. Um, in the photos app,

16:18

you can use like natural language

16:20

search. So you can be like, you

16:23

know, you could search for like food before or someone

16:26

that you've tagged, um, in

16:28

your photos, but you can be like

16:30

this person wearing glasses or the food

16:32

we ate in Iceland. And it's

16:35

like pretty good. And it comes up with

16:37

that stuff. So just getting

16:39

a step closer to like, I

16:41

just search for my license plate.

16:44

Cause I can never remember. So

16:47

like, I have this one picture of a chicken that

16:49

I saw while walking down a street in Brooklyn. So

16:51

would I just be able to, and I can never

16:53

like, no one believes me when I

16:55

told them I ran into a chicken on a sidewalk

16:57

in Brooklyn. I don't believe you now. One

17:02

photo of it. So would I be able to go

17:04

to like the photos app and be like, find me

17:06

the photo of the chicken walking on a side walk

17:08

in Brooklyn with my foot in there.

17:11

I think that's the most important

17:13

use case of AI, like

17:15

at all. Or

17:18

it will just generate that photo for you. And

17:20

you'll be able to lie to us. Chicken on

17:22

the sidewalk and then tell people that you saw

17:24

it and be a liar. So

17:27

when we saw it, WWC of

17:29

iOS 18 was AI everywhere in

17:31

every app and every corner. It's

17:33

just going to make the phone better. And then

17:36

that was like a one set of ideas.

17:39

And then the other idea was, and then Siri

17:41

will become your all knowing assistant. Is

17:44

that the experience of 18.1 in the beta right now? Not

17:47

yet. It's like, yeah. It's just a

17:49

little Easter eggs kind of threw out.

17:51

And then the thing that is missing

17:54

from Siri is like the

17:56

big stuff where it's like, the internet.

18:00

intelligence in the AI, I think, yeah. No,

18:03

the like, it'll understand what's on your screen and you

18:05

can ask for it, you know, you can be like,

18:08

email this photo to my mom and it'll

18:10

just go do it for you. Like that

18:12

is the stuff that's coming. And then the

18:14

third party app stuff because developers are going

18:17

to be able to hook into it and

18:20

let you do Siri stuff in their apps. And

18:23

that's the AI dream

18:25

on our phones. And that's

18:27

still the like, no, it's coming

18:29

because people have to develop it.

18:32

Is the, but they changed the user experience of Siri,

18:34

right? So now when you open it, like flashes the

18:36

thing, you can like, you

18:38

can type to it now. Yeah,

18:41

you double tap the- People have been able to. I

18:44

learned this and I didn't know it V. What's

18:47

the, how do you do

18:49

this? Oh, it's an accessibility feature where

18:51

you can type to Siri. So you

18:53

can do it even if you don't

18:55

have the beta. I just think it

18:57

looks a lot prettier in the beta

18:59

because this is another example of them

19:01

going like, oh, accessibility features, actually they

19:03

work for everybody. Why don't we just

19:05

like shine them up and make them

19:07

useful? Right, make it

19:10

glow. It is actually really interesting how

19:12

many accessibility features have just made their

19:14

way into the main Apple operating systems.

19:16

Yeah, and she ventures, there's other ones,

19:19

but tap to type to Siri is the big one

19:21

with 18.1 because

19:23

it's the thing that takes Siri from

19:25

being timers and music

19:27

player to chatbot,

19:30

to assistant, to intelligent

19:32

agent that is taking action across all of your

19:34

apps. And I, it seems

19:36

like they've added the stuff, but they haven't added any

19:38

of the backend to make it go yet. Yeah,

19:41

like I'm asking it questions and

19:43

it's still just Googling stuff for me. I'm

19:45

like, oh, okay. Well, I could have typed

19:47

this into a Google search bar instead of

19:50

a Siri text box, but. And

19:52

it seems they've added AI call recording

19:55

and transcription, which is the feature that I want the most. Yeah.

19:58

Because if you're a reporter. Not because I'm

20:00

constantly spying on everyone. I'm

20:02

Richard Nixon, everyone. I'm constantly recording everyone.

20:05

No, we're reporters. And AI call recording

20:07

and transfer are legitimately useful in our

20:09

jobs. And we've just been doing either

20:12

people are using their Android phones in our office, which

20:14

lots of people do, or there's a set

20:17

of weird third party stuff. But being able to just use

20:19

my iPhone, which is my main phone, would be really great.

20:22

So that's here now. Has it worked yet? Have you tried it

20:24

out? I have not, because I didn't put

20:26

my SIM card in that phone because of

20:28

eSIM because of nightmares. But

20:31

I did try out, so the

20:33

Voice Memos app will do

20:35

transcriptions now. And that's not an Apple

20:37

Intelligence feature. The recording a

20:40

call, I think, is Apple Intelligence. Could

20:42

be wrong. But the

20:44

transcription is good. I

20:47

am a Pixel recorder stand. I

20:49

always have a Pixel phone at

20:51

a press event. It

20:55

is just in my bag. And I put it

20:57

next to the iPhone. And

20:59

it was really good. It was up

21:01

there with the Pixel. So I am

21:04

personally excited about that. Yeah.

21:06

Yeah, I think that's the one. It's

21:08

always a grab bag of which features, the one

21:10

that hits. And I just know

21:13

that that feature for at least our little community

21:15

of reporters is going to hit the hardest. Uh-oh.

21:18

I can see people being excited. I'm so

21:20

excited. Can't wait to cancel my otter subscription. Oh,

21:23

no. Poor. I mean, talk about startups getting wiped out.

21:25

There's like two or three that are going to go

21:27

away, as soon as it sets. It

21:30

seems like the rest of the features are not

21:32

coming for a while, right? Like Mark Gurman at

21:34

Bloomberg said some of these might not

21:36

come until late 25 or even 26. Yeah,

21:39

I think the last thing. I'm so confused

21:41

on the timeline. But the last thing I

21:44

read, I think, was that they would all

21:46

come to developer betas by the end of

21:48

the year. But then as far as

21:50

being in the public beta, that's 2025. Yeah,

21:54

it's just a really long, staggered

21:56

kind of roll out, it sounds

21:58

like. Yeah, since the

22:01

danger with the Siri that can do stuff on

22:03

your behalf is quite high, you have

22:06

to make sure it works. We're also just not

22:09

sure how any of the

22:12

Apple private cloud stuff

22:14

is going to work. Right

22:16

now, is it clear that all this stuff is happening on the phone

22:18

or in the cloud or where? Or is it

22:20

all ... We know it's all happening on the phone. There's

22:24

a privacy report you can pull down

22:27

now that will show you ... I

22:29

think a lot of this is in

22:31

the cloud. It can't ... I

22:34

wouldn't say that for sure, but you can

22:36

pull this privacy report that shows you everything

22:39

over a certain time period that Apple Intelligence

22:41

has done on your phone. I

22:44

guess it's encrypted because it's just gobbledygook. I

22:46

downloaded it. I

22:49

was like, well, this is secure as hell because I don't

22:51

know what any of this is. Yes,

22:54

still unclear. Yeah,

22:57

I mean, I would just say the fundamental architecture of

22:59

this thing is very new and what

23:01

happens on your phone versus in the cloud, there's

23:04

some stuff that Apple isn't very clear that Apple

23:06

happens on your phone. Like Memoji and

23:08

some of the photo stuff happens on your phone. Some

23:11

of the other stuff, I'm assuming a lot of the Siri

23:13

stuff is going to go to the cloud because Siri goes

23:15

to the cloud right now. It

23:17

just seems like this rollout is

23:20

going to be a lot slower and more deliberate

23:22

than anyone anticipated based on WWDC. It

23:25

might roll all the way ... I mean,

23:27

late 2025 is very close to when

23:31

we would expect iOS 19 to be coming out. I'm

23:36

sort of curious how that will all play, but for now it seems

23:38

like they've sprinkled in some AI in

23:40

the places where you would expect it. Yeah,

23:43

it does seem pretty safe. It's like, okay, we've

23:45

been able to ... It's kind

23:48

of the table stakes stuff of write

23:50

me an email that's friendly or professional

23:52

or summarize this for me. That's

23:55

all kind of ... They're kind of laying the groundwork with

23:57

that stuff. Then we're going to get 18.0 for ... first

24:01

and then 18.1 seconds. So this is pretty far away,

24:03

it seems. Yeah.

24:05

It's always... All

24:07

right. Well, the AI overlords are coming

24:09

whether we like it or not. All

24:11

right. So that's iOS 18.1. It's interesting

24:13

that's out now and they're actually dual

24:15

tracking it, right? There's iOS 18 in

24:18

the public betas and 18.1 in

24:20

the developer betas. And you get the

24:22

feeling they wanted people to be talking about the AI

24:25

stuff because the Pixel Mine event is

24:27

coming very soon. And we know exactly

24:29

what Google is going to talk about in the Pixel Mine event. But

24:32

we got to take a break. Let's do that, come back

24:34

and let's talk about the Pixel Mine event. We'll be right

24:36

back. Have

24:39

a question or need how-to advice? Just

24:41

ask Meta AI. Whether you

24:43

want to design a marathon training program that

24:46

will get you race ready for the fall,

24:48

or you're curious what planets are visible in

24:50

tonight's sky, Meta AI has

24:52

the answers. Perhaps you

24:54

want to learn how to plant basil when

24:56

your garden only gets indirect sunlight. Meta

24:59

AI is your intelligent assistant. It

25:02

can even answer your follow-up questions, like

25:04

what you can make with your basil and other ingredients

25:06

you have in your fridge. Meta

25:09

AI can also summarize your class

25:11

notes, visualize your ideas, and so

25:13

much more. The best part is you

25:15

can find Meta AI right in the apps you

25:17

already have. Instagram, WhatsApp,

25:19

Facebook, and Messenger. Just

25:22

tag Meta AI in your chat or tap

25:25

the icon in the search bar to get

25:27

started. It's the most

25:29

advanced AI at your fingertips.

25:32

Expand your world with Meta AI.

25:37

All right,

25:40

we're back. So like we were just saying,

25:42

the Pixel Mine event is coming up. It's

25:44

August 13th, basically tomorrow.

25:49

I'm guessing we're just going to see, we've

25:51

already seen the phones. I

25:54

assume Google is just going to talk about

25:56

the software, which is the new

25:58

trend, right? The phones look the same. same as

26:00

ever. And now we're gonna,

26:02

we're gonna just make you care about

26:04

AI is how these companies are

26:07

going. Yeah, just hit

26:09

us over the head with AI.

26:11

And we've seen every color of

26:14

the phone, every size, like, it's

26:16

bigger, it's smaller, I don't know.

26:20

I'm, I'm just in this, like, all the

26:23

phones have turned into iPhone design,

26:25

which is like, fine. We've gotten

26:27

to this, like, the

26:29

edges are flat. They were colorful. Yeah,

26:32

I mean, sure. But

26:35

like, I'm good with this. We've

26:37

gotten to the same place with

26:39

the phone design, but Google's like,

26:42

I think it's like, Samsung and Google

26:44

are like, how do we make this not an

26:47

iPhone? And the answer is like a weird camera

26:49

situation. Yes. And it kind

26:51

of looks like Google's really leaning into the

26:53

weird camera situation. And I

26:55

don't know how I still feel about it. I

26:58

think these are the Rivian headlights of phone

27:00

cameras. Oh, that's it. Yeah. Do you know

27:02

what I'm saying? Yeah. Like,

27:05

you can love them, you can hate them. They're not changing them. It's,

27:08

you know, and you can try to

27:10

convince me that they look like snake

27:12

eyes. They don't, you know? No. Just

27:15

like you can try to convince me this phone doesn't have a weird forehead.

27:17

It does. It does. It

27:19

does. Super does.

27:23

But it's true. The colors are more vibrant, which is

27:25

getting away from what Apple is doing, or at least

27:27

that's what the leaks indicate. So we're

27:29

expecting a new Pixel 9, a Pixel 9 Pro, a Pixel

27:32

9 Pro Fold. I got these

27:34

names. And then

27:36

a watch, right? Yeah. Pixel

27:39

Buds Pro 2 and a Pixel Watch 3. So

27:41

that's a pretty complete refresh of the

27:43

Pixel line. It feels

27:46

like the Fold should be where the action

27:48

is, but I

27:50

just can't tell how much Google cares about these devices.

27:53

That's the question, isn't it? So

27:57

they moved the, like the Fold

27:59

was on the... the product cycle

28:01

with like, it was at IO

28:03

last time around, right, and then

28:06

they've like scootched it. So it's

28:08

more part of the like main

28:10

refresh cycle, which kind of

28:12

makes sense. It was like, is this

28:14

a flagship phone? It's like the most expensive

28:16

phone you guys sell, but it's a cycle

28:18

behind, you know, with the like camera

28:21

hardware and whatnot. So I think it

28:23

makes sense moving

28:25

the fold into this position, but

28:29

it's a lot of phones. I don't

28:31

know. Do we need this many phones?

28:35

Like, how do you think it'll compare to,

28:38

cause you know, Samsung, you said like their

28:40

foldables are just basically coasting at this point.

28:42

So are we at a point where like

28:44

the Pixel Fold is something

28:46

that is like, yay, competition?

28:49

Or is it just, it's

28:52

here? Yeah, I'm still

28:54

of the mind that the OnePlus

28:56

Open got it right. Like that

28:59

is how a foldable, like a

29:01

book style foldable should be shaped.

29:03

And then the Samsung is

29:05

just committed to this like remote

29:07

control shape thing. Every

29:09

year they're like, it's a few millimeter, like the

29:11

screen is a few millimeters wider and it still

29:14

feels like a remote control to use. So

29:17

I'm curious how different this new Pixel

29:19

Fold is. Cause there's a lot of

29:21

rumors that it's like thinner

29:23

and lighter cause boy

29:26

was the previous one heavy. But

29:29

I do like that, that like

29:31

wider aspect ratio for the cover

29:33

screen. I don't

29:35

think it needs to be like quite as wide

29:37

as the first generation Pixel

29:39

Fold. So we'll see.

29:42

Yeah. And then the

29:44

point of all this is going to be the software,

29:47

right? This

29:49

is pretty iterative hardware across the board.

29:51

Some minor camera tweaks, a

29:54

wacky camera bump. But

29:57

if you're not paying close attention, I think

29:59

it's going to be fairly. really hard to distinguish the hardware from

30:01

the previous generation. It's the

30:03

software that's the game. So you can see

30:05

they're adding Gemini sort of to the first

30:08

layer of the interface. They've got this

30:10

new thing called Pixel screenshots, which

30:13

is basically just Microsoft recall, which everyone

30:15

is scared about, only manual. So

30:17

instead of constantly taking screenshots, you have to manually

30:19

take a screenshot. Very good. A

30:22

very good differentiation. But I'm kind of into

30:24

it, right? You take a screenshot and you're like, what is this? Or

30:26

like save it for later. There's a

30:28

chicken walking down the street, just like hold on to that for

30:30

me. And

30:32

then of course there's circle to search, which Google is

30:34

adding to basically everything at this point. I

30:36

will, I will coyly say that a friend

30:39

of mine just put

30:41

like circle to search me in the face the

30:43

other day. You might imagine who that friend is

30:45

who was showing me an Android feature and he

30:47

dunked on me with

30:49

circle to search, which was pretty funny. But like

30:52

this is them saying, okay, there's a new way

30:54

to think about how this phone works. Right?

30:57

So you can just try, Gemini is like right here and

30:59

then you can just point at things or tell the phone

31:01

to look at things and it will generate information for you,

31:03

which is a very Google approach to it. The

31:07

question is whether that's enough to

31:09

make people switch, not even

31:12

from the iPhone, from their Samsung phones, which

31:14

obviously are dominant. And

31:16

of course, because it's Google, whether they'll stay committed

31:18

to any of these ideas for more than 20

31:20

minutes. And I, I

31:22

just don't know. It's just a confusing

31:25

time. I think even just circle to

31:27

search, like, and I

31:29

use it on the Samsung phones I've been

31:31

testing and the Google phones obviously, but like

31:34

there's just kind of a lot

31:36

of different ways to get at

31:38

circle to search. And you're like,

31:40

so is this replacing these like

31:42

Google lens is still a thing.

31:45

I can translate something, but like, did I need

31:47

to circle to search that? They

31:50

just have a lot of ideas. Yeah. It's

31:52

like, do you open up Gemini and like take a

31:54

screenshot of something or a picture of the inside of

31:56

your refrigerator? I don't know.

31:58

There's just a lot of like. input

32:00

methods where it just sort of

32:03

could feel overwhelming. But it's

32:05

like, seems like they're in

32:08

that position that we're

32:10

kind of talking about with Smarter Siri,

32:13

where it's like, okay, like you've made

32:15

the promises, this is gonna make our

32:17

lives easier, blah, blah, blah. Right

32:19

now it's just a bunch of like, you

32:22

can switch the faces in these

32:24

photos, or, you know, put

32:26

a rain cloud in the sky behind you.

32:29

And like, that was all nice and

32:31

everything, but like, let's see something. Like,

32:33

can they, the screenshot thing, sure, that

32:35

would be cool. I'm gonna take a

32:37

screenshot of a recipe and I

32:40

have literally have the same like

32:42

spaghetti and meatball recipe from America's

32:44

Test Kitchen bookmarked in my Google

32:46

Photos as a like favorite,

32:48

as if it's a photo of my child. Like,

32:51

I just look at it that often.

32:54

I'm like too cheap to pay

32:56

for America's Test Kitchen and

32:58

can't commit to writing it down. So

33:00

I don't know. I think there's a

33:02

real use case for help

33:05

me find this thing in a screenshot. I feel like,

33:07

you know, I work with both of you

33:09

very close in the audience, might not know what it's like to work with

33:11

you, but I bookmarked,

33:13

I favorited a screenshot of a recipe that I make

33:15

all the time versus there's a photo of a chicken

33:17

that I can never find is kind

33:20

of exactly the difference. Yeah.

33:23

A little bit, yeah. Yeah,

33:26

that's tricks. It

33:29

tells a whole story. I'm

33:32

just making my spaghetti and meatballs and being

33:34

like, where's that chicken? I

33:36

think about the Brooklyn chicken all the

33:38

time. It's been so long. I'm not

33:40

sure this chicken is alive anymore. I

33:42

may have the only record of this

33:44

sidewalk walking chicken. Like it's important to

33:47

me that this chicken lives in

33:49

my memory. Please

33:51

send us a note and describe

33:53

how your photo storage approaches. Describe

33:56

your personality. Because I think there's a lot in

33:58

there. It's a rich zone to uncover. On

34:00

a scale of like one to chicken. On

34:04

a scale of I'm just scrolling for a photo

34:06

of a chicken endlessly. I swear it's here. Speaking

34:09

of photos, Allison, you've been

34:11

covering what can only be described as the one

34:13

as a photo apocalypse. Somewhat closely.

34:15

You just wrote a piece about the

34:18

Samsung sketch a B and

34:20

they'll put a B in the photo

34:22

for you. It seems like the pixel

34:24

nine is just going to continue moving

34:27

inexorably down this path with

34:29

this feature called add me that will just add

34:31

you to photos that you're not in. Yep.

34:35

That's what it sounds like. I

34:38

mean, it's only from a leak. It's only from a

34:40

leak. Um, that was

34:42

in a YouTube video that got removed. It

34:44

was like an ad and it got, it was leaked to YouTube and

34:46

it got removed, but it

34:49

just feels like Google has to start communicating

34:51

about how it will keep our

34:55

information environment somewhat pure as it adds

34:58

these features that just pollute

35:02

it more than ever. Yeah. They were

35:04

really like tiptoeing up to the line. I feel

35:06

like last year and it was like, well, this

35:09

is just something that people have been doing in

35:11

Photoshop for a long time. And now more people

35:13

have the tools. And then I feel

35:15

like Samsung kicked the door down with sketch

35:18

to image. It was like, put

35:20

a chicken in your photo, put a

35:22

blimp like whatever you want to do, you can have

35:24

it. And then, yeah, I feel like

35:26

the, um, the floodgates are

35:28

open and I don't know how, like how,

35:32

uh, aggressive Google will

35:34

be. Like the add me

35:36

to the photo thing. It

35:38

seems like there could be guardrails on it

35:40

or it could just be like pure chaos

35:43

and it's an election year and

35:45

what could go wrong. So

35:48

many things could go wrong, but I also like,

35:50

it just reminds me of this one family portrait

35:52

I have. So like my family's half split in

35:55

the US versus, uh, Korea. And so we all

35:57

went to Korea to get this family portrait, except

35:59

for one. one cousin who couldn't go because

36:01

he was a dual citizen and his military

36:03

service time was coming up because all Korean

36:05

men have to do military service before they're

36:07

30. And he's just like, I'm

36:09

just not going to do it. So he had

36:12

to get photoshopped into the family photo. And if

36:15

you look at it from a distance, you can't tell

36:17

that he's been photoshopped in, but if you look up

36:19

close, you're like, well, lighting

36:21

on his hair is wrong. Yeah. Coming

36:24

from a different angle. You can really

36:26

like, you can just, he's just texturally

36:28

not the same as everybody else. So

36:31

you can tell, you can tell. And

36:33

it's just like this huge family portrait

36:35

and Steve who's

36:37

like been photoshopped in. So

36:40

like, not

36:42

really. Cause we can all tell he's not there. So

36:44

it's sort of like, I don't know. I feel like

36:46

my auntie would love to have him

36:51

in this sort of situation, but I also,

36:53

you know, having aunties with the power to

36:55

just add you into anything that seems also

36:57

dangerous. Well, so it's not, it's not like

37:00

an AI ad me feature, at least from

37:02

the ad that we see. It's

37:04

like a compositing feature, like a

37:07

very manual compositing feature where you take

37:09

a photo with us and there's, you

37:11

leave space for literally yourself in the

37:13

photo and then someone

37:16

else holds the camera and it

37:18

shows you the old photo and you

37:20

like go stand where you would have

37:22

been. And then you take a photo and

37:24

it stitches all the photos together. So

37:27

it's very manual in its way. I'm sure

37:29

there's AI in the background to handle

37:32

the stitching and alignment and probably

37:34

the textures and colors and lighting. Too

37:37

many fingers. Yeah. But the

37:39

idea that it's going to, it's not like deep

37:41

faking you to put you in the photo. It's

37:43

taking another photo of you in the same space

37:46

and then like making a composite, which

37:48

is still weird. Yeah. It's

37:51

one of those features that like as a parent

37:53

now I'm like, Oh, I completely

37:55

get that. Like best take

37:57

where it's like you try and take

37:59

a picture. where there's more than two

38:01

toddlers in it, like, forget it. Nobody's

38:04

gonna have a good time. But then it's

38:06

like, oh, you have all this data and

38:09

you can just kind of like AI

38:11

it together into one photo that looks right.

38:14

Though like you get to

38:16

be in the photo even though you took

38:18

the photo thing. I'm like, yeah, I'm following

38:20

along with that. I get that. So

38:23

it's basically so you never have to ask someone,

38:26

oh, hey, do you mind taking a photo of

38:28

us? No, you have to ask

38:30

them that, but in a

38:32

much more complicated and ethically challenged way. No, I mean,

38:34

it's in your group. It's in your group. It would

38:36

be in your group. So like, yeah, like you too

38:39

would take a photo and then I

38:41

would take a photo of you two and

38:43

then I would hand one of you the phone and

38:46

I would go stand next to where you had been

38:48

standing and you would take another photo

38:50

and the pixel would composite those photos together. So

38:52

it looks like all three of us are standing

38:54

there. I hate talking to strangers

38:56

and asking them to take a group photo for

38:59

me and my group. So this is really gonna

39:01

help me not grow as a person and not

39:03

talk to strangers more is basically. See, I love

39:05

being asked that question because then everyone's like, oh,

39:07

you know what you're doing, which is great. You

39:10

know what? People don't

39:12

know what they're getting into. I'm like,

39:14

so you've got the pixel phone. Tell

39:16

me about your choices here. They're

39:19

like, who's this lady? The

39:23

second someone hands you a phone and you turn it to landscape,

39:25

they're like, oh my God. We

39:27

got a player here. But

39:33

even that's weird. Like even the thing I'm describing

39:35

where it's pretty manual, right? It's not just AI

39:37

deep picking you, but you have to, you're making

39:39

a composite photo. What

39:41

is a photo? Like it's not a moment in time. That

39:44

moment quite literally never happened and we're all gonna,

39:47

we're gonna pass it off as a photo. And

39:50

I've been thinking about this parents thing because Allison, every

39:52

time you write about it, you put it in there.

39:54

Like the experience of being a parent is just

39:57

trying to, I

39:59

would like to. a statistical average of this moment

40:01

in time that makes me feel good, is the

40:03

goal of kid photography.

40:06

Yeah, the reality is too scary. Yeah, you're

40:08

like, if I could just round off the

40:10

edges and be like, this felt great at

40:13

several moments throughout this day, that

40:15

would be fine. I

40:18

don't actually need a pixel perfect recreation of

40:20

this moment in time. And it's

40:22

kind of weird. That is sort of the

40:24

goal, and I understand in particular, my parents

40:26

want that. But

40:28

it's right next to, are you showing somebody

40:30

an image of a thing that actually happened?

40:32

Or are you just creating an illustration? And

40:35

is that illustration like in the

40:37

age of influencers, Instagram, is that illustration

40:39

going to lead to all kinds of

40:42

weird outcomes, but what people's expectations

40:44

of their own experiences should be like, because they don't

40:46

know they're looking at something fake. In

40:49

the entire ecosystem,

40:52

we talked about a show in the

40:54

context of the images of the Trump

40:56

assassination attempt, the entire ecosystem of validating

40:59

these images, saying what's real, what's not, which

41:02

ones are AI, which ones aren't, is just

41:04

not ready. We've

41:07

talked about it a lot. There

41:09

are initiatives and alliances, and there's

41:11

labels on meta platforms that are

41:13

being applied seemingly randomly. But

41:16

none of it works. No one trusts any of it.

41:19

And we're just barreling towards this

41:21

future of completely synthetic images coming

41:23

off these phones. And

41:25

it doesn't seem like any

41:27

of the companies give a shit. Yeah. I

41:29

feel like there's this just

41:31

shared understanding we have, especially with a

41:34

phone camera. It's like you

41:36

just pointed at a thing, you were there,

41:38

you took a picture, it happened. You see

41:40

it on Instagram. It's like, we just have

41:42

this shared understanding of like, yes, that was

41:44

a thing that happened exactly,

41:47

more or less as we saw it. We

41:50

all understand that people pose for

41:52

photos with big hats on in

41:54

front of gorgeous landscapes or whatever.

41:57

But yeah, we're just like coming. straight

42:00

at that moment of like, okay,

42:02

we're going to have to start asking some

42:05

different questions when you're scrolling through Instagram or

42:07

just like shifting your mindset. And it's not

42:09

even like when I did the,

42:11

the sketch to image, um, I did,

42:13

you know, mess with a bunch of

42:15

photos, added cats and pirate chips and

42:17

like nonsense to them. And

42:19

I posted a bunch to my Instagram like

42:22

grid and a post and I went

42:24

through the, um, requirements they have for like

42:26

when you need to add an AI

42:28

tag. And it didn't

42:31

even meet them. It was like, you,

42:34

you can, you know, it was like something

42:36

about, it was my, like my own personal

42:38

photo and I wasn't, um,

42:41

the way it had manipulated it was like, yeah,

42:43

it's kind of up to you, you know, put

42:45

a tag on there or don't whatever. Like, okay,

42:49

that's fine. It's

42:51

all fine. Well, see, the other thing

42:53

I've noticed a lot of, um, is

42:56

even just the basic AI features and

42:58

phones now, they're over

43:00

correcting and over sharpening. And this is

43:02

beyond like, it looks bad. It's like,

43:05

if you zoom in too far on a

43:07

lot of modern smartphone photos now, you

43:10

will see that the text is all

43:12

AI wonky and like

43:14

signs in the background because AI has tried to denoise

43:16

and sharpen and you've gotten some weird stuff, like

43:19

really small faces far away in a

43:21

lot of smartphone photos now look incredibly

43:23

distorted and demonic because there's like

43:26

three pixels and it's like, well, what if I

43:28

put a face here? And it's like, I don't

43:30

really know what a face looks like yet. So

43:32

here's a face, a vaguely face shaped denoised thing.

43:34

I think Samsung phones are, are

43:36

more egregious with this than iPhones, but they're

43:38

all doing it, um, to

43:40

varying degrees. And that is already confusing people.

43:42

Like they're like, these photos are AI generated.

43:45

It's like, you're not wrong, but you're

43:47

also not right. Like that

43:50

moment in time did happen. And then,

43:52

uh, uh, you

43:55

know, a smartphone interpreted this demon

43:57

face instead of just

43:59

not having. information there. It's weird

44:01

when it kind of comes to the

44:04

surface where, what was the

44:06

photo of the lady trying on a wedding

44:08

dress? Yeah. Yeah.

44:11

In the mirror, she's making two different poses. Oh, yeah. And

44:13

everybody lost their minds and it turned out to

44:15

be like it was accidentally in panorama mode or

44:18

something like that. But even

44:20

that is like, this stuff has been

44:22

going on behind the scenes and just

44:24

sort of doing everything it

44:27

can with that data to make a photo that

44:29

it thinks you will like. And

44:31

then when it kind of goes sideways and

44:34

is obvious, then it's a real weird

44:36

moment of like, no, yeah, actually your

44:38

phone's been doing this. Like

44:41

you just, it just didn't look like

44:43

a demon until right this

44:45

moment. Yeah. I

44:47

think there's going to be some moment

44:50

where the

44:53

industry, probably not the industry, governments the

44:56

EU, they tend, they, they're always doing stuff.

44:59

They're going to say, if you

45:01

enable these features, you have to like

45:04

hard watermark these photos as

45:06

being partially generated by AI. Because

45:08

if we don't get all the

45:10

way there, then we're

45:12

going to end up in this extremely

45:14

weird moment where only

45:17

professionals with

45:19

like cryptographic signatures and their

45:21

DSLRs are

45:23

going directly to Getty or the source of

45:25

any truth. And I don't think

45:27

that's, that doesn't seem sustainable, right? Like

45:30

only if you buy this one, like a

45:32

camera and you have a contract with Reuters,

45:35

can your photos be trusted? Like

45:37

$10,000 camera. Yeah. It

45:40

just doesn't seem, that doesn't seem like what

45:42

anybody wants. I'm, I'm hopeful

45:44

that like consumers correct this

45:46

first, but I suspect that

45:49

some government somewhere is going to have to make a

45:51

rule and people are going to have to

45:53

comply because we're just headed every

45:55

time. I know that Google people are going to

45:57

yell at me because every time I'm like, what is a photo apocalypse is

45:59

here? because of some Google feature. They're like, no,

46:01

this is just what people want. And I'm like, people also

46:04

want sugar. Like, I love

46:06

cigarettes. Like,

46:11

what do you want me to do? It

46:14

always does feel like they have the best case scenario

46:16

in mind and that humans are not goblins

46:18

who do horrible things to each other sometimes.

46:20

They're like, oh no, everyone will use it

46:22

in the way that we intend, which is

46:25

the nice way. And it's

46:27

just like, no. You

46:30

only need one jerk. You only need one

46:32

jerk to ruin everything. Look

46:35

around. All right, other reviews. V, you reviewed the

46:37

Galaxy Watch Ultra this week. Yes,

46:40

yes. What kind of Apple Watch is it? A good

46:42

one or a bad one? It's

46:45

Apple Watch Ultra. That sure is what it is. But

46:49

actually, what it really is is that if you

46:52

took the Apple Watch Ultra and drew

46:54

it from memory, that's the Samsung Galaxy

46:56

Watch Ultra. Or if you

46:58

took the Apple font, it's

47:00

like the Apple Watch Ultra too. But

47:02

in an Android font is just basically

47:04

what you're doing with this. I

47:08

was writing the review. And then I was like,

47:10

you know what? It's actually just expedient if I

47:12

tell you what's different and bullet point everything that

47:15

is the same because there's too many things that

47:17

are the same. And then I started

47:19

doing the major things that are the same. And

47:21

then I was like, oh, but then there's this.

47:23

And then there's this. And then there's all these

47:25

tiny little features that only a psycho like me

47:27

would know and remember are on each of the

47:29

watches are there. Like, oh, you can run against

47:31

your last race performance if you're a runner. Yeah,

47:33

Apple did that and watch it West 9. And

47:35

now he's here on the Samsung. And why is

47:37

it here on the Samsung? Because I

47:39

don't know. It was on Apple. They have an

47:42

orange shortcut button that is

47:44

the quick button on Samsung and the action

47:46

button on the Apple Watch.

47:48

It's the same thing just

47:50

in a different font. And that's like you

47:52

could sum up the watches that. And also

47:55

it's a squirkel. And I don't like it. I

47:58

had mixed feelings. about the squircle going in

48:00

and then after two weeks of wear, I

48:03

don't like the squircle. It is just

48:05

too chunky on my wrist. I can

48:07

fit three chopsticks into the gap between

48:10

the watch and my

48:13

wrist and it's just

48:15

a chonker. It's a very chunky boy and so orange.

48:21

The Apple Watch Ultra was orange. This

48:24

is so radioactively

48:26

biohazard orange. It's a little upsetting

48:28

how orange it is. I

48:32

don't have it on me right now but

48:34

it looks like a Halloween watch. I actually

48:37

ended up appreciating the regular Galaxy Watch 7,

48:39

which I am wearing right now, a whole

48:41

lot more because I was like,

48:43

oh, it doesn't hurt me to wear it because

48:45

it's just too big. It's too big

48:47

for my wrist and the Apple Watch Ultra is

48:49

also big for my wrist. I can stick a

48:51

bunch of stuff in here. There's a gap. Something

48:55

about it being a square just made it a

48:57

lot harder for me to wear. When I was

48:59

testing the sleep apnea feature, it was just like,

49:01

nope, we can't get readings from you. I

49:04

actually had to wear the other one to get the readings

49:06

and test that feature out. It

49:10

makes it sound bad but it's actually a really great Android

49:12

watch. I would say it's one of the best Android watches

49:14

that are available. There's

49:17

a little part of me that hurts that

49:20

Samsung decided to bake an Apple Watch

49:22

in order to create this excellent Android

49:24

watch. It just felt wrong. It felt

49:26

wrong to me because I truly do

49:28

love how weird and unique the Samsung

49:31

smartwatches have been with their

49:33

rotating bezels and just Samsung

49:35

being weird about how it does

49:38

health stuff. The

49:40

ages metric is just completely bonkers. I don't

49:42

understand why it's there. I was definitely going

49:44

to ask you about this. By

49:47

the way, you said age and that stands

49:49

for advanced glycation end products. Yes.

49:51

Have you told me yet what that means? There's

49:56

a whole informational section that you can read

49:58

in the Samsung Health app

50:00

about what it is. And it's basically

50:02

measuring compounds when fats and

50:05

sugars in your blood oxidize.

50:08

How is it doing this? Samsung won't say.

50:10

It just says there's a new three in

50:12

one active biosensor with more LED colors and

50:14

whatnot. And there's a little

50:17

spectrum and it goes from low to high.

50:20

And if you're low,

50:22

I guess you're metabolically younger

50:24

than your age. And if you're high, I

50:26

guess you're not your metabolic age.

50:28

They're measuring this through the skin? Yes. Are

50:32

they, they're not like pricking you. The watch isn't

50:34

like taking your blood. No, everything is non-invasive. There's

50:36

not like a, there's not like a chemical sense.

50:38

It's like in your sweat. They're

50:40

just doing the thing that everyone does. They're just

50:43

shining light into your skin. And based on like

50:45

how it reflects back, they're just telling you things

50:48

that have not been FDA cleared. It's very

50:50

experimental. Like I did not know what this,

50:52

this actual feature was. So, you know, when

50:54

I was getting briefed on it and I

50:57

went up to Samsung and I'm like, so

50:59

what is this? And they were basically like,

51:01

well, we have this new sensor. So we

51:03

thought, why not throw this also in here?

51:05

This is the Samsung answer of all time.

51:08

And I was like, oh, okay.

51:10

So like, how, how do you intend people to

51:12

use it? And they're like, actually, we don't know.

51:14

Like we're just kind of, we just kind of

51:16

want to see how people use it. And so

51:18

a lot of the new features are quote unquote

51:20

AI powered and you're getting these AI insights and

51:23

they're telling me things that I don't know what

51:25

to do with because on the

51:27

one hand, the watch was like, Hey, you

51:29

have not been consistent with your sleep little

51:31

lady. You should work on that. And then

51:33

I refreshed the app and it's like great

51:36

job staying consistent with your sleep schedule. And

51:38

I was just like, so which is it? Which is

51:40

it? Which is it? I had one night where I

51:42

didn't like have my sleep schedule completely perfect. So I

51:45

guess that's what it was reacting. That's not helpful. And

51:47

then the age is metric. They're like, well, if you

51:49

want to improve your score, I'm smack dab in the

51:51

middle. So I don't know. I guess that means I'm

51:53

my age, uh, metabolically, who knows? Wait, I just wait.

51:56

Can I just stick with the age thing for one

51:58

second? So they have a metric. age,

52:01

which I think

52:03

most people know

52:05

what a metric called age is meant to measure,

52:07

which is your age. Yes. That's

52:10

a well-known label for a thing. And

52:15

then it measures, first

52:17

of all, let me just quote your own review back to you, V. Meanwhile,

52:20

the age metric is baffling, is what you

52:22

have written here. It

52:24

is baffling. So then it's shining a

52:26

light through your wrist, and

52:29

then somehow from what it, the

52:32

reflections of that light, it

52:34

is tracking how protein

52:37

and fat are oxidized by sugar, and

52:39

then telling you whether that's a low, medium, or high.

52:42

And then they're saying that is something called age. And

52:44

I just want to point out how deeply, meaningfully

52:47

confusing that is. It's

52:49

your metabolic age. So they

52:51

love to do these things. Get out of

52:53

here. Right. But there's no, but

52:57

it's like a new metric. Is

52:59

this a metric that exists in the literature? It

53:02

is a metric that exists in

53:05

research, and scientists are studying

53:07

it. But in a

53:09

consumer watch, it means nothing.

53:11

It means absolutely nothing

53:13

in a consumer watch. So

53:16

it tells me that I'm kind of

53:18

in the medium, neither low or high,

53:20

in the yellow section, so to speak.

53:24

So to improve your age's metric, here's

53:27

what you do. Does it know how old you are? Yeah,

53:29

I did put my demographic information

53:31

in. So presumably, is it

53:35

all related to your actual age? If

53:38

you're green, is your metabolic age younger than

53:40

your actual age? Yes, that's basically what

53:42

they're telling you. This is the idea. That's so

53:44

messed up. This is the idea. And then

53:47

how do they? Allison, it's not messed up. It means nothing.

53:49

It means nothing. So

53:51

it doesn't have to be messed up.

53:53

It's just baffling. It makes no sense.

53:56

I like new metrics. I'm

53:58

just trying to understand this one. So

54:00

someone somewhere is like a

54:03

25 year old has a metabolism

54:05

of X. And if you

54:07

tell us you're 25 by shining a light

54:09

from a smartwatch into your wrist, we

54:12

can tell you if you're old.

54:14

This is kind of a new thing that they're

54:16

doing in wearables. It's not just Samsung. Like basically,

54:20

Aura recently did a thing where it's like,

54:22

we can tell you what your cardiovascular age

54:24

is and whether it's aligned with your physical

54:27

age or not. And

54:29

so there's just like

54:31

this obsession with telling you whether

54:33

you are like physically speaking aligned

54:35

with expectations for your actual age

54:37

or whether you're quote unquote, physically,

54:40

physiologically younger than your actual age or older. I

54:42

feel like there's a lot of people in Silicon

54:44

Valley who are drinking the

54:47

butter of the young in order to live forever. And I'm not

54:49

trying to draw a straight line to the

54:51

Galaxy Watch Ultra. I'm just saying there's

54:53

a path, perhaps a winding path. There

54:56

is, but it's, I do, it's, I

54:58

do not want a smart ring to

55:00

tell me if my heart is dying

55:02

faster than I am. Listen,

55:05

listen, listen, there are, there are health

55:07

nuts who want this information. I don't

55:09

necessarily think it's actually good for their

55:12

mental health to have this information, but

55:14

the app was just basically like, Hey,

55:16

so here's how you can improve your

55:18

ages index of metric. That means absolutely

55:21

nothing. And you'll never guess what

55:23

the advice is because it's to eat

55:25

healthy, sleep well and

55:28

exercise. So I'm just like, Oh, thanks.

55:30

It's always that last one it gets you. Never.

55:33

It's always the last one. All

55:36

right. Like

55:38

all Samsung devices, this, this all, this works

55:41

best if you're in the Galaxy ecosystem. Yes.

55:43

I'm assuming none of it worse than iPhone. What

55:46

gets worse if you have a pixel or you

55:48

have a OnePlus? Well, would you like to have

55:50

your EKGs and a five detection?

55:52

You will not get that unless you have

55:54

a galaxy phone. Would you like to know

55:56

if you have sleep apnea? You will not

55:58

get that unless you have a galaxy phone

56:00

because that that requires the Samsung Health Monitor

56:02

app, which is separate from Samsung Health, meaning you're

56:04

just not gonna get that. And

56:06

some of the AI features are

56:09

Galaxy Phone only as well, which

56:11

actually I think that's great for

56:13

you because the AI was

56:15

very hit or miss for me. And

56:17

all the advice it gave me, it

56:19

would be like, hey, your age is

56:21

not ideal, exercise

56:24

more. And then the AI is like, you're

56:26

exercising a little too much. You should rest

56:28

because it's affecting your sleep. And

56:30

also you're sleeping well and not well

56:32

at the same time. So it's like,

56:35

cool, great, thanks. I feel like it's a

56:37

big ask to be like, you need a

56:39

new watch and a new phone. And

56:42

a new ring and all of that stuff.

56:44

So I did last week, we

56:47

asked listeners to send us a note if

56:49

they were all in on the Samsung ecosystem.

56:53

I would, I'm just gonna guess and say that

56:56

if I asked listeners to send us notes about

56:59

why they were all in on the Apple ecosystem, we

57:02

would crash the internet, right? Like we would just get a lot.

57:04

I like say in a breath, like,

57:07

I don't think CarPlay is very good in emails for

57:09

days. All

57:12

day long, people are like, how dare you? We

57:14

asked for Galaxy ecosystem users, we got

57:16

two emails. Just

57:19

putting out there. So one

57:22

person, Sean, thank you, Sean

57:24

for emailing. They wrote in, they're

57:26

upgrading from a Fold 4 to a Fold 6,

57:30

which is, you know, great. They're doing

57:32

it because the Fold 4 broke, the warranty

57:34

on the devices cashed, and they found a

57:36

way through the cell phone warranty to get

57:38

a Fold 6. Right,

57:41

they try to use this licensed

57:45

Samsung repair company, Ubreak iFix, that

57:49

costs money. So then they

57:51

try to use their Amex warranty, they were denied,

57:54

but an Amex called Samsung, they said no, and

57:56

then they went to cell phone insurance, which they

57:58

had, and that replaced the, the Fold 6. So

58:01

that is one way to stay in the Samsung

58:03

ecosystem, is a nightmare repair journey

58:05

because the seals in your Fold 4

58:07

broke. I would not say this

58:09

is the type of email we get when we ask why people

58:11

stay in the Apple ecosystem. So that's one.

58:14

We got another one. Thank you so much for

58:16

emailing from Israel. This

58:20

one says, you asked to hear from

58:22

a Galaxy ecosystem user, I am that

58:24

user, which is great. I truly

58:27

appreciate that line. They wrote to us, I'm typing

58:29

this in my Galaxy tab S8 Ultra. I upgraded

58:31

from Z Fold 4 to Z Fold 6. I

58:34

use the tablet form factor much more than

58:36

the phone fold factor, which means they have

58:39

both a Galaxy tablet and they are constantly

58:41

using their phone in tablet mode. This

58:44

is the most Android tablet usage I've ever heard of in my entire

58:46

life. That's incredible. You're breaking,

58:48

whatever you do, everyone's like, that's 90%

58:51

of the usage. They have a Galaxy Watch

58:53

5 Pro and they're waiting on their Galaxy

58:55

Watch Ultra. Then they have four

58:57

TVs, including a frame TV, their washer and

58:59

dryer Samsung. They have the air dresser, the

59:01

fridge, the oven, a robot vacuum, a smart

59:03

things hub, a Blu-ray player, you're my people,

59:06

and then it says, and something else I'm

59:08

forgetting, I'm sure. Are

59:10

they Korean? And then at

59:12

the end it says, my husband uses an iPhone and an

59:14

iPad. Oh,

59:19

wow. There's a case study. So my

59:21

family. I just want to say one,

59:23

thank you. I love this. I'm going

59:27

to write back, but you've neglected

59:30

to say why you've chosen that list. So I

59:32

would love to know why. And if anyone else

59:36

is out there and wants to send us a note with

59:38

their list of why, I'm dying to read them, because we

59:40

know so much about the Apple ecosystem

59:42

and how people feel about it. Trust me, we

59:44

know so much. You are not

59:46

quiet. I'm dying to know how people in

59:48

other ecosystems feel about it. And

59:51

it's really interesting to get these notes. So keep

59:53

writing in. I'm very curious. That said,

59:55

my husband uses an iPhone and an iPad is very

59:57

funny, deeply funny. There's

1:00:00

something going on. I love it. Someone

1:00:04

is very excited about RCS. Oh

1:00:06

my gosh. Oh,

1:00:10

that's so good. Yeah. I

1:00:12

mean, look, there's a

1:00:14

bunch of Samsung devices in this house. Somehow

1:00:16

I've ended up in the LG ThinkQ ecosystem because

1:00:19

that's our washer and dryer and fridge.

1:00:21

Oh boy. God only knows why. And

1:00:24

now I'm like, man, I better... I got an

1:00:26

update the other... I got a notification today that new

1:00:28

songs for summer were available for the washer and dryer.

1:00:30

And I was like, yes, this is the dream. Oh

1:00:33

my God. Wait, songs for your washer and

1:00:35

dryer to play? Every quarter LG

1:00:38

sends new little icons for the

1:00:40

seasons and then issues new songs.

1:00:44

Is this why they couldn't... They couldn't

1:00:46

make phones anymore. They've been like, we have

1:00:48

to divert those resources to the

1:00:50

washer and dryer. There's 50

1:00:52

engineers with Casio keyboards being like, and once

1:00:54

a quarter they send them to me. And

1:00:59

it's great. I truly love it. Incredible.

1:01:02

We can't wrap up the Samsung conversation

1:01:05

without talking about Saturday

1:01:07

Samsung. Saturday Samsung.

1:01:10

So if you don't know, this is what

1:01:12

we have started calling Samsung's relentless attempts to

1:01:14

sell more products, which are all

1:01:17

born of the company losing some money,

1:01:20

sales are flat, and they issued

1:01:22

an edict saying everyone had to come

1:01:24

to work six days a week until they recovered.

1:01:28

Crazy. And it's their corporate employees, which

1:01:31

means a bunch of suits are sitting

1:01:33

around Samsung headquarters on Saturday being like,

1:01:35

summer shit. They're not

1:01:37

making... They're the suits. What are they going

1:01:39

to do? So it's only weird promotions. So

1:01:41

we have covered you get a free TV

1:01:43

if you buy a TV, which is amazing.

1:01:46

We have covered you get a free TV if you buy a phone. Also

1:01:49

amazing. You would think it would continue in this

1:01:51

vein. But no, last

1:01:54

Saturday, I guess, burst

1:01:57

of creativity, and they

1:01:59

decided that... The best way to market

1:02:01

the Galaxy Z Flip in

1:02:04

America would

1:02:06

be to say that the

1:02:08

Z Flip, it's a folding phone. A little

1:02:10

flippy guy is ideal for

1:02:13

busy police officers to wear his body

1:02:15

cam. That's

1:02:18

so Samsung. It's very good. Oh, Samsung. It's

1:02:21

so bad. I

1:02:23

have put a folding phone

1:02:25

on my shirt. I think

1:02:28

I'm safe to say I'm one of the

1:02:30

few people who's tried that outside of these

1:02:32

police officers. It's a dumb idea. It's

1:02:35

just not a good idea. It just

1:02:37

doesn't seem secure. Also,

1:02:39

don't they wear vests? They

1:02:46

worked, so they didn't just issue

1:02:48

them T-Mobile phones. There's

1:02:52

a post on Samsung's website. It's titled

1:02:54

Samsung Technology is Helping Police Authority Protect

1:02:56

the Public Safety, which is a

1:02:58

lot. They partnered with

1:03:00

a company called Visual Labs, which is,

1:03:03

quote, a leading body camera solution provider.

1:03:06

Capitalism. And then two

1:03:09

police departments in Missouri did a pilot

1:03:11

program with all of this. They have

1:03:13

a customized Z Flip that

1:03:16

has slightly different buttons. The

1:03:19

phones can be set to automatically

1:03:21

begin recording the phone to text a

1:03:24

pursuit or if you're in

1:03:26

the car when the emergency lights are

1:03:28

turned on and then the video

1:03:30

footage is sent to the Visual Labs cloud. OK,

1:03:33

this is all very good. Looking at

1:03:35

this, that's ridiculous. It's

1:03:38

so bad. That is

1:03:41

so. Well get

1:03:43

ready because 25 more police departments

1:03:45

are going to start wearing Z Flip as body

1:03:48

Oh, Lord. Oh, and I was wrong. There's

1:03:51

a partnership with T-Mobile. So there are T-Mobile

1:03:53

phones. Like, what do

1:03:55

you what do you need the rest of the

1:03:57

phone on your person for? Like, you just. Need

1:03:59

a camera. I just like the idea that people

1:04:01

are going to flip it open and then make

1:04:03

a TikTok and close it up. Yeah. They're

1:04:06

going to start doing some dances. And

1:04:08

yeah. No, it's not OK. There

1:04:12

is a this whole process is great, but

1:04:15

there's just other benefits, which

1:04:17

are basically this phone has

1:04:19

a camera in it. So

1:04:21

one of the benefits is, in addition to their use

1:04:23

as body worn cameras, Z Flip devices

1:04:26

can help improve evidence gathering and transparency

1:04:28

by clearly documenting details of arrests and

1:04:30

other interactions. It's absolutely not. There's literally

1:04:33

a feature on this phone. That's just

1:04:35

a camera. Raw stuff that wasn't there.

1:04:39

Body cameras do not have that. This

1:04:41

is such a bad idea. Draw

1:04:44

in the drugs. Put

1:04:47

the cocaine here in the dash. Yes,

1:04:51

the Galaxy Z Flip quote, the Galaxy

1:04:53

Z Flip additionally functions as a digital

1:04:55

camera needed for taking pictures of crime

1:04:57

scene evidence and audio recorder for witness

1:04:59

interviews and a personnel locator for tracking

1:05:01

the officer's location through GPS. This is

1:05:03

just a phone. I just

1:05:05

want to be very clear. What they have described is a

1:05:07

phone. Have you thought

1:05:09

about using a phone? This

1:05:12

is body camera. Incredible. It's very good. Draw

1:05:15

the suspect in the bushes. I just

1:05:17

want you all to imagine the high

1:05:19

fives this Saturday afternoon. There's

1:05:22

like one slice of pizza left in the box. The

1:05:25

cheese is getting a little weird. And they're like, what if we just

1:05:27

gave it to cops? You know it

1:05:29

was like a lightning bolt moment. They

1:05:32

were out the door in their cars like, we did it.

1:05:36

Days over. Sales are up. We

1:05:38

sell 25 police departments in one of these big flips.

1:05:42

They still have 25 more phones. It's

1:05:44

very good. It is truly very

1:05:46

good. Saturday, Samsung. It never

1:05:48

gets old. All right, we should take a break. We

1:05:51

are cruising our way over here. We're going to take a

1:05:53

break. We'll be right back. Have

1:05:57

a question or need how to advice? Just

1:05:59

ask. Ask Meta AI. Whether

1:06:01

you want to design a marathon training program

1:06:03

that will get you race ready for the

1:06:06

fall, or you're curious what planets are visible

1:06:08

in tonight's sky, Meta AI

1:06:10

has the answers. Perhaps

1:06:12

you want to learn how to plant basil

1:06:14

when your garden only gets indirect sunlight. Meta

1:06:17

AI is your intelligent assistant. It

1:06:20

can even answer your follow-up questions, like

1:06:22

what you can make with your basil and other ingredients

1:06:24

you have in your fridge. Meta

1:06:27

AI can also summarize your class notes,

1:06:29

visualize your ideas, and so much more.

1:06:32

The best part is you can find Meta AI right

1:06:34

in the apps you already have. Instagram,

1:06:37

WhatsApp, Facebook, and Messenger.

1:06:41

Just tag Meta AI in your chat or

1:06:43

tap the icon in the search bar to

1:06:45

get started. It's the

1:06:47

most advanced AI at your

1:06:49

fingertips. Expand your

1:06:51

world with Meta AI. All

1:06:58

right, we're back. It's time for the lightning round.

1:07:01

As ever unsponsored. I

1:07:03

feel like the rudest thing David did was leave for

1:07:05

the week and not pay to sponsor the lightning round.

1:07:08

Like, honestly, if you're going to

1:07:10

leave me hanging, bro, sponsored

1:07:13

by David Pierce, you know what I'm saying? Like, set

1:07:15

the tone. Sadly, he's still unsponsored. We're still in the

1:07:17

market. We get a lot of weird emails from people

1:07:19

who are like, I'll do my crypto

1:07:22

startup. We're not going to let you do that. Sorry.

1:07:26

That's just not going to happen. But if

1:07:28

you have a real company and a lot of money,

1:07:30

talk to us. Something

1:07:32

will talk to you. Not me. That's

1:07:34

the other side of the house. But someone will talk to you. All

1:07:36

right. Here's how I want to do the lightning

1:07:38

round this week. We have like three topics. Two

1:07:40

of them just have a lot of headlines. So

1:07:43

I'm just going to read all of the headlines and then we'll

1:07:45

like figure out what's going on.

1:07:48

The first one, though, is just one headline and it's our fault. Logitech's

1:07:52

new CEO, Hanukkah Faber, was on

1:07:55

Decoder this week. She's

1:07:57

the new CEO. She just started like late last year. We

1:08:00

had the old CEO Bracken-Derrill on several

1:08:02

times. One

1:08:05

of my ideas with decoders, there's more than five companies

1:08:07

in the world, and so we should talk

1:08:09

to all the companies and pay attention to them. That

1:08:11

generally goes well. And Logitech is one of those companies that I

1:08:13

think is under covered, right? They're just

1:08:15

around. You don't think about them

1:08:17

very much. And it turned out under

1:08:20

Bracken-Derrill, Kyle Lutchtak worked

1:08:22

was bonkers. He's like,

1:08:24

I have 23 direct reports. Everyone's

1:08:26

just allowed to do whatever they want. I buy companies,

1:08:28

I leave them alone. There's no overlap. It's like everyone

1:08:30

have a good time, which is how you end up

1:08:33

with them buying like 50 headphone companies. And

1:08:35

they're all just kind of like doing their thing. Like, I wonder

1:08:37

if they have a strategy. The answer was they do, which is

1:08:39

leave everybody alone, which worked just

1:08:42

to whatever extent. Bracken leaves Logitech.

1:08:45

He now works for, I think it's

1:08:48

called the VF Brands. They own Vans and Supreme

1:08:50

and the North Face. So

1:08:52

the guy who ran Logitech is now the CEO of Supreme. Incredible.

1:08:56

We're gonna see how that goes. I'm

1:08:58

gonna try to get him back to be like, so Supreme. That's

1:09:00

the whole thing. And they have a

1:09:03

new CEO, Hanukkah, who came from like Unilever

1:09:05

and Procter & Gamble. She has this background in consumer

1:09:08

goods and like marketing. Really interesting.

1:09:10

And I was like, are you gonna change this wacky

1:09:13

corporate structure? And that was what I thought

1:09:15

Decoder was gonna be about. Truly,

1:09:17

that's what I thought we were gonna talk. And then we did

1:09:19

talk about that for a while. And then she's like, I wanna

1:09:21

build a forever mouse. And this

1:09:24

keeps happening to me on Decoder where I'm like,

1:09:26

what? Explain. And she's like,

1:09:28

I wanna build a really beautiful mouse. It's like

1:09:30

very heavy, very premium. And it will just do

1:09:32

software updates forever. And that will be your forever

1:09:34

mouse. Like you'll just have it forever. And

1:09:37

I'm thinking, well, I've had this mouse

1:09:39

forever. Right. Same.

1:09:43

Right, it's like 10 years old. Are

1:09:46

people replacing mice? No.

1:09:49

Yeah, so very confused. Very confused.

1:09:52

And you can hear, if you listen to it,

1:09:54

the transcript reads one way. You've actually go and

1:09:56

listen to Decoder. I'm just

1:09:58

laughing. Like, I'm like, what are you... talking about.

1:10:01

I'm like, there's only two ways to monetize

1:10:03

hardware over time. It is

1:10:05

subscriptions or it is ads. And she's

1:10:07

like, yep, those are the two ways. I'm like, so

1:10:10

you're going to do a subscription mouse? And she's like,

1:10:12

I am. And I encourage you to go listen to

1:10:14

this, because I'm just losing my mind. What

1:10:17

am I subscribing to with a

1:10:19

subscription mouse? Right. So it's

1:10:22

AI software. No! What

1:10:27

is AI software for my mouse?

1:10:30

What, are you going to change the DPI

1:10:32

based on what I'm doing? No,

1:10:34

no, no, no, no. It's not even that. You

1:10:37

were like, here's an idea. No, no, no, it's worse than that. They're

1:10:40

putting the buttons on the mouse so that when

1:10:42

you see a field, like any text field on

1:10:44

your computer, the mouse

1:10:47

will do the prompting for you. Absolutely

1:10:50

not. So they've already rolled this out in Options

1:10:52

Plus. No. No.

1:10:55

This is a thing in Logic Mice today

1:10:57

for newer mice. Why older

1:10:59

mice are not allowed to have the software? I

1:11:01

didn't even get to, because I was just, what?

1:11:04

You can listen to me like, dewire

1:11:06

my brain in this conversation. So

1:11:09

I was like, so a mouse? And

1:11:12

she's like, yeah, this is how our video conferencing software

1:11:14

works. And I was like, but that's

1:11:17

a service. This is a

1:11:19

mouse. And she's like, yeah, a mouse, like a beautiful, and

1:11:21

I believe she said diamond in CrossFit mouse. So that was

1:11:24

decoder. These things happen to me in

1:11:26

decoder all the time. I got it. What I

1:11:28

was not expecting was the idea

1:11:30

of the subscription mouse has just

1:11:33

like, it's a news cycle. And

1:11:36

it was just like the new CEO riffing.

1:11:38

But now there's a full news cycle a

1:11:40

lot of selecting subscription lines. Oh, no. And

1:11:42

I will tell you, not since like

1:11:45

HP was like subscription printers, have people

1:11:47

been this unhappy about any idea that

1:11:49

I've ever heard? I

1:11:51

mean. There are YouTubers making

1:11:53

YouTube videos about it. I've

1:11:55

seen TikToks about it's like crazy. It's

1:11:58

like it's living its own little life.

1:12:00

People are. throwing away their Logitech mice.

1:12:02

Like, no, no, we're not throwing away.

1:12:04

This is really going to be my

1:12:06

forever Logitech mouse now. I'm never giving

1:12:08

it up. I will say the mouse

1:12:10

and my other computer there is easily

1:12:12

15 years old. Yeah, like you just

1:12:14

get one. It's kind of gross. I'm

1:12:17

not saying it's not a little gross. They

1:12:19

get disgusting, but like you're just in an office. You

1:12:21

find one out of a box. Someone

1:12:24

put it there 10 years ago. Thumbs sweat. You

1:12:26

can see my thumb sweat on this. It's

1:12:29

a radio show. Stop showing people your thumb sweat.

1:12:31

You're telling me. It's

1:12:34

just, you know.

1:12:37

Describe it for the audio listener. Diamond

1:12:41

next to me. I will tell

1:12:43

you, I rarely get deeply surprised

1:12:47

by what happens on decoder. It's

1:12:50

mostly a show about org charts, as the listeners know. And

1:12:52

then every now and again, these past couple of months,

1:12:54

the Zoom CEO came on and was like, we're

1:12:57

building AI clones of everyone to put in Zoom meetings.

1:13:00

And again, I just encourage you. You can just listen

1:13:02

to me try to respond to

1:13:04

that in real time. And look, I'll take the heat

1:13:06

on decoder. People are like, you're too nice, whatever. I'm

1:13:09

always trying to be better. I'm always trying to

1:13:11

ask harder questions. Just imagine

1:13:13

what it's like in real time to

1:13:16

be faced with someone being like, the future of

1:13:18

Zoom is AI clones in meetings. And

1:13:20

you have to acknowledge

1:13:22

that, not die, and

1:13:26

then formulate a series of follow up questions that make

1:13:29

any sense, that like tell a little story. It's harder

1:13:31

than it looks, is all I'm trying to say. And

1:13:33

they're serious. And you have to keep them there. Right,

1:13:35

and you have to keep them talking. It

1:13:40

is just much more challenging. And subscription-wise,

1:13:42

you can hear me just, what are

1:13:44

you talking about? Record scratch, yeah. So

1:13:48

I was not intending to cause a full news

1:13:50

cycle about mice. It was truly

1:13:52

not my intent, but I

1:13:55

hope I was- Yeah. ... that doesn't

1:13:57

do subscription mice. I tried very hard to

1:13:59

be like, I don't think- I think that's

1:14:01

a good idea. Please don't. Yeah. But you

1:14:03

know, Logitech has this really interesting problem. It

1:14:05

is interesting to think of their big problem,

1:14:07

which is if you believe

1:14:09

that AI is going to be important. And we've talked

1:14:12

about it in this episode. It

1:14:14

might be that you just talk to your phone a bunch more. And

1:14:17

it just does stuff for you. It might be that natural

1:14:20

interfaces, natural language interfaces, using the

1:14:22

cameras interface, all this stuff, starts

1:14:25

to take over from traditional PCs more and

1:14:27

more and more. This has been the dream for a long time. There's

1:14:31

a reason that Google and Microsoft or everybody else

1:14:33

calls AI a platform shift. And

1:14:35

if that happens, my sales go down.

1:14:40

If we use desktop computers left and we talk to our phones

1:14:42

more or whatever, maybe the PC

1:14:45

sales will go down and then my sales go down. And then we

1:14:47

just like, if you're Logitech, you're like, well,

1:14:49

how do I preserve the revenue? And

1:14:51

you're like, gold-plated, diamond-encrusted subscription mice.

1:14:53

And maybe that's not the right

1:14:56

solution. But

1:14:58

you can see the pressure. It was

1:15:00

a good conversation that I was not expecting just

1:15:03

that little bit to resonate as much as it

1:15:05

does. That said, we have to

1:15:07

find the brother laser printers of mice. Oh,

1:15:11

the other thing, there were TikToks about

1:15:13

subscription mice. And that put me into

1:15:15

people complaining about subscriptions TikTok. And

1:15:17

then I got to one where a very

1:15:20

nice young woman was like bitching about her.

1:15:22

A very nice young woman. It got

1:15:24

me to one where I think an influencer, she

1:15:26

was just complaining about her subscription printer. And

1:15:29

all the comments were like, buy a brother laser

1:15:31

printer. And I was like, I

1:15:33

did it. The Verge has accomplished its goal. One

1:15:36

thing we stand for. Yeah. Yeah,

1:15:40

buy one piece of hardware that lasts forever. I think the

1:15:42

Verge stands for that. All right, so that's lightning round one. It

1:15:44

feels like we're in a consensus that we're not going

1:15:47

to do AI-powered subscription mice. No. Don't

1:15:49

like it. Don't want it. Absolutely not.

1:15:51

The idea that your mouse is the one

1:15:53

that creates the prompt for open AI. It's

1:15:58

bold. You know, it's bold. That

1:16:00

sounds like a mouse wrote it. Yeah. If

1:16:06

you're a diehard options plus ecosystem

1:16:08

person, write us a note.

1:16:11

All right, so okay, here's the other two where

1:16:13

I'm just going to read a bunch of headlines for

1:16:15

Lightning Round. The first, very much related to

1:16:18

what we've been talking about with photos, there's

1:16:20

just a bunch of deep fake news this

1:16:22

week. Everyone knows it's a problem. We

1:16:24

know we got to stop it. And

1:16:26

then some people are like, screw it, we're doing

1:16:28

it anyway. And by some people, I mean Elon

1:16:30

Musk, who posted a deep fake video of comment

1:16:33

that violates exes own policies

1:16:35

about deep fake that were implemented under

1:16:38

his ownership. This isn't like

1:16:40

there was an old rule that disagrees with

1:16:42

and he's bulldozing it. He made a rule

1:16:44

about deep fakes, because everyone knows this problem.

1:16:47

And then he posted this ad with Harris.

1:16:49

It's like Harris's campaign ad with the Beyonce

1:16:51

song, but they replaced

1:16:53

the voiceover with her saying Biden

1:16:55

is too old. Okay, you

1:16:58

know, like in many ways, this is like,

1:17:01

this is just standard political parody. Like

1:17:03

if I hired a Kamala Harris impersonator to do

1:17:05

this, would it

1:17:07

be a problem? I

1:17:10

don't know. But if you

1:17:12

already have the rule, it's like, don't do this. You

1:17:14

should not, you should not, as the owner of the

1:17:16

platform do it. So he doesn't care. He doesn't give

1:17:18

a shit. But that's where we are, right? It's like

1:17:20

already happening. There's already these like weird moments occurring

1:17:23

because just bad faith actors

1:17:25

on both sides, in

1:17:27

particular one side. But

1:17:29

it's already happening. Then here's the rest

1:17:32

of the headlines. Microsoft wants

1:17:34

Congress to outlaw AI generated deep

1:17:36

fakes. Google tweaks

1:17:38

search to hide explicit deep fakes.

1:17:41

So if Google de-ranks

1:17:43

your website or knocks your website for having

1:17:45

deep fakes on it, especially explicit

1:17:47

ones, search won't show them to people

1:17:50

anymore. They're just going to go away

1:17:52

from search. Congress

1:17:54

wants to carve out intimate

1:17:57

D A I D fakes, which is basically what

1:17:59

you would. So there's a

1:18:01

language conversation here about what you should call these. So

1:18:04

you might call these like AI revenge porn. Oh.

1:18:06

Revenge porn is like a very loaded word for a

1:18:09

lot of reasons. So now we're going to try AI,

1:18:11

intimate AI deepfakes. So

1:18:14

you carve that out from section 230. So

1:18:17

that means right now you post stuff to a

1:18:19

platform like Facebook or X or Instagram or whatever.

1:18:22

Those platforms are not liable for what you post. That's section 230. It

1:18:25

makes the internet go around. And they're saying, no,

1:18:27

actually, if you allow intimate AI

1:18:29

deepfakes, you are now liable for them.

1:18:32

But you and that basically means you have to moderate them. So

1:18:35

that's one idea. And then

1:18:37

the Copyright Office just

1:18:40

issued a huge report on AI and copyright, which they've been

1:18:42

working on for quite some time. And their

1:18:44

first chapter of this report was about deepfakes.

1:18:47

And they're there. You can read. It's

1:18:49

very long. It's very good. It's very

1:18:51

easy to read. We'll link to it in the show notes

1:18:53

here. But they're basically like, yeah, copyright law can't do this. We

1:18:56

need a new law. This is a disaster. Here's how we

1:18:58

think the new law should work. So

1:19:00

we're just at this moment where everyone is

1:19:02

seeing the problem very clearly. And

1:19:04

then lastly, in the Senate, the

1:19:07

No-Fakes Act has been introduced. It's not

1:19:09

close to passing, but yet another bill

1:19:12

to ban deepfakes, to regulate deepfakes. So

1:19:14

we're at this point now where everybody

1:19:16

sees the problem. Right? Google

1:19:19

as a platform is doing some stuff to stop it. Microsoft

1:19:22

as a company is asking Congress to outlaw

1:19:24

it. The government is like, we

1:19:26

don't have the laws to do it. Some

1:19:28

parts of the governments are saying we need to do it. It

1:19:32

feels like this is the first big

1:19:34

AI regulation that's going to happen. It

1:19:36

feels like it's also the first one that has to happen. Deepfakes

1:19:39

bad. Deepfakes bad. So yeah.

1:19:42

Yeah. It's like this Spider-Man pointing

1:19:44

at Spider-Man meme. We're like, what's

1:19:46

going on? Who's in charge here?

1:19:50

Elon Musk is making rules about deepfakes and then

1:19:52

not following those rules. Not following them. The government

1:19:54

is like, yeah, we don't know. I

1:19:57

agree. Deepfakes bad, especially the ones that

1:19:59

are being used. we'll

1:28:00

just synthesize the tabs and build a comparison for you,

1:28:02

which is legitimately very cool. Um, and at least you

1:28:04

get to have to load the web pages, which

1:28:07

is like a good step. Yeah.

1:28:10

Um, but you see the, the

1:28:12

web is just fully changing. Like

1:28:15

whatever you think AI is doing

1:28:17

to productivity or workers or whatever

1:28:19

right now at this moment, how

1:28:21

we think about the web and

1:28:23

in particular search on the

1:28:26

web, it's already, the past

1:28:28

is past, like it's gone. It's in the rear view

1:28:30

mirror. We might only be a little bit far away

1:28:32

from it, but it's not coming back. Like

1:28:34

we're, we're just in a mode where you're

1:28:37

going to sign up to use a search engine and maybe

1:28:40

we'll have a different set of sources than the next

1:28:42

search engine based on the deals they have made, which

1:28:44

is going to be really weird. It's

1:28:46

just strikes me every time,

1:28:48

you know, Reddit comes up as that, like

1:28:51

we're racing to have, you know, Google

1:28:53

wants to answer your question without you

1:28:55

clicking on anything. Chat bots

1:28:58

want to answer your question or like

1:29:00

robots can answer your question, but

1:29:02

then also like Reddit

1:29:04

is incredibly valuable and it's just

1:29:06

people talking to people. Like that's

1:29:08

the thing that, you know, we

1:29:11

want to get to in search now where

1:29:13

I'm like, you know, I have a question

1:29:15

about my plant or whatever. It's like, I,

1:29:18

I don't want the AI generated answer.

1:29:20

I don't want whatever blurb, like

1:29:24

I want to read an answer from a

1:29:26

person and it's just strange to me that

1:29:28

that's at the center of this. Like, what

1:29:30

if you could just talk to robots? But

1:29:35

the robots need to listen to a person who knows

1:29:37

the answer. Yeah.

1:29:40

I mean, and Reddit is itself being polluted

1:29:42

with AI generated texts. So that's

1:29:45

the next turn. And then on top of it, all

1:29:47

the new information is a bunch

1:29:49

of kids talking to Tik TOK.

1:29:52

Oh, and like, and we found,

1:29:54

we found out this week, uh, that Tik TOK is one of

1:29:56

the biggest customers of opening AI because they're

1:29:58

using all that for the recognition. algorithms and

1:30:00

sorting and try like, and you're like, Oh,

1:30:02

this is a weird little circle where it's

1:30:05

there are going to be more silos on

1:30:07

the internet than ever before, because the open

1:30:09

web did not make enough of the

1:30:12

people who make the original content enough

1:30:14

money. And

1:30:16

so like, no one wants to repeat

1:30:18

that. So now it's like, you got to pay us. And

1:30:21

like, whether that actually comes back to individual Redditors

1:30:23

who are contributing remains to be seen. But Reddit

1:30:26

itself, Steve Huffins, like, yeah, I'm just gonna block you. Like,

1:30:28

I got my money from Google and now everyone else is blocked unless

1:30:30

they pay up. And I think you can do it. Because

1:30:33

he just needs Google. Like, Google is going to send

1:30:35

Reddit a lot of traffic and that's fine. You

1:30:37

can block Bing and be like pay up or you're dead. And

1:30:40

I think eventually they're all gonna end up

1:30:42

paying. Because if you have an

1:30:44

AI search product that can't look at Reddit, I think

1:30:47

a lot of people are like, well, where's the good stuff?

1:30:50

And you're gonna you're just gonna end up in a weird loop. We've

1:30:52

been covering this a lot. Like we over a

1:30:54

year now we covered sort of the end of the

1:30:57

SEO industry and what that did to the

1:30:59

web and all this stuff because we wanted to mark like

1:31:01

this is what it looks like now. And

1:31:04

I didn't I honestly did not expect

1:31:06

it to feel as different as

1:31:08

fast as it feels today. Like the

1:31:10

the end of the search era that we knew is I

1:31:12

think just firmly here. And you can see even this

1:31:14

week, it's just like, here's six more

1:31:16

headlines that are just like, here's how it's changing.

1:31:20

Mm hmm. I keep thinking about a friend

1:31:22

who she was buying a

1:31:24

new mattress. And she's like, I just

1:31:26

asked Claude. Yeah, Claude did it for

1:31:28

me. Claude was like, Yeah, which is the which

1:31:32

one is that perplexity? It's

1:31:34

anthropic. Yeah. Oh, God. And

1:31:36

it's just like my eyes get wider. I'm like,

1:31:39

put the little mattress review calm

1:31:41

that it was scraping. You know,

1:31:43

I don't know. That's a whole minefield to know

1:31:46

my husband does it. My husband is like an

1:31:48

avid chat GP as a search engine

1:31:50

user before it even had this project. And he's

1:31:52

been using Gemini and all the other stuff. And

1:31:54

he'll be like, Oh, yeah, I just asked that

1:31:57

this is how I built some code for this.

1:31:59

And this is is how I like, cause

1:32:01

he just, Google just doesn't give him the answers

1:32:03

that he needs anymore. And so like, I've just

1:32:05

been watching them just completely shift how they use

1:32:07

the internet over the last year. And I'm just

1:32:10

doing the same things that I have always done.

1:32:13

And I just am like, this

1:32:15

feels like getting left behind. Maybe

1:32:17

I should try using it. Maybe

1:32:20

I should try using it more in the ways that they

1:32:22

do, but I just run into the same problem as you

1:32:24

do, Allison, where I'm just like, I just, I

1:32:26

don't like the source that they pull from in

1:32:29

these things. So I'm just like,

1:32:32

oh, that's, I don't trust randoblogsite.com.

1:32:34

Give me the other one. I

1:32:36

started randoblogsite.com. Not all

1:32:39

randoblogsite.com. Respect your

1:32:41

elders. Hey,

1:32:44

that was my first website. You

1:32:48

know, I've been using Chatterby for Olympics trivia

1:32:50

cause I assume that the, you

1:32:52

know, like, it's like, what's this cable on

1:32:54

the fencer is like, the answer has been the same for a

1:32:56

long time. So you just like,

1:32:58

I have Chatterby mapped the action button on my

1:33:00

iPhone 15 pro and it works. Like

1:33:03

it just delivers an answer to you. And you're like, I don't

1:33:05

even know that's right. Right. It's

1:33:07

a close enough. Yeah. And the stakes are

1:33:10

low. Yeah. I

1:33:13

hope that's true. You know,

1:33:15

you're not like judging the

1:33:17

fencing. Like judges are not

1:33:19

using that. Actually we

1:33:22

have a big feature we are on this

1:33:24

week about AI being used to judge gymnastics.

1:33:26

You should read that. It's very good. I

1:33:28

mentioned that because we are way over. Thank

1:33:30

you. We got to

1:33:32

wrap this up. Thank you for indulging in chaos

1:33:34

for our cast to be announced. And that was

1:33:36

great. That was fun. It was good times. David,

1:33:39

if you're listening, you're dead to me. You never come back. That's

1:33:43

the way it goes. Like

1:33:45

a goldfish over here. All

1:33:48

right. That's it. That's our chest. That was really

1:33:50

fun. Thank you for listening. Rock and roll. And

1:33:55

that's it for the bird's cast this week. Hey, we'd love

1:33:57

to hear from you. Give us a call at 8. The

1:34:02

Verge casts is a production of The Verge and

1:34:04

Vox Media Podcast Network. Our show is

1:34:06

produced by Andrew Marino and Liam James. That's

1:34:08

it. We'll see you

1:34:11

next week. Have

1:34:14

a question or need how-to advice? Just

1:34:17

ask Meta AI. Whether you need

1:34:19

to summarize your class notes or want to create

1:34:21

a recipe with the ingredients you already have in

1:34:23

your fridge, Meta AI has the

1:34:25

answers. You can also

1:34:27

research topics, explore interests, and so

1:34:29

much more. It's the most

1:34:32

advanced AI at your fingertips.

1:34:34

Expand your world with Meta

1:34:36

AI. Now on

1:34:39

Instagram, WhatsApp, Facebook, and Messenger.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features