414: ‘Annoying Friendliness’, With Joanna Stern

414: ‘Annoying Friendliness’, With Joanna Stern

Released Tuesday, 19th November 2024
Good episode? Give it some love!
414: ‘Annoying Friendliness’, With Joanna Stern

414: ‘Annoying Friendliness’, With Joanna Stern

414: ‘Annoying Friendliness’, With Joanna Stern

414: ‘Annoying Friendliness’, With Joanna Stern

Tuesday, 19th November 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

I was doing a very

0:02

important project in the garage. I

0:04

feel like I should be giving

0:06

this exclusive to the verge cast,

0:08

where I have spent many hours,

0:10

and in fact I face-timeed Neelai

0:12

from the garage. But fans of

0:15

mine and the verge cast and

0:17

my interview with Craig Federici will

0:19

know that I've been on a

0:21

months-long journey to get my

0:23

garage working with Siri. This is,

0:25

no, you've got to, you've got

0:28

to break it here. Come on,

0:30

you only, you've first. And I

0:32

want to break some news. I

0:34

know, let me break this news

0:36

here. Joanna Stern has finally

0:38

gotten her garage to open by

0:40

saying, and you know what, I'm going

0:43

to say it here, so everyone

0:45

is listening, if you have this

0:47

set up, it will do it on

0:49

your phone. Hey, Siri, open the garage.

0:52

No, no, you were opening your garage.

0:54

No, it needs my face ID, so it's

0:56

good. It didn't do it. Good.

0:58

That's why I used on the

1:00

podcast, if I can think of

1:02

it, I called all of the

1:05

various devices, Hey, Dingus. Well, you know,

1:07

you got me. I watched your,

1:09

I want to talk about a

1:11

couple of your recent videos and

1:13

columns, but I watched your most

1:15

recent one where you went on

1:17

the weekend, or was it a

1:19

weekend, I don't a weekend. The home

1:22

pot in my office set a

1:24

timer for six minutes or whatever,

1:26

whatever you were asking. I've heard from

1:28

quite a few people that I've done

1:30

that. But I wish there was a way

1:33

for the systems to know this is

1:35

not a natural voice in the

1:37

environment. You should not. Supposedly? I

1:39

mean, I've never looked into it

1:41

because I don't even know what

1:43

they do. I don't even know.

1:45

And I can't. I try to

1:47

remember to say, hey Dingus, myself,

1:49

and that's as much as I

1:52

can be bothered. But my understanding,

1:54

and I've never, it might be

1:56

something you'd like to look into,

1:58

is like on TV commercials, on

2:00

sports a lot, they'll, Apple will

2:02

have a commercial and they'll say,

2:04

just ask Siri something, something. And

2:06

they'll demo it. And it doesn't

2:08

set off devices. And there's some

2:10

kind of like trick with the

2:12

modulation. Humans can hear it, but

2:14

the devices don't, you know, like

2:16

a frequency thing for that recording.

2:18

I don't know. I can't be

2:20

bothered. Right. I mean, and there

2:22

was something recently, some headline about

2:24

that, that they were working on

2:26

something deeper around that. But anyway.

2:28

I can't get into the details.

2:30

Okay, okay. Don't spoil the details.

2:33

No, I could. No, no, it

2:35

is, it's more that everyone will

2:37

stop listening to the podcast if

2:39

I go through the details. It

2:41

is such a technical, ridiculous, really

2:43

niche, niche issue. But let's just

2:45

say, top line, and I should

2:47

write this and. Maybe John, maybe

2:49

you'll let me write it for

2:51

your for daring firebooks. I'm pretty

2:53

sure no other one, no other

2:55

place, other, maybe I could do

2:57

a read it thread, I should

2:59

just do a read it thread

3:01

about how I did this all.

3:03

Bottom line is Chamberlain, which owns

3:05

all the garage doormakers, lift master,

3:07

lift gate, whatever all of them

3:09

are, has become really anti anything

3:11

else other than it's my Q

3:13

app, which is what you use

3:16

to open the garage through their.

3:18

digital platforms. And they just don't

3:20

like anything else. And so you

3:22

have to buy a third-party adapter,

3:24

a third-party accessory that is basically

3:26

a half-together small module that works

3:28

with all of this stuff. And

3:30

I bought this Miros accessory, which

3:32

Neelai suggested, and a few other

3:34

people in this world have suggested

3:36

as we've been going over this

3:38

in the last couple of weeks.

3:40

Thank you to all the people

3:42

on threads who shared their setups.

3:44

Anyway, I had to get an

3:46

accessory but then because I have

3:48

a lift master version that is

3:50

really anti working with anything it's

3:52

just completely closed down I had

3:54

to get another accessory, so I

3:57

have an accessory for my accessory.

3:59

This is all to say, it

4:01

finally works. And now I have

4:03

to do some cleanup and some

4:05

rewiring in the garage, but I'm

4:07

very proud of myself today. That

4:09

sounds like an accomplishment. Yeah, I

4:11

couldn't remember the name. I know

4:13

that's right? No, we do have

4:15

a garage. We actually do have

4:17

a garage. We actually do. want

4:19

a two-car garage. We only have

4:21

one car, but we have a

4:23

two-car garage, and it's, we do

4:25

have a liftmaster, but we don't

4:27

use it enough. I don't even,

4:29

I've never even tried to hook

4:31

it up to home kit. If

4:33

I drove all the time, I'd,

4:35

it would drive me nuts that

4:38

it's such a hassle. But it's,

4:40

I know that it is one

4:42

of those weird things, like when

4:44

we talk about antitrust lately, it's

4:46

everybody, whether it's tech nerds like

4:48

us or even. people in the

4:50

outside media world looking at the

4:52

law you know from a political

4:54

perspective they're looking at the big

4:56

tech companies and for good reason

4:58

they are so huge financially and

5:00

in terms of people's daily lives

5:02

that of course they're going to

5:04

get the scrutiny but it's some

5:06

of these little niche industries where

5:08

it's absurd and like I don't

5:10

think you can buy a modern

5:12

garage door opener that isn't from

5:14

one of the companies from Chamberlain

5:16

I mean it's you can't You

5:18

cannot. I've looked deep into it

5:21

and none of them support home

5:23

kit or Google Assistant or even

5:25

Alexa. I mean they've got this

5:27

integration with the IQ app and

5:29

Amazon and but it's it's just

5:31

completely on user-friendly and they think

5:33

their app is the ultimate and

5:35

it is not. It's like if

5:37

you found out that General Motors

5:39

also owns Ford Honda Toyota Hyundai.

5:41

And it's like, what? How is

5:43

that possible? And then none of

5:45

them want to support car play

5:47

or Android auto or something like

5:49

that. You know, and the whole

5:51

thing, you've covered this too. You

5:53

had the Ford CEO on stage

5:55

at a Wall Street Journal. Oh,

5:57

I talked to. and Farley like

5:59

every day. Yeah. It's my best

6:02

friend. But it really is like

6:04

the GM tech strategy that they're

6:06

taking of, hey, we're gonna move

6:08

away from the car play type

6:10

stuff and make our, we're gonna

6:12

endeavor to make our own home-built

6:14

console entertainment system so good that

6:16

you won't even miss it, which

6:18

is the technique that Tesla and

6:20

Rivian have taken. And in some

6:22

sense, it is. competition at work,

6:24

where if car play really is

6:26

important to you, you can go

6:28

to a car dealer and say,

6:30

hey, does this car, I'm looking

6:32

at support car play. And if

6:34

that's a deal breaker for you,

6:36

you could tell the salesperson and

6:38

the salesperson will say, you know,

6:40

I lost the sale because this

6:42

car doesn't support car play. It's

6:45

competition at work because you can

6:47

just go across the street and

6:49

buy a car from another company

6:51

that supports it But when it's

6:53

one company like this Chamberlain that

6:55

owns all the garage door openers

6:57

and they've decided they're going to

6:59

do their own Remember with it

7:01

with Apple pay when there was

7:03

the the current currency? Oh God

7:05

the currency Where you'd have the

7:07

convenience of scanning a QR code

7:09

on a piece of paper at

7:11

the checkout every time you wanted

7:13

to pay and it's like yeah

7:15

competition actually works, ideally. You know,

7:17

I know it doesn't always work

7:19

and there's always exceptions, but when

7:21

one company owns everything, it really

7:23

does break down. Although I think,

7:26

and I think one of the,

7:28

I think nine to five MAC

7:30

was reporting in and somewhere, maybe

7:32

it's been confirmed that Home Depot,

7:34

which has been one of the

7:36

last holdouts on mobile payment and

7:38

Apple Pay, because I believe I

7:40

need to go back to what

7:42

I was reporting on like 10

7:44

years ago, which was that currency

7:46

thing, but I believe they were

7:48

going to. be on that trend

7:50

or with that protocol. But Home

7:52

Depot has apparently decided to go

7:54

Apple Pay and that is a

7:56

game changer in my life. And

7:58

I happen to know that because

8:00

I write about it that Home

8:02

Depot, I don't know if they're

8:04

related or what the level of

8:07

separation is, but in Canada Home

8:09

Depot has supported Apple Pay for

8:11

a long time. There's some kind

8:13

of North American divide between Canada

8:15

and America. And the last time

8:17

I wrote about it, I wrote

8:19

about it during Canada and America

8:21

and the last time I wrote

8:23

about it during Fireball, I got

8:25

a couple of emails and people

8:27

saying, I don't know what you're

8:29

talking about. I go to Home

8:31

Depot all the time and Apple

8:33

Pay just works, and I write

8:35

back. Yeah. I've been watching you

8:37

go to Home Depot in Canada,

8:39

that's how I know. No. I

8:41

mean, I go to Home Depot

8:43

a lot, as you can tell.

8:45

I'm very interested in my garage,

8:47

and so I'm very often there,

8:50

but my kids love it. And

8:52

it isn't, it is annoying that

8:54

it does not take Apple Pay.

8:56

Yeah, and I off, at this

8:58

point, I've thought about it and

9:00

written about it enough that I

9:02

remember, but I can't tell you

9:04

how many times I get to

9:06

the check out. It's right there,

9:08

you're already outside and you're just

9:10

like, let me pay and it

9:12

doesn't work and I don't know.

9:14

Yeah, put your stuff down and

9:16

I have had people that the

9:18

employees of Home Depot say you

9:20

should write to the company and

9:22

it's like me writing to the

9:24

company is not going to help.

9:26

No, but you could. This is

9:28

where like being, especially you with

9:31

your perch where you're like, you

9:33

know what, I don't even want

9:35

to get into what I could

9:37

do. Thank you for giving me

9:39

this time and space to talk

9:41

about Syria and my garage. I

9:43

really appreciate it. You know what's

9:45

one of the other companies that's

9:47

like that owns the like more

9:49

brands than you think and I

9:51

always forget how to pronounce her

9:53

name. It's the Glasses Company that

9:55

owns Ray-Ban Luxotica. And you think

9:57

about them, like Ray-Ban, especially in

9:59

our beat, and I know you

10:01

were just wearing them on your

10:03

trip, like with the Ray-Ban meta.

10:05

I was just wearing them, I'm

10:07

holding them up right now. I

10:09

was just, how do you think

10:12

I was capturing opening my garage?

10:14

Of course. But they own, like.

10:16

a billion brands of glasses. It's

10:18

crazy, like when you go into

10:20

to buy a new pair of

10:22

eyeglasses, what percentage chance you have.

10:24

I think especially sunglasses, like they

10:26

own, all these brands that you

10:28

think of are rivals are all

10:30

owned by Luxotica. They're not rivals.

10:32

They're all part of the same

10:34

company. Very strange. You go into

10:36

like a lens crafters. 90% of

10:38

those are those brands. Yeah, because

10:40

they do kind of rotten anti-competitive

10:42

things because of their market share

10:44

where it's like, yeah, sure, why

10:46

don't you give us, how about

10:48

95% of your shelf space? If

10:50

you want any of our glasses.

10:52

And it's like, oh, that seems

10:55

like it ought to be illegal.

10:57

And they're like, shh, sorry. It

10:59

doesn't seem like it should be

11:01

right and it... Fact check, they

11:03

also own Sunglass Hut. Yes, exactly.

11:05

See what I mean? So when

11:07

you go to Sunglass Hut, you

11:09

don't even have the option of

11:11

buying glasses from a brand that

11:13

is... Sorry, they also own Pearl

11:15

Vision. Yep. And which they didn't

11:17

always, right? That was like an

11:19

acquisition at some point as they

11:21

built up this arsenal of both

11:23

the brands that make the high-end

11:25

glasses and the stores where you

11:27

buy glasses, which present themselves... as

11:29

being sort of, oh, you just

11:31

go to Pearl Vision to get

11:33

your eyes checked and you get

11:36

a prescription and you choose from

11:38

all these brands. Instead, it's all

11:40

sort of a, like when you

11:42

go in the Apple store and

11:44

everything is from Apple, except for

11:46

a couple of Belkin charges, you're

11:48

not surprised. No, but I'm, anyway,

11:50

everyone should go read Lixatica's with

11:52

a PDF pages, it's fascinating. All

11:54

right. Well, before we get into

11:56

it, here, I got a new

11:58

gimmick on the show. You hear

12:00

that? That sounds like a real

12:02

one. That is, it is a

12:04

real bell. That's my new money

12:06

bell that I'm gonna ding when

12:08

I do sponsor reads. That's the

12:10

money bell. I'm gonna thank. are

12:12

good friends at work, OS. If

12:14

you are building a B2B, S-A-A-S,

12:17

SAS, I think people pronounce it,

12:19

app, at some point, your customers

12:21

are going to start asking for

12:23

enterprise features like SAML, authentication, SCIM,

12:25

SCIM, SCI-I-M, provisioning, role-based, access, control,

12:27

audit trails, etc. I don't know

12:29

what any of that stuff means

12:31

because I'm not building SAS apps,

12:33

but if you, dear listener, are

12:35

you know what all those things

12:37

are and you know you need

12:39

them. WorkOS provides easy to use

12:41

in flexible APIs that help you

12:43

ship enterprise features on day one

12:45

without slowing down your core product

12:47

development. You get to spend your

12:49

engineering and design time on the

12:51

features that matter that separate your

12:53

company, whatever it is you want

12:55

to do, not these sort of

12:57

baseline SAS features like those. acronyms

13:00

I just stumbled over. WorkOS is

13:02

used by some of the hottest

13:04

startups in the world today such

13:06

as perplexity, versatile, plaid, and web

13:08

flow. WorkOS also provides a generous

13:10

free tier of up to one

13:12

million monthly active users for its

13:14

user management solution. Comes standard with

13:16

rich features like bot protection, MFA,

13:18

rolls and permissions, and more. You

13:20

don't pay a cent until you

13:22

hit over a million active customers

13:24

per month. Truly, I think that's

13:26

by any definition generous. If you

13:28

are currently looking to build SSO

13:30

for your first enterprise customer, you

13:32

should consider using WorkOS. Integrate in

13:34

minutes and start shipping enterprise features

13:36

today. Go to workOS.com. WorkOS. That's

13:38

how it spelled exactly how you

13:41

think.com. We want to start with

13:43

your weekend adventure. You want to

13:45

start with the Federiki interview you

13:47

did lighting last month. I

13:49

guess they kind of intersect in

13:51

some places. Yeah, they do. I

13:53

think this is actually a very

13:55

thematic episode of the show, to

13:57

be honest. And it fits with

13:59

the, you know, as we head

14:01

towards the end of 2024. It

14:03

is the year of AI, and

14:05

I think next year is going

14:07

to continue that. I don't really

14:09

think it's a fad. It is,

14:11

there's some real there there, unlike,

14:13

let's say, I don't know, not

14:16

crypto in general, but what were

14:18

those NFTs, right? I think that

14:20

was, or I think we could

14:22

even, Metiverse, yeah, everybody's got, even,

14:24

the company that renamed itself, doesn't

14:26

talk about it anymore. We can

14:28

say that that was just a

14:30

nice hype trend of. I don't

14:32

know. Let's just start with the

14:34

most recent one. I loved it.

14:36

It is such a perfect Joanna

14:38

Stern video. You took one overnight,

14:40

was it one day, 24 hours?

14:42

It was one overnight, yeah. Yeah,

14:44

we went on a... Describe the

14:46

premise. Well, people have been wondering

14:48

like why I did this, so

14:50

let me start with the premise

14:52

of, and he's featured in the

14:54

video, but Mustafa Solomon, who's now

14:57

the CEO of Microsoft AI. and

14:59

overseas Co-Pilot, they were kind of

15:01

the last on the train to

15:03

release voice bots or voice capabilities

15:05

to add to their chat bot.

15:07

And so Co-Pilot came out with

15:09

that I believe in early October.

15:11

And when I was interviewing him,

15:13

he was pretty clear, and he's

15:15

very clear in the video in

15:17

the column that I wrote this

15:19

past week, which is, this is

15:21

a new generation of relationships. And

15:23

I kept saying like a relationship

15:25

with the computer, but he kind

15:27

of wanting to go further than

15:29

that to say that this is

15:31

a new type of companionship and

15:33

friendship and friendship. And hearing the

15:35

word friendship with these just kind

15:38

of kept making me think of

15:40

like, what would I do to

15:42

test friends? What would I do

15:44

with friends? And so, girl's trip.

15:46

It makes total sense. I was

15:48

going to take my robot friends,

15:50

my bought friends, the voices of

15:52

meta-a-i, which they also came out

15:54

with within the last month or

15:56

two, a slew of voices, celebrity

15:58

voices that you can use in

16:00

Instagram, WhatsApp, messenger, etc. etc. etc.

16:02

I had chat GPT with advanced

16:04

voice mode. And then I had,

16:06

what was my last one, Google

16:08

Gemini, there, a Google Gemini, which

16:10

came out in the end of

16:12

August, and just this week came

16:14

to iOS with the Gemini app.

16:16

And so I strapped them all

16:19

to a tripod thing, four phones,

16:21

decided it needed a head, because

16:23

obviously, it just didn't look right.

16:25

Head, wig, put it in the

16:27

car, drove up, brought a wonderful

16:29

producer, David Hall. and my reporting

16:31

assistant, Cordelia James, and we just

16:33

spent 24 hours in this cabin,

16:35

mostly me just talking to these

16:37

things. I mean, we have so,

16:39

we recorded so much footage, hour,

16:41

we had 10 hours of footage

16:43

at some point, strung out of

16:45

me talking to these bots, because

16:47

that's what they, they are. They're

16:49

their voices. You're supposed to have

16:51

conversations with them. They have very

16:53

arbitrary limits about when you can

16:55

kind of stop. Like, like, I

16:58

did hit the limit with chat

17:00

GPTGBT. to this thing enough. You

17:02

can't keep using advanced voice mode.

17:04

But that happens like, I would

17:06

say, six hours in. Oh my

17:08

God, that's a lot. Yeah. But

17:10

don't worry, we had another account

17:12

and so we kept going. So

17:14

that was the premise. And I

17:16

just, I started thought, okay, what

17:18

are some things that can put

17:20

it through the test? And as

17:22

you can see in the video,

17:24

we built a fire together. David,

17:26

my producer, really thought it would

17:28

be hilarious if I thought it

17:30

would be hilarious. And, you know,

17:32

tried to have some serious conversations

17:34

with these bots and that didn't

17:36

go as well, but it was

17:39

all in all a very, very

17:41

fun, eye-opening experience. My favorite part,

17:43

and you seemed genuinely, I don't

17:45

know how many of it was

17:47

the first take or what, but

17:49

you seemed genuinely blown away, was

17:51

when they started talking to each

17:53

other about movies, I think? No,

17:55

Jillian Flynn novels. Yeah, that's it,

17:57

novels. Which was... I think I

17:59

was so, first of all, they

18:01

started talking to each other because

18:03

they all have mute buttons. So

18:05

I knew like, okay, I've got

18:07

to mute this one to test

18:09

this one and keep muting and...

18:11

that was the beauty of having

18:13

the tripod set up, but I

18:15

had left them all unmuted. And

18:17

I don't know how they started

18:20

getting on this topic of books,

18:22

but again, the fact was like

18:24

tried to like program them to

18:26

be, not program, but I wanted

18:28

this to be a girl's trip.

18:30

I set them to all have

18:32

female voices. I was definitely stereotyping

18:34

there, right? And then they all

18:36

start talking about pretty, I mean,

18:38

men read these books too, but

18:40

like. You know, this is a

18:42

pretty popular genre among women. And

18:44

they all start talking about Jillian

18:46

Flynn and Gone Girl and this

18:48

genre of books. And I'm just

18:50

like dying and I try to

18:52

like talk and get my words

18:54

in and they won't let me

18:56

in. And I mean, I guess

18:58

it was kind of like human

19:01

experience because I often feel that

19:03

way. People don't let me talk.

19:05

It reminds me of like when

19:07

I first got into computers, when

19:09

I was really young. And I

19:11

mean everybody did this in my

19:13

generation like every time I went

19:15

to Kmart I'd go to the

19:17

Commodore 64 display unit and type

19:19

10 print Kmart sucks 20 go

19:21

to 10 run and then it

19:23

runs the infinite loop is I

19:25

mean Apple named their first campus

19:27

infinite loop. It's it's just a

19:29

fascinating concept that to to a

19:31

mind that's drawn to computers. It's

19:33

just kind of beautiful. But hooking

19:35

up two machines to do something

19:37

where they will never stop doing

19:39

it to each other. It's satisfying

19:42

in a weird way. I don't

19:44

know. I think partly it's just

19:46

like a neat technical trick. And

19:48

part of it is reassuring, even

19:50

going back 40 years to 1980s,

19:52

Commodore 64s, and infinite loop. It's

19:54

a way of asserting our humanity

19:56

versus their machines. A human is

19:58

never going to get tricked into

20:00

an infinite loop, right? You can

20:02

trick people for a long time,

20:04

and it's kind of funny in

20:06

a practical joke kind of way,

20:08

but eventually they're just gonna pass

20:10

out or something. Whereas if... If

20:12

you get computers talking to each

20:14

other, they'll never stop. They'll never

20:16

stop. And we saw that. I

20:18

mean, it's beautiful, but it's also

20:20

amazing to just watch where they'll

20:23

go, because these large language models,

20:25

they're basically programmed to never not

20:27

have an answer. Right. They're not

20:29

even programmed. They just never not

20:31

have an answer. It is they

20:33

will always come up with something

20:35

and that's part it's like the

20:37

most chatty of friends that you

20:39

kind of just you just needed

20:41

to give me a one-line answer

20:43

but you gave me four paragraphs

20:45

and in that case they just

20:47

will keep keep going there was

20:49

another instance where one of them

20:51

started talking about I don't know

20:53

must have been maybe something in

20:55

the memory of Gemini or something

20:57

based on me but it wanted

20:59

to talk about writing an essay

21:01

about EV charging. Co-Pilot specifically would

21:04

get confused by this all, but

21:06

three of them would keep going,

21:08

keep, keep, keep going. You mentioned

21:10

this. What was the phrase, I

21:12

think I wrote it down, annoying

21:14

friendliness. And I, in two words,

21:16

I can't summarize my, that might

21:18

be my single biggest frustration with

21:20

all of them, is, and part

21:22

of it's me, you know me,

21:24

I mean, my personality, I'm a

21:26

little, yeah, come on, I don't

21:28

need the overly friendly friendly friendly

21:30

stuff. But I think anybody gets

21:32

tired of it, because it's clearly

21:34

phony. Everybody knows that these are

21:36

computers. And to me, this sort

21:38

of annoying friendliness that is clearly

21:40

programmed into these things is, I

21:42

find it distasteful. If every, every.

21:45

food offered, like if you go

21:47

on a resort or a cruise

21:49

ship or something where you don't

21:51

have options and everything is dessert.

21:53

It's like at first that kid

21:55

and you, there's a part of

21:57

you that is, oh, it's a

21:59

friendly computer. That's amazing. It's kind

22:01

of funny. And as a kid,

22:03

you might think, ah. Like a

22:05

week of nothing but candy and

22:07

cookies and ice cream would be

22:09

great You get physically sick after

22:11

a while and it's like mentally.

22:13

It's like this fake chipperness. It

22:15

just is so Overly sweet. It's

22:17

like the mental equivalent of my

22:19

teeth. I feel like I'm getting

22:21

cavities And I just want, I

22:23

wish that, and they don't have,

22:26

this is the thing that really

22:28

gets me, is they don't have

22:30

a way to turn it off.

22:32

They don't have a dial. They

22:34

should have a friendliness dial. And

22:36

it should just be, like you

22:38

should be able to set that.

22:40

And I think, like I said

22:42

the annoying friendliness common in the

22:44

context of that these companies have

22:46

to build trust with us with

22:48

these bots, right? We've got to

22:50

feel comfortable asking it about cooking,

22:52

but also. a major life change

22:54

or whatever that spectrum looks like.

22:56

You've got to feel like I

22:58

can confide in this thing and

23:00

I'm going to trust that it

23:02

says something that's smart and you

23:04

need to feel comfortable to do

23:07

that. But like some people are

23:09

just asking for cooking instructions or

23:11

every time I like open it

23:13

up it doesn't need to greet

23:15

me like I'm the Queen of

23:17

England. I don't know. Yeah and

23:19

it's like sometimes I really do

23:21

just want a button. on an

23:23

on-screen alert. I just want the

23:25

button that just says, okay, if

23:27

there's no other option, I just

23:29

want the one that just says,

23:31

okay, because it's okay, okay, or

23:33

cancel. And it's like, yeah, I

23:35

just want to hit, okay. I

23:37

don't want like a button that

23:39

says, hey, good to see you

23:41

again, John, do it. And it's

23:43

no, no, no, just give me

23:45

okay. I would really love to

23:48

turn the personality off on these

23:50

things and just get flat answers.

23:52

that when we first started talking

23:54

to these things at all, like

23:56

when Syria came out in 2010

23:58

or 2011, whatever the year was,

24:00

it was more like that. And

24:02

it's, we all know what these

24:04

companies are thinking. They are thinking

24:06

this is a disruption to society

24:08

that you can have a conversation.

24:10

with your computer now and we

24:12

want to make this as I

24:14

don't know as seamless or not

24:16

seamless but as acceptable as possible

24:18

or as non-threatening as possible and

24:20

right and I find the fake

24:22

friendliness in a weird way is

24:24

actually a little more disturbing right

24:26

and maybe and that's where I

24:29

finally got with some of this

24:31

too which is the given what

24:33

had happened with the the boy

24:35

who had committed suicide and there's

24:37

information about how he had been

24:39

chatting with character AI leading up

24:41

to that and all these other

24:43

types of bots that are being

24:45

built for loneliness and companionship. I

24:47

think that's where this is just

24:49

a disguise, right? The friendliness and

24:51

the personality is a disguise for

24:53

code and just computer. Yeah, and

24:55

sometimes I really wish I could

24:57

turn it down and just get

24:59

the code and just I don't

25:01

want it to actually I kind

25:03

of would me personally I if

25:05

I if there were a dial

25:07

for like sarcastic and I know

25:10

if any if anybody's trying to

25:12

do it for God I know

25:14

it's the X-A-I that Elon's making

25:16

and I'm glad someone's trying to

25:18

do other than pure sacrin friendliness

25:20

but it's like I do not

25:22

share Elon Musk's sense of humor

25:24

at all and what they think

25:26

is biting sarcasm to me is

25:28

is also not funny at all

25:30

it is not my my vector

25:32

of sarcasm but at least they're

25:34

trying something a little different but

25:36

I would know and that's like

25:38

the sarcasm thing obviously I'm I'm

25:40

dry and sarcastic and there's this

25:42

moment in the video too where

25:44

I am I can't believe we

25:46

captured it because I it like

25:48

flubed right it made a chattytyty

25:51

flubbed and said how the fire

25:53

making you feel and I You

25:55

know, and I thought that was

25:57

hilarious. Like if you was a

25:59

human, you'd be like, you laugh

26:01

at somebody's like... the way they

26:03

butcher or something. And I say

26:05

back to it, fire make me

26:07

feel good. And I'm cracking up.

26:09

I can't like really bite my

26:11

tongue. I'm cracking up and you

26:13

can hear my producer cracking up.

26:15

And it like doesn't get it,

26:17

right? And it just, but it

26:19

gets the laughter. Like it picks

26:21

up laughter so it knows that

26:23

you were, there's human laughter. And

26:25

so it somewhere in there, it's

26:27

in and it just went, ha

26:29

ha. It just. Worse. It's just

26:32

worse. And I do find it

26:34

funny, now that I've been doing

26:36

this for so long, and it's

26:38

just not just professionally, but just

26:40

going back to being 10 years

26:42

old and the way computers were

26:44

then, and it's like I see

26:46

all of this progress, and more

26:48

than any other industry, the computer

26:50

industry, it's why I continue to

26:52

love following it and doing what

26:54

I do, is that it's... still

26:56

moving so fast and things are

26:58

changing so fast. And you get

27:00

all of these examples over the

27:02

decades of things that had previously

27:04

only been imagined in science fiction

27:06

and now they're real, right? I

27:08

mean, just the fact that we

27:10

have, I mean, it comes up

27:13

all the time, but the things

27:15

that our iPhones and all modern

27:17

smartphones can do in our pocket

27:19

are just flabbergastingly amazing in science

27:21

fiction. I mean... video calls like

27:23

what you and I are doing

27:25

right now to have this conversation

27:27

used to be science fiction and

27:29

we could be doing it on

27:31

our phones over the air it's

27:33

all amazing but it's always so

27:35

interesting to me like the thing

27:37

I always think of first is

27:39

how once it becomes real what

27:41

it at first it's amazing and

27:43

then it settles in and everybody

27:45

nerds and non nerds alike it

27:47

just becomes a part of life

27:49

right I mean yeah And things

27:51

that happened a century ago are

27:54

like that. Running water is a

27:56

marvel of technology. I often say

27:58

like if we could time travel

28:00

and... bring Ben Franklin to the

28:02

modern day. And you want to

28:04

take him to the airport and

28:06

show him airplanes, you want to

28:08

show him the internet and Google

28:10

search and chat GPT. I think

28:12

just getting him to stop raving

28:14

about being able to take a

28:16

pee in the middle of the

28:18

night indoors and hit a button

28:20

and it just goes away? You've

28:22

got to give him a couple

28:24

of days to absorb the toilet

28:26

and running water, because it's just

28:28

amazing. Yeah. It settles in, we

28:30

just take it for granted, but

28:33

I think that another way of

28:35

looking back is how the science

28:37

fiction writers who imagined it got

28:39

it wrong. And it's like one

28:41

thing is everybody's imagined talking robots

28:43

and talking AI, whether they are

28:45

humanoid robots like C3PO who walk

28:47

around or like Hal 9,000 from

28:49

2001, which is a lot closer

28:51

to the modern AI. The one

28:53

thing, you know, the way that

28:55

Hal isn't a robot

28:57

who moves around, but is everywhere

29:00

on the ship. And it's the

29:02

same how you're talking to in

29:04

the place where the pod bay

29:07

doors are, as in the living

29:09

room area or whatever. Nobody imagined

29:11

these fake friendly attitudes. Nobody. How

29:14

is what I want? I want

29:16

a little bit of personality, but

29:18

mostly just sort of flat in

29:21

just the facts. I'm

29:23

going to be honest with you,

29:25

I don't even, I've never watched

29:27

most of these science fiction movies.

29:29

Every time I try to, I

29:31

fall asleep. But I think so

29:33

much of what you just hit

29:35

on is so right. I think,

29:38

especially the parts where we are

29:40

amazed by it and then we

29:42

just take it for granted. And

29:44

actually, that is something, I don't

29:46

have the exact quote from here,

29:48

but when I interviewed Craig Federegi

29:50

a few months ago, weeks ago,

29:52

I don't know what day it

29:54

is, about Apple Intelligence and I

29:56

really was trying to go after

29:58

why a Siri like this, where

30:01

is the Siri improvements, I was

30:03

really hitting him on Siri. He

30:05

really did back up and and

30:07

really wanted me and I think

30:09

he even was thinking about how

30:11

far Syria has come and we

30:13

got used to it right right

30:15

we started like to want more

30:17

and more I want to pull

30:19

up the quote because he obviously

30:21

said it so eloquently but I

30:23

think yes and now even so

30:26

with these bought like you hit

30:28

a wall right you we hit

30:30

a wall with Siri and Alexa

30:32

and I say that in the

30:34

column we hit a wall we

30:36

now know we can say What

30:38

do you call it, Dingus? Dingus,

30:40

yeah. Hey, Dingus. Yeah, you can

30:42

say, hey, Dingus, set the blah.

30:44

Dingus, do this, right? We've all

30:46

learned the very strict formulas of

30:49

how to do those things. We've

30:51

had to talk the talk in

30:53

their way. We hit a wall

30:55

with that. So now we have

30:57

these bots. We are now hitting

30:59

another wall with that. I don't

31:01

know. I think they can change

31:03

the friendliness, and they can, there's

31:05

probably a lot. It's just still

31:07

a computer. There's more you can

31:09

do and it will walk you

31:12

through these things in a more

31:14

personable, friendly way, and it is

31:16

smarter, but you also will hit

31:18

this wall. Yeah, and it's, I

31:20

don't know, the fake personality, the

31:22

fake friendliness, I just... never would

31:24

have guessed it but now that

31:26

it's here it's oh but I

31:28

see it as of course they're

31:30

doing it this way because they

31:32

think it's non-threatening and they realize

31:34

that people are going to feel

31:37

threatened by the weirdness of this

31:39

and and then I think it

31:41

also sets in with everybody else

31:43

is doing it this way so

31:45

we should too nobody wants to

31:47

stick out and just sort of

31:49

put out a more much more

31:51

roboticotic personality type, which I think

31:53

after the novelty wears off, people

31:55

would settle in and will be

31:57

fine with joy better, right? Yeah,

32:00

and here's, and this is what

32:02

he said, he said, but as

32:04

humans are expectations for what it

32:06

means to use our voice to

32:08

communicate and ask for things is

32:10

almost unbounded. Which is right. It's

32:12

like we start to ask and

32:14

we start to feel more conversational,

32:16

right? So we've gone from the

32:18

like we know the formulaic. Now

32:20

we're in this moment with with

32:23

large language models where we're able

32:25

to. talk more like yourself. I

32:27

mean, that was kind of also

32:29

the amazing thing of being away

32:31

with these things for 24 hours.

32:33

You don't prompt or you're just

32:35

like, oh, let's do a thing

32:37

of yoga. Pick some, let's make

32:39

a yoga routine for me. Let's

32:41

do it now, right? Like you

32:43

don't talk to it. You just

32:45

talk normally. And so I guess

32:48

if the endpoint is Hal or

32:50

whatever that dream robot that is

32:52

our partner, our partner and maybe

32:54

friend. we're going to keep hitting

32:56

walls. It's like there's our internal

32:58

monologue in our own brains, but

33:00

nobody, you know, that's everybody's their

33:02

own. And the next level of

33:04

communication is speech. It's the oldest

33:06

part of evolution. Literacy is a

33:08

thing. And there are human beings

33:11

who are illiterate, who cannot read

33:13

or write. But there's, unless you

33:15

have like some kind of disability,

33:17

everybody can speak. And it's kind

33:19

of amazing. It's really, it's one

33:21

of those things where you're like,

33:23

wow, that is kind of amazing

33:25

that people of who are really,

33:27

really not very intelligent at all,

33:29

just naturally on their own, learn

33:31

to speak as babies. And what

33:34

do you do when you speak?

33:36

You communicate your thoughts and your

33:38

thoughts are for whatever your level

33:40

of intelligence is, for you, they're

33:42

unbounded. And... You just speak. And

33:44

so speaking to computer, it is,

33:46

it's a really interesting observation by

33:48

Fedoriki, because there really is no

33:50

limit. And it's, you and I

33:52

are in fact professional writers, or

33:54

we try to be, but it's,

33:56

I feel you and I are

33:59

at the higher end of aptitude

34:01

for expressing ourselves through writing, but

34:03

for most human beings, of whoever

34:05

you do, wherever you live, however

34:07

old you are, communicating yourself by

34:09

speech is. the most natural thing

34:11

in the world. And so interfacing

34:13

with computers that way is very,

34:15

very different than pushing buttons or

34:17

typing commands. It's just a complete,

34:19

it's like a removal of abstraction.

34:22

It is a level of abstraction,

34:24

but we're evolutionarily hooked up not

34:26

to think of it as being

34:28

abstract. You mentioned like, you know,

34:30

I think it was a 14-year-old

34:32

in Australia who recently was found

34:34

to have gotten obsessed with a...

34:36

Game of Thrones constructed character and

34:38

it sat all cases of teenage

34:40

suicide. It's all tragedies and there

34:42

are way too many people who

34:45

in the midst of depression turn

34:47

to suicide. With it always been

34:49

true, it still is true and

34:51

they're used to not even be

34:53

chat bots and now that there

34:55

are chat bots, somebody who's in

34:57

the midst of an episode like

34:59

this, that might have and probably

35:01

would have. been just as depressed

35:03

if not more so without any

35:05

access to chat bots, got obsessed

35:07

with chat bots, and now that's

35:10

the headline. It's same thing with

35:12

self-driving cars, where how many people

35:14

die in all human-driven car cars

35:16

and vehicles per day in the

35:18

US? It is one of those

35:20

things that once technology solves it,

35:22

people are going to look back

35:24

at... 2024 today and see our

35:26

driving culture and the number of

35:28

deaths and for all the deaths

35:30

all of the gruesome injuries that

35:33

people don't die and thankfully and

35:35

recover from as barbaric. It is

35:37

really really strange historically how we've

35:39

just and it's one of those

35:41

things that we've just accepted you

35:43

know we me and you and

35:45

every most of the people listening

35:47

to this we're all born in

35:49

car culture in North America. We

35:51

just accepted as the way the

35:53

world works, but it's really weird

35:56

and gruesome. and self-driving cars are

35:58

a way out of that. But

36:00

the problem is there are still

36:02

going to be some accidents, especially

36:04

getting from here where all, well,

36:06

we're already at a point where

36:08

some cars are self-driving, but getting

36:10

from the point where all cars

36:12

were human driven to a future

36:14

where all cars are either self-driving

36:16

robotically or have systems in place

36:18

to prevent. collisions and we're seeing

36:21

that where you can be like

36:23

not paying attention and if you

36:25

get within a certain proximity of

36:27

the car in front of you

36:29

the car brakes itself and fantastic

36:31

fantastic life-saving features and even at

36:33

a lesser degree at lower speed

36:35

just nobody was going to get

36:37

hurt but thank God I didn't

36:39

rear end that guy because I

36:41

wasn't paying attention and avoid just

36:44

the minor irritation of a Fenderbender.

36:46

But the problem is, it doesn't

36:48

matter which brand it is, but

36:50

of course, if it's Tesla, it

36:52

brings in all this other political

36:54

baggage and personality baggage of Elon

36:56

Musk. But one accident that kills

36:58

somebody with a self-driving car is

37:00

the man bites dog headline. Right?

37:02

Oh, there it is. But meanwhile,

37:04

there's 10,000 other people who die.

37:07

I don't know what the, you

37:09

know, but it's numbers like that

37:11

every month. in human driven cars.

37:13

And I think there's that sort

37:15

of effect with these chat bots.

37:17

100% and look, like, there's also

37:19

a big difference between the chat

37:21

bots that I, and I try

37:23

to make this point in the

37:25

piece in the video and the

37:27

column, but big difference between the

37:30

smaller startup character AI, which didn't

37:32

seem to have many protections in

37:34

at all in the app for

37:36

if somebody's. using words like suicide

37:38

or killing themselves to provide more

37:40

information versus what I saw here.

37:42

I tried to test that with

37:44

all of these and they're all

37:46

providing hotlines or talking and saying

37:48

talk to a real human. So

37:50

there's certainly also a lot that

37:52

can be done on the product

37:55

side. I don't want to say

37:57

like always the in product, it's

37:59

not. And I don't think, but

38:01

this, look, we've got, you've got

38:03

a lot of forces and you

38:05

could talk about lots of forces

38:07

that play into all of this

38:09

horribleness. But I agree with you,

38:11

I don't think we should hold

38:13

that up as the, not, and

38:15

by the way, I, it sounded

38:18

to me like. in that case

38:20

he was also typing. It was

38:22

a text-based chat. It wasn't voice,

38:24

though, character AI, and a few

38:26

of these do have the voice-based

38:28

stuff. But look, that makes it

38:30

more realistic and humid. I mean,

38:32

there's just, there's no doubt that

38:34

I had formed an image, especially

38:36

of meta-a-a-a-i, I gave the Kristen

38:38

Bell voice to it, and because

38:41

I just, I love Kristen Bell,

38:43

and it's like, I'm going to

38:45

be hanging out with Kristen Bell,

38:47

Just a chat bot, right? Like

38:49

there is something of a species.

38:51

There's something more there than just,

38:53

oh, I was texting with a

38:55

computer. Yeah, yeah, it tickles a

38:57

lower lizard part of your brain.

38:59

I guess lizards don't really talk

39:01

to each other, but you know

39:03

what I mean. Older, very old

39:06

evolutionary when we first started to

39:08

communicate with our voices, part of

39:10

our brains, where when you hear

39:12

the voice of a loved one,

39:14

it... It sets off endorphins in

39:16

your brain. If you just happen

39:18

to run into your kid accidentally,

39:20

you were out and about, and

39:22

your kid's school group happens to

39:24

be where you are, and you

39:26

hear your kid's voice, it's whoa.

39:29

And it's like, you know, your

39:31

brain kind of lights up in

39:33

a funny way. You know, just

39:35

when you hear like a friend

39:37

who you haven't seen in a

39:39

while, they just happen to be

39:41

in the same store as you,

39:43

and you hear their voice, and

39:45

it's like, whoa. When you see

39:47

somebody's face, you recognize it and

39:49

you have a reaction that's not

39:52

voluntary, it's involuntary and you don't

39:54

get that. I mean, a weirdo

39:56

like me can get it from

39:58

like type faces or something. But

40:01

even there, even for someone who's

40:03

obsessed with something like that, ooh,

40:05

this book has a real, oh,

40:07

I love this fun. It makes

40:09

me a little more likely to

40:11

buy it. It's still not the

40:13

same as when I see the

40:15

face of a loved one or

40:17

something like that. We're hooked up

40:19

for that and computers have never

40:21

had that before. Yeah. And now

40:23

they do. Kind of, right. And

40:25

that's the other thing that science

40:27

fiction didn't really imagine. It's like

40:29

science fiction always gives us. the

40:32

finished product right and again even

40:34

if you haven't watched 2001 how

40:36

it's the the red eye and

40:38

and C3PO it's like but then

40:40

nobody ever imagined like hey what

40:42

was it like the first year

40:44

where they had this technology where

40:46

you could talk to them and

40:48

it was and it's so different

40:50

than I think anybody would have

40:52

imagined it to be it's because

40:54

if you think about it superficially

40:56

from the perspective of a science

40:58

fiction writer 20, 30, 40, or

41:00

more years ago, you start thinking

41:03

like, well, maybe because it's the

41:05

early days, it can only understand

41:07

three or four things or something,

41:09

because superficially that seems like a

41:11

good way to get it going.

41:13

But no, they'll answer anything. They'll

41:15

answer questions about anything. And they're

41:17

at times incredibly amazing and knowledgeable

41:19

and give you these helpful answers

41:21

and then at other times are

41:23

more stupid than any person you've

41:25

ever even fathomed. You know, like

41:27

when Google turned on the AI

41:29

in their search and was giving

41:31

really funny answers to how many

41:34

rocks should I eat on a

41:36

daily basis, right? Right. Or had

41:38

to glue the cheese on to

41:40

pizza. Like... Yes! Shouldn't. Yeah, they're

41:42

just engineered to be confident. Right.

41:44

And it takes a human being

41:46

at the moment to come up

41:48

with a question like, how many

41:50

rocks should I eat a day

41:52

to... trick the LLC-based AI into

41:54

giving a funny answer because it's

41:56

a nonsensical question in a way,

41:58

right? Right. There's a, you probably

42:00

haven't seen it given what you

42:02

said about science fiction movies. The

42:05

movie Blade Runner from 1982, I

42:07

believe, spoiler here, but there's the

42:09

whole premise of the movie is

42:11

Harrison Ford plays a Blade Runner

42:13

who is a... police detective whose

42:15

job it is to hunt down

42:17

replicates, replicates are robots who look

42:19

like humans. And so you, they

42:21

have skin and completely visually look

42:23

and talk like humans, but they're

42:25

fake underneath and the ones who

42:27

get out and are illegally out

42:29

and about, you have to identify

42:31

them. And there's a thing called

42:34

the void comp, comp test where

42:36

it's a series of questions that

42:38

the police can ask and you

42:40

give 20 questions and a replicant

42:42

is going to get tricked up

42:44

by some of them. And it's

42:46

a really good scene in the

42:48

movie. It's everybody who's watched it

42:50

is one of the most memorable

42:52

parts of the movie. But the

42:54

type of questions they ask are

42:56

much more thoughtful and cool and

42:58

interesting and they're not, how many

43:00

rocks should you eat a day?

43:02

Right? Like it turns out at

43:05

our current moment, the way to

43:07

figure out if you're chatting with

43:09

an AI is to ask a

43:11

nonsense question. Yep. Not a deep

43:13

philosophical question. It's true. Well, I'm

43:15

making a list of all the

43:17

movies I need to watch. All

43:19

right. Blade Runner, I might put

43:21

you to sleep. I love it,

43:23

but it looks good. But a

43:25

lot of times I'll start, I've

43:27

started a lot of these movies

43:29

and then I just fall asleep.

43:31

Here, let me hit the money

43:33

bell again. Do another sponsor. Thank

43:36

you. To our good friends, this

43:38

time it's Squarespace. Squarespace is the

43:40

all-in-one platform for building your own

43:42

website online. Everything you need to

43:44

do to own your own website,

43:46

Squarespace can do for you. Domain

43:48

name registry. picking templates to what

43:50

it looks like, modifying what it

43:52

looks like through their fluid engine,

43:54

which is the name of their

43:56

next generation engine for modifying the

43:58

way the thing looks. And the

44:00

fluid engine works great, whether you

44:02

are on a desktop computer or

44:04

on your phone, any modern web

44:07

browser, you just build your Squarespace

44:09

website on the Squares website itself.

44:11

And what you see is what

44:13

the people who visit. your site

44:15

will get it is WhizzyWig for

44:17

making a website and it works

44:19

absolutely terrific. They have all sorts

44:21

of other features I mean just

44:23

about everything you can imagine. Squarespace

44:25

payments is the easiest way to

44:27

manage payment in one place with

44:29

Squarespace. Onboarding is fast and simple.

44:31

You get started with a few

44:33

clicks and you can start receiving

44:35

payments right away and you can

44:38

give your customers any way to

44:40

pay imaginable. debit in the US,

44:42

Apple pay support, after pay in

44:44

the US and Canada, clear pay

44:46

in the UK, all sorts of

44:48

stuff like that. Credit cards, of

44:50

course, easily accepted. They have all

44:52

sorts of other features invoices. So

44:54

if you're setting up not a

44:56

website for just personal presence, but

44:58

actually running a business and you

45:00

need to send invoices and stuff

45:02

like that, they've got an invoicing

45:04

feature built into the platform that

45:06

you can use. Analytics, so you

45:09

can see the stats, how many

45:11

people are coming to your site,

45:13

where are they going to on

45:15

your site, and where are they

45:17

coming from, all sorts of stuff

45:19

like that. And it's such a

45:21

great analytics interface, where instead of

45:23

looking an airplane dashboard, and it's

45:25

all confusing, and it's a beautiful,

45:27

beautiful presentation, it just makes it

45:29

clear. Information graphic-wise is just absolutely

45:31

top-notch. Where do you go to

45:33

find out more? talk show Squarespace.com/talk

45:35

show and by using that URL

45:38

they'll know you came from this

45:40

show and you the listener who's

45:42

signing up will save 10% off

45:44

your first purchase of a website

45:46

or domain and you get 30

45:48

days free just by going there

45:50

to start 30 days to start

45:52

build the whole website whole month

45:54

you can build it out use

45:56

it turn it on go live

45:58

make sure you like it and

46:00

only after the 30 days are

46:02

up you need to pay just

46:04

go to square space.com/talk show you

46:06

did not include Siri in your

46:09

your girls trip I didn't. And

46:11

the reason why it was so

46:13

funny, you said, Siri, can you

46:15

introduce yourself? To get, I guess,

46:17

B-roll footage of Siri introducing itself.

46:19

And Siri's answer was just to

46:21

confirm. Do you want to turn

46:23

this device off? Yeah, I mean,

46:25

I was looking for a moment

46:27

to just have like a sound

46:29

like Siri having a basic answer,

46:31

but Siri knew exactly how to

46:33

write itself into the script. I'm

46:35

sorry there because I said Syria

46:37

and now I've got device. Oh

46:40

yeah, now wait, right. Dingus and

46:42

Dinkus. Beeping and boppin. But I

46:44

noticed it back at WWDC when

46:46

Apple unveiled the whole Apple intelligence

46:48

array of features and they said

46:50

it that with this new interface

46:52

that came out with iOS 18.1

46:54

last month. where the new Siri

46:56

gets this all-new visual interface, we're

46:58

like, let's just speak about the

47:00

phone, where when you invoke it,

47:02

you get a whole new visual

47:04

interface where the border of the

47:06

screen lights up in a Siri

47:08

rainbow of colors, and you can,

47:11

they, in a typical Apple fashion,

47:13

it made it seem like it's

47:15

nothing but great, which is that.

47:17

You can continue a conversation. So

47:19

you could say, hey, Dingus, who

47:21

did the Baltimore Ravens play last

47:23

week? And the series will give

47:25

an answer. And once you're still

47:27

up, you could say, and who

47:29

did they play the week before?

47:31

And Siri, in theory, should know

47:33

we're still talking about the Baltimore

47:35

Ravens or whatever it was, and

47:37

then give you an answer from

47:39

the week before. But then once

47:42

you're done and that Siri interface

47:44

goes off, then you're starting over

47:46

each time. And so the continuing

47:48

context in A session, is it

47:50

definitely better than the way Siri

47:52

used to be. But the fact

47:54

that it forgets everything by design

47:56

each time you started up is

47:58

very different from these other chat

48:00

bots. And I think that alone

48:02

would have ruled it out from

48:04

inclusion in your girls AI weekend.

48:06

Right? Well, I think it's also

48:08

the, like, I was just testing

48:10

something using Apple Intelligence too, just

48:13

to see. I've been testing 18.2

48:15

and wanted to see about the

48:17

ChatGBT integration and if when you

48:19

do start talking to Siri if

48:21

it's going to really start to

48:23

draw on ChatGBT and it really

48:25

doesn't. No, because I've been running

48:27

18.2 pretty much since the first

48:29

beta too. And it doesn't. Yeah,

48:31

I mean, and that's supposed to

48:33

start happening. I'm finding in my

48:35

testing of 18.2 that it I

48:37

don't mind that this chat GPT

48:39

integration is there but I'm finding

48:42

that it's kind of if it

48:44

weren't there my opinion of the

48:46

overall thing would hardly be different

48:48

it really And I don't know

48:50

if that's because leading up to

48:52

WWDC, I mean there was a

48:54

lot of reporting, Mark German, of

48:56

course, the king of the rumor

48:58

reporting in Apple World had a

49:00

bunch of stories leading up to

49:02

WWDC, suggesting that the deal with

49:04

Open AI didn't really... get finalized

49:06

until the last minute or close

49:08

to the last minute. And Apple

49:10

said at WWDC that there will

49:13

be or could be future partners

49:15

like Google in the future that

49:17

you could switch from using chat

49:19

GPT as your answer questions Syria

49:21

alone can't partner to Gemini or

49:23

something else. Here's what I think

49:25

is happening. I think the main

49:27

reason is that Siri does not

49:29

have a strong, large language model

49:31

component to like, if that's just

49:33

the way it worked out for

49:35

this year, that it would still

49:37

be, we can still tell people

49:39

by these things for Apple Intelligence.

49:41

And I think it shows. But

49:44

I think what's happening, specifically in

49:46

18.2, because you asked, like, why

49:48

wouldn't Siri fit into this? And

49:50

I think the main reason is

49:52

that Siri does not have a

49:54

strong large language model component to,

49:56

like, be conversational. See my interview

49:58

with Craig. We went deep on

50:00

that and he definitely hinted at

50:02

he didn't hint He just said

50:04

is that where this is going

50:06

of course But we've got to

50:08

figure out kind of the right

50:10

parts to put that together and

50:12

so because he was very clear

50:15

This is the garage example. He

50:17

was very clear right now Siri

50:19

does the garage, it does it

50:21

very well. But what we don't

50:23

want is Siri to go off

50:25

and not do that thing very

50:27

well. So they have to figure

50:29

out the right way and that

50:31

to me made a lot of

50:33

sense. You have billions of users

50:35

using this thing, got to make

50:37

sure that it can do the

50:39

thing everyone relies on it for,

50:41

but then also be able to

50:43

go and do the advanced thing.

50:46

One thing though that I think

50:48

is happening right now in Siri.

50:50

So like I just asked Siri.

50:52

Give me some... good recipes for

50:54

meatballs. I don't know, I always

50:56

go to meatballs as my like

50:58

go-to test of it, right? And

51:00

it searched the web and found

51:02

that and gave me like the

51:04

little pop-up that has that. I

51:06

tried a couple things and it

51:08

didn't bring up ChatGPT, but when

51:10

I asked it to write a

51:12

poem about John Gruber, it did.

51:14

It did. It asked me, yes,

51:17

what happened to the poem, hold

51:19

on. away. Hold on, the nice

51:21

screenshot. Yeah, see, it's clearly by

51:23

design and I think partly to

51:25

cover up for limitations in our

51:27

own system, but also partly because

51:29

they don't want to even get

51:31

into that now, like the permanent

51:33

memory of your interactions with the

51:35

service. They really want that blank

51:37

slate each time you bring it

51:39

up. Hold on, I'm asking to

51:41

do it again, but I think

51:43

that what's happening is that there's

51:46

very specific types of prompts. where

51:48

it says, write a poem or

51:50

whatever the other ones are, that

51:52

they're saying, hey, let's go to

51:54

chat cheapity for that. But then

51:56

for some other things where a

51:58

large language model could be useful,

52:00

it's not doing that. And I

52:02

also wonder if I had asked

52:04

it to write a poem about

52:06

John Gruber and I was doing

52:08

it, because I'm doing it by

52:10

text the answer, right? I have

52:12

found myself, I don't care enough

52:14

about these things to use them

52:17

all. And so I've sort of

52:19

gone all in with chat GPT,

52:21

and I mainly also, I like

52:23

the answers, I think the 400

52:25

model is probably the best or

52:27

closest to the best for the

52:29

sort of things I'm using it

52:31

for, but I really love their

52:33

Mac app. The chat GPT Mac

52:35

app is by far in a

52:37

way the best. Macintosh app for

52:39

any of these bots and I

52:41

know it all day long. It's

52:43

just a really well engineered Mac

52:45

app. It's not an electron thing

52:48

that's a wrapper for their website.

52:50

It's a really good fast with

52:52

a great interface and it's interesting

52:54

that it has your whole history

52:56

with it because I have I'm

52:58

logged in and I have a

53:00

chat GPT account and I pay

53:02

10 bucks a month for whatever

53:04

and there was a I don't

53:06

know if it's a meme or

53:08

what but like. a thing a

53:10

couple days ago where people are

53:12

saying to ask chat GPT or

53:14

ask any of these things to

53:16

make a picture, how do you

53:19

imagine a typical day in my

53:21

life? And I don't think ChatGBT

53:23

knows a lot about me, but

53:25

I'd imagine the picture I got,

53:27

it didn't have a lot, some

53:29

of the people's pictures have so

53:31

much in them. Mine was like,

53:33

of course all the rooms look

53:35

idyllic, it was sort of a

53:37

rustic, lots of wood, a big

53:39

leather, old leather couch, which is

53:41

kind of me. Like if I'm

53:43

gonna have a couch, I don't

53:45

have a couch in my office,

53:47

but if I did, I probably

53:50

would have a leather one, and

53:52

I like an old-looking leather couch

53:54

leather couch leather couch leather couch,

53:56

a typewriter on a desk, not

53:58

a computer, but a typewriter, and

54:00

some, not a lot in real

54:02

life, I've probably got more crap

54:04

around my office, but some books

54:06

and papers on the floor, and

54:08

I am kind of a mess.

54:10

And I do, one of my,

54:12

my, my, one of my favorite

54:14

most used things that I ask

54:16

chat GPT is when, it's like

54:18

my... superpowered thesaurus. Either I know

54:21

I'm looking for a certain word

54:23

and I can't think of it,

54:25

or I want a better version

54:27

of a word. I'm always asking

54:29

chatGPT for either a specific word

54:31

I can't think of, or like

54:33

a better word than the one

54:35

I'm thinking of. And it's amazing

54:37

at how good it is at

54:39

it. It's so good at it.

54:41

And so based on that it

54:43

kind of knows I'm a writer

54:45

and kind of seemingly knows I'm

54:47

a little scattered. that I do

54:50

research then build up piles of

54:52

books and oh there are lots

54:54

and lots of books in in

54:56

my imagined chat gPT world. So

54:58

this is the poem I did

55:00

write about you now I finally

55:02

got it working again. John Gruber

55:04

with wisdom keen in the world

55:06

of tech he's often seen with

55:08

mark downs elegance and grace his

55:10

thoughts and code find their place.

55:12

I'm not going to keep going

55:14

on. That's pretty good though. Yeah

55:16

that's pretty good. Okay, my my

55:18

image of that chatch UBT did

55:21

of my world, which was pretty

55:23

funny. I mean, I tried it

55:25

a few of them, but I

55:27

posted it on threads. It thinks

55:29

I'm, and it's not wrong, but

55:31

it must think like I'm truly

55:33

like the world's leader of the

55:35

Cub Scouts. I don't know what

55:37

that's called, like the, the, the,

55:39

the, America. They've changed the name

55:41

to Scouting America. Right, right. Because

55:43

it's not boys. To get rid

55:45

of the boys and girls separate

55:47

aspects. But my son, my seven

55:49

year old did just join the

55:52

Cub Scouts. And so I have

55:54

been using it a lot because

55:56

it's complicated. I'm like, please explain

55:58

to me, like I've been asking

56:00

about dens versus packs. I've been

56:02

asking about what are the different

56:04

den names or what are the

56:06

different pack names and the Cub

56:08

Scouts and there's wolves and there's

56:10

bears and there's this. And so

56:12

it really took that to heart

56:14

so that everyone I generated, I'm

56:16

like, I've got Cub Scout paraphernalia

56:18

everywhere, I've got patches, I've got

56:20

kids in the background with Cub

56:23

Scouts. And I guess at some

56:25

point I also asked about basketball.

56:27

I want to say at some

56:29

point I was asking about the

56:31

best way to get a basketball

56:33

hoop into cement. Because that was

56:35

something I was struck. Like I

56:37

use it a lot for home

56:39

stuff. I sent you a lot

56:41

for home stuff. I sent you

56:43

a picture of mine. I should

56:45

post it on social media and

56:47

I'll put I'll put links to

56:49

years from threads or wow I

56:51

mean this is a beautiful home

56:54

that you have here from the

56:56

early I don't know 1920 well

56:58

well it is it you know

57:00

what though you know what's funny

57:02

is in the early years of

57:04

daring fireball you know to I

57:06

started the site in 2004 And

57:08

there weren't, YouTube wasn't a thing,

57:10

and I've never really done much

57:12

video, and podcasts weren't yet a

57:14

thing. So all people knew me

57:16

by was the writing. And I

57:18

didn't put a picture of myself

57:20

on the site. And I started

57:23

going to things like WWDC or

57:25

Macworld Expo, and people would meet

57:27

me for the first time. And

57:29

I mean like dozens of times,

57:31

people would just say, oh, I

57:33

thought you were really old. And

57:35

at the time, it was like

57:37

20 years ago, I was only,

57:39

I don't know, like 30. So

57:41

I wasn't really old at all,

57:43

and they were like, oh, I

57:45

thought you were like old. Somehow

57:47

ChatGPT seems to have the same

57:49

impression of me. Yeah, the old

57:51

sounding written voice. I guess. I

57:54

guess. Because it's not just a

57:56

typewriter. Or you were very wise.

57:58

It's a manual typewriter. Yeah. I

58:00

also enjoy that there's a map

58:02

on the wall of mine and

58:04

it's like a nonsense map. It's

58:06

not a real place. You have

58:08

a lot of plants. But yeah,

58:10

I mean, really, I just sent

58:12

you mine, but you can see

58:14

I've got the Boy Scouts, I've

58:16

got basketballs, it thinks I love

58:18

bagels. Hmm. I don't know about

58:20

the bagel piece. Oh yeah, see

58:22

yours is really yours are very

58:25

very Jam packed with things. Look

58:27

how they got they jam packed

58:29

it. Oh and I love that

58:31

little safari compass. They gave me

58:33

on the first one. Oh, oh

58:35

the first one. Look at that

58:37

safari compass on the table. Oh,

58:39

yeah, yeah, I see it. It's

58:41

like a like a coaster or

58:43

a pen or something like that

58:45

and now I'm like I want

58:47

that I want that for the

58:49

Cub Scouts. I want a safs.

58:51

Yeah, they've definitely got you as

58:53

being like the scouting mom. Yeah,

58:56

definitely. I mean, but it was

58:58

very appropriate because that week I

59:00

had obviously asked a lot about

59:02

this, but my life is, I

59:04

wish, I mean, I don't wish

59:06

I was as involved with the

59:08

Boy Scouts as this thinks I

59:10

am. And also in that second

59:12

one, you've got a Boy Scout

59:14

framed logo on the wall. Yes,

59:16

I love them so much. Yep.

59:18

And also in that second one,

59:20

you've got a wood-grained... sign on

59:22

the wall big with your name

59:24

on it. Yep, I love myself.

59:27

I love myself. I love basketball,

59:29

apparently. But you have an iMac.

59:31

You have an iMac, not a,

59:33

you would know, you have an

59:35

iMac on a desk and a

59:37

laptop on your lap and another

59:39

like a tablet. A detached keyboard.

59:41

Yeah. Yeah. So you've got a

59:43

lot of computers. I don't have

59:45

any, but. Yeah. Well. But just

59:47

going back to the chat. You.

59:49

Look, I think there's something there

59:51

about how Apple's thinking about how

59:53

much they, obviously we know that

59:55

they've taken some, they wanted to

59:58

minimize some of the risk. right,

1:00:00

and they also, whatever we want

1:00:02

to say they're behind or whatever

1:00:04

the reason is, they're being cautious

1:00:06

about how they roll this out. And

1:00:08

so they've used chat cheapity and

1:00:10

potentially some whatever partners they can

1:00:13

to plug into Siri. Eventually,

1:00:15

as Craig said in that interview, we

1:00:17

will see a Siri that has

1:00:19

more large language model capability and

1:00:21

can converse and can likely do

1:00:23

a lot of what we've already

1:00:25

been seeing from these other bots that

1:00:28

I went to the cabin with. It's

1:00:30

just they're not, they're just not,

1:00:32

it's just not comparable right now. And

1:00:34

look, we're going to see this from

1:00:36

Alexa. I mean, Amazon has been delaying

1:00:38

and delaying, but that's going to come

1:00:40

soon too. So Apple's going to have no

1:00:42

other choice. Because it is conspicuous, and I

1:00:44

thought of that with your, I thought of

1:00:47

it just earlier on the show, when you

1:00:49

were trying to remember all four of the

1:00:51

companions who you took with you, that I

1:00:53

think Jim and I was the one, the

1:00:55

fourth of four that you couldn't think of.

1:00:57

But it does stand out. Which is, it

1:00:59

is a little forgettable. We're not

1:01:02

talking about Amazon at this

1:01:04

point, where clearly they're working

1:01:06

on it. And I think Amazon is

1:01:08

sort of taking, Amazon and Apple

1:01:10

are the two, I was gonna

1:01:13

say weirdos, but exceptions. They're doing

1:01:15

things in very different ways from

1:01:17

the others. And Apple's is, let's

1:01:19

be very deliberate and slow, but

1:01:22

we'll start with. Very simple stuff.

1:01:24

So like when iOS 18.1 came out,

1:01:26

which is the only non-bata version

1:01:28

of Apple intelligence that anybody

1:01:31

out there listening has, it

1:01:33

doesn't even do much, right? There's

1:01:35

not even that much to it.

1:01:37

It's got like the writing tools,

1:01:39

doesn't do any of the image

1:01:41

generation stuff. They're releasing

1:01:43

stuff, but it's like a little bit

1:01:45

at a time. And I get the

1:01:47

feeling that Amazon... rather than release a little

1:01:49

bit at a time is sort of holding back

1:01:52

and then they're going to come out with a

1:01:54

big thing at once. I don't know. I could be wrong.

1:01:56

And I think, you know, as much as there

1:01:58

was some controversy, not controversy. but

1:02:00

like when Apple Intelligence launched

1:02:02

a few weeks ago there were many

1:02:04

people on the side of Apple's late

1:02:06

they're behind they're clearly behind there's

1:02:08

reporting that shows they're behind that

1:02:11

was one camp then there's the

1:02:13

camp which I don't didn't actually

1:02:15

take a stance on in my piece

1:02:18

it was mostly leaning on Craig who

1:02:20

was saying we're doing this to be

1:02:22

deliberate we're deliberately being slow we want

1:02:24

to get it right we're doing it

1:02:26

all right because we have a responsibility

1:02:29

Probably some place in the

1:02:31

my opinion is that it's in some

1:02:33

place in the middle of those things

1:02:35

But when you think about Apple

1:02:37

and Amazon specific I

1:02:40

mean the two big Assistants that

1:02:42

people make fun of and talk about

1:02:44

the most are Syria and Alexa

1:02:47

Yeah Google assistant as

1:02:49

well, but I think to a lesser

1:02:51

degree they have the most riding

1:02:53

on it Right if people wake

1:02:55

up and Alexa can't do the

1:02:57

basic things They can't turn

1:02:59

on the lights or start doing those

1:03:01

things in a ridiculous crazy way People

1:03:03

are gonna freak the hell out Yeah,

1:03:05

the me We haven't hooked up to do

1:03:07

real things to do real talk about the garage

1:03:10

if the garage just doesn't do the

1:03:12

thing or Alexa's not turning on the

1:03:14

lights or Alexa's turning on your fireplace

1:03:16

and like Those all are real things

1:03:18

and these companies built massive

1:03:20

business around them again. Google has

1:03:23

this too, but Google's done it

1:03:25

weird in a typical Google way.

1:03:27

They've put Gemini Live within assistant

1:03:29

and it but assistant still lives

1:03:31

there. It's a mess. It's total

1:03:33

mess. It's Google that I was thinking

1:03:35

of when I retracted my use of

1:03:37

the word weird for Apple and Amazon

1:03:39

and said their exceptions and doing

1:03:41

it differently because it's Google

1:03:43

that's always the weird. I mean they

1:03:46

just don't a weird thing too. You can

1:03:48

still set it. If you ask Gemini Live

1:03:50

set a timer, it won't. But you can

1:03:52

still go back to Google Assistant and

1:03:54

set a timer. And they worked out a new

1:03:56

way with extensions to set a timer, but

1:03:58

it's all like tactic. They obviously just wanted

1:04:01

to get it out the door.

1:04:03

Again, do I think Amazon and

1:04:05

Apple were behind and are playing

1:04:07

ketchup? Yes, but I also think

1:04:09

it's not as easy as them

1:04:11

to, for them, to just say,

1:04:13

rush it out, rush it out.

1:04:15

And I think the proof that

1:04:17

Apple was caught flat-footed is the.

1:04:19

Clear, obvious, I mean, at my

1:04:21

live show after WWDC, I asked

1:04:23

about it, and it was one

1:04:25

of those questions that I thought,

1:04:27

there's no way they're going to

1:04:29

give me a straight answer. And

1:04:31

they just answered straightly, yeah, that's

1:04:33

an issue, is the amount of

1:04:35

RAM in the device for on-

1:04:37

device processing. And yeah, it's like,

1:04:39

Apple Intelligence really needs, whatever the

1:04:41

device, iPhone or iPad or Mac,

1:04:43

it needs at least eight gigabytes

1:04:45

of RAM, don't get it. And

1:04:47

wow, I was like, that's a

1:04:49

really clear answer to the sort

1:04:51

of question Apple executives usually don't

1:04:53

answer. And the fact that so

1:04:55

many recent devices, most conspicuously, last

1:04:57

year's iPhone 15 non-pros, don't have

1:04:59

that much RAM. We know that

1:05:01

things like that get set, especially

1:05:03

for the iPhone, because of the

1:05:05

massive 100 million units per year

1:05:07

scale, whatever it is that's done,

1:05:09

a year in advance, right? Like

1:05:11

the iPhone 17 from next year

1:05:13

are locked in at this point.

1:05:15

I know. there's I've talked to

1:05:17

people at Apple where there are

1:05:19

certain little things that could happen

1:05:21

as late as December or January

1:05:23

but for the most I mean

1:05:25

something like how much RAM is

1:05:27

on the actual system on a

1:05:29

chip that's already set in stone

1:05:31

probably months ago if not even

1:05:33

longer the fact that they put

1:05:35

a lot of especially iPhones out

1:05:37

in the world that don't have

1:05:39

enough RAM for on device Apple

1:05:41

intelligence processing is a sign that

1:05:43

they were caught flat-footed but I

1:05:45

also think Apple is kind of

1:05:48

I don't think they're happy about

1:05:50

that and I think if they...

1:05:52

could go back in time and

1:05:54

say, hey, maybe start putting eight

1:05:56

gigs of RAM in the iPhone

1:05:58

14 or 13. They would, and

1:06:00

I think they kind of regret

1:06:02

it. But I think overall, they're

1:06:04

comfortable, and Tim Cook said it

1:06:06

in his Wall Street Journal profile

1:06:08

recently, that it's very, very clear.

1:06:10

It's a long time Apple message.

1:06:12

Our aim is to be the

1:06:14

best, not the first. And they're

1:06:16

comfortable with that. I think Google

1:06:18

was very, very, very uncomfortable. culturally

1:06:20

being accused of being late or

1:06:22

behind and it sort of panicked

1:06:24

and still does and it's just

1:06:26

sort of like well okay here

1:06:28

fine here's everything we know how

1:06:30

to do it is a lot

1:06:32

and look they had researchers that

1:06:34

worked on these transformers I mean

1:06:36

it's of all places to be

1:06:38

behind oh they had that reason

1:06:40

to feel like this was happening

1:06:42

in our own house everyone in

1:06:44

the house get get it together

1:06:46

get it together everyone everyone that

1:06:48

it's a different situation across the

1:06:50

405 or wherever they are down

1:06:52

in geographic, I don't know. California

1:06:54

geography, not my thing. I do

1:06:56

think Apple's comfortable with their position,

1:06:58

but I will say I really

1:07:00

don't like the ad campaign. I

1:07:02

would say it might be my

1:07:04

most disliked Apple advertising campaign that

1:07:06

I can remember is the commercials

1:07:08

they're doing for Apple intelligence. Because

1:07:10

I feel like all of them.

1:07:12

My wife even says when they

1:07:14

come on, it's sports season when

1:07:16

she sees, because we don't really

1:07:18

see a lot of regular TV

1:07:20

commercials, but I'm watching football or

1:07:22

when the Yankees were still playing

1:07:24

and she's sitting with me on

1:07:26

the couch and she's like, why

1:07:28

would I want that? Like, there

1:07:30

was a commercial where somebody, and

1:07:32

some of it too, it just

1:07:34

makes the human beings in the

1:07:36

ads look rude. There's one where

1:07:38

the actress, she's the actress from

1:07:40

the last of us and she

1:07:42

has like a... a meeting with

1:07:44

an agent or somebody in the

1:07:46

entertainment industry and I did you

1:07:48

read my script and she like

1:07:50

looks at her phone and there's

1:07:52

an email with the script she's

1:07:54

like summarize and she's right there.

1:07:56

It's all A. It's just rude.

1:07:58

Top level. It is rude to

1:08:00

show up at a meeting where

1:08:02

you're supposed to have read the

1:08:04

script not having read the script.

1:08:06

It's rude. It's rude and inconsiderate.

1:08:08

And B. You gave me reading

1:08:10

materials before this podcast and I

1:08:12

absolutely read them. I used Apple

1:08:14

intelligence to summarize your text to

1:08:17

me. Yeah. B, how stupid do

1:08:19

you think this other person is

1:08:21

when you're staring at your phone

1:08:23

poking at it and going, oh

1:08:25

yeah, yeah, it's a story about

1:08:27

coming of age of whatever the

1:08:29

plot is. It was pretty good.

1:08:31

Then it's like the commercial's implicit

1:08:33

message is that the other person

1:08:35

is so stupid that they don't

1:08:37

see that you're reading a summary

1:08:39

of the script on the fly

1:08:41

right in front of them. And

1:08:43

then that you're lying about it

1:08:45

and saying that you liked it

1:08:47

when you haven't read it. It's

1:08:49

all, all of those things are,

1:08:51

I don't see any of that

1:08:53

as positive, right? And then there's

1:08:55

the one too, where Bella Ramsey

1:08:57

is her name, right? Yeah, Bella

1:08:59

Ramsey. Where she runs into somebody,

1:09:01

she sees them like down the

1:09:03

hall or whatever. It's like, remind

1:09:05

me of the person I had

1:09:07

coffee with at blank. Cafe. I

1:09:09

actually would love that, but that

1:09:11

doesn't exist right now on the

1:09:13

product. No. That's the one commercial

1:09:15

that, and I've seen some people

1:09:17

complain about that one too. That's

1:09:19

the one in the campaign that

1:09:21

I like the most because I

1:09:23

am often bad with names. Oh,

1:09:25

I'm so bad with names. I

1:09:27

always have been. I'm not worried

1:09:29

that it's a sign of dementia

1:09:31

creeping up on me. But I've

1:09:33

always been bad with names, so

1:09:35

I do look forward to a

1:09:37

future where I'll get... some kind

1:09:39

of assistance. You know, and I

1:09:41

wear glasses all the time now,

1:09:43

so I have a leg up

1:09:45

on that. If all my glasses

1:09:47

do is just tell me the

1:09:49

name of everybody I'm looking at

1:09:51

or give me a button. I

1:09:53

can do it, I would buy

1:09:55

those glasses in a second. I

1:09:57

love how we all want that

1:09:59

thing, but we know that that

1:10:01

is the worst privacy situation in

1:10:03

the history of the world. Right.

1:10:05

And also, by the way, maybe

1:10:07

Apple is the only one that

1:10:09

can deliver that kind of feature

1:10:11

with privacy because of the iPhone

1:10:13

integration, but probably another conversation. Yep,

1:10:15

it is a whole other conversation

1:10:17

where the whole idea of face

1:10:19

recognition is this enormous can of

1:10:21

worms for privacy and... civil liberties

1:10:23

and I'm not being blithe about

1:10:25

those very serious things that I

1:10:27

agree with are true. But in

1:10:29

my dream world, I want the

1:10:31

feature, I want it available, and

1:10:33

I want it to be provably

1:10:35

private. I absolutely want this. And

1:10:37

in fact, we had a great

1:10:39

story in the journal this week

1:10:41

by a colleague of mine, Anne-Marie,

1:10:43

and she wrote about how Apple

1:10:46

Notes is being used for all

1:10:48

these ridiculously funny things. And I

1:10:50

actually featured one in my newsletter

1:10:52

today about how people like make

1:10:54

stickers of their outfits and put

1:10:56

them in notes. Anyway, check that

1:10:58

out. But my reason for bringing

1:11:00

this up was is that we

1:11:02

had this conversation about notes and

1:11:04

that to me, that's the most

1:11:06

private thing. If that were to

1:11:08

ever leak, I would be devastated

1:11:10

and I would be canceled. Truly.

1:11:12

Not even maybe publicly, but just

1:11:14

like by friends, right? Because I

1:11:16

have very specific notes about people's

1:11:18

names or people mom of so

1:11:20

and so Where so and so

1:11:22

her name is blank, right? This

1:11:24

is I have that is what

1:11:26

I have to do. So I

1:11:28

don't seem so rude every time

1:11:30

I run into them at a

1:11:32

practice or something I I do

1:11:34

it to join I would be

1:11:36

canceled right with you a lot

1:11:38

of mine the notes like that

1:11:40

are in my contacts not Apple

1:11:42

notes. Oh It's the cop, but

1:11:44

there are ones, I have some

1:11:46

in Apple nodes. You know, it's-

1:11:48

Talk about this system a little

1:11:50

bit, because how would you know

1:11:52

the name of the person? I

1:11:54

kind of know which people, I

1:11:56

have an intuitive sense of who

1:11:58

I've saved. contact from. So when

1:12:00

Jonas was in school for a

1:12:02

lot of his friends parents they're

1:12:04

all in Apple notes because I

1:12:06

don't have contacts for most of

1:12:08

them. I in fact I had

1:12:10

I think it was just one

1:12:12

Apple note with Jonas school friends

1:12:14

or something like that. Yes yes

1:12:16

exactly. And but I have stuff

1:12:18

in there that I definitely wouldn't

1:12:20

want to come out like somebody

1:12:22

has a gap in the teeth.

1:12:24

That's how you remember. Yeah, right.

1:12:26

Not the worst thing in the

1:12:28

world. It's just but it's like

1:12:30

the mom with the gap tooth.

1:12:32

It's so and so's mom. And

1:12:34

it helps me. No, I have

1:12:36

the same crap in mind. It

1:12:38

is only the parenting nightmare. Yeah,

1:12:40

it's not rude. And I do

1:12:42

trust that my Apple notes are

1:12:44

not going to be leaked publicly.

1:12:46

And it is a note to

1:12:48

my future self. It is John

1:12:50

Gruber right now making a note

1:12:52

to John Gruber in the future

1:12:54

after I've forgotten whose mom she

1:12:56

is. And so nobody who's ever

1:12:58

intended and that is, oh yeah,

1:13:00

the gap tooth mom. I know.

1:13:02

And I'm like, oh yeah, yeah.

1:13:04

And you also know you can

1:13:06

go search gap tooth mom and

1:13:08

it'll come up right there right

1:13:10

there. Because I know that's it.

1:13:12

Right. Because I remember that's the

1:13:15

thing that I would, that I

1:13:17

would put in the note. And

1:13:19

then you go and you're like,

1:13:21

oh yeah, Lucy, right. Okay, hey

1:13:23

Lucy, how are you? And then

1:13:25

you can, you don't feel any.

1:13:27

Right. But if somebody ever uncovered

1:13:29

all of them, it would be

1:13:31

like an episode of curb your

1:13:33

enthusiasm where everybody's gonna be mad

1:13:35

at me because there's something in

1:13:37

there that's, I mean, let's face

1:13:39

it. I don't know

1:13:41

how we got, well I do

1:13:43

know how we got here, but

1:13:45

yes. But you know what, but

1:13:47

it's funny to circle it back,

1:13:49

we're not getting that sort of

1:13:51

companionship from our AI friends yet,

1:13:53

right? Like you can't keep a

1:13:55

secret with chat GPT like that,

1:13:57

like just between me and you,

1:13:59

remember that's the dad who walks

1:14:01

with a limp or something. Right,

1:14:03

right. Or looks really old, you

1:14:05

know, like somebody has a really

1:14:07

old-looking dad or something like that.

1:14:09

You know, these unpleasant things, but

1:14:11

that's the thing that I remember.

1:14:13

I can't ask Syria about that,

1:14:16

but can't ask ChatGPT. And in

1:14:18

the future, I'd like to have

1:14:20

a trusted, you know, where every

1:14:22

bit of... And Apple's the one

1:14:24

who's sort of going towards that.

1:14:26

They haven't said anything about... a

1:14:28

future version of Apple Intelligence having

1:14:30

contextual access to your notes? I

1:14:32

know they're talking about email like

1:14:34

their email calendar. Yeah, the the

1:14:36

halo demo that does not exist

1:14:38

in any shipping form in beta

1:14:40

yet is that the pick up

1:14:42

the mom airport. Yeah, when's my

1:14:44

mom's flight? When's my mom flying

1:14:46

into town? Yes. And it involves

1:14:48

Siri knowing the emails that her

1:14:50

mom had sent and it involves

1:14:52

Syria having access to the calendar

1:14:54

where maybe you put the flight

1:14:56

information and stuff like that. None

1:14:59

of that's there. But if Syria

1:15:01

had Apple Intelligence, I guess, had

1:15:03

access to my notes, it would

1:15:05

be very helpful to me, but

1:15:07

I really need to trust it.

1:15:09

I mean, because I don't think

1:15:11

my notes, I sure there's a

1:15:13

lot of people out there with

1:15:15

a lot worse stuff in their

1:15:17

notes than mine, but it's private.

1:15:19

Very private. Well, yeah, we have

1:15:21

a great story about just all

1:15:23

the weird things people are doing

1:15:25

in doing in notes. I have

1:15:27

not seen that, but I made

1:15:29

a note of it here, and

1:15:31

I will put it in the

1:15:33

show notes. The other thing I

1:15:35

just before I forget to bring

1:15:37

it up is one of the

1:15:39

interesting examples from your video with

1:15:42

these four chat companions is that

1:15:44

none of the four that you

1:15:46

can have these ongoing conversations with

1:15:48

can do set a timer for

1:15:50

six minutes. Right. And I don't

1:15:52

know why timers and everybody in

1:15:54

our field, me, you, everybody who

1:15:56

writes about these things, we all

1:15:58

turn to timers. as the shortcoming.

1:16:00

And famously, Siri couldn't set more

1:16:02

than one timer at a time.

1:16:04

But I know you've, I know

1:16:06

you have talked about this. That

1:16:08

was a great success. I feel

1:16:10

great, I feel great success and

1:16:12

partial credit to all of us

1:16:14

in this industry who forced Apple

1:16:16

to create multiple timers. There's like

1:16:18

some engineer who was forced. He's

1:16:20

like, yeah, I worked on that.

1:16:22

If you were the engineer to

1:16:25

have worked on that, me and

1:16:27

John and so many others out

1:16:29

there. So proud of your work.

1:16:31

I might have probably not with

1:16:33

you, but at some point I

1:16:35

think this has come up on

1:16:37

this podcast before, but we have

1:16:39

home pods and an Alexa in

1:16:41

our kitchen. And you've met my

1:16:43

wife. She is not into our

1:16:45

world at all. She doesn't read

1:16:47

my site every day. She's not

1:16:49

a tech enthusiast. And me getting

1:16:51

permission to have two different talking

1:16:53

devices in our kitchen is really...

1:16:55

out of character for her, but

1:16:57

it's because Alexa for so long

1:16:59

was the only one that could

1:17:01

set multiple timers and Amy's the

1:17:03

one who does cooking where there

1:17:05

might be two different things going.

1:17:08

In a kitchen it is not,

1:17:10

it is actually more unusual if

1:17:12

you only need one for a

1:17:14

even mildly complex meal that you

1:17:16

only need one timer at a

1:17:18

time. And so Alexa, effectively she's

1:17:20

there just for multiple timers. But

1:17:22

at some point they had added

1:17:24

multiple timers to home pod and

1:17:26

that was Apple's answer which was

1:17:28

like every time I was going

1:17:30

check just to make sure it

1:17:32

doesn't work would go check with

1:17:34

the PR team. It would say

1:17:36

well it does work on the

1:17:38

home pod. I mean I practice

1:17:40

timer parenting. I don't know if

1:17:42

I'm going check with the PR

1:17:44

team. It would say well it

1:17:46

does work on the home pod.

1:17:48

Right. I mean I practice timer

1:17:50

till we are going to leave

1:17:53

here. five minutes that you can

1:17:55

sit on the potty, you know,

1:17:57

all the things. And so I

1:17:59

need multiple, and I have two

1:18:01

kids and I've got a cooking,

1:18:03

I've got multiple. Timers is essential

1:18:05

to my life. Yeah, reading time

1:18:07

was mandated. Read a book for

1:18:09

30 minutes. We definitely use that

1:18:11

a lot. I mean, Jonas is

1:18:13

20 now, Jesus. And you're still

1:18:15

like, read a book for 20

1:18:17

minutes. Yeah, I can't make them

1:18:19

do it anymore. Right. Right. Yeah,

1:18:21

I don't think I can make

1:18:23

my seven-year-old read a book for

1:18:25

20 minutes still. But it was

1:18:27

interesting that these other ones, which

1:18:29

are so much more advanced conversation.

1:18:31

can't do the device timers and

1:18:33

dumb dumb old Siri is pretty

1:18:36

good at it at this point.

1:18:38

Yeah and but they can't do

1:18:40

all those fundamental device control the

1:18:42

things that we we I think

1:18:44

we would call now more voice

1:18:46

assistant than AI companion or whatever

1:18:48

these are the assistant tasks set

1:18:50

a timer set an alarm play

1:18:52

the music at a reminder there

1:18:54

are so many of those things

1:18:56

that are essential to what We

1:18:58

do with Syria or Alexa. And

1:19:00

that's where that conversation with Craig

1:19:02

really went, which was that we

1:19:04

need to be able to get

1:19:06

both things right. And I assume

1:19:08

it's the same thing that's happening

1:19:10

at Amazon with Alexa. Especially with

1:19:12

Alexa. Like, when did my package

1:19:14

arrive? What I order this thing

1:19:16

off Amazon? All of those things

1:19:19

we talked about at the beginning

1:19:21

of the conversation where we've trained

1:19:23

our language, we know the vernacular

1:19:25

and the words to say to

1:19:27

get the assistant to do the

1:19:29

thing. Right, and in some ways

1:19:31

it really is the same thing

1:19:33

that has made people like us

1:19:35

able to make sense of the

1:19:37

terminal interface, where you have to

1:19:39

enter the commands in this order,

1:19:41

or it's not going to work,

1:19:43

or if you screw up the

1:19:45

RM command, it's going to permanently

1:19:47

delete files that you didn't want

1:19:49

to delete it. And it's... Oh

1:19:51

yeah, of course, because that was

1:19:53

my fault. because I entered the

1:19:55

command wrong. And we're like, ah,

1:19:57

curse me, you get mad and

1:19:59

you curse your luck, but you'd

1:20:02

think, well, it was my fault,

1:20:04

I put the commandant wrong. And

1:20:06

we... where our minds naturally go

1:20:08

that way and most other people's

1:20:10

do not. And they shouldn't. They

1:20:12

shouldn't be expected to learn the

1:20:14

magic way. And even to learn

1:20:16

something as simple as it makes

1:20:18

total sense to me that Siri

1:20:20

and Alexa are now very good

1:20:22

at setting timers on devices. And

1:20:24

these other ones aren't because these

1:20:26

chat-GPT and copilot-style ones aren't. don't

1:20:28

have the device context. They don't

1:20:30

run. They're just apps on your

1:20:32

phone, right? So it kind of

1:20:34

makes total sense to me. But

1:20:36

to a normal person, it's like,

1:20:38

I'm just talking to these things.

1:20:40

Set a timer. If you're supposed

1:20:42

to be so smart, I could

1:20:45

hire the world's worst human assistant,

1:20:47

somebody who is so inept and

1:20:49

bad at their job that I'm

1:20:51

going to have to have an

1:20:53

uncomfortable conversation and fire them, but

1:20:55

they could set a timer for

1:20:57

five minutes if I told them

1:20:59

to. Like when you look at

1:21:01

what Google did, they seem to

1:21:03

take that as a moment to

1:21:05

say, okay, we'll still have like

1:21:07

this added on functionality for that.

1:21:09

We need to have that functionality.

1:21:11

And I believe now I have

1:21:13

to look on the pixel and

1:21:15

I have to set it up

1:21:17

and I'm sure there are going

1:21:19

to be some Android listeners that

1:21:21

will say, yeah, that's just an

1:21:23

extension now, you just have to

1:21:25

plug it in, but that's still

1:21:27

complicated. It's not what I would

1:21:30

see like Apple doing like Apple

1:21:32

doing or even Amazon doing, right.

1:21:34

Yeah. Or the most used, most

1:21:36

requested things of Syria and Alexa

1:21:38

are. And they're going to make

1:21:40

sure that those things do not

1:21:42

break. Yeah, I think so, right.

1:21:44

And it's just, they're approaching it

1:21:46

from very different perspective than these

1:21:48

other, the other Chadmakers. Here, I'm

1:21:50

gonna hit the money belt one

1:21:52

last time. Thing. What do you

1:21:54

think of this gimmick? Let the

1:21:56

people know. I mean, it does

1:21:58

break up the show. Yeah, I'm

1:22:00

gonna I'm gonna stick with it.

1:22:02

Anyway, I want to use the

1:22:04

bell. Yeah, I used it last

1:22:06

week with Merlin, but Merlin's the

1:22:08

one who had the bell I

1:22:10

said I wish I had a

1:22:13

bell, but I didn't know. It

1:22:15

turned out he had a bell

1:22:17

on his desk and he hit

1:22:19

it for me. And then I

1:22:21

went on Amazon and bought myself

1:22:23

a bell. Anyway, that was the

1:22:25

money bell and I got to

1:22:27

thank our third and final sponsor

1:22:29

of this episode. It is our

1:22:31

good friends at Memberful. Memberful is

1:22:33

Best in Class membership software used

1:22:35

by many of the web's biggest

1:22:37

independent creators, publishers. and even bigger

1:22:39

media companies. It's really really good.

1:22:41

It lets you offer membership perks

1:22:43

and exclusive content to your loyal

1:22:45

followers and fans, readers, listeners, watchers,

1:22:47

whatever it is that you're creating,

1:22:49

giving you full control over who

1:22:51

has access to your blog posts,

1:22:53

newsletters, for email newsletters, online courses,

1:22:56

podcasts, private community chats, and more.

1:22:58

and with membership memberful you maintain

1:23:00

full control over your brand your

1:23:02

audience and your business if you

1:23:04

ever want to leave you can

1:23:06

just export the full membership list

1:23:08

take it with you build your

1:23:10

own system take it somewhere else

1:23:12

It's a sign of confidence that

1:23:14

you won't want to leave Memberful,

1:23:16

that they make it so easy

1:23:18

to export and leave if you

1:23:20

ever wanted to. Their brand doesn't

1:23:22

get put in front of yours.

1:23:24

It's not, for example, sub stack,

1:23:26

where everybody who's on sub stack,

1:23:28

it looks like a sub stack

1:23:30

and says sub stack. Memberfuls behind

1:23:32

the scenes, it's quiet. may not

1:23:34

even know you're using Memberful. It's

1:23:36

that behind the scenes, which I

1:23:39

think is super, super creator friendly.

1:23:41

Full customization, you can set up

1:23:43

and manage your membership program with

1:23:45

Memberful's intuitive platform, create multiple membership

1:23:47

tiers if you want to, and

1:23:49

payment options that cater to different

1:23:51

audience segments, seamless integration, they integrate

1:23:53

with all the tools you already

1:23:55

use, like WordPress, Mailchimp, Google Analytics,

1:23:57

Discord, Discord, a big one, where

1:23:59

you can gate access to your

1:24:01

community. Discord for members only through

1:24:03

memberful. They already have an integration

1:24:05

with that to make it as

1:24:07

easy as possible. It's just a

1:24:09

great way and it is clearly

1:24:11

the trend for independent media going

1:24:13

forward. And they have great customer

1:24:15

support too where you can contact

1:24:17

somebody for help for your particular

1:24:19

case and they can make recommendations

1:24:22

because they know from their experience

1:24:24

what sort of things work and

1:24:26

which don't for building it and

1:24:28

making it successful. like which sort

1:24:30

of things you should make members

1:24:32

only and which ones you should

1:24:34

put out there for everybody else

1:24:36

so that they want to become

1:24:38

members of your site. They help

1:24:40

you with that. So anyway, where

1:24:42

do you go to find out

1:24:44

more? Go to memberful.com/talk show. M-E-M-E-N-B-R-F-L-S-S-

1:24:46

talk show. As we head down

1:24:48

to home stretch, have you had

1:24:50

this experience? I think everybody in

1:24:52

our racket dealing with Apple has,

1:24:54

where you encounter an issue or

1:24:56

a problem or an observation. And

1:24:58

when you're communicating with Apple PR

1:25:00

about it, they act like, ah,

1:25:02

they don't say you're crazy, they

1:25:04

don't, they don't gaslight you per

1:25:07

se, but they, they act like

1:25:09

it's like, ah, nobody's ever said

1:25:11

that before. Huh. which I guess

1:25:13

is sort of gas lady, but

1:25:15

for me I ran into this

1:25:17

testing the iPhone 16s in September

1:25:19

and they they didn't say you

1:25:21

should install the 18.1 beta but

1:25:23

they didn't discourage it and for

1:25:25

reviewers who don't know how to

1:25:27

do it they gave instructions. And

1:25:29

so I used them for a

1:25:31

couple days with 18.0, which was

1:25:33

what they were going to ship

1:25:35

within the box, which I thought

1:25:37

was important to note. If you

1:25:39

buy it when you first unwrap

1:25:41

it, here's what you get. And

1:25:43

then I put the 18.1 beta

1:25:45

on. And that's when I first

1:25:47

got the new series. And new

1:25:50

series voice is definitely more. human.

1:25:52

And I still use, I forget,

1:25:54

they don't name their voices cutie

1:25:56

style anymore, they just call them

1:25:58

like voice one, voice two. I

1:26:00

think I'm a voice three person,

1:26:02

which is sort of the traditional

1:26:04

like original seri female American voice.

1:26:06

Let's see, oh, nope, it's a

1:26:08

voice four. American voice four is

1:26:10

my seri. And it definitely sounds

1:26:12

more realistic. But also. is talking

1:26:14

to me way too slow. It's

1:26:16

so slow. I have an American

1:26:18

voice too. I've been playing around

1:26:20

with them though, so. I'm going

1:26:22

to start playing around. The colors

1:26:24

of the sky fade with the

1:26:26

setting sun as the stars begin

1:26:28

to shine through the clear night.

1:26:30

The colors of the sky fade.

1:26:33

The colors of the sky fade

1:26:35

with the setting sun. You're so

1:26:37

sorry, you were saying yours is

1:26:39

slower and now I'm trying to

1:26:41

download this and sorry. Yeah, yeah.

1:26:43

I don't hear the slowness in

1:26:45

those canned examples when you're trying

1:26:47

the voices, but just day to

1:26:49

day talking to it and getting

1:26:51

verbal answers back. It seems like

1:26:53

New Syria simultaneously does sound more

1:26:55

human in inflection, which has been

1:26:57

like a slow boiling frog as

1:26:59

Federiki told you in your interview.

1:27:01

over 15 years now, it's gotten

1:27:03

more and more realistic incrementally every

1:27:05

couple years. But I find that

1:27:07

it talks way too slow. And

1:27:09

they were like, ah, nobody said

1:27:11

that. And maybe I'm the only

1:27:13

one to notice. I don't know.

1:27:16

But then if you go into

1:27:18

set it on your phone, you

1:27:20

can go into settings accessibility Siri,

1:27:22

and there's a talking speed. And

1:27:24

I've turned it up to like

1:27:26

110 percent, which to me still

1:27:28

sounds a little slow. But the

1:27:30

other thing, and Amy agrees with

1:27:32

me, because one place I hear

1:27:34

the, I hear the Siri voice

1:27:36

all the time is when we

1:27:38

are driving and going somewhere and

1:27:40

getting directions. is the new series

1:27:42

has like a Gen Z vocal

1:27:44

fry that I find annoying. And

1:27:46

it's not friendly, but it'll be

1:27:48

like, get off at exit 347?

1:27:50

It's like, why? Why are you

1:27:52

talking like this? And I realize

1:27:54

that it is sort of a

1:27:56

verbal tick of younger people to

1:27:59

sort of inflect like that. But

1:28:01

I don't want my phone talking

1:28:03

to me like that. I don't

1:28:05

know. Have you noticed this? Or

1:28:07

is it all just me? Amy

1:28:09

definitely noticed it. Yeah. I think

1:28:11

it might be a voice four

1:28:13

thing. And I'm going to try

1:28:15

switching after this, after we're done

1:28:17

recording. And I just switched to,

1:28:19

I've just downloaded. Well, it says

1:28:21

still downloading voice four. I think

1:28:23

this is some sort of bug,

1:28:25

because I've got very good connection.

1:28:27

It should be downloading or downloaded

1:28:29

already. I noticed something different than

1:28:31

that, which was when I first

1:28:33

was using the beta of iOS

1:28:35

18.1 in my car, it would

1:28:37

just misunderstand in a real bad

1:28:39

way. It didn't make any sense

1:28:41

if I would ask, like, direct

1:28:44

me to this or play this,

1:28:46

like, they were very, very egregious

1:28:48

errors. So that's gotten better, I

1:28:50

feel. I'm going to keep an

1:28:52

eye on that, but I feel

1:28:54

like that was my biggest thing

1:28:56

about the switch. Yes, but to

1:28:58

your point, yes, and to Apple's

1:29:00

credit, it has gotten to sound

1:29:02

more natural. They did really highlight

1:29:04

the fact that if you ask

1:29:06

something and then you flood, or

1:29:08

if you like, oh, actually I

1:29:10

meant this, it picks up on

1:29:12

that. I've noticed all of that

1:29:14

working very well. And that happens

1:29:16

to me frequently. I'll say, oh,

1:29:18

turn off the lights in the,

1:29:20

and actually I mean. I ostensibly

1:29:22

as a podcaster I should be

1:29:24

fairly good at putting sentences together

1:29:27

without stammering but when I'm talking

1:29:29

to Syria I guess I know

1:29:31

I'm talking to a device and

1:29:33

so I don't devote my full

1:29:35

attention to it because I don't

1:29:37

have the respect I have for

1:29:39

it that I have for either

1:29:41

a person I'm talking to physically

1:29:43

or like here I'm conscious of

1:29:45

the fact that tens of thousands

1:29:47

of people are going to listen

1:29:49

to the podcast and so it

1:29:51

has my full attention in a

1:29:53

way that when I'm just asking

1:29:55

Syria to open the living room

1:29:57

shades in the morning it doesn't.

1:29:59

For me, I'll say I want

1:30:01

to I want to do this

1:30:03

thing, but then I just slightly

1:30:05

change it and that works pretty

1:30:07

well Yeah, because like you change

1:30:10

your mind halfway through because you

1:30:12

started the command before you've really

1:30:14

thought it through you like, oh

1:30:16

no, actually make it 11 minutes

1:30:18

not 7 minutes or something like

1:30:20

that Like I was going to

1:30:22

heat up a piece of pizza

1:30:24

and the oven's not even hot

1:30:26

yet. So yeah, I don't seven

1:30:28

minutes. It's not going to be

1:30:30

hot. Give it 10 minutes. Give

1:30:32

it 10 minutes. you change it

1:30:34

halfway through. Definitely, it's a real

1:30:36

noticeable improvement and also one that

1:30:38

I've already taken for granted. No

1:30:40

longer seems impressive. Which just go

1:30:42

to what Craig was saying, which

1:30:44

is that that we're just going

1:30:46

to keep wanting more and more.

1:30:48

But to be fair, it is

1:30:50

a pretty low bar still. I

1:30:53

mean, I think for me, and

1:30:55

I highlighted this in my Apple

1:30:57

intelligence review, is that when you

1:30:59

say that series is getting better,

1:31:01

people expect some of the worst

1:31:03

parts of Syria to get better.

1:31:05

And I think that one of

1:31:07

the worst parts of Syria for

1:31:09

me, well, isn't like, speaking of

1:31:11

the great Larry David, there's that

1:31:13

great Larry David scene, which I

1:31:15

think you linked to a few

1:31:17

months ago. It's not the misunderstanding

1:31:19

happens, and it's funny, or it

1:31:21

triggers all of the devices when

1:31:23

you want to ask one, and

1:31:25

it's triggering your computer or the

1:31:27

home pot. Those are all, I

1:31:29

think, other Syria quirks that we

1:31:31

live with, and it's fine. Search

1:31:33

the web. Right, something that I

1:31:36

think is pretty normal or I

1:31:38

expected to have the answers and

1:31:40

it tells me to search the

1:31:42

web is where I think that

1:31:44

like serious stupidity lies. Yeah, totally.

1:31:46

And it just. feels frustrating. And

1:31:48

now that these other bots can

1:31:50

do it, it really stands out

1:31:52

as an omission. It's like, don't

1:31:54

just tell me to go to

1:31:56

Wikipedia. I'm asking you verbally, because

1:31:58

I want a verbal answer. And

1:32:00

I'm 100% certain that you could

1:32:02

parse the Wikipedia page that you're

1:32:04

sending me to and get the

1:32:06

answer from the first paragraph. And

1:32:08

I know you could do it.

1:32:10

And you're just refusing not to.

1:32:12

The Larry David thing was so

1:32:14

funny and I saw and it

1:32:16

was him talking to her to

1:32:18

Syria in his car and getting

1:32:21

angrier and angrier at the directions

1:32:23

and the argument they're having over

1:32:25

having trying to get the directions

1:32:27

and somebody asked him about that

1:32:29

scene and he said it was

1:32:31

complete you know like almost everything

1:32:33

in the show is totally based

1:32:35

on real life except that in

1:32:37

real life he got way angrier.

1:32:39

And he said, usually the me

1:32:41

on the show is actually a

1:32:43

worse version of me than the

1:32:45

real me, I think. And he

1:32:47

said, in that case, the real

1:32:49

me was ready to drive the

1:32:51

car off off the cliff. I

1:32:53

mean, I think also there's he

1:32:55

just like fully, like he's cursing.

1:32:57

I linked to it a few

1:32:59

months ago and some readers got

1:33:01

bad because they said it was

1:33:04

just completely not suitable and he's

1:33:06

really cursing. But sometimes you really

1:33:08

need curse words and sometimes when

1:33:10

Syria and these things really let

1:33:12

you down, there's no other way

1:33:14

to express yourself completely. The other

1:33:16

thing I've noticed with the chat

1:33:18

GPT integration with Syria and Apple

1:33:20

Intelligence is it's clearly not the

1:33:22

chat GPT you get in the

1:33:24

app. Just like pure access to

1:33:26

the model, which is powerful and

1:33:28

does amazing things like write poems

1:33:30

that that series or Apple Intelligence

1:33:32

can't do, but it doesn't have

1:33:34

the integration with the live web

1:33:36

results that the chat gPT app

1:33:38

does. So there's the whole training

1:33:40

window model problem where, oh, this

1:33:42

model was trained on data up

1:33:44

to the summer of 2022, and

1:33:47

that's where all of its knowledge

1:33:49

ends. So it doesn't know anything

1:33:51

that happened after whatever the cutoff

1:33:53

date is. When you're using the

1:33:55

ChatGPT app now, I don't even

1:33:57

notice that anymore because behind the

1:33:59

scenes it'll go on the web

1:34:01

and get live answers for last

1:34:03

week's election or sports the World

1:34:05

Series that took place last month

1:34:07

or something like that. And when

1:34:09

you're using the ChatGPT integration in

1:34:11

Apple Intelligence, you don't get any

1:34:13

of that. It just isn't there.

1:34:15

Again, if that integration in Apple

1:34:17

Intelligence wasn't there, I wouldn't miss

1:34:19

it because when I want to

1:34:21

ask a question like that, I

1:34:23

don't go to Apple Intelligence because

1:34:25

I know it's not going to

1:34:27

work. I go to the ChatGPT

1:34:30

app. So I'm not even sure

1:34:32

why just use the ChatGPT app

1:34:34

for ChatGPT things isn't Apple's answer

1:34:36

to these questions. See, I'm just,

1:34:38

and again, I'm doing more testing

1:34:40

and I'm planning to do something.

1:34:42

More on this in the coming

1:34:44

week. So I haven't really played

1:34:46

around deeply with the chat chupete

1:34:48

stuff But when is it going?

1:34:50

I just don't know if I

1:34:52

have a best the handle yet

1:34:54

on when it is going to

1:34:56

go ask chat chupete I mean,

1:34:58

I know it's asking chat chupete

1:35:00

in the writing tools to when

1:35:02

now you specify a prompt So

1:35:04

if you say write an email

1:35:06

to John Gruber telling him I

1:35:08

will definitely be on his podcast

1:35:10

but I can only do it

1:35:13

at these times make it professional

1:35:15

Right. That's using ChatGBT there, but

1:35:17

when I go to Siri and

1:35:19

I type something, I'm not getting

1:35:21

anything going to ChatGBT beyond my

1:35:23

poem ask. No. And I feel,

1:35:25

and it's one of those things,

1:35:27

I really do think that sometimes

1:35:29

talking to Apple, like we can.

1:35:31

You and I can through Apple

1:35:33

PR in a way that is

1:35:35

rare. It is sometimes like talking

1:35:37

to an LLLM because they don't

1:35:39

want to give you in the

1:35:41

way that these LLLM's won't just

1:35:43

say and and are technically incapable

1:35:45

of saying I don't know, right?

1:35:47

So they give you rather than

1:35:49

tell you I don't know how

1:35:51

to make your cheese stick better

1:35:53

to your pizza. I'm sorry. I

1:35:55

realize that's a problem, but I

1:35:58

don't know. They'll just make up

1:36:00

an answer like, use Elmer's glue.

1:36:02

And Apple, when you ask them

1:36:04

questions, like, when does it go

1:36:06

to chatGBT? They're like, ha, that's

1:36:08

a good question. And because they

1:36:10

don't want to answer. Instead of

1:36:12

saying, you know what, we don't

1:36:14

want to give an answer to

1:36:16

that, because we want to. and

1:36:18

it went through to chat cheapity.

1:36:20

But when I asked it before,

1:36:22

just for a meatball recipe, it

1:36:24

did not go to chat cheapity.

1:36:26

See, I don't know how, I

1:36:28

don't know how to explain that.

1:36:30

Live testing here on the show.

1:36:32

Yeah, and when you interviewed Federi

1:36:34

and of course on camera, he's

1:36:36

going to stay on message, but

1:36:38

it was a great interview and

1:36:41

it was insightful in many ways,

1:36:43

but in terms of getting an

1:36:45

answer like that, they just... They

1:36:47

know it's a thing and they

1:36:49

know when and why it's going

1:36:51

to chat GPT obviously, but they

1:36:53

don't want to talk about it.

1:36:55

And I think part of it

1:36:57

is that they want to be

1:36:59

flexible so that if they have

1:37:01

a if if they're on device

1:37:03

processing is better in three months

1:37:05

than it is today that it

1:37:07

will do things on device that

1:37:09

it previously went to private cloud

1:37:11

compute and and maybe it'll use

1:37:13

private cloud compute for things that

1:37:15

previously it was handing off to

1:37:17

chat GPT and they don't want

1:37:19

to give an answer in November

1:37:21

that may not be true in

1:37:24

March, but they don't want to

1:37:26

say that either. I wish that

1:37:28

they in the same way that

1:37:30

I really, really wish CHIPT would

1:37:32

just say, I don't know how

1:37:34

to answer that. When it doesn't

1:37:36

know how to answer it, I

1:37:38

wish Apple would tell me, yeah,

1:37:40

we just don't want to answer

1:37:42

that. Rather than talk around it.

1:37:44

Yeah. I mean, I think there's

1:37:46

a lot of things going on

1:37:48

because I'm also using this too,

1:37:50

and it's using obviously the knowledge

1:37:52

base of Wikipedia, which it has

1:37:54

had before, and it's using whatever

1:37:56

other web sources it's already brought

1:37:58

in. It's clearly, there's some segmentation

1:38:00

of, use chatchupity for X, Y,

1:38:02

Z types of prompts. And to

1:38:04

your point, maybe we can ask

1:38:07

Apple and they will get back

1:38:09

to us or maybe they will

1:38:11

just have a LLM response that

1:38:13

is just, we are looking into

1:38:15

it, but thank you for inquiring

1:38:17

about this. Pretty much. I don't

1:38:19

have anything else that I wanted

1:38:21

to talk about at least on

1:38:23

this subject and you're, I always

1:38:25

appreciate it. I love talking to

1:38:27

you, Joanna Joanna Joanna. I love

1:38:29

this conversation. I mean, I took

1:38:31

a lot away from it, but

1:38:33

especially that I must watch these

1:38:35

movies. And I realize now the

1:38:37

one thing I will, it's the

1:38:39

last thing I wanted to talk

1:38:41

to you about, but I'm a

1:38:43

little, I'm a little hurt. I

1:38:45

learned only through your most recent

1:38:47

video, after I, I don't know

1:38:50

which one of those bots started

1:38:52

addressing you as Joe. And you

1:38:54

said, well, it's interesting because I

1:38:56

didn't tell it to call me

1:38:58

Joe, but my closest friends call

1:39:00

me Joe. Well, you never told

1:39:02

me to call you Joe. You

1:39:04

can call me Joe. My family

1:39:06

calls me Joe. My friends from

1:39:08

like high school, college, call me

1:39:10

Joe. My wife sometimes, yeah, everyone

1:39:12

calls me Joe. You can. You

1:39:14

can do it. You're there. We're

1:39:16

there. I didn't feel like I

1:39:18

was there, but chat you BT,

1:39:20

but I mean, it really took

1:39:22

like that. Yeah, I think my

1:39:24

only other thing that I wanted

1:39:26

to say was you wrote an

1:39:28

amazing essay last week and I

1:39:30

don't know if you've talked about

1:39:32

it on your show, but I

1:39:35

can interview you about that if

1:39:37

you'd like. We, I did talk

1:39:39

about it with Merlin Man last,

1:39:41

on the last episode, but that's

1:39:43

all right. It's all right. I

1:39:45

did. And thank you very much.

1:39:47

The response has been, it's an

1:39:49

unusual essay for me and I

1:39:51

will just say to, and it

1:39:53

generated way more responses than I

1:39:55

usually get to anything I write.

1:39:57

And every single one of them,

1:39:59

not even 99, but 100% of

1:40:01

them that I've seen, have just

1:40:03

been gracious and wonderful and I've

1:40:05

trying to, unlike my usual self

1:40:07

who just reads them in. gives

1:40:09

up on answering. I'm trying to

1:40:11

answer everyone and if anybody wrote

1:40:13

and I didn't answer and you're

1:40:15

listening know that I did read

1:40:18

it and I thank you. It

1:40:20

was very very touching. No I

1:40:22

think it was it was I

1:40:24

loved it and I think sometimes

1:40:26

we don't get to see the

1:40:28

other sides of people I don't

1:40:30

even know if it was so

1:40:32

much of that because I feel

1:40:34

like I do know you a

1:40:36

little bit as a as a

1:40:38

friend and some of your personal

1:40:40

life but I just the way

1:40:42

you've you've threaded that was you've

1:40:44

threaded that was so so so

1:40:46

wonderful touching. Thank you, thank you.

1:40:48

And it's, it's, it's, and I

1:40:50

always, Amy always knows, Amy knows

1:40:52

that when I'm the most uncertain

1:40:54

about publishing something, it turns out

1:40:56

to be like one of the

1:40:58

best things I've written. It's, I,

1:41:01

I, my mom died back at

1:41:03

the end of June, very end

1:41:05

of June, and like her funeral

1:41:07

was July first. So I, for

1:41:09

a while I was saying July,

1:41:11

but it was the funeral, it

1:41:13

was ended, but anyway, it was

1:41:15

summer. And I had no urge

1:41:17

to write about it, but I

1:41:19

thought about it, and it thought

1:41:21

about it, and it's been there,

1:41:23

and then the election happened, and

1:41:25

my dad lost his ring, and

1:41:27

the whole thing just came, it

1:41:29

just, yeah, it was, it just

1:41:31

formed in my head, and it's,

1:41:33

yeah, that thing where your mom

1:41:35

died, has been turning in my

1:41:37

head for so many months, and

1:41:39

it just was like, oh, I

1:41:41

need to write this, and I

1:41:44

think people might, it might... suit

1:41:46

the moment for a lot of

1:41:48

people to read and it took

1:41:50

me like three days to write

1:41:52

and by the time I got

1:41:54

done with it I was like

1:41:56

ah this seems self-indulgent. I don't

1:41:58

know. No I think that's where

1:42:00

it was like you're story but

1:42:02

I think at least for me

1:42:04

it was so relatable I'd also

1:42:06

gotten some bad news about a

1:42:08

family friend last week and it

1:42:10

was just everyone was not everyone.

1:42:12

Oh unfortunately not everyone yeah but

1:42:14

many people were sad and it

1:42:16

just was a it was beautifully

1:42:18

written actually I was wondering though

1:42:20

that about your writing and you're

1:42:22

very out and open about your

1:42:24

politics and and You must have

1:42:27

some readers. I mean, look at

1:42:29

the makeup of this country right

1:42:31

now. I mean, you must have

1:42:33

some readers that don't, that did

1:42:35

not agree or did not vote

1:42:37

Harris. Do they share with you?

1:42:39

I mean, what, do they? So

1:42:41

that's, that's an, very interesting. I

1:42:43

like this bonus interview. Yeah, this

1:42:45

is my, this is my podcast

1:42:47

that I don't have. But again,

1:42:49

but I often hear this when

1:42:51

you're on the show, is this

1:42:53

show, is that people love it

1:42:55

because you tend to tend to

1:42:57

take to take over. I waited

1:42:59

two hours. I know, and I'm

1:43:01

in 51 minutes. And again, I'm

1:43:03

often reluctant. My nature is to

1:43:05

be reluctant to talk about the

1:43:07

behind the scenes thing. But on

1:43:09

this particular one, I'm actually happy

1:43:12

to. So I started during Fireball

1:43:14

in 2004, and I had very

1:43:16

strong opinions politically about the George

1:43:18

W. Bush Dick Cheney administration, and

1:43:20

especially the second term. And I

1:43:22

almost never wrote about it on

1:43:24

daring football. And it wasn't because

1:43:26

I was, oh, I'm just starting

1:43:28

this and I don't want to

1:43:30

piss people off. I just thought,

1:43:32

even though I had extremely strong

1:43:34

opinions, I felt in the natural

1:43:36

way, in the way that people

1:43:38

who don't agree with me politically

1:43:40

still think I should separate them

1:43:42

because I come to your site

1:43:44

for tech, please leave this stuff

1:43:46

off. And I just, I kind

1:43:48

of felt that way about it

1:43:50

then. And I also... didn't spend

1:43:52

much time even though I really

1:43:55

really like Barack Obama and I

1:43:57

really his especially his 2008 elect

1:43:59

was one of the best days

1:44:01

of my life. Me and Amy

1:44:03

and Jonas. Jonas was four at

1:44:05

the time. We voted in the

1:44:07

morning and flew to Chicago to

1:44:09

be with our friend Jim Cudall

1:44:11

and his lovely family. And we

1:44:13

went to Grant Park for the

1:44:15

big evidence, like, I don't know,

1:44:17

100,000 people. I mean, it's an

1:44:19

enormous place and it was packed

1:44:21

to watch the election results. And

1:44:23

I mean, I've had this feeling

1:44:25

many times, but I thought he

1:44:27

was gonna win. I mean, it's

1:44:29

partly why we flew to Chicago.

1:44:31

But we didn't know, you don't

1:44:33

know. And we're there for his

1:44:35

speech and it was remarkable and

1:44:38

they didn't know who we were,

1:44:40

but there's, I'll put a picture

1:44:42

here, I'll make a note and

1:44:44

I'll send it to you then.

1:44:46

But it was on the front

1:44:48

page of the Chicago Tribune the

1:44:50

next day. It was a picture

1:44:52

of people from Grant Park and...

1:44:54

And you can see me with

1:44:56

Jonas on my shoulders and Jim,

1:44:58

his son Spencer, was a couple

1:45:00

years older than Jonas, but was

1:45:02

on his shoulders even though he's

1:45:04

six or seven at the time.

1:45:06

And you could see Amy and

1:45:08

Jim's family, his wife Heidi. We're

1:45:10

in a crowd. It's not a

1:45:12

picture of us per se, but

1:45:14

especially Jonas and Spencer as the

1:45:16

kids who were there. I got

1:45:18

a, you know, and you can

1:45:21

buy, I don't know if the

1:45:23

journal does this, but you can

1:45:25

buy pictures from the newspaper and

1:45:27

we bought, we have a big

1:45:29

frame version of the picture. It's

1:45:31

just this great moment. I didn't

1:45:33

spend a lot of time writing

1:45:35

about Obama in that time either.

1:45:37

So it's not just that I

1:45:39

hate on Republicans and it. It

1:45:41

just didn't feel the place. It

1:45:43

crossed a line that's other than

1:45:45

politics. It's not just policy and

1:45:47

you agree with it or you

1:45:49

don't agree with it and it's

1:45:51

traditional conservativeism and liberalism and left-right-bin.

1:45:53

It's these bigger things like just

1:45:55

truth and lies and competence and

1:45:57

abject stupidity. Or as I love

1:45:59

the word, cackestocracy. I can't say

1:46:01

it, but I love... the word

1:46:04

and it's a word that means

1:46:06

government of the least competent people.

1:46:08

And you know nominating this idiot

1:46:10

gates from Florida to be the

1:46:12

attorney general is that it goes

1:46:14

beyond the definition of catechistocracy. But

1:46:16

the whole first Trump administration was

1:46:18

full of this stuff. And then

1:46:20

he tried to overflow the results

1:46:22

of fair and free election. That

1:46:24

to me isn't political in a

1:46:26

traditional sense. It's different. And I

1:46:28

couldn't see keeping my mouth shut

1:46:30

about it for four years. And

1:46:32

yes, I definitely heard from people

1:46:34

who were upset about it, people

1:46:36

who still liked Trump while he

1:46:38

was the president. And I guess

1:46:40

I lost some number of readers.

1:46:42

I mean, I don't really pay

1:46:44

attention to analytics that much. It

1:46:46

seems like my site is popular

1:46:49

and I feel like when I

1:46:51

did pay attention to stats, it

1:46:53

was not helpful. And this time

1:46:55

around, was it any different? I

1:46:57

mean, it seems... What's interesting, so

1:46:59

what's interesting is when I posted

1:47:01

that, how it went, the essay

1:47:03

about my mom, and never really

1:47:05

talking about the election much other

1:47:07

than my experience watching the results

1:47:09

come in, in a very nerdy,

1:47:11

data-driven way. I got a bunch

1:47:13

of the emails I got were

1:47:15

from people who said, hey, we

1:47:17

disagree on politics, I voted for

1:47:19

Trump, blah, blah, blah, blah. But

1:47:21

it just really nice, like the

1:47:23

people who, some people stopped reading

1:47:25

from 2016 to 2020, and other

1:47:27

people who I think are more,

1:47:29

they disagree and they still, they

1:47:32

voted for him this month, but

1:47:34

I think are less passionate about

1:47:36

it for whatever their reasons to

1:47:38

vote for the son of a

1:47:40

bitch. they're not as lava fuel

1:47:42

lava in the veins about it

1:47:44

and just wrote a very nice

1:47:46

thank you note and acknowledge that

1:47:48

we have different opinions on it

1:47:50

and but that was a lovely

1:47:52

thing to share but Here we

1:47:54

are again and facing down four

1:47:56

years of this and I I

1:47:58

I don't know what I don't

1:48:00

have a strategy for it. I

1:48:02

posted I don't know if you

1:48:04

saw it I posted last night

1:48:06

recently I posted there was there's

1:48:08

Mike Tyson 58 year old Mike

1:48:10

Tyson is fighting now by the

1:48:12

time this podcast is out it'll

1:48:15

be over but it on Friday

1:48:17

November 15th he's fighting Jake Paul

1:48:19

the Youtuber? and at the way,

1:48:21

it's a real fight though, it

1:48:23

sounds like a stunt, but at

1:48:25

the real, at the real way

1:48:27

in for this fight, which has

1:48:29

to take place before a sanctioned

1:48:31

official boxing match, Tyson slapped him

1:48:33

in the face and I linked

1:48:35

to the headline at ESPN, Mike

1:48:37

Tyson slaps Jake Paul and face,

1:48:39

and then I quipped the winner

1:48:41

of the fight, gets to be

1:48:43

the next secretary of the treasury.

1:48:45

Which... made me laugh out loud.

1:48:47

I thought last night sitting on

1:48:49

the couch, I literally laughed out

1:48:51

loud before I wrote it and

1:48:53

I thought, oh my God, I've

1:48:55

got to write that. And then

1:48:58

you realize there's a chance that

1:49:00

could be coming. Yeah, with Matt

1:49:02

Gates is the nominee for Attorney

1:49:04

General, I actually think Mike Tyson

1:49:06

as Secretary of the Treasury would

1:49:08

actually, if I could have one

1:49:10

or the other, I'd rather have

1:49:12

Mike Tyson as Secretary of the

1:49:14

Treasury, in all seriousness, then pedophophile...

1:49:16

Cripo Matt Gates as Attorney General.

1:49:18

I really would. It's that it's

1:49:20

that turned that much into pro

1:49:22

wrestling. Yeah. I mean, Hoke Hogan,

1:49:24

Hoke Hogan actually went to the

1:49:26

Republican Convention. He's got to be

1:49:28

mad that he's, he hasn't been

1:49:30

nominated. There's a lineup of people.

1:49:32

Kid Rock, there's a lot of

1:49:34

people that could be up for

1:49:36

some jobs that we don't know.

1:49:38

Yeah, Kid Rock, I mean, I

1:49:41

mean, he should be in charge

1:49:43

of like alcohol, tobacco and firearms,

1:49:45

right? That's right there for the

1:49:47

taking. It's pretty much the same

1:49:49

thing that you just described. Anyway,

1:49:51

I didn't want to get into

1:49:53

politics. I've just been wondering about

1:49:55

it. No. I realize, for example.

1:49:57

There's no way that you work

1:49:59

at the Wall Street Journal, and

1:50:01

you have a beat, and your

1:50:03

day job. You have to stick

1:50:05

to the beat, of course. And

1:50:07

other people who are more independent

1:50:09

creators, who the election came and

1:50:11

went, and there's nothing on their

1:50:13

site about it, whether pro or

1:50:15

con. I don't pass any judgment

1:50:17

on them at all, and you

1:50:19

write what you want to. But

1:50:21

I realize the way I do

1:50:23

that is a bit unique in

1:50:26

our field. I mean, we have

1:50:28

very strict standards and guidelines at

1:50:30

the journal, so I'm not even

1:50:32

allowed to really share much on

1:50:34

politics other than news headlines. And

1:50:36

I mean, I can stretch that,

1:50:38

but I just, I've tended not

1:50:40

to. I've written some pieces in

1:50:42

the last couple of weeks. I

1:50:44

wrote a lot about tech spam,

1:50:46

and I got this crazy amount

1:50:48

of tech spam a few weeks

1:50:50

ago, and it was completely all

1:50:52

right wing, like insane amount. And

1:50:55

I wrote about like trying to

1:50:57

track down, because I did tech

1:50:59

stop. It was the stories basically

1:51:01

I said I texted stop and

1:51:03

I got flood a flood within

1:51:05

within the first 24 hours I

1:51:07

had another 30 messages and it

1:51:09

went on for three days and

1:51:11

I had about a hundred messages

1:51:14

and so I wrote all about

1:51:16

this but they were all like

1:51:18

from Trump's from packs Trump supporting

1:51:20

packs all of those things and

1:51:22

so a lot of people assumed

1:51:24

I was a Trump supporter and

1:51:26

so I got a lot of

1:51:28

hate mail from people saying you're

1:51:31

gonna support him and then This

1:51:33

week I tweeted something where I

1:51:35

was on NBC News and I

1:51:37

said something about why people are

1:51:39

leaving X. They had interviewed me.

1:51:41

There are a lot of people

1:51:43

from the other side saying, you're

1:51:45

this, that, and so I try

1:51:47

to let it all be out

1:51:50

there. And of course, I've also

1:51:52

told my standards, people at the

1:51:54

journal and I've always said, if

1:51:56

these are social issues that affect

1:51:58

my life, I'm going to be

1:52:00

out there talking about them. On

1:52:02

social media. I mean, I wasn't

1:52:04

quiet around October October. There are

1:52:06

certain things, you know, I'm not

1:52:09

quiet when we, when things are

1:52:11

about LGBT rights, I'm not quiet

1:52:13

about those things. things. People can

1:52:15

make their assumptions. But I try

1:52:17

to stay away from it all.

1:52:19

Yeah. But then there's Neelize prescient,

1:52:21

I fear, with JFK Jr. apparently

1:52:23

being a nominee for health and

1:52:25

human services. A vote for Donald

1:52:28

Trump is a vote for school

1:52:30

shootings and measles. So I said,

1:52:32

oh, I loved it. And it's

1:52:34

really true. It's both. It's what

1:52:36

makes us human is the, the,

1:52:38

I'm going to butcher the power

1:52:40

phrasing, but the F. Scott Fitzgerald

1:52:42

line that the sign of a

1:52:45

first class intellect is being able

1:52:47

to hold two opposing thoughts in

1:52:49

your head at the same time.

1:52:51

So you can both find it

1:52:53

funny. and find it heartbreaking at

1:52:55

the same time, right? It is.

1:52:57

It is a vote for measles.

1:52:59

It is a vote for more

1:53:01

school shootings. It is. And those

1:53:04

are the vaccines are one of

1:53:06

the great technologies of all humankind.

1:53:08

It is just absolutely astonishing how

1:53:10

many people either died or went

1:53:12

blind or with polio wound up

1:53:14

unable to walk from these diseases.

1:53:16

from the time human beings existed

1:53:18

until vaccines and the wide distribution

1:53:20

of them and Nothing is as

1:53:23

electric emotionally as school shootings It's

1:53:25

thank God that you can if

1:53:27

you want to put on your

1:53:29

data analytic cap you can say

1:53:31

Very few kids as a percentage

1:53:33

are in schools with a school

1:53:35

shooting in any given year But

1:53:37

it's very small consolation when they

1:53:39

keep happening over and over and

1:53:42

over again and randomly anywhere right

1:53:44

so it's it it's also it's

1:53:46

very much human nature that anything

1:53:48

that's randomly reinforced is on your

1:53:50

mind and Yeah, it's I guess

1:53:52

if you want to be really

1:53:54

and to to bring

1:53:56

the whole episode

1:53:59

full circle the bots. don't

1:54:01

my friend bots

1:54:03

they don't need

1:54:05

vaccines vaccines.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features