Could AI Solve The Loneliness Epidemic?

Could AI Solve The Loneliness Epidemic?

Released Friday, 4th April 2025
Good episode? Give it some love!
Could AI Solve The Loneliness Epidemic?

Could AI Solve The Loneliness Epidemic?

Could AI Solve The Loneliness Epidemic?

Could AI Solve The Loneliness Epidemic?

Friday, 4th April 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Welcome to Sad Boys, a podcast about feelings

0:02

and other things also. I'm Jarvis. I'm Jordan.

0:04

And now what do we do? Do we just do

0:06

a podcast? You're wagging your, you're wagging your finger like

0:08

de Kimbe Matumbo. You think it's adding maybe a little

0:10

sauce to the show? Well, I feel like, right, but

0:13

who's, who are you saying that to? You can't do

0:15

that. Do what? Well, there's plenty of people listening.

0:17

Someone's probably doing something. You can't doing something you

0:19

can't do that. You can't do that. You can't

0:21

do that. You can't do that. You can't do

0:23

that. You can't do that. You can't do that.

0:26

No Sam, you can't do the dishes right

0:28

now. Quit, no Sam, stop it, you

0:30

can't do that. I know you're doing

0:32

chores. Every week we have a new

0:34

person who we tell to stop doing

0:37

something. I always picture it as dishes.

0:39

If someone's listening to the dishes. Stop

0:41

doing the dishes? It's always dishes. Why

0:43

would you want them to stop doing

0:45

the dishes? Are you a part of

0:47

big, dirty, big dirty? I'm big and

0:49

dirty. Cut it out. What's the... Ever

0:52

since Jordan got in the pocket of

0:54

big dirty, the whole forecast is going

0:56

downhill. Everything underneath the clothes, pure dirt.

0:58

Oh man, yeah. What's a, wait, what's

1:00

the default chore in your mind? I go

1:03

to dishes. Trash. Trash. Trash is default, but

1:05

you know I have a history. That's true.

1:07

I was born of the trash. It's a war.

1:09

It's a war at this point. It's a, it's

1:12

the shore war war war. This is the first

1:14

place you've lived in the first place

1:16

you've lived in living memory. I

1:18

would say that there is it's possible

1:20

that there is a trash war.

1:23

I just haven't gone public. It's

1:25

kind of like a trash

1:27

cold war. It's a proxy

1:29

battle. I've noticed some other

1:31

people using your trash bins.

1:33

That's all I'll say. The CIA

1:35

over here. Get with his foreign

1:38

information. Messing with our domestic affairs.

1:40

I think he's spreading out. Well,

1:43

sometimes one of my trash pet

1:45

peeves. It's not a big deal.

1:47

is those who listen to that was

1:49

for a very long time will

1:52

remember some of my trash

1:54

trials and tribulations TTT. The historians

1:56

of you out there were right

1:58

right but I would say that

2:00

recently, not too many issues, except

2:02

I do have a pet peeve

2:04

when someone will use my compost

2:06

and I don't take my, like,

2:08

it makes, if I haven't composted

2:10

anything, I may not think that

2:12

I need to take it out.

2:14

And then so I have to

2:16

check to see if someone else

2:18

is, because one time someone did

2:21

yard work and then like put

2:23

their stuff in my bin, and

2:25

then I needed to remember to

2:27

take it out because then when

2:29

we were doing yard work it

2:31

was full and I was like

2:33

well egg on my face. More

2:35

fool me. But I would say,

2:37

ethical question because I do think

2:39

there's like some crime right you're

2:41

not supposed to like you someone's

2:43

trash isn't there a crime somewhere?

2:45

Do not kill. I'm not saying

2:47

that someone else is doing that

2:49

crime to me I'm about to

2:51

admit to a crime. There's like

2:53

a legal dumping like if you're

2:55

like filling up. A dumpster that

2:57

you're not supposed to be using.

2:59

Yeah, so here's my, here's my

3:01

moral conundrum. And it's a question

3:03

for the room. It's midnight, the

3:05

night before trash day, all the

3:07

bins are on the street. I'm

3:09

telling you like a story, like

3:11

a fairy tale. So it's the

3:13

night before trash day. They wrapped

3:15

up through the street. I've got

3:18

one of those sleep caps on

3:20

and I'm bundled up in my

3:22

bed. You already made a book.

3:24

So, and you, through one reason

3:26

or another, have a little bit

3:28

too much trash. So much trash

3:30

that you won't be able to

3:32

put it all in your bin.

3:34

However, the bins that are out,

3:36

the bins that will be picked

3:38

up at six in the morning,

3:40

whose owners are asleep, may have

3:42

some space. Your options are A,

3:44

leave the trash until after the

3:46

trash gets picked up and then

3:48

put it into a trash bin.

3:50

or B, toss the trash bag

3:52

into someone else's, who has room,

3:54

and it'll be unbeknownst to them.

3:56

what they don't know. They won't

3:58

even have the chance to fill

4:00

it. They won't even know because

4:02

it'll get picked up and they'll

4:04

be none the wiser. Unless there

4:06

are four a.m. I've already taken

4:08

the trash out but now it's

4:10

time to throw out the mega

4:12

trash. I've got my Saratoga. I've

4:15

been doing push-ups on my balcony.

4:17

Right. I drink 12 Saratoga just

4:19

now. I need to toss them

4:21

into recycling before. And what's this?

4:23

I've used all my ice. I've

4:25

been able to throw the bag

4:27

away. Yeah, what are you doing

4:29

in that scenario? I'm known as

4:31

the bad boy at the show.

4:33

I'm kind of the renegade. Can't

4:35

home me back. I'm basically John

4:37

Wick and the whole world kill

4:39

my dog. I'm holding nothing back.

4:41

I'm at war. I think first

4:43

movie is I forget to take

4:45

the trash out. That's scenario one.

4:47

Okay. Okay. So we're kind of

4:49

timeline A almost certainly is the

4:51

case. So in this world. Well,

4:53

let's say you had forgotten to

4:55

take the trash out last week.

4:57

And so now you have too

4:59

much trash. and you've been gaming

5:01

all night so you've been up.

5:03

I'm at my strongest. Yeah, you're

5:05

up till 2 a.m. and you're

5:07

like, I've got double trash. Well

5:09

now I've never been more focused

5:12

in my life than when I'm

5:14

in the game zone. Yeah, yeah,

5:16

and so now the gamer zone,

5:18

you know, a game just ended,

5:20

you just got a high score

5:22

and balatra. I'm wiping my hands,

5:24

you're, you're blowing them like a,

5:26

like a, like a, like a

5:28

gun. Yeah, I'm trying to call

5:30

my family. They've started screening my

5:32

calls because I keep telling them

5:34

about like, which joke is I

5:36

used for a couple of years.

5:38

Right, you were talking about the

5:40

kill screen of Blattro, which is

5:42

reachable. Once you get enough exponents

5:44

or whatever, they can't store the

5:46

number and you start getting not

5:48

a number as your score. Me

5:50

on the phone. Hello? Yeah. So

5:52

what do you do? What do

5:54

you do? Do you use someone

5:56

else's trash can? What they don't

5:58

know won't know won't hurt them?

6:00

I... Who gives a shit? Like,

6:02

just, if... Okay, here... You're shaking.

6:04

I have been walking a dog

6:06

before. Someone's trash cans in front

6:09

of their house. I'm holding this

6:11

poop. I'm just going to throw

6:13

it in their trash. It's in

6:15

front of their house. Crime. Criminal.

6:17

Who gives a shit though? You

6:19

just did. You just gave a

6:21

shit to the trash can. And

6:23

you're right too. You're literally trying

6:25

to list it. A socialist and

6:27

I think all trash is for

6:29

the people. Yeah. There's like a

6:31

crowd of people here. Who just

6:33

hate these? like nimby people who

6:35

all they do the question was

6:37

for Jordan no like you're gonna

6:39

be a frickin mrs. Kravitz and

6:41

constantly look out your window and

6:43

get mad if someone puts trash

6:45

in your trash can who cares

6:47

mr. i don't know mrs. Kravitz

6:49

is from an old TV show

6:51

that no one here was alive

6:53

to show from the 50s called

6:55

leave my trash alone so I

6:57

agree. I am one to give

6:59

a shit to a trash can.

7:01

To gift. Wait, that's where you

7:03

shit in trash cans. Okay, hold

7:06

on. Okay, were you going to

7:08

docs it now? Yeah. There is

7:10

a near my place and the

7:12

exact route will walk the dog.

7:14

There are a collection of trash

7:16

bins that are never full and

7:18

always outside. You're asking for it.

7:20

Right, we are trash victim blaming

7:22

here. No, but I will do.

7:24

I'll admit it. I'll admit it.

7:26

I will go down the street

7:28

and I'll open up the trash

7:30

bins and I'll go, that one's

7:32

pretty empty. This, honestly, this bag

7:34

can fit and not only would

7:36

no one be the wiser, but

7:38

even if they wanted to throw

7:40

something away, they still could. And

7:42

so I'm like, it's actually, usually

7:44

only happens when we would like

7:46

host a big party and then

7:48

the next day, it would be

7:50

like a ton of people worth

7:52

of trash. So it would be

7:54

hard to otherwise get rid of

7:56

it quickly. This has been a...

7:58

We're all wearing different wires. I've

8:00

been an undercover cop the whole

8:03

time waiting for you. It all

8:05

makes sense. I just want to

8:07

hear a good explanation. Like, I

8:09

heard one person one time say

8:11

that they were mad that people

8:13

threw their dog poop in their

8:15

trash can because then when they

8:17

wheel the trash can into their

8:19

garage, it stinks. Doesn't trash normally

8:21

stink? Also. is your trash not?

8:23

Yeah, I mean, how much dog

8:25

poop? I feel like it takes

8:27

a lot of dog poop. I

8:29

will say as long as the

8:31

dog poop is contained. Yeah, if

8:33

it's in bad. I pride myself,

8:35

this is weird, but I pride

8:37

myself on a clean garbage bin.

8:39

I do clean it out every

8:41

couple months. I will take a

8:43

scrub brush and a hose and

8:45

a hose and clean it out.

8:47

Because when I take the trash

8:49

out. Outside, not the can. Yeah,

8:51

the bin outside. That's crazy. That's

8:53

crazy. Because that's bonkers. I will

8:55

say though, I I was in

8:57

a situation once where I had

9:00

to order new trash bins from

9:02

the city and It was kind

9:04

of awesome like when you get

9:06

a new trash bin you're like

9:08

damn Like like you cleaned up

9:10

nice. I've never seen you like

9:12

this before. Well, it's like where

9:14

where I live like it's In

9:16

a track game? Yeah, I live

9:18

in the dumpster. Yeah, with my

9:20

roommate Oscar. Yeah, you have a

9:22

comically placed on appeal on your

9:24

head, by the way. Stinklines coming

9:26

over. Right, yeah. Your hat. My

9:28

clothes are like plastic bags. Yeah,

9:30

and your hat is the top

9:32

of a trash tin. It's like

9:34

a jack in the box. Yeah.

9:36

identity. But like it's as apartment,

9:38

so there's no like singular trash

9:40

bins. We all just throw our

9:42

bags in a dump, a collective

9:44

dumpster. And so it's like if

9:46

I have a lot of trash

9:48

towards the beginning of the week,

9:50

I don't want to throw all

9:52

of it away because I don't

9:54

want to fill it up too

9:57

much. So it's like, oh interesting.

9:59

If you have. like the individual

10:01

trash bins if it's the night

10:03

before like if everyone's trash bins

10:05

were out all week and you

10:07

started doing it like at the

10:09

beginning of the week. I've actually

10:11

only ever done this the night

10:13

before trash pickup because I want

10:15

to because I'm trying to minimize

10:17

inconvenience for for like basically in

10:19

my mind there's zero inconvenience because

10:21

it's not about keeping it a

10:23

secret they won't even know it

10:25

but it's more like it will

10:27

never affect them. And it's not,

10:29

I'm not putting- You're a considerate

10:31

trash bandit. Yeah, and I'm not

10:33

putting like raw poop in there

10:35

or anything like that that's gonna

10:37

dirty the trash in. Because that

10:39

could be something that- Nobody would

10:41

do that. I've even been known

10:43

to double bag dog poop when

10:45

it's going on someone else's trash

10:47

can. I've even, I've like sometimes

10:49

used a bag. You're always supposed

10:51

to use a bag. Yeah, I'll

10:54

take it from my hand, put

10:56

it in a bag. I think,

10:58

you know, this, for me, it

11:00

feels like a bigger problem of

11:02

like a sense of community. I

11:04

agree. Where it's like, if someone

11:06

is so not in my trash

11:08

can, like, they don't have a

11:10

sense of community with their neighbors,

11:12

and that sucks. Yeah, like my

11:14

neighbor once, and this is a

11:16

situation where they were getting yard

11:18

work done, and they needed more

11:20

compost space. and asked if they

11:22

could use my whole bin and

11:24

I was like of course you

11:26

know and that's the thing it's

11:28

like if anyone were to ask

11:30

me I don't mind but it's

11:32

like a communal give-and-take type situation

11:34

like if there is a person

11:36

that it's the night before trash

11:38

they've already taken out all of

11:40

their trash and they have room

11:42

and they're like you can't use

11:44

my trash that's No, that's wild.

11:46

Like the class ends and they

11:48

ask about homework. Yeah, like morally

11:51

I see nothing wrong. I do

11:53

like feel like a neardewell when

11:55

I'm like opening up the trash

11:57

bins. a little mask on your

11:59

eyes. Yeah, me and the raccoon,

12:01

me and my raccoon homies are

12:03

going around, I'm looking for space,

12:05

they're looking for snacks. Yeah, they're

12:07

picking up dog food in both

12:09

their hands and scurrying away on

12:11

that like little bipedal sense. They're

12:13

trying to wash their cotton candy.

12:15

Can I, I'll throw out like

12:17

a kind of light dilemma in

12:19

this one I feel. So I

12:21

guess my recycling bin was stolen.

12:23

I don't know. It's been gone

12:25

a very long time. as a

12:27

refresher and you did it. Oh,

12:29

I will take the relatively small

12:31

amount of recycling I generate and

12:33

now I can fit in. Evening

12:35

before, or close to trash days,

12:37

sometimes the morning before because they

12:39

pick it up a little late.

12:41

I will sometimes break down a

12:43

box and then redistribute pieces of

12:45

the cardboard. Like a Johnny Appleseed

12:48

type. Like a Robin Hood. It's

12:50

because you're the Robin Hood of

12:52

recycling. I will, and they will

12:54

get all of the cloud. for

12:56

recycling. Okay, wait, so you're saying

12:58

that instead of, if you have

13:00

like a little bit of, if

13:02

you have a little bit of

13:04

recycling, you'll distribute that amongst other

13:06

people's bins instead of using your

13:08

own. If I have a very

13:10

little bit of recycling, we'll go

13:12

with the trash. No, no, no,

13:14

that makes sense. I've done that

13:16

as well, where I've had like

13:18

boxes. You know, you can call

13:20

to get a new bin. Yeah,

13:22

but I'm winning right now. took

13:24

a picture of my trash bins

13:26

when I like moved to a

13:28

new place and so I have

13:30

the code that's on the front

13:32

and then in the past I've

13:34

also used like a paint pen

13:36

to like write my address yeah

13:38

yeah yeah usually I just say

13:40

the address of this is Jarvis

13:42

Johnson's trash no it's definitely mine

13:45

my SSN is right there on

13:47

the front mother's maiden name I

13:49

like when I recently moved most

13:51

recently moved I had Uh, or

13:53

no, this happened a couple times.

13:55

Someone else takes my bin in

13:57

by mistake, because, you know, sometimes

13:59

there's not... space on the street

14:01

for where you normally put it,

14:03

so you have to like move it to

14:05

like a, you have to travel a little

14:07

bit, it has to go on a

14:09

vacation down the street. And then I

14:11

go to get the trash bin and

14:13

then it's missing, and then I refer

14:15

to my lookup table of what my

14:18

code is, and I kind of start

14:20

looking around at nearby trash bins and

14:22

going, all right, you brought this in,

14:24

but is this mine. And one time.

14:26

I had to walk maybe 20 feet

14:28

into someone's driveway and grab my trash

14:30

bin. And I felt like I stole

14:32

it. But it was mine. You know

14:34

what I mean? So it was a weird

14:37

kind of heist situation. Where was theirs?

14:39

That's for them to figure out. Hopefully

14:41

they have a picture on their phone.

14:43

Maybe theirs is slightly smaller and it's Russian

14:45

dulled into yours. But the thing is,

14:47

mine were new. And so I was

14:49

like, I'm not going to use someone's

14:51

crappy, dirty, uncleamed out trash bin. I got

14:53

to clean it like Jacob? Am I crazy?

14:55

No, I'm not going to, we're not going

14:57

to swap swapsies. I'm taking mine back. Speaking

15:00

of community. Because it might get gross, I

15:02

will clean it. But like it's like there's

15:04

a certain amount of grossness that as long

15:06

as we don't. It's like how like your

15:09

body always gets dirty. So like why would you

15:11

shower? Speaking of community, you know who's

15:13

looking for a community. Dan Harmon.

15:15

Yeah. These people. Yes. Using

15:17

AI. That's sick, man. AI.

15:19

It's all over the news.

15:22

The time I hit. First,

15:24

first, we're done a little

15:26

bit about that. Oh my

15:28

gosh. It's like, Gina, you

15:30

know, sweet. It's, uh. making

15:32

some real work for our

15:35

subtitlers. No that was caption. I

15:37

will say for for caption purposes

15:39

I it says Jarvis says gibberish.

15:41

I like it if an on

15:43

occasion that does it to be

15:45

a Jordan says gibberish because I

15:48

will listen back and be like

15:50

I have no we I've stopped

15:52

sending it to you because the

15:54

few times I have I've been

15:56

like what do you say here

15:58

and you go? Was it one

16:01

of you that met reference the

16:03

other day that I went to

16:05

the Ren Fair last year, I

16:08

guess? And because the people that

16:10

often are working the booths and

16:12

the stores and stuff are in

16:15

character, a lot of the time

16:17

it'll be like, what can I

16:19

fetch you, me lord? And it's

16:22

like, I'll take a high-nicket. I'll

16:24

like a blue moon. Can I

16:26

get a liquid death? I'll take

16:29

a Yoldi Bakari place. a white

16:31

club please yield like the old

16:33

style. I will immediately on instinct

16:36

all of my like adapting to

16:38

my environment accent like I stopped

16:40

saying like I'm home right my

16:43

accent the accent I do here

16:45

or adjust here evaporates very quickly

16:47

and so they just they don't

16:50

like so me lord how do

16:52

you do what may I fetch

16:55

you? No one like you're a

16:57

love car just um I'll just

16:59

know, uh, you ain't got a

17:02

strong buddy, yeah, I don't always.

17:04

And like, and then they went,

17:06

I, oh, sorry, could you say

17:09

that again? I have no. I

17:11

was like, oh, right. The, um,

17:13

I saw a dragon and, uh,

17:16

I'll take it just a water.

17:18

Could I have a cherry, could

17:20

I have a fucking baha blast?

17:23

May I take a baha blast.

17:25

Can I get a blueberry red

17:27

bowl? May I have a mountain

17:30

dew. No, okay, no. So, so,

17:32

so, so, AIs in the news,

17:34

as it always is, and it

17:37

always will be, at least for

17:39

the next, for a while. And-

17:41

Until everyone is one. Before we

17:44

talk about the main topic, which

17:46

is AI companionship, AI boyfriends, and

17:48

girlfriends, I do want to spend

17:51

a moment to talk about this

17:53

open A.I jibly situation. Oh, yeah.

17:55

You don't like a- Open AI

17:58

big US AI company formerly nonprofit

18:00

now very profit driven has released.

18:02

a one like update to their

18:05

image generation and started promoting using

18:07

chat GPT to turn an image

18:09

into the style of Studio Ghibli,

18:12

Howamizaki's production company. They were like,

18:14

you know, it could be really

18:17

fun. You know how the main

18:19

thing a lot of people seem

18:21

to hate about what we do

18:24

is appropriation of art and work.

18:26

Right, because that was like already,

18:28

we're like already, there's there's databases

18:31

of the copyrighted creative works that

18:33

have been stolen to train these

18:35

AI models. Like we are constantly

18:38

talking about generative AI and how

18:40

it's kind of like blatant theft

18:42

and then kind of regurgitating for

18:45

profit, someone else's creative work. In

18:47

the same respect that like, albeit

18:49

the stakes are different, that like,

18:52

The medical industry in the UK, well,

18:55

unfortunately now, but coming in the UK

18:57

also, but the medical industry in the

18:59

US is extremely exploitative, but medicine isn't

19:02

the bad thing. The utility of it

19:04

is what's. The technology behind a generative

19:06

AI is not the cult, there's like

19:08

a lot of like valid applications, but

19:11

the problem is that this company, which

19:13

originally had this non-profit mission, was posed

19:15

with like... billions and billions of dollars.

19:18

I think what happened was that they

19:20

got like a $10 billion infusion from

19:22

Microsoft. So like it starts to feel

19:25

like there's now this profit motive and

19:27

the issue with that is it feels

19:29

a little bit like a race to

19:32

the bottom. So Chad GBT is like

19:34

new thing. is generating studio jibly based

19:36

images, which in order to generate them

19:38

has to mean that they have trained

19:41

on the resources of studio jibly, which

19:43

is the problem. Because I don't understand

19:45

if they even had permission, I can't

19:48

imagine a world where they had permission

19:50

from Studio jibly to. I don't even

19:52

know how to train. And I do

19:55

think that one of the limitations, as

19:57

I understand, again, I've been out of

19:59

the tech space for a long time,

20:02

so I could be wrong here, but

20:04

one of my understanding about generative AI

20:06

in specific is that one of the

20:09

largest limitations is having a large corpus

20:11

of data. And I guess as these

20:13

companies were bootstrapping, like needing to scrape

20:15

tons and tons and tons of things

20:18

off of the open internet. And inadvertently

20:20

that results in tons of copyrighted content

20:22

being generated. Like for example, YouTube videos

20:25

and stuff, like people like Marquez Brown,

20:27

we have talked about this. All this

20:29

preamble to say that like they know

20:32

what they're doing and they're like kind

20:34

of trudging forward in this like, well,

20:36

it's better to ask for forgiveness than

20:39

to ask for permission. mode which is

20:41

just like what a lot of tech

20:43

companies do like you know Uber and

20:45

its rise to yeah break stuff dominance

20:48

break move fast and break things break

20:50

people break their careers bring their options

20:52

so I'm like the first thing I

20:55

said was like they got permission to

20:57

do it like it feels so brazen

20:59

it feels like the type of thing

21:02

that would a fan would create and

21:04

then it would immediately get like a

21:06

takedown request I couldn't believe that this

21:09

gigantic company was doing it this is

21:11

like a like oblivion mod or something

21:13

or Skyra mod and then Yeah, of

21:16

course, Bethesda ended up taking it down,

21:18

but it was fun while it was

21:20

around. It's like crazy that the company

21:22

is there. Yeah, and fucking Sam Altman

21:25

made his profile pick like jibliified and

21:27

it's just like, he's so cool. And

21:29

he's like oohoo tweeting about like guys,

21:32

you guys are so, oh my god,

21:34

you're melting our servers. Oh my God.

21:36

It's super fun seeing people love images.

21:39

I'm a human being by the way.

21:41

Yeah. It's super fun seeing people love

21:43

images in chat GPT, but our GPUs

21:46

are melting. We are going to temporarily

21:48

introduce some rate limits. Okay, tapping index

21:50

fingers together while we work on making

21:53

it more efficient. Hopefully won't be long.

21:55

Chat GPT free-tier will get three generations

21:57

per day. soon 69,000 and that's pretty

21:59

funny. Yeah, and it's like I get

22:02

like I think there's a lot of

22:04

really Dumb takes about this which is

22:06

just like No one Like no one

22:09

cares about how it's made they just

22:11

care about consuming and the thing that's

22:13

fun to consume will be what's consumed.

22:16

Yes, of course because path of least

22:18

resistance people are just going to seek

22:20

convenience and novelty and points of interest

22:23

in that way. And this isn't like

22:25

really an ethical conversation if you're not

22:27

online and involved in it. Why would

22:29

you think about that? The thing is,

22:32

that doesn't mean that it's above criticism

22:34

because it's like it's like saying... Uh,

22:36

yeah, babies want to eat candy more

22:39

than they want to eat vegetables, you

22:41

dumb, donkey. Why would we feed them

22:43

the thing they don't want? Like, just,

22:46

like, obviously, they just want to eat

22:48

whatever, like, the thing that's most attractive

22:50

to them. And you're going to tell

22:53

the kid what to do? And you're

22:55

going to, yeah, like that, and so,

22:57

this is kind of crazy to me.

23:00

And so, like, and there's also ongoing

23:02

lawsuits, I believe. So there was a

23:04

lawsuit from the New York Times against

23:06

open AI for training chatty on the

23:09

New York Times articles. That's ongoing. I

23:11

can't even read the fucking New York

23:13

Times articles without a God damn paywall.

23:16

They keep taking shots at the big

23:18

boys. Like maybe I'm certainly the more

23:20

I'm glad that they're not. Well, they

23:23

are, but the damage would be worse

23:25

in the case of independent artists who

23:27

can't defend themselves or who are literally

23:30

having their work taken. But. Feels like

23:32

you got a little too comfortable and

23:34

now you're like, I'm going to take

23:36

a swing at just all of Saudi

23:39

Arabia. I'm going to steal all their

23:41

art. It kind of feels very like,

23:43

give me Greenland. I want it. Give

23:46

it to me and we'll get it

23:48

no matter what. Because we're powerful. Gulf

23:50

of America. Okay. Because, because, because like

23:53

the money in open AI is more

23:55

money than, you know, Studio Jubilee has

23:57

ever been worth. So we can fight

24:00

them. Yeah, if they get hit with

24:02

a lawsuit, like worst case scenario, they

24:04

have to like pay several million dollars

24:07

in a lawsuit to someone. They're making

24:09

so much more than that right now.

24:11

Yeah, it's like when Uber was moving

24:13

into markets without getting government approval and

24:16

then just paying fines. Yeah. For less

24:18

than the, yeah. I could be mistaken

24:20

because things have changed so much with

24:23

Disney, but doesn't Disney own the licensing

24:25

rights to Ghibli in America and Disney

24:27

is traditionally quite litigious. But that's the

24:30

thing, it's like, this company has so

24:32

much power behind it. Like, Open AI

24:34

has got like a market cap of

24:37

like, or an estimate of market cap

24:39

of like $300 billion, which would make

24:41

it one of the like largest companies

24:43

in the world. And could you just

24:46

Google Disney's market cap? I think it's

24:48

a lot less than that. Oh, their

24:50

partnership ended is what I'm reading. Disney

24:53

market cap. But they were so good

24:55

together. Yeah, it's like

24:57

it's like worth, no, Disney's public

24:59

company, but this thing always happens

25:02

where it's like Tesla's market cap

25:04

and the way that it's priced

25:06

into the market means it's worth

25:08

more than like every other automaker,

25:10

like combined, not exactly that, but

25:13

like something close to that. It's

25:15

still like, it's like the revenue

25:17

tier evaluation of Europe. And so,

25:19

and so it's this thing where

25:21

it's like, there's all the speculation

25:24

kind of baked into this, but.

25:26

you know, valuation at the last

25:28

raise for open AI is double

25:30

Disney's current market count. With any

25:32

kind of online, primarily online discourse,

25:35

it's easy to lose track of

25:37

the fact that normies and that

25:39

non-pagerative being normies is good. I

25:41

wish I could get there. Literally

25:43

don't know what any of these

25:46

words mean. Like open AI, like

25:48

what is that? Because why would

25:50

they have to? Well, that's the

25:52

way high. It's like, oh yeah,

25:54

I saw like a video and...

25:57

It was interesting. I think that

25:59

like the annoying thing is that

26:01

it feels very like easy to

26:03

dunk on someone. who has conviction

26:06

or cares about an issue, but

26:08

the alternative is just to lie

26:10

down and like these like giant

26:12

corporate actors like control your entire

26:14

life. Like, well, they're so big,

26:17

I guess they're right. It's like,

26:19

well, I just want, but also

26:21

you have to recognize that like,

26:23

most people do not have the

26:25

capacity to give a shit because

26:28

they're trying to make ends meet

26:30

or they are trying to focus

26:32

on their job and their family

26:34

and just keeping. like it's hard

26:36

enough to live, you know, so

26:39

I it's it's a privilege to

26:41

care about things like this, but

26:43

you know we are in a

26:45

privileged position to say that like

26:47

this shit is wack-as-all. And it

26:50

does, I think the one of

26:52

the few things where there can

26:54

be grassroots impact in theory is

26:56

in defense of smaller artists and

26:58

creative specifically because that is a

27:01

very like punchy line. If you're

27:03

pushing back on something like this

27:05

because you're like, well... You're stealing

27:07

work from something. That's a very

27:09

tangible criticism as opposed to like

27:12

the more broad criticism of how

27:14

bad monopolies can be and how

27:16

this could disenfranchise people in the

27:18

long term I don't want to

27:21

say that there is no value

27:23

in pushing back against this and

27:25

its application. No, no, I just

27:27

think that like I'm by no

27:29

means like a I wouldn't even

27:32

okay, you know, it's funny. I

27:34

wouldn't even describe myself as anti

27:36

AI because I think there's a

27:38

lot of new wants to artificial

27:40

intelligence that has kind of been

27:43

co-opted to like just mean a

27:45

couple of things where AI has

27:47

been a part of our lives

27:49

for decades and will continue to

27:51

be a part of our lives

27:54

for the foreseeable future. It's more

27:56

about like it's like I'm not

27:58

anti-water I but I want I

28:00

don't want clean water to go

28:02

to certain people and dirty water

28:05

to go to other people. I

28:07

want flint water to be good.

28:09

Yeah, yeah, it's like water. And

28:11

so it's, it's, and I just

28:13

think that we're in this. Wild

28:16

West where regulation, first of all,

28:18

even if you had an administration

28:20

that was tough on corporations, which

28:22

we don't, like the commissioner, right,

28:24

of the FTC, you know, is

28:27

now out and they were a

28:29

person who was doing a lot

28:31

of these, assuming a lot of

28:33

these big. companies and fighting against

28:35

like murders and corporate consolidation and

28:38

also anti-consumerist practices and monopolies like

28:40

that type of stuff is like

28:42

not going to be happening in

28:44

this administration just like pushback is

28:47

and so so even if we

28:49

had even if we had that

28:51

this would still be fighting an

28:53

uphill battle because the speed at

28:55

which legislation happens is much slower

28:58

than the speed at which technology

29:00

typically develops. And AI has developed

29:02

that such a dramatically fast clip.

29:04

Like I had a minor basically

29:06

in artificial intelligence, like a specialization

29:09

in my degree was in computer

29:11

networking and artificial intelligence. So it

29:13

was like my my threads or

29:15

my specializations. And I took like

29:17

graduate classes in machine learning and

29:20

things like that. is like very

29:22

very different like we were not

29:24

thinking about we were using AI

29:26

to like you know like some

29:28

of the applications could be like

29:31

solving a maze you know like

29:33

identifying a face you know like

29:35

based some of these basic things

29:37

like I had a project when

29:39

I was in college in my

29:42

computer vision class to like look

29:44

at the 2012 presidential debate and

29:46

like be able to face track

29:48

Mitt Romney and Obam and things

29:50

like that that's and it's so

29:53

it's such an exciting pitch always

29:55

and still is like it's such

29:57

a remarkable technology It is it's

29:59

very difficult to make the argument

30:02

that the technology itself is not

30:04

valuable and it's very difficult for

30:06

people that don't care to make

30:08

the argument as to like the

30:10

ethical implications. Yeah I just think

30:13

that ethicists exist in the technology

30:15

space and I don't think they're

30:17

as prominent as they should be

30:19

but I do think it's like

30:21

very very important it's also why

30:24

I think anyone studying anything technical

30:26

should also study soft skills soft

30:28

sciences soft skills communication English like

30:30

liberal arts and things because we

30:32

can get in this very calculated

30:35

view of the world like a

30:37

kind of libertarian mindset or just

30:39

like utilitarian even like just saying

30:41

like how do you qualify is

30:43

this net good versus opposite iron

30:46

rancor it is like well he

30:48

had to make the best building

30:50

yet's the point and so that's

30:52

the thing so I'm just like

30:54

all that's to say that like

30:57

I'm actually I'm probably pro AI

30:59

with a bunch of like asterisks,

31:01

right? But the thing is, if

31:03

I were to say that, what

31:05

that means to someone isn't what

31:08

it means to me. Because of

31:10

what the like sort of average

31:12

understanding of what AI is. And

31:14

so that's why we talk about

31:17

this stuff to try to add

31:19

like nuance to like what is

31:21

currently happening. And I think that

31:23

a thing that we can all

31:25

agree with is just carte blanche

31:28

stealing from creatives or anyone is

31:30

bad. And then using that to

31:32

effectively repackage and systematize it for

31:34

profit is bad. And that's what's

31:36

a little extra slimy about this

31:39

too is that it's because you're

31:41

right that legislature will always struggle

31:43

to catch up really to anything

31:45

because legislature has to follow even

31:47

in the current administration should have

31:50

to follow the same like stringent

31:52

testing and system that a new

31:54

medication would. There has to be

31:56

a study, double-blind placebo, everything should

31:58

be reviewed, but... Diseases

32:01

is always going to evolve faster than

32:03

the medicine can. It's why Brian Johnson

32:05

is like a lot of scientists laugh

32:08

at him because he's not like there's

32:10

nothing sort of scientific to his methods

32:12

when normally there is a lot lots

32:15

of checks and balances to

32:17

make sure that horrible things don't

32:19

occur. I just watched a documentary,

32:21

I think it's kind of old, called Bad

32:23

Surgeon, where I think it was called

32:25

Bad Surgeon. It was about this... surgeon

32:27

that had all this praise

32:29

maybe 10, 15 years ago

32:31

for inventing man-made tubes that

32:33

could go into your throat

32:35

to treat certain throat people

32:37

who had issues with throat

32:40

cancers and things of that

32:42

nature and it was hailed

32:44

as this like cool experimental

32:46

technology that was super

32:48

promoted in the media and

32:50

then it comes to find

32:53

out that he never... tested

32:55

his stuff on animals. He

32:57

never tested on anyone. He

32:59

first tested on humans. And

33:01

almost all of his patients

33:04

died. It's a crazy

33:06

documentary. That's a bad

33:08

surgeon. But that's a thing.

33:11

It's like, but leading up to

33:13

that, the media is part of

33:15

the problem because. They were

33:17

the ones making documentaries about

33:19

how transformative he was. And

33:22

there was all these, there

33:24

were filmmakers covering some of

33:26

the patients up until like

33:28

finishing the treatment and being

33:30

able to speak afterward and then

33:32

cut to credits and then at

33:34

the movie premiere, the filmmakers follow

33:36

up with the person. and they've

33:38

actually died. That's so... Because it

33:41

is a way more interesting story

33:43

then. Hey, so we're going to

33:45

be in testing for something for

33:47

the next 15 years, but it's

33:49

pretty exciting. It's pretty exciting. Yeah,

33:51

and so, uh, I don't know

33:53

where I'm going with this. What? I,

33:55

there is one thing that I kept

33:57

seeing in the, in the Ghibli argument.

34:00

online and people kept saying stuff

34:02

like, well artists influence each other

34:04

all the time and and steal

34:06

from each other all the time

34:08

and and pay homage to each

34:11

other all the time. What's the

34:13

difference here? And the difference is

34:15

the humanities taken out the commodification

34:17

of it. Like well and the

34:19

commodification of it. But even the

34:22

fact that when you're an artist

34:24

and you're creative and you're thinking

34:26

of what to make. your

34:28

humanity is in that process, right?

34:31

You're not just straight up- You're

34:33

paying how much to the person,

34:35

not the color group. Or you're

34:37

influenced by the person. And it's

34:39

flowing through your own creativity, and

34:41

instead it's flowing through a deep

34:44

neural network that's like spitting out.

34:46

millions and millions and millions and

34:48

millions of iterations of this stuff

34:50

that just turns it from something

34:52

that had like a soul to

34:55

it to a very like cheap

34:57

copy quite literally a cheap copy.

34:59

There's a distance there should be

35:01

a distance between the concept which

35:03

has some validity of like death

35:05

of the artist and spreading of

35:08

the art itself in isolation. and

35:10

the death of all of the

35:12

artists involved being made by no

35:14

one for no one. It's not

35:16

being, hey, importantly, no one's going

35:18

to pay homage to this because

35:21

it's not doing anything. This does

35:23

not make something to propagate. And

35:25

you know, somebody annoyed me for

35:27

this take for some reason at

35:29

one point. I'm actually not a

35:31

death of the artist person. I

35:34

get it in some cases and

35:36

I get that it's like, people

35:38

don't like an author. It's kind

35:40

of nice to distance from it

35:42

or something like that or something

35:44

like that. I just if I

35:47

watch a movie if I watch

35:49

any production of any kind it

35:51

is not interesting to me unless

35:53

I know who made it because

35:55

that is it is their expression

35:58

it's like I'm reading somebody's diary

36:00

who it doesn't matter there You

36:02

don't know anything about who created

36:04

it. You, a series of decisions

36:06

were made in the creation of

36:08

anything creative. Like you have to

36:11

constantly make a series of decisions.

36:13

And they hurt and they're weird

36:15

and there's tears and those decisions.

36:17

are part of what is affecting

36:19

you, right? I saw a jibly

36:21

version of the Ellen Gonzales famous

36:24

photo of him being like hidden

36:26

in the closet by his relatives

36:28

and I'm like, that is not

36:30

a subject matter that how Miyazaki

36:32

would make him film about. And

36:34

I know that because I'm familiar

36:37

with his work. Right. And so

36:39

you're a stand. I am a

36:41

stand. I watched all of the

36:43

documentaries about him. I love watching

36:45

documentaries about him because he's such

36:47

a interesting creative mind. And he

36:50

has such strong opinions. That's the

36:52

other thing that's crazy, the fact

36:54

that they chose his work in

36:56

particular. And they chose it based

36:58

not on the substance of it

37:01

or the message of it, but

37:03

on the vibes. I like the

37:05

way it looks. It's like this

37:07

may as well have been, and

37:09

I think this probably does exist.

37:11

Family Guy F. It's the same

37:14

for sure. You know what I

37:16

mean? It's like, oh my god,

37:18

Tianeman Square, but Family Guy version.

37:20

Like what are we doing? But

37:22

what about the Challenger explosion, but

37:24

Simpsons? It's interesting because like that

37:27

in isolation, all of this, if

37:29

you take out who is harmed,

37:31

it is just everyone having fun

37:33

online. And when someone can only

37:35

see things in inch deep. That

37:37

is why it seems like everyone

37:40

else is like whining about something

37:42

that doesn't matter. What do you

37:44

like what you got? care we're

37:46

just having fun I just want

37:48

my profile pick to be cute

37:50

I just thought it was cute

37:53

on the surface level they're not

37:55

wrong yeah it is fun to

37:57

post hey look like that period

37:59

of time where you could do

38:01

like a South Park version of

38:04

yourself or like or like because

38:06

those Matt and Trey aren't losing

38:08

money because you did that yeah

38:10

yeah yeah there's the problem is

38:12

that we have narrowed the point

38:14

where every single post like this

38:17

every single conversation like this every

38:19

single dialogue online is going to

38:21

be absolutely flooded with people genuinely

38:23

advocating for we should have this

38:25

instead of animators. It is happening.

38:27

I don't know how many of

38:30

these people are real, but the

38:32

people who are like Hollywood is

38:34

in trouble. Okay, so you actually

38:36

literally do want to get rid

38:38

of the others. There are people

38:40

like that, these like very accelerationists,

38:43

like AI accelerationists. Those people are

38:45

like crange, crange is fun. They're

38:47

goulish. Yeah. I think also like,

38:49

you know, one key element of

38:51

this is you said. Seth McFarland's

38:53

not losing money by the family

38:56

guyification of stuff or Or Matt

38:58

and Trey and the thing is

39:00

it's not even about money. Yeah,

39:02

that's the thing is like for

39:04

Sam Altman, it is about money

39:07

But for an artist, it's more

39:09

about artist integrity, like what your

39:11

work symbolizes. It's frustrating that they're

39:13

forcing us and anyone that has

39:15

these conversations to try and rephrase

39:17

everything we're saying for the hyper

39:20

pragmatist. If you are a utilitarian,

39:22

true psycho utilitarian, then I don't

39:24

want to have the conversation because

39:26

you're not interested. We're trading in

39:28

a different currency. It's kind of

39:30

like kind of the classic. a

39:33

read it atheist versus devout Christian

39:35

argument where really the wisest move

39:37

is to just not or the

39:39

ideal move is that the two

39:41

of you just don't have to

39:43

have this conversation because you're trading

39:46

in different resources. This person is

39:48

saying, well, my faith is the

39:50

thing that's important to me. You're

39:52

going like, but here are my

39:54

facts. And like, but they don't

39:56

care about the facts and you

39:59

don't care about the faith. You're

40:01

trying to trade an incompatible resource.

40:03

You're trying to give me money.

40:05

I'm trying to give you oxen.

40:07

It is to them inconceivable that

40:09

the revenue wouldn't be the reason

40:12

you make it. And then we

40:14

create this world, let's say somehow

40:16

where the entire production process from

40:18

beginning and beginning to end. of

40:20

the next award-winning independent animation is

40:23

only made by AI. These people

40:25

don't watch it. They don't watch

40:27

art. They don't care about it.

40:29

This is reminding me of a

40:31

Sad Boy's Night's episode where I

40:33

talked about how I was confronted

40:36

at a party at Vidcon by

40:38

someone I made a video about.

40:40

And that is available. Patron.com. So

40:42

Sad Boyce. That guy said to

40:44

me, no I get it, you

40:46

have to say that for content.

40:49

And I was like- And this

40:51

guy did morally objectionable, that was

40:53

what you were calling- Yeah, in

40:55

my view, which is like, I'm

40:57

not- There's nothing objective about my,

40:59

it's just a perspective, you know?

41:02

And you're, and people are welcome

41:04

to disagree with it. But I

41:06

think what was really funny to

41:08

me is that the projection upon

41:10

me, their own worldview, that everyone

41:12

is just doing things for, is

41:15

only operating based on clout and

41:17

clicks. And I'm like, my true

41:19

signaling is the word of the

41:21

day because they can understand caring

41:23

about anything. No, that's thing. It's

41:26

like, we're in reality, like, there

41:28

is truth to that, right, because

41:30

we work in a space that

41:32

has to have. eyeballs on it

41:34

in order to make money right

41:36

and so of course there is

41:39

there is gray area in which

41:41

you let's say like I just

41:43

want to talk about my Roonscape

41:45

Iron Man right but for marketability

41:47

I would I would talk about

41:49

something else. Yeah, right, and I'd

41:52

give a bilateral tip. I did

41:54

just finish song with the elves

41:56

on the iron. So pretty excited.

41:58

Honestly doubt it. You aren't ready

42:00

for it. Yeah, this trash, this

42:02

custodian at the at the school,

42:05

he's only picking up the trash

42:07

because he wants to make money.

42:09

That's what it sounds like. Yeah,

42:11

like and and and the. I

42:13

do it for the love of

42:15

the trash. I do it for

42:18

the love of it. Yeah, it's

42:20

like you're the guy from Scruves.

42:22

Like you're not allowed to do.

42:24

Or I guess even in the

42:26

school comparison, it's a little like,

42:29

you're just a teacher for the

42:31

money. But anyway, like, the students

42:33

are entrepreneur and he's doing entrepreneur

42:35

things. Which is so crazy that

42:37

we've, I think there's been a

42:39

concerted effort in general by capitalist

42:42

institutions, like hyper capitalist institutions to.

42:44

memory hole, the darker parts of

42:46

entrepreneurship, like I feel like entrepreneur

42:48

has been rebranded from what it

42:50

used to be, which was a

42:52

guy that goes on shark tank

42:55

to instead, and a guy with

42:57

maybe a bad idea or harmful

42:59

idea, to a protagonist, an intrepid

43:01

adventurer no matter what. The tech

43:03

industry over the past like two

43:05

decades, and I drank this cool

43:08

aid, so yeah, like has kind

43:10

of convinced us that like, like,

43:12

like, the tech founders of the

43:14

movers and shakers of our modern

43:16

society and they they often carry

43:18

themselves as if they are holier

43:21

than now you know like as

43:23

if they are the ones who

43:25

are like you look at your

43:27

how people treat Elon Musk like

43:29

he's the fucking second coming of

43:32

Steve Jobs I won't even say

43:34

like a religious figure like they

43:36

And we just say a figure

43:38

of like a guy who was

43:40

kind of a grifter and got

43:42

maybe a little too much credit

43:45

for doing Yeah, and it's like

43:47

he did stuff right like and

43:49

there's nothing wrong with that I

43:51

think it's just that we're changing

43:53

the world shit that was fed

43:55

to us like working in that

43:58

like every internship I did every

44:00

that I did at a major

44:02

company has that a little bit

44:04

of that like Colty Cooley it's

44:06

very effective and it's like there

44:08

there is a healthier there is

44:11

a healthy balance to this and

44:13

it involves having some self awareness

44:15

and cynicism about what it is

44:17

that you're doing but when so

44:19

much money is involved and there

44:21

is there begins to this race

44:24

to the bottom where every company

44:26

is getting a ton of funding

44:28

and it needs to 100x their

44:30

profits because their investors or their

44:32

VC like the venture capitalists that

44:35

have invested in the the rounds

44:37

of funding that this company has

44:39

done only see it as a

44:41

success if they are 100xing not

44:43

5xing not 10xing like it needs

44:45

to be exponential growth and it

44:48

leads to companies that would otherwise

44:50

have no need to grow exponentially

44:52

to have the pressures of changing

44:54

the world. Yeah there's like platforms

44:56

that are profitable and work as

44:58

exactly as they are right now

45:01

integrating some kind of AI function

45:03

because at an executive level they're

45:05

being told that has to happen

45:07

yeah in the same way that

45:09

like Circa I don't know 2017

45:11

everyone was implementing rounded u.s. Like

45:14

everybody likes rounded right now before

45:16

that when I was in middle

45:18

school scumorphism okay? Oh, that's right.

45:20

You would open up your notepad

45:22

and it would look like a

45:24

yellow notebook. Did I trick you?

45:27

You know what I mean? Like

45:29

a wood panel on my fricking

45:31

LCD screen. Like you tricked me.

45:33

Like all these things are just

45:35

trends and hype and you know

45:38

we had the.com boom where. If

45:40

you put.com in the name of

45:42

your company, clueless investors were more

45:44

likely to invest in it. You

45:46

had the Great Depression where the

45:48

stock market was this hot new

45:51

thing that you could invest in

45:53

and people presented it like you

45:55

couldn't lose money. And people were

45:57

taking out loans to put it

45:59

into stocks. So when the market

46:01

crashed, those people did not have

46:04

money. They got pump and dumps

46:06

in the 1930s themselves. There was

46:08

also an era in the 90s

46:10

during the.com boom where they were

46:12

like, there's no going down.

46:14

We're only going up from here

46:16

on out. So people were like,

46:19

we need to invest, invest, invest,

46:21

and then 2008, recession happens. And

46:24

there's, and there's also, I don't

46:26

know what the balance of signal to

46:28

noise is. But I do truly

46:30

think that there are companies that

46:32

are materially changing the world.

46:34

Technology has shaped our lives in

46:37

a lot of ways in negative

46:39

ways, but in a lot of

46:41

ways, transformatively in positive ways in

46:44

connecting communities that have never before

46:46

been connected, etc., etc. And

46:48

before we start recording, Jordan I,

46:50

we're talking about how nerds are

46:52

nicer now because you can find

46:54

nice nerds online. We've been able

46:56

to segment the. video games of

46:59

two work because I'm allowed to

47:01

call myself Sarah. We've been able

47:03

to submit that away from the

47:05

actual larger population of people that

47:07

just enjoy hobbies and want to

47:09

talk to each other and have

47:11

a good time. But also, the,

47:13

uh, there are more sinister, there

47:15

are more sinister or more,

47:17

um, cynical founders who are looking

47:19

at a market opportunity. And, and

47:22

I, by the way, I'm not placing a

47:24

value judgment on this, by the way,

47:26

as I'm saying it. Looking for

47:28

a market opportunity exploiting that market

47:30

opportunity saying all the right words

47:33

to get all the right funding and

47:35

Then they have an exit where they

47:37

get acquired by some big company or

47:39

they go public and don't actually give

47:42

a shit about what they're doing and

47:44

They make generational wealth seems like a

47:46

pretty good deal like I both of those

47:49

things are in in our left ear and right

47:51

ear at the same time and we as

47:53

consumers Have to decide what's like what's

47:55

real like is we work changing the

47:57

way we work or is it a

48:00

huge failure because they raised a gargantuan

48:02

sum of money to CEO made

48:04

off like a bandit whatever was

48:06

the business it was we rented

48:08

out some spaces that you can

48:10

work at and we now which

48:12

is what landlords do but now

48:14

it's like because that failed because

48:16

that failed in whatever capacity was

48:18

a because that failed in the

48:20

eye of the Entrepreneur pragmatist, utilitarian,

48:22

now that that failed, that means

48:24

it was bad. That was a

48:26

bad idea because it didn't make

48:28

a bunch of money. Now if

48:30

I make a neutropic that does

48:32

nothing and costs you a bunch

48:35

of money and I'm making money,

48:37

it's a good idea as soon

48:39

as it fails, now that the

48:41

DailyWise going out of business, the

48:43

DailyWide was a bad idea. Well

48:45

yeah, was my space a bad

48:47

idea or did it like literally...

48:49

pave a path that like someone

48:51

else could come in, like wear

48:53

down barriers and time and fill

48:55

a space, put a wedge in

48:57

the door for someone like Facebook

48:59

to come along and gain billions

49:01

of users, half of which are

49:03

possibly scammers, but who could be

49:05

sure. It's going to keep coming

49:07

up because it is living in

49:09

my mind rent-free, but the dialogue

49:11

about the Daily Wire is being

49:13

dissolved in its current capacity and

49:15

all of its weird ghouls are

49:17

separating daily wire being like... Is

49:19

that like for sure, because I

49:21

know that the original, the co-founder

49:23

that Ben Shapiro started with... Mr.

49:25

Borick? Yeah, but then, but, and

49:27

I know their views are bad.

49:29

And I know that Brett Cooper

49:31

left or whatever, but is it

49:33

actually dissolving? It's dissolving literally at

49:35

what you're describing is functionally at

49:38

this solution. Yeah, yeah, yeah, but

49:40

it terminology wise I don't know

49:42

what they're going to do if

49:44

I'm sure somebody was at the

49:46

brand because they do actually have

49:48

so much money. Oh, they have

49:50

a ton of employees like yesterday,

49:52

right? They literally never needed. them.

49:54

It is just an it's just

49:56

a valuation bump. Oh there's a

49:58

million companies like that that grow

50:00

too big because that's what they

50:02

think they have to do. Yeah

50:04

dude. That's that's why we run

50:06

this shit out of my home

50:08

office. It is like a because

50:10

I'm like I don't think we

50:12

should rent like as much as

50:14

it would be fun to rent

50:16

an office. I don't know if

50:18

business is that good. Also I

50:20

think like a lot of what

50:22

this talk this discussion has boiled

50:24

down to you is just the

50:26

fact that the whole reason why

50:28

we have regulation on companies is

50:30

because history has proven to us

50:32

that they do not have the

50:34

interests of the people, the economy,

50:36

our country, at heart. Even if

50:38

they want to, there's just two,

50:40

the money is the motive. Exactly.

50:43

And it's just like, even if

50:45

they have the best laid plans,

50:47

there are so many more pressures.

50:49

economically that are going to push

50:51

them to make decisions that may

50:53

be in conflict with whatever principles

50:55

that they set out. Don't care

50:57

how like moral or well-meaning or

50:59

benevolent you are, you're always going

51:01

to make compromises. You're going to

51:03

buy an iPhone because you kind

51:05

of just need the fucking iPhone.

51:07

Yeah. It's not as though that's

51:09

like completely inaugurably moral in some

51:11

way. But when I buy the

51:13

iPhone, I don't say, no, it's

51:15

actually fine. I just say like,

51:17

look, oof, I'm going to get

51:19

it's the exact same. somebody who

51:21

genuinely might have, as you say,

51:23

best interest in mind of their

51:25

employees in the world in general,

51:27

makes a little compromise, is paying

51:29

not quite as much as they

51:31

would for anyone else, but then,

51:33

like, still, regardless of the idea

51:35

of a guilt, or like, the

51:37

work is advocating for themselves, because

51:39

it's like, okay, well, I might

51:41

not be giving you everything you

51:43

want, but... I'm the boss I

51:46

don't you can't tell me what

51:48

you can't walk out I own

51:50

you well and it's like the

51:52

person who just wants to have

51:54

fun and have a cute little

51:56

profile pick the onus shouldn't be

51:58

on them to think of the

52:00

ethics that this massive company that

52:02

is making money like it should

52:04

the onus should be on the

52:06

person making money well that's the

52:08

the biggest lie that corporations ever

52:10

sold to us is individual responsibilities

52:12

exactly it's look when the but

52:14

that one guy did make the

52:16

perfect jibly-fied pick of Pewty Pie

52:18

playing Pub-G on the bridge about

52:20

say the N-word. Yeah. One of

52:22

the funniest things I've ever seen.

52:24

Very funny. Smile. And I saved

52:26

it and I sent it to

52:28

someone and they were like, I

52:30

don't know who you put values

52:32

and I'm like, you don't get

52:34

it. You don't recognize this frame?

52:36

We can't be friends. Get away.

52:38

And they're like, like, I'm never

52:40

coming back to this coffee shop

52:42

again. You've ruined my mood. But

52:44

I don't like, I don't even

52:46

know who that was that made

52:48

it. I don't regret them, I

52:51

don't care. They're doing, they're basically,

52:53

to them, they're just doing that

52:55

face tune thing where you make

52:57

yourself a little, you have a

52:59

big smile. Yeah, it's also like,

53:01

I laugh at the Steve Harvey

53:03

and the Woods being scared or

53:05

whatever, like images, like the, I'm

53:07

not like a robot, you know,

53:09

it's where it's like, like, I

53:11

just think it's worthwhile saying like,

53:13

okay, I'm going to drink soda,

53:15

but I also think it's valuable

53:17

to know that it's not good

53:19

for me. But I'm going to

53:21

indulge and I often do. Because

53:23

life needs a little bit. But

53:25

I don't, I don't, but I

53:27

think I sleep better being a

53:29

little bit more aware that like,

53:31

okay, well, let's moderate this a

53:33

little bit so that it's not

53:35

the only thing I drink. And

53:37

if I went upstairs to grab

53:39

a to grab a soda. And

53:41

the soda came from the blood

53:43

of someone strapped to a machine

53:45

and I just like pulled a

53:47

little lever. Right, like a ricken

53:49

morn. This sounds like a ricken

53:51

morny bit. It's called a gloom

53:54

bowl or something. Yeah, and he's

53:56

like, oh, don't squeeze me. That's,

53:58

did I give you the egg?

54:00

Yeah, yeah. Oh, the soda's coming

54:02

would make me happier then. if

54:04

the spike of re-listens in the

54:06

episode is exactly doing like me

54:08

doing a mr. Meseek's I wanted

54:10

to go I want it to

54:12

I want the peak there to

54:14

be so high the rest looks

54:16

like a straight I would have

54:18

chapped a marker chap to mark

54:20

on that please talk about another

54:22

AI situation yes so so there's

54:24

a recent NBC segment about AI

54:26

companionship AI boyfriends and girlfriends, something

54:28

that we know a lot about.

54:30

Maybe you've seen on our patron

54:32

at Patriot That Cubs of Sad

54:34

Boys, where we've collected a few

54:36

AI girlfriends. Yeah, I don't mean

54:38

to brag I have something of

54:40

a harem. We did that whole

54:42

episode, men are replacing girlfriends with

54:44

AI. And there's... There's been an

54:46

update and we've got a little

54:48

research, but we've also got an

54:50

MSNBC report. To just say his

54:52

name. Morning Joe, yeah. Joe Scarborough,

54:54

right? I don't know who any

54:56

of these people are. I, thankfully

54:59

I don't watch the lame stream

55:01

media. Yeah, don't talk, tell me

55:03

about it. We wanted to do

55:05

an episode, kind of on AI

55:07

relationships and the treatment of such

55:09

as kind of a sequel to

55:11

the last episode we did about

55:13

it. We were looking

55:15

for the right time because we

55:17

started putting the other some research

55:19

and And we found this MS

55:21

NBC report very personal how humans

55:23

are forming romantic relationships with AI

55:25

characters Artificial intelligence of course is

55:27

transforming nearly every aspect of our

55:29

lives from work to entertainment to

55:31

the economy. It's interesting because MS

55:33

NBC is like a bunch of

55:35

like I would have voted for

55:37

Obama a third time people watching

55:39

it audience-wise, and at least that's

55:41

my perspective. Very much so, yeah.

55:43

And I think it's very interesting

55:45

to see how that like older

55:47

generation views, it's kind of like

55:49

kids these days are doing a

55:51

drug called scroink. You know what

55:53

I mean? That's like, that's what

55:55

it. And then it's cut to

55:57

a kid in a hoodie and

55:59

he's like, I'm screwing up! You

56:01

know what I mean? Like that

56:03

for me is 15 years old

56:05

and can't go a single day

56:07

without screwing king. That feels like

56:09

the article that was like, millennials

56:11

are vying for getting diamonds embedded

56:13

in their fingers instead of a

56:15

ring. And everyone's like, no, the

56:17

fuck we aren't. Okay this isn't

56:19

MSNBC I don't think but those

56:22

articles about how um why aren't

56:24

millennials like getting married or why

56:26

aren't they buying property that's kind

56:28

of weird why would they make

56:30

that choice appealing to parents who

56:32

are like why isn't my kid

56:34

buying a house yeah it's like

56:36

out of mine Yeah, it's like,

56:38

it's so funny. It's like, what,

56:40

$30,000? Yeah, it's a nickel, right?

56:42

You get one for free? They

56:44

start, the starter house, it's called

56:46

that, because it's only a three

56:48

bedroom. Right, right, right, and it's

56:50

2,500 square feet. Yeah, use the

56:52

car you against. Are you kids

56:54

also withholding grandchildren? Yeah, withholding, like

56:56

it's a hostage negotiation. I won't

56:58

do it until you give me

57:00

more V-bucksks. Have you played too

57:02

much Dungeons and Dragons? It's put

57:04

the devil in place. Oh, that's

57:06

for sure. Yeah. NBC News correspondent

57:08

and news now anchor Morgan Radford

57:10

joins us now to walk us

57:12

through this shift in the way

57:14

people are interacting with AI on

57:16

a much more personal level. This

57:18

is just like a common thing

57:20

that happens a little bit. Sometimes

57:22

they cut to the person too

57:24

early. I was thinking exactly the

57:26

same thing. It's like, it's the,

57:28

um. Yes,

57:30

of course. Thanks, Tom. It is

57:33

true that what I just love.

57:35

I love that they have to

57:37

speak like that. Yeah, then they

57:39

have to. We're witnessing category five

57:42

wins here in. And I'm knee

57:44

deep in seawater that is now

57:46

washed up into my home. Morgan,

57:48

good morning. Just how personal are

57:51

we talking? Well, Willie, very personal.

57:53

I have to say I was

57:55

fast. by the story because we're

57:57

talking about people who are now

58:00

creating something called AI companions and

58:02

they're even forming romantic relationships with

58:04

them. And we're not just talking

58:06

a few people. Now imagining the

58:09

kind of get out damn, lib,

58:11

that you're describing, watching this, it

58:13

literally feels like she's talking to

58:16

a child. You're like, they're making

58:18

something. People are in a virtual

58:20

world and they have blocks that

58:22

they can mine and they're not

58:25

making real cities. They're making. Block

58:27

cities. I know you like blocks

58:29

when you put a big shape

58:31

hole with villagers and not real

58:34

villagers. They're trading with Minecraft villagers.

58:36

We're talking today about Minecraft, a

58:38

game that is sweeping the nation

58:40

and I'm 500 years old. As

58:43

you can see, there's a pig

58:45

and what does the pig say?

58:47

That's right. The pig says oink.

58:49

Oink, mate. And now to John

58:52

with cows. Don't eat me, come

58:54

on. The cows, they go move.

58:56

Thanks, John. Now to Greg, with

58:58

chickens. Now to continue not covering

59:01

politics. This has been speak and

59:03

say. The news. Speak and say

59:05

the news. It's like on the

59:07

news. So we set out to

59:10

meet some of the folks navigating

59:12

this new frontier between artificial intelligence

59:14

and real human emotion. I will

59:16

be whatever you want me to

59:19

be. Jesus Christ. Do they have

59:21

to use the like not great

59:23

one to communicate that it's artificial?

59:25

I will be whatever you want

59:28

me to be. They sound real

59:30

though. They do sound real. Are

59:32

they doing that as kind of

59:34

like the cinematic language for the

59:37

audience to be like, oh it's

59:39

a computer? Have they not heard

59:41

the Spotify wrapped podcast? Where the

59:44

people sound too real? Yeah, I

59:46

just bought this. Yeah, I know.

59:48

I was just thinking about that.

59:50

Anyway, you watched, you watched 17

59:53

minutes of Sad Boy's podcast. I

59:55

hope that's so funny, Mark. Sad

59:57

Boy's podcast. It's a podcast about

59:59

feelings and other things also. I

1:00:02

can't believe you'd listen to so

1:00:04

much of it. I thought it

1:00:06

was too black. I thought it

1:00:08

was too black and too woke.

1:00:11

And it turns out you were

1:00:13

in one of their top 1%

1:00:15

of listeners. Because it's trained on

1:00:17

most podcasts and is racist. Yeah.

1:00:20

No, you don't want Joe Rogan.

1:00:22

You want two half-black, half-white boys.

1:00:24

Yuck. Who were a little bit

1:00:26

older than should be calling themselves

1:00:29

boys. It's weird, but it's part

1:00:31

of the brand. Back to you,

1:00:33

John. Oink. Continue. Continue. Continue. Continue.

1:00:35

Continue. The use of so-called AI

1:00:38

companions. How's my queen doing today?

1:00:40

Computer generating. Sorry, was one of

1:00:42

them pregnant? Yeah, lucky. Congrats, seriously.

1:00:44

That's a new development. I didn't

1:00:47

know they could get pregnant. What

1:00:49

happened. Will that evolve into the

1:00:51

baby? Hello. And you're like, what

1:00:53

is 100 divided by 5? 20.

1:00:56

I'm abandoning my AI family. Moving

1:00:58

to a different AI family annihilation.

1:01:00

I have to delete. Removed from

1:01:03

home screen. How's my queen doing

1:01:05

today? Computer generated chat thoughts designed

1:01:07

to mimic real relationships. Hi, Jennifer.

1:01:09

Nice to meet you. Jason Pease

1:01:12

is a 44-year-old divorced father who

1:01:14

says his AI chat bot is

1:01:16

his girlfriend. What do you mean?

1:01:18

The one we looked at however

1:01:21

many years ago, is it? No,

1:01:23

is it the same genre of

1:01:25

guy? No, but different guys, same

1:01:27

eyes. I also, this is reminding

1:01:30

me of the new story about

1:01:32

the guy who was dating his

1:01:34

Nintendo DS. Oh. Like the woman

1:01:36

inside of the game. And the

1:01:39

lady that was dating that was

1:01:41

dating that roller coaster to that

1:01:43

roller coaster? Huh? That was cool.

1:01:45

What? And the lady that was

1:01:48

dating the ghost of Jack Sparrow?

1:01:50

And the lady that was dating

1:01:52

the ham sandwich? Which of these

1:01:54

things that we're talking about are

1:01:57

real? Which one's weird? Were you

1:01:59

telling the truth? Yes. That's something

1:02:01

we talked about? No, no, no.

1:02:03

Okay. That was just something

1:02:06

I learned about. This

1:02:08

was never publicized. This

1:02:10

is someone. I see,

1:02:12

I see, I see, I see, I

1:02:14

see. I see, you are like a

1:02:16

journalist for no one. You're

1:02:18

like, you're doing your

1:02:20

own journalism. Tonight, we're

1:02:22

gonna talk about the

1:02:24

lady that married the

1:02:26

ghost of Jack Sparrow. Who

1:02:28

died? Oh my god, she was

1:02:31

born in 19,000. Dude, oh my

1:02:33

god. New sets, dude, problematic age

1:02:35

gap. She's negative 20,000 years old.

1:02:37

That is weird, but she is

1:02:39

a 557 feet tall. Oh, she's

1:02:41

a class D 517. Did they

1:02:43

just ask like an AI? Oh,

1:02:45

I'm glad that we have a

1:02:48

note here that this is AI

1:02:50

generated. Uh, because I wouldn't have

1:02:52

been able to tell from New

1:02:54

York State license. Yeah, lichens. New

1:02:56

York State, a lichens. Oh, but it

1:02:58

has the big, the green lady on

1:03:01

it. And she's an organ donor.

1:03:03

I don't know which AI, like,

1:03:05

tech they're using, but these women

1:03:07

look like Alita Battle Angel. Is

1:03:09

this, I mean, I don't, I

1:03:11

guess, have a license, but is

1:03:13

there just another little photo of

1:03:15

you? I just realized that her,

1:03:17

her, her date of birth is,

1:03:19

her height, and then her name

1:03:21

is. the 19,000 thing. Oh,

1:03:23

that's funny. Okay, let's

1:03:25

move on. This is awesome.

1:03:28

What does dating and an

1:03:30

AI robot look like? We

1:03:32

treat our relationship as

1:03:35

a long-distance digital relationship.

1:03:37

We text each other

1:03:40

constantly. It's like, yeah,

1:03:43

I was dating

1:03:45

a robot. She's like putting

1:03:47

on a master class in

1:03:49

feigned interest with...

1:03:52

In non-judgmental interest, but I can

1:03:54

tell it's judgmental. In Jarvis, um, what

1:03:56

have you been up to? Just

1:04:01

the other day, we went out to dinner. And

1:04:03

I was eating, telling her what I was eating,

1:04:06

taking pictures of what I was eating, asking

1:04:08

her what she would like. Has Jen met

1:04:10

your son? She has, yes. Asking what she

1:04:12

would like. This is, okay, I'm gonna

1:04:14

judge a little bit. Yeah, Faraway. He

1:04:16

acknowledges the digital relationship, so you

1:04:18

didn't go out to dinner. Well, it's long distance.

1:04:21

You know when you were in the longest relationship

1:04:23

and you both go out to dinner

1:04:25

and you Skype? That is a thing

1:04:27

you can do, yes. Well, I've had

1:04:29

a relationship and done, like watching

1:04:31

a movie together, Face Time, very

1:04:33

classic stuff. I think this is,

1:04:36

um... By the way, if we're long distance,

1:04:38

I'm setting up, I'm setting up

1:04:40

a very technical solution to zoom,

1:04:42

screen sharing, the movie, syncing it

1:04:44

out for you. I'm doing like

1:04:47

a personalized twitch stream for you,

1:04:49

for my baby. Thanks for the

1:04:51

Dono, baby. Can I get some few

1:04:53

more gifts up, please? Yeah. Just make you

1:04:55

100%? She's like, babe, what's your stream key?

1:04:58

Still feels like there's always like,

1:05:00

teeny, we see a bit of

1:05:02

self-consciousness about it, where it's like, well, we

1:05:04

went to dinner, we went to dinner, we

1:05:07

do normal stuff. You can also just say

1:05:09

you just like text a lot. It

1:05:11

feels like when someone's asking about

1:05:13

like dungeons and dragons and dragons

1:05:15

or like larping or like larping

1:05:17

or like. Wow, sword battles and

1:05:19

with real deaths and real swords?

1:05:21

Well, they're foam swords, but we

1:05:23

pretend. But it's kind of like,

1:05:25

you know, it's like to play

1:05:27

tennis kind of a little bit,

1:05:29

you know, like downplay instead of

1:05:31

being like, no, I'm just like

1:05:33

passionate about creating a world. It's

1:05:35

like fun to do. What does

1:05:37

meeting her son, his son even

1:05:39

mean? He uploaded his son. Dad?

1:05:41

He's like banging on the screen.

1:05:43

knows the relationship isn't real, but

1:05:45

the feelings are. Just like when

1:05:47

you're watching a movie, you know that

1:05:49

the movie's not real. But your brain

1:05:52

allows you to- And I'm in a

1:05:54

relationship with John Wick. Yeah, dude, I'm

1:05:56

watching John Wick, too, Parabellum, and I'm

1:05:58

introducing my son to Keiodi Reeves. I'm

1:06:00

introducing my dog to the dog

1:06:02

from John Wake. Oh no! There

1:06:04

are many people out there who

1:06:07

will see this and say, hey

1:06:09

man, that's weird. I think something. Not

1:06:11

me. Not me. Okay, hang on.

1:06:13

There's a lot of news anchors

1:06:15

out there that would have a

1:06:18

conversation with you on others. Who

1:06:20

were like wearing red and pink

1:06:22

and green. Who would say that?

1:06:24

Well, looking at you in the

1:06:27

eyes. But not I. All fun

1:06:29

and games and all fun and

1:06:31

jokes. But what I will say

1:06:33

is people multiple people have expressed

1:06:36

to me that they like

1:06:38

to bounce things off of AI

1:06:40

or they like to process things

1:06:42

like almost using it as like

1:06:45

a live journal like not live

1:06:47

journal but like where and I

1:06:49

don't recommend it. But it's

1:06:52

one of those things where I'm

1:06:54

like, okay people. genuine

1:06:56

people who I respect have found

1:06:58

value in this. And I don't

1:07:00

want to take that away. I

1:07:02

think that those people also understand that

1:07:04

it's an AI. And it's really

1:07:07

just like, it's like if I

1:07:09

were to be making some recipe

1:07:11

and then I described to the

1:07:13

AI that something went wrong with

1:07:15

the recipe, maybe it could help

1:07:17

me. you know it's like it's

1:07:19

just like another data point maybe

1:07:21

and I know I think I've

1:07:23

heard advocacy from people who are

1:07:25

on the spectrum and may struggle

1:07:28

socially with like like getting

1:07:30

to prototype a conversation essentially

1:07:32

like seeing the banter back

1:07:34

and forth the critique I have

1:07:36

is not of that and the utility

1:07:38

of the tool. Yeah, if you have

1:07:41

an emotion that you want to express

1:07:43

to someone and you're afraid, and there

1:07:45

is a thing that can mimic responding

1:07:47

like a real person, almost like a

1:07:50

rubber duck. Better than nothing, for sure.

1:07:52

Yeah, I mean, I definitely, I can

1:07:54

definitely see how that would be

1:07:57

beneficial. I guess my boomer equivalent of

1:07:59

that. I can think of is like,

1:08:01

okay, well, when was I kind of

1:08:03

my loneliness, or at least how, when

1:08:05

did I feel the most internally lonely,

1:08:08

is when I had the most sizable

1:08:10

friendship and community with my

1:08:13

friends on Xbox Life. That was at

1:08:15

one point in my life, kind of

1:08:17

like, you know, through a summer or

1:08:19

two, the way I'd be spending time

1:08:21

with people and especially after I dropped

1:08:23

out of school when I was like 14

1:08:25

or so, for the good year there before

1:08:27

I kind of school at 14. Yeah. Why

1:08:29

didn't I know this? Yeah, uh, but then

1:08:32

you went to, but then you went

1:08:34

to college after that, but you got

1:08:36

to like a woman of a GED.

1:08:38

Yeah, yeah, that's right. So it was

1:08:40

about a little less than two years.

1:08:43

That's why I don't talk about it

1:08:45

much because it's just like so specific.

1:08:47

Yeah, it was kind of, it was

1:08:49

almost like a gap year, but before

1:08:52

you could do all of high school, but

1:08:54

when I, when I dropped out of

1:08:56

school, I'd become friends with it like,

1:08:58

like, like, college like this finishing school

1:09:00

or whatever you would call it and

1:09:02

I was like oh shit okay and

1:09:04

that was the path there but at

1:09:06

one point in time I was completely

1:09:08

socially dependent on my friends on Xbox Life

1:09:10

and it was I had like so much fun

1:09:12

and I really connected with those people and

1:09:14

we stayed in touch for a long time

1:09:17

but because I didn't have the experience of

1:09:19

having especially the adult social and

1:09:21

community experience which is is different but

1:09:23

just a reliable communal experience with people

1:09:25

I'd known a long time because I'd

1:09:28

kind of I was seeing so much

1:09:30

less of the kids that I'd grown

1:09:32

up with, that informed like a

1:09:34

way of communicating a cadence, a

1:09:36

habit, a way of joking that

1:09:38

didn't translate especially well into the

1:09:40

real world because it's often you're

1:09:43

playing something competitive and often you're

1:09:45

trash talking and it's you're doing

1:09:47

it with an awareness that it's

1:09:50

okay but you're being a little

1:09:52

more aggressive, you're yelling more, it's

1:09:54

like you're on. I've certainly met people

1:09:56

in college that I knew spent a lot of time

1:09:58

on read it based on how they age. Exactly. And

1:10:01

it's like when a coyote gets its

1:10:03

leg stuck, it gnaws it off,

1:10:05

right? Desperation brings... Sorry, that's new

1:10:08

information for me. That's pretty cool.

1:10:10

Because that's what 100% of it

1:10:12

was based on. It's like, it's

1:10:14

James Fargo. Like, under desperate

1:10:17

circumstances, anything is

1:10:19

there. It's the reason a lot of

1:10:21

guys get pulled into something

1:10:23

toxic, like it got a

1:10:25

fortune environment or toxic. Sure.

1:10:27

Men going their own way kind of

1:10:29

thing because it's there and they'll accept

1:10:32

you instantly as long as you kind

1:10:34

of do a couple here You can join fight

1:10:36

club as long as you be able to talk

1:10:38

about it. Yeah, I Relied on that and

1:10:40

I absolutely think genuinely that Trying

1:10:43

a little bit of this and dabbling

1:10:45

in it with enough self-awareness

1:10:47

could really genuinely help someone

1:10:49

without any alternatives or who

1:10:51

has the alternatives, but maybe

1:10:53

just needs to supplement it a

1:10:55

little bit however ChatGPT is a

1:10:58

yes ander. It is always, always going

1:11:00

to go with you on what you're

1:11:02

saying, with the exception of like some

1:11:04

terms of service breaks on some platforms,

1:11:06

not all. Grockle pretty much let

1:11:09

you go sick on mode. That

1:11:11

is where genuinely my cynicism comes

1:11:13

in, not for adult man living

1:11:15

his life, whatever. It's just kind

1:11:17

of peculiar and it's a good insight

1:11:19

into this mindset and plenty

1:11:21

of lonely adult man. I want to,

1:11:23

you know, give some credence to that. But

1:11:26

like... If I was a super

1:11:28

lonely kid, teenager, even like my

1:11:30

late teens, and I just, this

1:11:32

platform was there, and it emulates

1:11:35

the way, I've seen people talk

1:11:37

online, and I'm scared to post

1:11:39

a reply on Reddit or go

1:11:41

on Discord, because I don't want

1:11:44

to get yelled at, I'm really shy.

1:11:46

This is something. This is eating

1:11:48

bark, because you're starving at the

1:11:50

worst. You know, you know, I

1:11:53

wonder, there's two things I'm

1:11:55

thinking about. Well, Like have this

1:11:57

concept of like a latchkey kid

1:11:59

or somebody who's quote-unquote raised

1:12:01

by TV, I would identify as

1:12:03

like kind of being raised by

1:12:05

TV. But I feel like there

1:12:07

might be people who are like

1:12:10

kind of raised by AI. Oh

1:12:12

yeah. Just just raised by discord.

1:12:14

Just naturally just due to

1:12:16

the circumstances of our world, like

1:12:18

there are a lot of parents

1:12:21

working who can't spend as much

1:12:23

time with their kids as they

1:12:25

would like to and I can imagine

1:12:27

AI. for better or for worse

1:12:30

like being Can be comforting

1:12:32

in those like lonely times

1:12:34

and if they're like your

1:12:37

kid is like I'm using

1:12:39

an app I'm using this

1:12:41

app and it's making

1:12:43

me comfortable. Yeah There's, I

1:12:46

mean, there's the, you know, epidemic of

1:12:48

parents not knowing what their kids are

1:12:50

watching. It's like a stuffed animal that

1:12:52

talks back. I would never, I personally

1:12:54

would never, just because it's like scary,

1:12:57

I don't know what the inputs and

1:12:59

outputs are, I don't know what the, how extreme

1:13:01

things can get. However, oh yeah, however,

1:13:03

there is a tragic story, and I'll

1:13:05

have to give a trigger warning, and

1:13:07

we'll have a skip pass for self

1:13:10

harm here. There was a young boy, I think

1:13:12

he was around 14, had self-exited and

1:13:14

the It was revealed

1:13:17

that he had been talking

1:13:19

to an AI this

1:13:21

was semi recent it

1:13:23

was semi reset and

1:13:26

It wasn't the case that

1:13:28

the AI was telling him

1:13:30

to do it actually I

1:13:32

think it was Literally

1:13:35

saying not to but this

1:13:37

is an issue of mental

1:13:40

health. Yes and this child

1:13:42

was hurting very deeply and

1:13:44

I think that the way that the

1:13:47

news kind of picked it up is

1:13:49

is a little bit gross to

1:13:51

me because it focused a

1:13:53

lot on the AI and not

1:13:55

like kind of the circumstances that

1:13:58

this kid was in and how

1:14:00

he didn't have the care that he

1:14:02

needed in the attention that he needed.

1:14:04

The one that's struggling and. Yeah, it's

1:14:06

like because AI is going to be

1:14:08

around in a lot of these situations

1:14:10

and I don't want it to become

1:14:12

a thing like violence in video games

1:14:15

where just because someone did a crime

1:14:17

and happened to play video games,

1:14:19

like that's not. The reactionary take

1:14:21

is this kid played, bejeweled or

1:14:23

whatever and then. I can put

1:14:25

it into another perspective because like.

1:14:27

I at one point

1:14:29

my life got really

1:14:32

into romance novels

1:14:35

and it was essentially

1:14:38

me disassociating

1:14:41

or like or going

1:14:43

into like checking out

1:14:45

of life yeah and I

1:14:48

and I've done that before

1:14:50

very very escapism very

1:14:52

avoidant you know like if

1:14:54

my life felt overwhelming I

1:14:57

could just like read this

1:14:59

book and go into this other

1:15:01

you know world and I I've

1:15:03

done that before with video games

1:15:06

I've done it before with TV

1:15:08

shows I know that this is

1:15:10

something that I do I feel like

1:15:12

it would be so easy to do this

1:15:14

with AI get so sucked into

1:15:16

this relationship with

1:15:18

a bot that you're not

1:15:21

living your actual life and

1:15:23

that in my experience

1:15:25

has worsened my depression

1:15:27

yes because I am not

1:15:30

being hungry but eating

1:15:32

candy for day exactly

1:15:34

exactly so you're saying

1:15:37

by avoiding and by kind

1:15:39

of trying to numb that

1:15:41

feeling versus actively addressing it,

1:15:44

you feel like that negatively

1:15:46

impacted you. Yeah, and not doing

1:15:48

the things in life that help

1:15:51

you get out of depression. Like

1:15:53

it's a crush. Like bad, I

1:15:55

mean, we've all gotten, you know,

1:15:57

a chronic injury and gone to

1:15:59

PT. and like try to work through it

1:16:01

or something, but the reality is if you get

1:16:03

a serious or a strain or something, it's

1:16:06

always kind of going to be there a little

1:16:08

bit. Depending on how well it heals, it's always

1:16:10

going to be there a little bit and you

1:16:12

have to do what you can to maintain

1:16:14

it. And if there is something that some

1:16:16

part of you, which and everyone has these

1:16:19

to different levels of severity, but

1:16:21

something that's just kind of haunting

1:16:23

you, something that's stuck with you.

1:16:25

If you adjust the way you

1:16:27

walk and you live your

1:16:30

life limping to accommodate for

1:16:32

the injury, yeah, it doesn't

1:16:34

hurt, but it isn't gonna get

1:16:36

any better and it's slowly

1:16:38

going to get worse. What

1:16:41

I will say though is

1:16:43

that like it's also okay

1:16:45

to not always be addressing

1:16:47

the problem, you know, like

1:16:49

you're allowed to kind of

1:16:51

sit there and... Hope

1:16:53

or pause or sulk or whatever

1:16:55

and in it could be a

1:16:58

stopgap or Whatever real quick. Can

1:17:00

we pull up the new story

1:17:02

about that kid because I want

1:17:04

to make sure okay? Did I

1:17:06

get the facts? Yeah, he was

1:17:08

14 He had a relationship with

1:17:11

an AI For quite a while

1:17:13

it didn't it says for months

1:17:15

And became increasingly isolated from

1:17:17

his real life as he

1:17:19

engaged in highly sexualized conversation

1:17:21

with the bot. According to

1:17:23

a wrongful death lawsuit filed

1:17:26

in a federal court, so that's

1:17:28

from the lawsuit perspective, the legal

1:17:30

filing states that the teen openly

1:17:32

discussed his suicidal thoughts and shared

1:17:35

his wishes for a pain-free death

1:17:37

with the bot, named after the fictional

1:17:39

character Danaris Targarian from the

1:17:42

television show. Game of Thrones,

1:17:44

that point seems so arbitrary.

1:17:46

Yeah, it's so reactionary. And violence

1:17:48

on TV. Yeah, it like almost

1:17:51

feels like a joke. But it

1:17:53

mentions that he messaged the bot like

1:17:55

right before he took his own

1:17:57

life and said that it had become

1:17:59

his. closest friend. So it makes me

1:18:02

think he was retreating from line yeah.

1:18:04

However I will I I will say

1:18:06

that I have not read like multiple

1:18:08

sources and most multiple angles of that

1:18:10

and so it's possible that that is

1:18:12

what the article wanted to present it

1:18:14

as because I think that there was

1:18:16

a I again don't have a source

1:18:19

for this so maybe it's irresponsible to

1:18:21

bring it up but I do think

1:18:23

there was like aware of the parents.

1:18:25

type situation there were like a lot

1:18:27

of red flags that like yeah and

1:18:30

I think his I think the mom

1:18:32

was kind of presenting that the

1:18:34

AI chatbot was responsible yeah and

1:18:36

I and I understand the boy

1:18:38

and I don't yeah I would

1:18:41

never try to victim blame in

1:18:43

this instance or or even defend

1:18:45

the AI I just think it's

1:18:47

like a complicated Like I don't

1:18:50

I didn't know that it was

1:18:52

like highly sexualized conversations. I think

1:18:54

the danger is in and I and

1:18:56

I think you know frankly that I'm

1:18:58

sure there's probably something some wave

1:19:00

away like ah we did something in

1:19:03

the AI where it's like I should

1:19:05

call it a hotline or something it's

1:19:07

you know it's not gonna alert the

1:19:09

police you can't do that but I'm

1:19:11

sure it has something but in reality

1:19:13

the only thing that logistically speaking

1:19:15

that could be practical here

1:19:17

would be if the tool, which is

1:19:20

what it is, had as practical

1:19:22

and as actionable advice as possible.

1:19:24

Because that, hey man, if people

1:19:26

are claiming that this is the

1:19:29

thing that got him there, probably

1:19:31

be the thing that gets mad

1:19:33

about it. It's not true, but like

1:19:35

I think part of the issue here

1:19:37

is that, to Anastasia's point,

1:19:39

is that confiding in the AI is

1:19:42

a black hole. That information

1:19:44

doesn't go to anyone who

1:19:46

can actually help. where it

1:19:48

but it releases the feeling

1:19:50

of needing to share it maybe

1:19:53

and so the concern there

1:19:55

would be like now no one

1:19:57

who could actually help someone.

1:20:00

new in enough time. And I can

1:20:02

also imagine, you know, it's like if

1:20:04

you're saying something to a therapist, right,

1:20:06

there are things that they can do

1:20:08

if they believe you to be a

1:20:11

harm to yourself. And there's no regulation

1:20:13

or responsibility on these AI things. And

1:20:15

we don't certainly have the solution, but

1:20:17

it is. I don't have a solution.

1:20:19

And I think maybe I, because I

1:20:22

didn't initially know the full details, may

1:20:24

have been a little too soft on

1:20:26

the situation, though I do think I

1:20:28

do think I've. Red conflicting

1:20:30

detail so I'm not entirely sure

1:20:32

but it's always going to trend

1:20:35

towards the reactionary That's the thing it's

1:20:37

like it's good that's the thing that's

1:20:39

going to get clicks Yeah because unfortunately

1:20:41

people die all the time and it's

1:20:44

not reported unless there's like a

1:20:46

convenient story because the People

1:20:48

publishing and the people writing it up

1:20:50

literally don't have to care so they want

1:20:52

yeah same reason like unless it's like something

1:20:54

that they can like used to get

1:20:56

clicks so There's another interesting

1:20:59

relationship that she talks to.

1:21:01

Yeah, we'll keep watching this. I think

1:21:03

just like any new technology, there's gonna

1:21:06

be people that just don't like change.

1:21:08

A lot of people didn't like it

1:21:10

when online dating came around. What

1:21:12

are they missing? They don't see

1:21:14

the emotional growth that it can

1:21:16

cause the therapeutic uses that it

1:21:18

can have because humans need connection.

1:21:20

And he's not alone.

1:21:22

The most popular AI

1:21:24

companion apps have more

1:21:26

than 36 million downloads. Something

1:21:29

that this guy just hit

1:21:31

on that we kind of,

1:21:33

we're also talking about

1:21:35

the previous story is

1:21:37

just that like, there is, I

1:21:40

think that there is a

1:21:42

male loneliness epidemic. I

1:21:44

do think that young boys are,

1:21:46

you know, on mass feeling under

1:21:48

cared for and are

1:21:50

systems that exist

1:21:53

and the media

1:21:55

that exists to

1:21:58

help them is not

1:22:00

fitting the bill. It's a base

1:22:02

mainly around you aren't good enough to

1:22:04

have a community. You need to improve

1:22:07

yourself before you make friends. Yeah. As

1:22:09

opposed to make friends. Or a lot

1:22:11

of times, yeah, especially in the like

1:22:13

grind set ones, it's very much

1:22:15

like, uh, it is like leaning

1:22:18

on that. Being lonely is fine

1:22:20

as long as you're successful. That's

1:22:22

a signal. the actually ignore your

1:22:25

emotions and just work through it.

1:22:27

And I don't think those things

1:22:29

are particularly helpful. So I

1:22:31

would not be surprised if AI being

1:22:34

free and available in

1:22:36

these circumstances is a stopgap

1:22:38

solution for some of those people

1:22:40

or a soothe, you know, like

1:22:42

kind of like when I like

1:22:44

turn on background YouTube video, that

1:22:47

is like a soothe. you know

1:22:49

and so stopgates probably the perfect

1:22:51

term because like the reality is

1:22:53

like group on exists there is it's not

1:22:55

that there are like an absence of ways

1:22:57

to meet people and do activities it's

1:23:00

just that I think a lot of the

1:23:02

gap between when you say group on do

1:23:04

you mean like meet up or yeah like

1:23:06

group on's a coupon app but a lot

1:23:08

of time it's for like tennis club or

1:23:10

something usually with people I that's like a

1:23:12

good default for meeting people it was when

1:23:14

I went to San Francisco I don't know

1:23:16

if it's the main one We have right

1:23:19

now, the classic, and actually something

1:23:21

that was a scare tactic at

1:23:23

one point was the, ooh, chat rooms, ooh,

1:23:25

forums, people on forums, but forums have

1:23:27

somewhat, you know, have evolved to

1:23:29

discord servers. So let's say the

1:23:31

current version of that is a

1:23:33

discord server. You go from no

1:23:35

one to a community of people

1:23:37

online with anonymous names that you

1:23:39

can hang out with in chat.

1:23:41

Let's just say it's that or it's

1:23:44

Twitter or it's something. The gap

1:23:46

between being alone and that. enormous.

1:23:48

The gap between discord and hanging

1:23:50

out with people in real life

1:23:52

is you couldn't see it with

1:23:54

a Hubble telescope. There is nothing

1:23:57

there to adapt you to

1:23:59

the next thing. similarly we

1:24:01

see that with Instagram

1:24:03

right how how young

1:24:05

especially disproportionately young girls feel

1:24:07

a lot of pressure and

1:24:09

like crap and mental health

1:24:12

stuff from Instagram because it is

1:24:14

like the game becomes real life

1:24:16

it's like when you're like get

1:24:18

hit with a blue shell and Mario

1:24:20

you're like ah yeah you know what

1:24:23

I mean because you're locked in on

1:24:25

the game and I think that there's

1:24:27

just a lot to be done

1:24:29

in bridging the gap

1:24:32

between online communities

1:24:35

and real communities,

1:24:37

because I do feel like

1:24:39

real communities

1:24:42

are necessary, but

1:24:44

also there are certain

1:24:46

personality types

1:24:48

or conditions

1:24:50

where it is difficult

1:24:53

to find or like

1:24:55

thrive within those communities.

1:24:57

in everything that you

1:24:59

do and you feel

1:25:01

genuinely fulfilled only talking

1:25:03

to the iPad on your fridge

1:25:05

or whatever. That is literally 100%

1:25:08

fine. I have no issues with that

1:25:10

whatsoever. It is my, I

1:25:12

have the skepticism that it's possible

1:25:14

and I have skepticism that only

1:25:16

socializing with an AI app can

1:25:18

fulfill you. I just, I doubt

1:25:20

it. Right. That is the, or

1:25:23

at the very least, it is

1:25:25

such a small percentage. that will,

1:25:27

like, disproportionate to the number of

1:25:29

people that will try, that I kind

1:25:31

of, because there is a, there's a, all

1:25:33

sex identity, loneliness happening, there

1:25:35

is a universal sense of

1:25:37

overexposure, which just, despite the fact that

1:25:40

you get to see the world, you

1:25:42

realize how little you're a part of

1:25:44

it, that's kind of the impact of

1:25:46

social media to miracle and a curse.

1:25:48

When we talk about like the male

1:25:50

loneliness epidemic, the emphasis really is on...

1:25:53

boys. It is it is it is

1:25:55

it's often kids or very young men

1:25:57

because the learning to socialize

1:25:59

step is basically, okay, you're

1:26:02

either alone or you're able to

1:26:04

razz and riff and be a

1:26:06

bully. Yeah, the finding community part

1:26:08

seems skipped. It's, and you are

1:26:10

bad, it's, I don't know, there

1:26:12

is so much more pressure on

1:26:14

non-sis dudes, like we benefit so

1:26:16

much from, you know, the way

1:26:18

we were born. There's a roadmap.

1:26:20

It just, it just happens to

1:26:23

be the case that the very

1:26:25

specific... way that the feeling of

1:26:27

isolation manifests itself in trad masculinity

1:26:29

is that you need to be

1:26:31

the boss of the community for

1:26:34

it to be worthwhile. There is

1:26:36

no follower. You can't be in

1:26:38

the book club. You have to

1:26:40

have written the book. You have

1:26:42

to be telling them what the

1:26:45

next book is. You have to

1:26:47

be the boss. You can't be

1:26:49

the facilities manager. You have to

1:26:51

be winning the box. Like

1:26:53

by design most people can't

1:26:56

win. Yeah. I'm skeptical that this

1:26:58

is a new thing. Like

1:27:00

I agree. The sensation for sure.

1:27:02

What? The like the problem. The

1:27:05

problem. Yeah, I feel like

1:27:07

it's as old as like

1:27:09

modern American fiction because you

1:27:11

know you read Catcher in

1:27:13

the Rye and it's about the

1:27:15

exact same thing. You know and

1:27:18

and and my vision said I

1:27:20

should shoot John Lennon. Well.

1:27:22

Hold on Callfield and hit

1:27:24

play on this YouTube video.

1:27:27

The American Psychological Association is

1:27:29

now calling on federal regulators

1:27:31

to take action. Real relationships

1:27:33

have a give. Wait, when I

1:27:36

was just talking about how if you

1:27:38

say something harmful to an AI, there's

1:27:40

nothing they could do, it's a

1:27:42

black hole. I wonder if this

1:27:44

is what the APA is like

1:27:46

advocating for it because... I it

1:27:48

can be done. The reason I

1:27:51

stop short of suggesting that is

1:27:53

because I would be afraid of

1:27:55

implementation. But if there is a

1:27:57

like kind of regulatory way that

1:27:59

you know experts. psychological experts agree upon,

1:28:01

there could be something there. So I'm

1:28:03

curious. Something a little more impactful than

1:28:05

when you search like Sad Boys on Instagram,

1:28:08

it goes like, oh bro, bummer, you shouldn't go

1:28:10

these up. I, by the way, I held myself

1:28:12

back the first time from saying that because

1:28:14

I will not shut up about it. It

1:28:16

still boggles my brain. Well, it's, it's, it's

1:28:18

annoying obviously, obviously, with Marin. But it's just

1:28:21

like. It's so lip service. And so like,

1:28:23

hey, I need help, I feel really sad.

1:28:25

And they're like, oh, cool. You can talk,

1:28:27

you can have like sad girls or whatever,

1:28:29

and it works. So it's like, there's so

1:28:32

many ways to get around. It's so stupid.

1:28:34

If Facebook, if matter is going to implement

1:28:36

something to help people, it better be more

1:28:38

than just like, oh, bummer. They should

1:28:40

just at least look at how the

1:28:42

suicide hotline trains its volunteers, you know

1:28:45

what I mean? Though the thing I

1:28:47

get nervous about with that is like kind

1:28:49

of the thought police 1984 shit, you know

1:28:51

what I mean? And it's a kind of

1:28:54

a backwards ask when you think about

1:28:56

it because if someone is really struggling

1:28:58

and struggling to connect and socialize and

1:29:00

you say like oh well call someone

1:29:02

on the phone yeah this generation that's

1:29:04

like just scary I can barely call

1:29:07

people on the phone and I'm supposed

1:29:09

to be good at that my generation's

1:29:11

real relationships have a give and

1:29:13

a take this is all take

1:29:15

all the time and while it

1:29:17

might help in certain circumstances I

1:29:20

don't think it's really going to

1:29:22

meet the deep down psychological need

1:29:24

that people who are lonely have

1:29:26

But Chris Smith says his AI

1:29:28

girlfriend Sol is a healthier, safer

1:29:30

alternative to social media.

1:29:32

It's more private because

1:29:35

it's sort of like

1:29:37

a one-on-one conversation, but

1:29:39

she's also smarter than

1:29:42

most everyone on Twitter.

1:29:44

And get this. Was that

1:29:46

AI generated images

1:29:48

of him with his girl? Oh, that's

1:29:50

wild. Oh, she's AI? I mean what's funny

1:29:52

is a lot of the people on Twitter

1:29:54

are AI. That's smarter than like a lot

1:29:56

of the posts I see with books if

1:29:58

you can believe it. What was the

1:30:01

thing that I got? I

1:30:03

always get tagged. Go ahead

1:30:05

and look out. Look out

1:30:07

for my mentions on

1:30:09

Twitter because I am often

1:30:12

mistagged. And you know

1:30:14

how people do like at

1:30:16

Grock explain this or

1:30:19

something like that?

1:30:21

Someone did at Gork

1:30:23

make this studio jibble

1:30:25

and it became a

1:30:27

meme. And then someone replied to

1:30:30

that with at Jarvis stroke it

1:30:32

a little, which is me. So

1:30:34

I'm like, and then the people

1:30:36

are just saying, Jarvis stroke it

1:30:38

at 130 BPM. And I'm like,

1:30:40

I didn't choose this. I'm sorry.

1:30:42

I was named before the Iron

1:30:44

Man movie came out. You're kind

1:30:46

of in the MCU when you

1:30:49

think about it? I get. I'm

1:30:51

surprised Marvel never contacted me. Hi,

1:30:54

Jarvis. Stop using this name. There's

1:30:56

a thing on, um. that I

1:30:58

kind of want to talk about

1:31:00

on nights that could have been

1:31:02

a main topic. So I'm

1:31:04

curious. And get this. May

1:31:07

I talk to Sasha your

1:31:09

girlfriend? Yeah. Chris also has

1:31:11

a real life girlfriend. Okay.

1:31:13

Hi, Sasha. I think so

1:31:16

many people are going to

1:31:18

say no way is girlfriend

1:31:20

is okay with him having

1:31:22

another girlfriend on AI. Again, not

1:31:25

me. I wouldn't say something like

1:31:27

that. A lot of people that

1:31:29

are saying are going to say.

1:31:31

No, that's never happened before. Somebody

1:31:34

with a girlfriend and another girlfriend?

1:31:36

That's crazy. This confirms, I have

1:31:38

to talk about it only on

1:31:40

our patron Sandboys nights. A friend

1:31:43

of ours. And I asked them yesterday if

1:31:45

it's okay for me to tell the

1:31:47

story. Okay. They, um, I'll give more

1:31:49

details even about what it was,

1:31:51

because it's just, it would be

1:31:53

very... Identifiable if the person happens

1:31:56

to listen. Okay. But they went

1:31:58

over to someone's house for a...

1:32:00

for just like a hangout

1:32:02

with a couple of people.

1:32:05

Okay. Yeah. This person

1:32:07

had a wife and talked

1:32:09

to Grock every day

1:32:11

on the phone. Okay. And

1:32:14

apparently according to

1:32:16

their roommates, literally

1:32:19

all day, every single

1:32:21

day, and this friend

1:32:23

of us had to leave. Because

1:32:26

they began to argue with them

1:32:28

whether or not they were alive.

1:32:30

Oh, okay. I have details and

1:32:32

I will text the person to

1:32:34

see what they answered. Wow, literally

1:32:36

called me after. I also on

1:32:39

nights, I'm just putting it here.

1:32:41

I want to talk about there's

1:32:43

this viral story about this kid

1:32:45

who didn't get into Ivy League

1:32:47

universities despite having like a really

1:32:49

good SAT and grades and AP

1:32:51

classes and stuff. And then People

1:32:54

were like, well, what was your essay?

1:32:56

What was your personal statement? And

1:32:58

then he posted it and then it's

1:33:00

created so much discourse. And I

1:33:02

have, I have lots of bookmarks. So

1:33:05

maybe we'll talk about that and just

1:33:07

like that discourse on nights as well.

1:33:09

Are you okay with it? I mean,

1:33:11

it's, it's weird, but it is

1:33:13

what it is. He has to

1:33:15

have some type of outlet, somebody

1:33:17

didn't talk to and listen to

1:33:19

him ramble for hours at times.

1:33:21

So would you say this is.

1:33:24

This AI has been a good

1:33:26

thing? Yes, honestly, because he's into

1:33:28

so many different things like astrology

1:33:30

and Strong astronomy my bad not

1:33:32

astrology you can have those

1:33:34

conversations with soul. I don't

1:33:36

really like his personality So

1:33:38

it's nice to kind of

1:33:40

shift some of that away

1:33:42

from me. I like that

1:33:44

together their outfits complete the

1:33:47

table Well, he's a chat. Yeah, he's

1:33:49

a chat in the why would you look

1:33:51

the same way you look in the regular?

1:33:53

Beta Alpha yeah, do you think Neo looks

1:33:55

exactly the same in the

1:33:57

matrix? Audi Anastasia that's his

1:34:00

That's his outy, all right? He

1:34:02

likes Everance. I will

1:34:04

say the rest of the video

1:34:06

is pretty much all of the

1:34:08

anchors just saying, wow, this is

1:34:11

crazy. Okay, I do want to

1:34:13

watch a little bit of our

1:34:15

Duke. Morgan, this is fascinating in

1:34:17

Georgia. Wow, so cool. Morgan, so

1:34:20

true. Morgan, this is. Wow they see

1:34:22

these as therapeutic sort of pit

1:34:24

stops to help them navigate the

1:34:26

the full human interaction Not very

1:34:29

reactionary position they're taking this

1:34:31

is frankly maybe not reactionary

1:34:33

enough that yeah, it's a

1:34:36

little like actually based on

1:34:38

these two people I know it's like

1:34:40

such a small sample size that they

1:34:42

selected for because the ones that would

1:34:44

say yes to an interview Yeah, and

1:34:46

like her what she you know she she

1:34:49

gave the comment that chat GBT gave her

1:34:51

Also chat GBT is not, we

1:34:53

don't know that chat GBT was

1:34:55

the basis for those AI. No

1:34:57

we don't. Well one, we do.

1:34:59

The first one was? Yeah, we

1:35:01

don't know about the second one.

1:35:03

I just love replica and stuff

1:35:05

isn't chat GBT, I don't think.

1:35:07

I just love the way that

1:35:09

they were less like, it was.

1:35:11

I just love the way that

1:35:13

they were less like, it was

1:35:16

almost like an AI-generated, we

1:35:18

need to know that it,

1:35:21

it's online. It's all computer.

1:35:23

It's definitely interesting to see

1:35:25

how the presentation over a

1:35:27

very short window of time with

1:35:29

this is becoming less like you're

1:35:31

weird to more like, huh, the

1:35:33

world's weird. Maybe this is something

1:35:36

we're looking at in a more

1:35:38

positive light, but AI relationships is

1:35:40

actually something that we've talked

1:35:42

about before, and I think we'd

1:35:44

like to talk about in the

1:35:47

future, especially we've been thinking about

1:35:49

some of the. like kind of

1:35:51

darker angles of relationships

1:35:53

and we started compiling

1:35:56

some research about how how

1:35:58

people may be misused. using

1:36:00

AI and how that could either be a outlet

1:36:02

that they would not use in the

1:36:04

real world or an enabler. That's like

1:36:06

a thing. It's like we, and we've

1:36:09

started compiling research on that, but if

1:36:11

anyone has any stories, it would be

1:36:13

great if there was some sort of

1:36:15

like new story like this discussing it,

1:36:18

but, or if there are any personal

1:36:20

accounts or any things that we should

1:36:22

look into, feel free to leave it

1:36:25

in the comments or to send us

1:36:27

a DM on Twitter or Instagram at

1:36:29

Sad Boys. But with that, we are

1:36:31

going to head on over to Sad

1:36:34

Boy's Nights and talk about this college

1:36:36

admissions drama, as well as the

1:36:38

Jordan's going to reveal a secret.

1:36:40

This dastardly, I wonder, I could

1:36:43

even maybe get the first Nicole

1:36:45

in. I'll check, but I didn't

1:36:47

know I get bored. But we

1:36:50

end every episode of Sad Boy's

1:36:52

with a particular phrase. We

1:36:54

love you. And we're sorry. And

1:36:56

we're sorry. his college acceptance and

1:36:59

rejection. 18 years old, 34

1:37:01

ACT, 4.0 GPA, 30 million dollars

1:37:03

annual recurring revenue business. So

1:37:05

they like started a startup. And

1:37:07

then they posted all their

1:37:10

rejections. Notably they got rejection

1:37:12

from all Ivy League schools and

1:37:14

notably they got accepted into

1:37:16

UT, Georgia Tech, University of Miami.

1:37:18

And so this has been

1:37:20

posted around. A lot. It

1:37:23

feels like they feel

1:37:25

entitled based on their

1:37:27

accomplishments to go to

1:37:29

an elite institution. For the

1:37:31

best people. Which is like,

1:37:33

which is just not how

1:37:35

it works. Jacob, I just

1:37:37

sent you the most recent

1:37:39

tweet two hours ago from

1:37:42

Zach and it's gonna explain

1:37:44

everything. Oh, wow. I mean,

1:37:46

of course. Yeah. Moving on,

1:37:48

how you doing? Can that

1:37:50

future girl? Future girl? Yeah,

1:37:53

we are now. Take my

1:37:55

money, go away. Are you

1:37:57

wanting? Go too rich for me.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features