Part Three: How The Zizians Went Full On Death Cult

Part Three: How The Zizians Went Full On Death Cult

Released Tuesday, 18th March 2025
 4 people rated this episode
Part Three: How The Zizians Went Full On Death Cult

Part Three: How The Zizians Went Full On Death Cult

Part Three: How The Zizians Went Full On Death Cult

Part Three: How The Zizians Went Full On Death Cult

Tuesday, 18th March 2025
 4 people rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

Also media, Welcome

0:04

to Behind the Bastards, a podcast

0:06

where I, Robert Evans, am going

0:09

to war like a one man like

0:11

Rambo, like one of the later Rambo movies,

0:13

not the first one that was actually about

0:16

the cost of PTSD

0:18

and Imperial War, but like the later ones

0:20

where he's a one man army. I'm doing that, and I'm

0:22

doing it against Microsoft because I fucking

0:24

hate co Pilot. With

0:27

me to talk about how much we hate Microsoft

0:29

co Pilot, my producer Sophie

0:31

Lichterman, and our wonderful guest David

0:33

Borie. David, how do you feel about

0:35

Microsoft co pilot?

0:37

Ah Rambo three, let's go okay,

0:40

okay, okay, kill a ton of brown people.

0:43

Well, no, I mean this one. We're just it's just like

0:46

Microsoft co pilots. We're killing Okay,

0:49

all of them are not really people.

0:51

It's so bad.

0:52

The outlook is terrible. Microsoft

0:55

has really gone far off of making a lot

0:57

of products that people hate to use.

1:00

Speaking of speaking of products people

1:02

hate to use, David said

1:04

before we started recording that he was

1:07

excited to hear this story. I just watched

1:09

it.

1:09

Know.

1:10

Where we're starting on this story is

1:12

page twenty one of the script, and

1:15

where we add the script is page

1:17

forty nine.

1:18

WHOA, I I

1:20

made a mistake in doing this. I'm

1:24

gonna admit that right now before

1:26

we get further, I'm gonna say I aired in

1:29

this. And it's you know, I've

1:31

made peace with the inevitability

1:33

of fucking stuff up, especially when like every

1:36

week you're doing a different chunk of history and we're veering

1:38

from like we're talking about fucking seventeenth

1:40

and eighteenth century France and then like

1:43

now we're talking about like a fucking

1:45

gay you died, a genocide and Darfur

1:47

or whatever. Right, like you're going to these are

1:49

all important topics, but like you simply

1:51

can't every single week cover the

1:53

breadth of stuff that we do and not you're gonna

1:55

misspeak, You're gonna make errors and stuff.

1:58

And when it comes to like I'm talking about I'm

2:00

talking about like not obviously those are important,

2:03

but you know, if I fuck up some fact

2:05

about like early nineteen hundreds

2:07

Germany, I'm not going to be like too bent

2:10

out of shape because it's like, you know, there's

2:13

there's no perfection in this. But in this case,

2:16

it's this tiny little community that

2:18

nearly all of the reporting on has been like

2:21

deeply incomplete, and I feel

2:24

like I like the

2:26

stress over, like what do I include

2:29

in here? And the other problem is that none of these people

2:31

have editors, and so all

2:33

of everybody in this story has a blog,

2:35

and every blog post is like forty thousand

2:37

words.

2:38

So it's just like, now, just say, what

2:40

media are you able to get this? Is? You're getting

2:42

this all straight from the source, right, A.

2:45

Lot of it's I mean I read, I've read

2:47

most of Zizz's blog entries, and I've at least

2:49

done like little surveys of the blogs

2:51

of everybody else involved in this. There

2:54

were also a couple of very helpful

2:57

compilations that like people, there's like one

3:00

like a former Sometimes it's like

3:02

former members of the community. Sometimes it's folks

3:04

who are like rationalists that were trying

3:06

to warn other rationalists about Zizians.

3:09

But like people in and around the community

3:11

have put together compilations where they'll like

3:13

clip mixes of news stories and

3:15

like conversations online come

3:18

and obviously these folks like nasty

3:22

work, yes, yes, and

3:24

I'm deeply great, well, we'll have source links at

3:26

everything in here. I note when I'm kind

3:28

of like pulling something from something directly,

3:31

but like, I'm very grateful to the maniacs

3:34

who put together these like documents that have

3:36

helped me piece together what's happening. Because really,

3:39

if you're coming in as an outsider, if you weren't

3:41

like embedded in this community while all is

3:44

crazy shit was going on, it's

3:46

a little it's kind of impossible to like

3:49

get everything you need to get. You have to

3:51

refer to these interior sources. It's

3:54

just the only way to actually understand

3:56

stuff.

3:57

Oh yeah, I as an out cider,

4:00

I don't know what's going on.

4:02

I don't

4:04

know where it's going for sure. I

4:06

don't know where it's going.

4:07

It's going, we know where it ends, which you know what

4:09

it ends. A member of Congress shows

4:12

up at the library in Vermont that

4:14

the US and Canada shares because a

4:16

border patrol agent was murdered there and threatens

4:18

to take over Canada, and that's all.

4:21

Like, there's a degree to which you can kind of

4:23

tie heightened tensions between the US and Canada

4:26

to the murder of this border patrol agent, which itself

4:28

is directly tied to the fact that Alicia Jedkowski

4:31

wrote a piece of Harry Potter fan fiction.

4:33

I love it all goes back to that.

4:35

Yes, yes, it all comes back to bad

4:38

Harry Potter fan fiction. So

4:46

part three we spent last episode

4:48

talking about zizz Is moving to the

4:50

Bay and their first interactions with

4:52

the rationalist community. That

4:54

big Sea Far conference

4:56

they went to that was very reminiscent, had a

4:59

lot of exercises reminiscent of like Sin

5:01

and on ship.

5:01

Right, very very a

5:04

lot of talking murder, Yes,

5:06

a lot of talk a murder.

5:07

These people love theorizing about when

5:09

it's okay to kill people. Constant

5:13

factor at all of this, which.

5:15

Is can't be a step in a good direction.

5:18

Yeah, you know, you should.

5:20

You should be aware of. There's

5:22

like, if your community is talking about like

5:25

the ethics of escalating

5:27

to murder in random arguments

5:30

too much, maybe be a little worried.

5:32

If someone sits down next to you and says,

5:34

how would you murder me? Or whatever? This right?

5:37

You always got to get out of that room.

5:39

Yeah you want to, You want to, You want to leave immediately.

5:44

And even more if they're like, yeah, that's the right

5:46

way, even worse sign and

5:48

then.

5:48

If they're like, yeah, would you would you perform

5:51

necrophilia? In order to, in the past

5:53

scare people away from attacking you, like,

5:56

get out of that room, leap bad.

6:00

This is not a crew you want to be a part of. Yeah,

6:02

maybe just take a pickleball or pickleball.

6:05

People never talk about necrophilia

6:07

playing pickleball.

6:08

I don't think one time. I don't think one

6:11

time.

6:12

No, they all talk about how they're getting

6:14

knee replacements, and that's beauty

6:16

of pickleball exactly. So,

6:22

in spite of how obviously bad this

6:24

community is, Ziz desperately

6:26

wants to be in the center of the rationalist

6:28

subculture, and that means being in

6:30

the Bay. Unfortunately, the Bay

6:33

is a nearly impossible place to survive in if

6:35

you don't have shitloads of money, and

6:37

one of the only ways to make it in the Bay

6:39

if you're not rich is to wind up

6:41

in deeply abusive and illegal rental situations.

6:45

You know this, David, I'm I'm

6:47

not spreading any news to you.

6:49

Shout out to my landlord, mister lou.

6:54

So.

6:54

Ziz winds up in a horrible sublet

6:56

with a person she describes as an abusive alcoholic.

6:59

I was there. I don't know if she was

7:01

the problem. In this part like, I obviously

7:03

I've got one side of this story, but her claim is

7:05

that it ends in physical violence. Ziz

7:08

claims he was to blame, but she also describes

7:10

a situation where they're like, after a big argument,

7:12

bump into each other and he calls the

7:14

cops on her for assault. I wouldn't put

7:16

it past Ziz to be leaving some parts out of this.

7:18

But also I know a bunch of people who

7:21

wound up in horrible sublets with abusive

7:23

alcoholics who assaulted

7:26

them in the Bay Area.

7:27

And then La.

7:29

Chrislist is a crap shoot.

7:31

Craigslist is a crap shoot. Yeah, every

7:33

time I always I feel

7:35

like they need to like qualify with like this is just

7:37

his's account. But also this sounds like a

7:39

lot of stories I know people have had.

7:41

Yeah, it's tough to get by there.

7:44

Yeah, so she calls

7:46

the or he calls the cops

7:48

on her, and then yeah, they do nothing,

7:51

and he attacks her in her bedroom

7:53

that night. So she decides to like he's like throwing

7:55

a chair at her and shit, So she decides, I

7:57

got to get out of this terrible fucking sublet. And

8:00

unfortunately, her next best option,

8:02

a very common thing in the rationalist community

8:05

is to have whole houses rented out

8:07

that you fill with rationalists who don't have a lot

8:09

of money. It

8:11

never rents by the artists yet kind of

8:13

like artists or like content producer houses,

8:16

it never explodes. People never

8:18

have horrible times in these. This

8:22

particular rationalist house is called liminal

8:25

because you know, gen Z loves talking about their

8:27

liminal spaces on the Internet.

8:30

One resident of the house reacts very

8:32

negatively when Ziz identifies herself

8:34

as a non transitioning trans woman and

8:36

basically asks like, when are you going to leave? So

8:38

she has, you know, she says that as

8:41

soon as she arrives one of the other residents's

8:43

transphobes, she can't stay there very long. Again,

8:46

all sounds like a very familiar Bay Area

8:48

housing situation story. She bounces

8:51

around some short term solutions airbnbs,

8:54

moving constantly while trying to find work.

8:56

She gets an interview with Google, but the hiring process

8:59

there is slow. There's a lot of different stages

9:01

to it, and it doesn't offer immediate relief

9:03

from her financial issues. Other potential

9:05

offers fall through as she conflicts with the fundamental

9:08

snake oiliness of this era of silicon

9:10

Valley Development, ziz blames on

9:12

the fact that she couldn't feign enthusiasm

9:14

for companies she didn't believe in. Quote, I

9:16

was inexperienced with convincing body language,

9:18

inclusive lies like this. I

9:20

did not have the right false face, but very

9:23

quick to think up words to say so, like I'm

9:25

not good enough at lying that I'm excited about

9:27

working for an app to you know,

9:29

help you do your laundry better, which

9:32

is like a third of the bay.

9:33

Yeah, right, yeah, And once again she

9:35

has like flashes of like, oh,

9:38

wow, you really you really have strong morals

9:40

and all the aim. Yeah, she's

9:42

a strong resume, right, it wasn't she

9:44

does she wants like an award as a NASA

9:46

intern, Right, She's still yeah, she's

9:49

she really is good at a lot of this stuff.

9:51

In all of these Zizians, as silly

9:54

as their their beliefs about

9:56

philosophy and like cognitive science are,

9:58

they're all extremely a comp in their fields.

10:01

Nearly. It's

10:03

a it's good evidence of the fact that like it's

10:05

always a mistake to think of intelligence as

10:07

like an absolute characteristic,

10:10

like I am a genius software engineer,

10:12

therefore I am smart. It's like no, no, no, you

10:14

you're you're you're you're dumb at plenty of

10:16

things, mister software engineer.

10:18

I don't sell yourself. Sure.

10:20

Yeah,

10:22

So she does start to transition

10:25

during this period of time. She goes on finasteride,

10:27

which helps to avoid male pattern baldness,

10:30

and she starts experimenting with estrogen and anti

10:32

androgens. She'd wanted to avoid this for

10:35

I'm sure she had a variety of reasons, but

10:37

as soon as she starts taking hormones they have

10:39

such a positive effect. She describes it as a hard

10:41

to describe felt sense of cognitive benefits,

10:44

and she decides to say stay on them. By

10:47

October, she'd committed to start writing

10:49

a blog about her own feelings and theories on

10:51

rationalism, and her model here was

10:53

Yudkowski. She names this blog Sin

10:55

Seriously, and it was her attempt to convince

10:58

other rationalists to adopt her belief

11:00

about like veganism and such.

11:02

Her first articles are like pretty bland. It's

11:05

the scattered concepts and thought experients,

11:07

very basic stuff like can God create

11:09

a rock so big God couldn't move it? And

11:11

then like throwing a rationalist spin on that.

11:13

So it's you know a lot of this is like, oh,

11:15

maybe in an area in which college didn't cost two hundred

11:17

grand, you could have just gotten a philosophy

11:20

degree, and right, that would have made you happy.

11:22

Like, right, you

11:24

just wanted to spend a couple of years talking through

11:27

silly ideas based on dead

11:29

Greek guys.

11:30

Well you know the Bay is the place to do that.

11:32

Yeah. Well, unfortunately, so she

11:35

starts to really show an interest early on though, and this is

11:37

where things get unsettling and enforcement

11:39

mechanisms, which are methods by which individuals

11:41

can like blackmail themselves into

11:43

accomplishing difficult tasks for personal

11:46

betterment. She writes about an app called

11:48

b Minder, which lets you set goals

11:50

and punish yourself with a financial bit penalty

11:52

if you don't make regular progress. And

11:54

she's really obsessed with just the concept of

11:57

using enforcement mechanisms to make

11:59

people better, writing often

12:01

you have to break things to make them better. So

12:03

not a great path going down? Here

12:06

is she following this herself like she's

12:09

working on She's trying to use some of these

12:11

tactics on herself to make herself to deal

12:13

with like what she sees as her flaws that're

12:15

stopping her from you know, saving the cosmos.

12:20

Great stuff, a lot I've brought pressure to put

12:22

on yourself.

12:22

Yeah, this poor woman has

12:25

been under the highest stakes this whole time.

12:27

Well, and that's again that that comes.

12:29

That's not Zizz, that's the entire rationalist

12:32

subculture. The stakes are immediately, we

12:34

have to save the world from the evil

12:36

AI that will create hell to punish

12:38

everybody who doesn't build it, and

12:41

that actually will talk about this later. That breaks a

12:43

ton of people in this She is not the only one

12:45

kind of fracturing her psyche

12:48

in this community. So right

12:50

around this time, she's bouncing around short

12:52

term rentals and like desperately trying to get

12:54

work. She meets a person named

12:56

Jasper Wynn, who at that point identified

12:58

as a trans woman. Now those by Gwynn

13:00

Danielson and uses by them pronouns.

13:03

That's what I'm gonna refer to them, but for clarity's

13:05

sake, I'm gonna call them gwyn or Danielson, even

13:07

though they went by a different name at this time, because

13:09

that's what they're called now. Gwinn

13:11

was a fan of Zizz's blog and had some complex

13:14

rationalist theories of her own. They

13:16

came to believe that each person had multiple

13:18

personalities stored inside their brain,

13:21

a sort of like mutation of the left brain

13:23

right brain hypothesis, and each of these

13:25

sides of your brain was like a whole, like

13:28

intact person, right, Like

13:31

great, yeah, no, cool,

13:33

No, you guys are gonna be fucking with your head's

13:36

real heart. Great.

13:37

Oh

13:39

yeah.

13:40

So Ziz falls in love with Gwynn's ideas

13:42

and she starts bringing them up in rationalist events,

13:45

trying to brute force them into going mainstream among

13:47

the community. But people are like, this is a little

13:49

weird even for us, and she does

13:51

not succeed in this, and as a result,

13:53

she and Danielson and a couple of other friends

13:56

start like talking and theorizing together

13:58

separately from the bulk of the community. So

14:00

now again you've had this. They're starting to calve

14:03

off from the broader subculture, and they're

14:05

starting to like really

14:07

like dig ruts for themselves in a specific direction

14:09

that's leading away from the rest of the rationalists.

14:11

Literally, all that cold stuff.

14:13

Huh, all that colt stuff, all that

14:15

cold stuff. Now, Gwynn

14:18

and Ziz largely like bonded

14:21

over their struggle paying Bay Area rints,

14:23

and together they stumbled upon a solution beloved

14:26

by generations of punks and artists in northern

14:28

California taking to the sea

14:31

specifically. It's

14:33

great, it's great. I

14:36

mean, I've known like three separate people

14:38

who lived on boats in the Oakland Harbor

14:40

because it was like, this is the only way I can't afford

14:42

to live in the bay.

14:44

My little brother went to school

14:46

right right outside of San Francisco,

14:48

and his principal lived on a boat right

14:50

just like a mile away from

14:52

the school, and everybody loved it.

14:54

Yeah, yeah, everybody loved it. I mean, I gotta say,

14:56

everyone I know who lived on a boat lived on a

14:58

shitty boat. But I'm also not

15:00

convinced there are boats that any, boats that stay

15:02

nice for very long.

15:04

Yeah, it

15:07

feels like you would be dank. I guess is the word.

15:10

Dank is a good description of boat

15:12

life, I think.

15:13

Yeah.

15:16

So Gwinn's boat was anchored off the Encenal

15:19

Basin, and Ziz found this a pretty sweet

15:21

solution. She goes over to stay over

15:24

one night and while they're like hanging out, staying

15:26

up, probably taking drugs,

15:28

they don't like usually write about it, but

15:31

from like other community conversations, I think

15:33

we have to assume an awful lot of the time

15:35

when these people are staying up all night and talking,

15:37

there's a lot of like ketamine and stuff being

15:39

used to that isn't written into

15:41

the narrative.

15:42

That also goes along with it,

15:45

also.

15:45

Goes along with the Bay Area pills

15:47

and powders are big yeah quote.

15:50

They talked about how when they were a child, their friend

15:52

who was a cat had died and they had to use

15:54

their own retroactive paraphrasing sworn

15:56

an oath of vengeance against death. Fucking.

16:01

These are just people doing great, very

16:04

healthy from

16:06

the opposite of what you want a kid to learn when their

16:08

pet dies. Is like, yeah, you know, death

16:11

is inevitable. It happens to everything. You know, it'll

16:13

happen to you one day, and it's sad, but just

16:15

something we have to accept. No, no, no wars

16:17

death.

16:18

No. They were like, no, no, no, I

16:20

can fix this.

16:21

Okay, I as a parent have failed in

16:23

this situation. This was

16:25

an unsuccessful step

16:28

in my child's development. Maybe no more

16:30

pets for a while, Maybe no more pets.

16:35

Gwynn also spent way too much time

16:37

online, which is how they wound up reading hundreds

16:39

of theoretical articles about how AGI

16:42

artificial general intelligence would destroy the world.

16:44

And again, AGI is like a mainstream

16:46

term now because a fucking chat

16:49

GPT came out a couple of years ago and everyone started

16:51

talking about it at this point two sixteen

16:53

seventeen. It's only like real people

16:55

who are really into the industry in

16:57

a nerdy way who are using that frame, Like

17:00

regular people on the street don't know what you fucking mean when you're

17:02

talking about this stuff, but this is a term that is in use

17:04

among them. And like Ziz Gwyn

17:06

moved to the Bay Area to get involved in fixing the problem.

17:09

They were another kin, are you familiar

17:11

with this online community? Which

17:14

one other kin?

17:15

Other kin? No, I have no, I've never heard

17:17

of that.

17:18

It's like a it's like the Mormonism

17:20

of furreedom almost like.

17:22

That that's that's the same you

17:24

say.

17:25

I don't mean like it's harmless,

17:27

right, Like these are people who there's

17:30

a mix of beliefs. Some of them like literally believe

17:32

they're like fantasy creatures. Some of them just

17:34

like yeah.

17:34

To be like yeah, like half identify

17:37

as like it a non

17:40

human creature.

17:41

Right, Oh, Like their furry persona is they're

17:43

true.

17:44

Yeah, yeah, kind of. That's close enough for

17:46

government work. And in Gwin's case, it's

17:48

even different where I don't think they believe they are

17:50

literally a dragon, but they

17:52

believe that when there's a singularity and the robot

17:55

god creates heaven, they'll be given the body of

17:57

a dragon because the robot god will be able to

17:59

do that. It's a good singularity

18:01

at least. That's why this is all so important

18:03

to them, making sure it's like a nice AI so

18:06

they'll be able to get their animal friends back and get

18:08

their dragon body. Til is old time

18:10

tale, as old as time again. A

18:13

lot of this could be avoided by just like processing

18:16

death and uh, stuff

18:19

like that a little better. But we

18:21

don't do that very well in our society anyway.

18:23

We've got a lot of people who are committed to denying

18:25

that. So I'm not surprised

18:28

like this happens at like the corners right Like this

18:30

is this is just a little downstream

18:32

from that Brian Johnson guy tracking his erections

18:34

at night and trying to get the penis of a nineteen year

18:36

old.

18:37

Yeah, like.

18:40

A massive sanity gap between these

18:42

two things.

18:43

It's I think I think it's I think

18:45

we're drinking from the same well.

18:47

Yeah, yeah, so

18:49

this is a result or so, Zizz

18:51

commits herself to turning Gwynn to the dark

18:54

side, which is a term she started to use. Obviously,

18:56

it's a Star Wars term and it comes

18:58

out as a result of her obsession with It's called acrasia.

19:01

Akrasia is an actual Greek term

19:03

for a lack of will power that leads someone

19:06

to act in ways that take them further from their goals

19:08

in life. It's an actual, like I

19:10

think akrasia often it was like an early

19:12

term for like what we call adhd right, like

19:14

people who have difficulty, like focusing on tasks

19:17

that they need to complete. One of the promises

19:19

of rationalism was to arm a person with tools

19:21

to escape this state of being and act

19:23

more powerfully and effectively in the world.

19:26

Ziz adds to this some ideas crib from

19:28

Star Wars. She decides that the quote unquote way

19:30

of the Jedi, which is like accepting moral restrictions

19:33

you know about like not murdering people and the like, is

19:36

a prison for someone who's

19:39

like truly great and has the opportunity to accomplish

19:41

important goals. Right if you're

19:43

that kind of person. You can't afford to be limited

19:45

by moral beliefs. So in

19:48

order to achieve the kind of vegan singularity

19:50

that she thinks is critical to save the cosmos,

19:53

she and her fellow rationalists need to free

19:55

themselves and from the restrictions

19:57

of the Jedi and become vegan scyth. That's

20:02

more or less where

20:05

things are going here. So here

20:08

I should note that while Gwynnin's's are spinning

20:10

out on their own, everything that you're

20:12

seeing from them, these feelings of grandiosity

20:14

and cosmic significance, but also paranoid

20:16

obsession are the norm in rationalists

20:19

in effective altruist circles. There's a

20:21

great article in Bloomberg News by Ellen

20:23

Hewitt. It discusses how many in the EA

20:25

set would suffer paralyzing panic attacks

20:28

over things like spending money on an ice dinner

20:30

or buying ice cream, obsessing over how

20:32

many people they'd killed by not better optimizing

20:34

their expenses, and quote in

20:37

extreme pockets of the rationality community,

20:39

AI researchers believe their apocalypse related

20:41

stress was contributing to psychotic breaks.

20:44

Marie employee, and that's one of these organizations

20:46

created by the people around

20:49

Yakowski. Jessica Taylor had a job

20:51

that sometimes involved imagining extreme

20:53

AI torture scenarios. As she

20:55

described it in a post on Less Wrong, the

20:57

worst possible suffering in AI might be able to

21:00

people at work, she says, she in a small

21:02

team of researchers believed we might make God,

21:04

but we might make mess up and destroy everything.

21:07

In twenty seventeen, she was hospitalized

21:09

for three weeks with delusions that she was intrinsically

21:12

evil and had destroyed significant

21:14

parts of the world with my demonic powers,

21:16

she wrote in her post. Although she acknowledged

21:19

taking psychedelics for therapeutic reasons,

21:21

she also attributed the delusions to her job's

21:24

blurring of nightmare scenarios. In real life,

21:26

in an ordinary patient, having fantasies about

21:28

being the devil is considered megalomania,

21:30

She wrote here. The idea naturally followed

21:32

from my day to day social environment and was central

21:34

to my psychotic breakdown. Oh

21:38

man, just taking ketamine

21:40

and convincing yourself you're the devil, normal

21:43

rationalist stuff.

21:44

Yeah, I mean, hey, we've all been there,

21:46

right.

21:46

We've been there now.

21:49

In fact, no,

21:53

this is the least relatable group of people i've

21:55

ever heard of.

21:55

No, no, exactly, because there it's this like

21:58

grandiosity, it's this absolutely need

22:00

to whatever else is going on, even if you're

22:03

like the bad guy, feel like what you're

22:05

doing is like of central cosmic significance.

22:07

It's this fundamental fear that all

22:09

is integral to all of these tech guys.

22:11

It's at the core of Elon Musk too, that like,

22:14

one of these days you're not going to exist and

22:17

very few of the things that you valued in your life

22:19

are going to exist, and there's still going

22:21

to be a world because that's life.

22:24

That's just yeah, that It's so crazy

22:26

how it boils down to just like yeah,

22:28

man, well I don't know what you thought was going

22:30

to happen.

22:31

Yeah, bro, sorry, Yeah, that's just

22:33

how that's just how it goes. You know, We've got

22:35

like ten thousand years of like philosophy and

22:37

like like thinking and writing

22:39

on the subject of dealing with this, But you

22:42

didn't take any humanities and your STEM classes.

22:44

So no, you know that you're just

22:46

trying to bootstrap it.

22:48

Yeah, you just watched Star

22:50

Wars again and decided you got to figure it out.

22:53

Yeah, you watch Star Wars one hundred and thirty

22:55

seven times and figured that was going to your

22:57

a place reading a little bit of fucking Plato or

22:59

something. Maybe

23:01

it didn't work. Also, again, the

23:03

ketaminees not helping.

23:05

No, no, no, no, God

23:08

to be a fly on that wall.

23:10

Oh god. Yeah, the rationalist

23:12

therapists are raking it in, oh.

23:14

Man, honestly well deserved.

23:16

But yeah, some talk

23:18

about info hazards.

23:20

Jesus.

23:23

So I have to emphasize here again that

23:25

I want to keep going back to the broader rationalist community

23:27

because I felt like a risk of this is that I would

23:30

just be talking about how crazy this one lady and

23:32

her friends were, and it's like, no, no, no, Everything

23:34

they're doing, even the stuff that is a

23:36

split off and different in like more extreme

23:39

than mainstream rationalism, is directly

23:41

related to shit going on in the mainstream rationalist

23:43

community, which is deeply tied into big

23:46

tech, which is deeply tied into like the Peter Teal

23:48

circle. A lot of these folks are close to in

23:50

and around the government right now, right So like that

23:52

is it's Ziz is not nearly

23:54

as much of an outlier as a lot of rationalists

23:57

want people to think. Right, Yeah,

24:00

anyway, at rationalists meetups, Ziz

24:02

began to pushing this whole vegan syth

24:04

thing hard and again meets with little success,

24:06

but she and Gwyn gradually start

24:08

to expand the circle of people around them. Meanwhile,

24:11

in her professional life, that Google interview process

24:13

moves forward. Ziz says that she past

24:16

every stage of the process, but that it keep

24:18

getting dragged out, forcing her to ask her

24:20

parents for more help. In November, around

24:22

the time her blog started to get a following, she

24:24

says Google said she'd passed the committee

24:27

and would be hired once she got picked for a team.

24:29

Now I don't know what happens after this, she

24:32

says. Google asks for proof of address, which she

24:34

doesn't have. She's just turned

24:36

twenty six, and she's not on her parents' health insurance

24:38

either. She's been pages describing

24:41

what is a very familiar nightmare scenario to me of

24:43

like trying to get proof of address so you can get

24:45

a job and life continue getting

24:47

like, you know, get on Cali med and stuff.

24:49

And I do think it's probably worth acknowledging

24:52

that, Like, as her brain is starting to break

24:54

and she's she's getting further

24:56

and further into all these delusional ideas, She's

24:58

also struggling with being off of her parents'

25:00

health insurance and like trying to find stable

25:02

housing in the bay and like that

25:05

influences the situation.

25:07

And still in the process of transitioning, right.

25:09

Yes, yes, exactly, And still in the process

25:11

of transitioning. Yes, a heavy workload,

25:14

you're doing too much to your brain, right, yes,

25:19

so, And then she makes the worst possible decision,

25:21

which is to live with her friend Gwynn in her

25:23

tiny sail in their tiny sailboat,

25:26

which is now anchored by the Berkeley Marina.

25:28

Again, this is not like a houseboat.

25:31

This is like a sailboat with one small

25:33

room.

25:34

Right, it's got a court like

25:36

yeah, there's like a bed

25:39

table in.

25:39

A sink, writing, like a little bathroom

25:41

probably maybe a kitchenette. But it's not like

25:44

livable for two people.

25:46

Somebody who's like ever lived into

25:48

small space with your roommate

25:51

knows just like, no matter

25:53

where you're at, it's horrible,

25:55

bad idea.

25:56

And imagine if

25:58

that's shitty tiny apart meant that you remember

26:00

from your past was a boat

26:07

just disastrous and

26:10

this is not a good situation. This would later write,

26:13

I couldn't use my computer as well. I couldn't

26:15

set up my three monitors, there was no room, couldn't

26:17

have a programming flow state. For nine hours, I

26:19

had trouble sleeping. The slightest noise in

26:21

my mind kept alerting me to the possibility that someone

26:23

like my roommate from several months ago, was going to

26:25

attack me in my sleep. So this

26:28

is not a healthy situation. And both

26:30

Gwynnon's's have endured some specific

26:32

traumas, and both are also prone to

26:34

flights of grandiosity and delusion. And now

26:36

they are trapped all day, every

26:39

day together in a single room where

26:41

their various neuroses are clashing with each

26:43

other and their only relief is talking for

26:45

hours about how to save the world.

26:47

Oh my god, this

26:52

is a it's a real villain story.

26:54

You couldn't get any worse than this.

26:56

It couldn't. And it's like, at this point,

26:59

I don't think either of them is like intentionally

27:01

doing anything bad. You've just

27:03

you've kind of created a cult where

27:06

like you're trading off on being the cult leader

27:08

and cult member for each other, Like you've isolated

27:10

each other away from the world, and you're

27:12

spending time brainwashing each other together

27:15

in your little boats. Yeah, how often

27:17

do you think they were leaving that boat not

27:19

nearly long enough? And Gwynn

27:21

is on what Ziz describes as a cocktail

27:23

of stimulants. Quote mapped out the cognitive

27:26

effects of each hour they were on them.

27:29

They get very angry if Ziz interrupts

27:32

their thoughts at the wrong time. And also

27:35

like Ziz isn't really sleeping, so

27:37

they're just talking for hours and getting

27:39

on each other's nerves at the same time. But also

27:42

like building these increasingly

27:44

elaborate fantasies about how they're

27:46

going to save the cosmos and it's

27:48

you know, it's not great. Through these

27:50

conversations they do develop Gwinn's multiple

27:53

personalities theory, mixing in some of

27:55

Zizz's own beliefs about good and evil.

27:57

And I want to quote another passage from that Wired article

27:59

that's summarizes what they come to believe

28:02

about this. A person's core consisted

28:04

of two hemispheres, each one intrinsically

28:06

good or non good. In extremely

28:09

rare cases, they could be double good, a condition

28:11

that so happened with Lesoda

28:13

identified in herself and this

28:15

is consistently going to identify herself as

28:17

intrinsically good, so she's both sides

28:20

of her personality are only good. But

28:22

most people are at best single

28:24

good, which means part of them is non good

28:27

or basically evil, and they're at war with

28:29

this other half of their brain. That's a whole

28:31

person that's evil, which

28:33

is why the other people can't be trusted to make decisions.

28:36

You know, like, increasingly, this's attitude is

28:38

going to be like, only intrinsically good

28:40

people can be trusted to make good decisions,

28:42

only the double goods, only the double

28:44

goods. That's such like a you know,

28:46

you're making your own life or well speech. This

28:49

this is a bad sign. Yeah, so,

28:55

Zizz is google ambitions fall apart at

28:57

this time. They don't really give us a good explanation

28:59

as to why. I kind of think they started bombarding

29:01

their contact with Google with like requests

29:04

about why the process wasn't going faster, and maybe

29:06

Google was like, ah, maybe we don't need this person.

29:10

Ziz concludes failing at Google was good

29:12

because she's gotten she'd gotten ten thousand

29:14

dollars from unemployment at this point. Quote

29:16

this means I had some time. If they hired

29:19

me soon, it would deprive me of at least several months

29:21

of freedom, and which of course she is

29:23

continuing to work out her theories with

29:25

Gwynn on the sailboat. Also,

29:27

if that's freedom, it's really

29:29

not freedom.

29:30

I maybe maybe work. I heard

29:32

the Google campus has a lot of things

29:34

to do, and.

29:36

It's the kind of the what if. I think

29:38

maybe at this point she still could have pulled out of

29:40

this tailspin if she'd gotten a job

29:42

and worked around other people

29:44

and socialized not on the sailboat,

29:47

but also a real consistent thing with Zizz

29:49

is at this point she has no willingness

29:52

to do the kind of compromise. And I'm not just talking

29:54

about the moral compromise, but like, even going to

29:56

work a job for a company, you're

29:59

going to spend a large part of your day doing a thing that

30:01

like you wouldn't be doing otherwise, right,

30:03

because that's what a job generally,

30:06

that's just work. And Ziz

30:08

feels like she can't handle

30:10

the idea of doing anything but reading fan

30:12

fiction and theorizing about how to give herself

30:14

superpowers. Right, that's the most important thing in

30:16

the world because the stakes are so high,

30:19

So she like like ethically can't

30:21

square herself with doing anything she needs

30:23

to succeed in this industry. Where she has

30:25

the skill to succeed. And

30:28

this is this is another trait she's got in common

30:30

with the rest of the rationalist EA subculture

30:33

that that Bloomberg article

30:36

interviewed a guy named quaou Chu Yuan,

30:39

a former rationalist and PhD candidate

30:41

who dropped out of his PhD program

30:43

in order to work in AI risk. He

30:46

stopped saving for retirement and cut off

30:48

his friends so he could donate

30:50

all of his money to you know, EA causes

30:52

and because his friends were distracting him from saving

30:54

the world. And these are all this all cult stuff,

30:56

right. Cults want you to cut off from your friends, they

30:58

want you to give them all your money. He's doing

31:01

but he's doing it like independently,

31:04

Like there's not like a single leader.

31:06

He's not like living on a compound with them.

31:09

It's just once you kind of take these

31:11

beliefs seriously, the things

31:13

that you that you will do to yourself are

31:15

the things people in cults have

31:18

done to them.

31:18

Right.

31:19

In an interview with Business Insider, Yan said,

31:21

you can really manipulate people, and you're doing all kinds

31:23

of crazy stuff. If you can convince them, this is how

31:26

you can prevent the end of the world. Once you

31:28

get into that frame, it really distorts your ability

31:30

to care about anything else.

31:34

Man.

31:34

That's yeah, that's

31:36

kind of a thing. It's harder to talk about

31:38

this than like could people talk

31:40

about Ziz as like, oh, it's a cult leader and

31:43

she had her you know, Vegan trans Ai

31:45

death cult or something, and you

31:48

know, I feel like that's not

31:50

close enough to the truth to get what's like

31:52

to get how this happened, right, because what

31:54

happens with Ziz is

31:57

very cultish. But Ziz

31:59

is one of a number of different people who

32:01

have cabbed off of the rationalism community and

32:03

had disastrous impacts. But it

32:05

happens constantly with these people

32:07

because like it's such.

32:09

An engine for it.

32:10

Yes, it's an engine for making cults.

32:12

It's it's this is a cult factory

32:15

for sure.

32:16

Yeah, you're creating a cult factory.

32:18

Oh no, to give you the base

32:20

ideas and then you can just kind of franchise it

32:22

how you'd like.

32:23

Yeah, And a lot of prominent

32:26

rationalists who news is at the time have since

32:28

gone out of their way to describe her as like, you know,

32:30

someone on the fringes. Anna Salomon

32:33

of Seafar described her as a young person

32:35

who was hanging around and who I suspect

32:37

wanted to be important. And

32:40

Anna claim, is there anyone here who doesn't

32:42

want that? Within this No,

32:45

that's all of them, right, that's the whole community.

32:49

And like Anna was emailing directly gave

32:51

that gave Ziz, like some of the advice that Ziz

32:54

considered like key to her moving to the Bay

32:56

Area and stuff. Right like these these

32:58

these people, like the rationalists are really really

33:00

want you to think that this was just like some fringe

33:03

person. But she's very much tied in to

33:05

all of this stuff, right, So for

33:07

her part, Ziz doesn't deny that failing

33:10

to convince other rationalists was part of why

33:12

she pulled away from mainstream rationalism.

33:14

But she's also going to claim that a big reason for her

33:16

break is sexual abuse among people

33:18

leading in the rationalist community. And

33:21

there's a specific case that she'll cite later that

33:23

doesn't happen to until twenty eighteen, But

33:25

this is a problem people were discussing in twenty

33:27

seventeen when she's living on that boat. The

33:30

representative story is the case of Sonia

33:32

Joseph, who was the basis of that Bloomberg news

33:34

piece. I've quoted from a couple of times, and

33:37

it's a bummer of a story. Sonya

33:39

was fourteen when she first read Ydkowski's

33:41

Harry Potter and the Methods of Rationality, which

33:44

is set her on the path that led her to moving

33:46

to the Bay Area in order to get

33:48

involved in the rationalist EA set. And

33:50

she's focused on the field of AI risk. And

33:52

I'm going to read it a quote.

33:54

This week has been so long that

33:57

I completely erased the

33:59

Harry Potter part of this story from my brain.

34:01

It's never drops too

34:04

far below the surface. I cannot overemphasize

34:06

how important this Harry Potter fan fiction is

34:08

to all these murders. I want primary

34:11

texts are getting abused. Yes, yes,

34:14

it's a primary text of the movement.

34:16

Wow, I'm going to read a quote from that Bloomberg

34:19

article. Sonia was encouraged when she

34:21

was twenty two to have dinner with a fortyish

34:23

startup founder in the rationalist sphere

34:25

because he had a close connection to Peter

34:27

Teal. At dinner, the man Bragg that Yudkowski

34:30

had moderate modeled a core Harry

34:32

Potter like fit professor in that fanfic

34:35

on him, Joseph says that He also

34:37

argued that it was normal for a twelve year old girl to

34:39

have sexual relationships with adult men, and

34:41

that such relationships were a noble way of transferring

34:44

knowledge to a younger generation. Then,

34:46

she says he followed her home and insisted

34:48

on staying over. She says he slept on

34:50

the floor of her living room and that she felt unsafe

34:52

until he left in the morning. Jesus,

34:55

so great. You know, bragging about your Harry

34:57

Potter, how you helped inspire the Harry Potter

34:59

fan, and then explaining how twelve year old girls

35:01

should have sex with adult men. Good stuff,

35:04

got rational.

35:06

I gotta say, that's a crazy brag to get

35:09

chicks. Yeah, you

35:11

know it was.

35:12

You know, one of those characters.

35:14

I'm the snake.

35:15

Yeah, I'm the snipe of this. By the way, what do you

35:17

think about twelve year olds? Awesome?

35:20

I have a close connection to Peter Teal.

35:23

Yeah.

35:26

Cool, oh

35:29

man. As that Bloomberg article makes

35:31

clear, this is not an isolated issue within rationalism.

35:34

Quote, sexual harassment and abuse are distressing,

35:36

are distressingly common. According to interviews

35:38

with eight women at all levels of the community, many

35:41

young ambitious women described a similar trajectory

35:43

that were initially drawn in by the ideas, then

35:45

became immersed in the social scene. Often

35:47

that meant to attending parties at EA or

35:50

rationalist group houses, or getting added to jargon

35:52

filled Facebook messenger chat groups with

35:54

hundreds of like minded people. The eight women

35:56

say casual misogyny threaded through the scene

35:59

on the low end brick. The rationalist

36:01

adjacent writer says a prominent rationalist

36:03

once told her condescendingly that she was

36:05

a five year old and a hot twenty year old's

36:07

body. Relationships with much older

36:10

men were common, as was polyamory. Now

36:12

there was inherently harmful, but several women say those

36:14

norms became tools to help influential older

36:16

men get more partners. And this

36:18

is also this isn't just rationalism, that is

36:20

the California ideology. That

36:23

is the Bay Area tech set, right.

36:24

Yeah, very techy.

36:27

Yes a man,

36:30

and it's all super fucking gross

36:33

the whole year a five year old and a hot twenty

36:35

year old's body thing. What the fuck? Man?

36:41

How do you say that? Not hurl yourself off

36:42

the San Francisco Bay Bridge?

36:46

Vile?

36:47

That's fucked up, dude, that's

36:49

bad. Speaking

36:51

of bad to the bone are

36:54

sponsors. Ah,

37:00

we're back. So this is

37:02

important to understand in a series about

37:04

this very strange person and the strange beliefs

37:06

that she developed that influenced

37:08

several murders. Ziz

37:10

had many of the traits of a cult leader, but

37:13

again, she's also a victim first

37:15

of the cult dynamics inherent to rationalism.

37:17

And what she's doing next is she breaks

37:19

away with a small loyal group of friends, and

37:21

she does create a physical situation that much

37:23

more resembles the kind of cults we're used to dealing

37:26

with, particularly scientology, because

37:28

next she's going to take Oh wow, me and Gwnn

37:30

living alone on this boat. We kind of hate each

37:32

other and neither of us is sleeping, and

37:35

our emotional health is terrible. But we've

37:37

made so many much progress on our ideas.

37:40

Maybe we should Maybe we should make this a

37:42

bigger thing, right, Maybe we should get a

37:44

bunch of rationalists all living together

37:46

on boats.

37:50

She needs a work life balance.

37:52

Yeah, no, no, what she thinks she needs is

37:55

she calls it the Rationalist fleet, which

37:57

is she wants to get a bunch of community members to buy

37:59

several boats and live anchored in the bay

38:01

to avoid high bay area rent so they can spend

38:03

all their time talking and plotting out ideas

38:05

for saving the cosmos. Oh

38:08

man, so

38:11

great, and.

38:11

I get it right. It's expensive

38:13

here. I want to get some boats with my friends. It

38:16

does sound cool.

38:17

We won't go insane together, obviously,

38:20

you know. She buys

38:22

a twenty four foot boat for six hundred

38:24

dollars off of Craigslist. And I

38:28

don't know much about boats, but I know you're not getting

38:30

a good one for just six hundred dollars.

38:32

No.

38:33

No, like a living

38:35

boat, like a full

38:37

boat, like.

38:38

A foot boat.

38:40

Yes, a full boat.

38:41

Oh man, that had to be a piece of shit,

38:44

to be a shitty, shitty, colossal

38:47

piece of shit. Yeah.

38:48

She names it the Black Signet, and she starts

38:51

trying to convince some of her idea a lot of these people

38:53

who have gathered around her to get in on the project.

38:56

Eventually, she, Danielson, and a third

38:58

person puts together the money to buy a boat that's

39:00

going to be like the center of their fleet, a seventy

39:02

year old Navy tug boat named the Caleb,

39:04

which was anchored in Alaska.

39:07

This is like a.

39:08

Ninety four foot boat. It's a sizeable

39:10

boat, and it is also very

39:13

old and in terrible shape.

39:17

That's the that's the crown jewel of the food,

39:20

right right, that's our flagship. Man,

39:24

Danielson sale, They

39:26

buy this thing with this third guy, Dan Powell,

39:28

who's at least a navy veteran, so like, you

39:31

know, okay, that boot calls

39:34

boat adjacent, but he's

39:36

I get the feeling. Nobody says this, but David

39:40

Powell says that he put tens of thousands of dollars

39:42

into buying the Caleb. And I just know

39:44

from what Danielson and Ziz

39:46

wrote about their finances, neither of them had nearly

39:49

that much money. So I think, by far he

39:51

invests the most in this project. And

39:54

I don't want to insult the guy, but he says

39:56

he did it because he quote considered buying the boat

39:58

to be a good investment, which boats

40:01

aren't. Boats are never an

40:03

investment comically,

40:06

so like known to nothing

40:09

depreciates like fucking raw

40:12

salmon depreciates slower

40:14

than a boat. I

40:20

think his attitude is I'm going to become like the

40:22

slumlord of a bunch of or at least

40:24

landlord to a bunch of boat rationalists.

40:26

But I think correct, I

40:28

don't know how you expect this to pay

40:30

off seventy year old

40:33

tug boat for a bunch of like poor

40:35

rationalists, punk kids to live

40:37

in. Who was that ever supposed to

40:39

work? What's

40:42

the P and L statement you put together

40:44

here?

40:47

Oh?

40:49

What was the what was the timeline on him getting

40:51

his money back? He thought?

40:53

Oh god, I have no idea. He absolutely

40:56

takes a bath on this ship, right, yeah,

40:58

he claims, And I belie leave him

41:00

that Ziz lied to him about the whole

41:02

scenario to get his money. I

41:05

do think this was essentially a con from her,

41:07

he says. Quote Ziz led me to believe that

41:09

she had established contacts in the bay and that it

41:11

would be easy for us to at least get a slip,

41:13

if not one that was approved for overnight use.

41:16

And as it turns out, when we were coming through the inside

41:18

passage from Alaska, it was revealed that we did not

41:20

have a place to arrive.

41:21

Wait, oh, I didn't realize

41:24

he sailed it down from Alaska.

41:26

Yeah, they all sail it together, them and a couple

41:28

other rationalists that they pick up. They make

41:31

a post on the internet being like,

41:33

hey, any rationalists want

41:35

to sail a boat? Down from Alaska,

41:38

talk about our ideas while we live on a boat.

41:41

Oh man, so these

41:44

people need space, yes, just

41:46

get a warehouse.

41:47

Yes, yeah, well get

41:50

the ghost ship. Fire had happened by that point, so

41:52

I don't think warehouse space was easy to get.

41:55

Yeah, but this, I

41:57

think this would have. I think you're right. In an earlier era,

41:59

they have just wound up living in like a warehouse

42:02

and maybe all died in the horrible fire because

42:05

that there were issues with that kind of life too,

42:07

But they would have been an option besides the boat

42:09

thing. Anyway, the Caleb is not in

42:11

good shape.

42:12

Again.

42:12

This boat is seventy plus years old. It

42:14

is only livable by punk standards, and

42:17

while it was large enough it is a ninety four foot

42:19

boat you can keep some people on there, it's

42:21

also way too big to anchor in most

42:23

municipal marinas, especially

42:25

since the boat has three thousand gallons of incredibly

42:28

toxic diesel fuel and it's not really

42:30

seaworthy, which means there's this constant

42:32

risk of poisoning the water as it sits in

42:35

that the authorities are just going to be consistently

42:37

like, guys, you can't have this here, guys,

42:39

you simply can't have this here,

42:42

so they just.

42:42

Got to operate out and inter national

42:45

waters like a cruise ship.

42:46

No, they're just kind of illegally anchoring

42:48

places and hoping that it's fine and

42:50

periodically getting boarded over it. Another

42:53

crew member on the right down from Alaska

42:56

who's just kind of there. They're just there,

42:58

you know, for the adventure. So they they leave

43:00

and don't come back after they get to the bay. But

43:03

this person expressed an opinion that Ziz consistently

43:06

came off as creepy but not scary.

43:09

At one point, he says that she confronted him

43:11

and told him he was transgender, and when he's

43:13

like, no, I'm really not, she told him

43:15

he was.

43:16

Yes.

43:17

She does this a lot, tells people I know that

43:19

you're this, this is and it works like that's

43:21

how a number of her followers get to her. But

43:24

she also it doesn't work a lot of time. A lot of

43:26

people are like, no, I'm not you know whatever

43:29

it is you're saying. She does this to Gwynn too, so

43:31

I don't doubt his story. Like she

43:34

just kind of decides things about people

43:36

and then tries to brute force them into accepting

43:39

that about herself, and when there are people

43:41

who are like both desperate for like approval

43:43

and affection and also who are housing

43:45

insecure and need the boat or

43:47

wherever to live with her, those people feel

43:50

a lot a number of them feel like a significant

43:52

pull to just kind of accept whatever

43:54

Ziz is saying about them.

43:56

Yeah. I mean, when you're desperate in that way, you

43:58

kind of definitely find yourself to

44:01

have a roof over your head, like.

44:03

Right, Yeah, And it's a very normal cult

44:05

thing, right, Like this is an aspect

44:07

of all of that kind of behavior. Now,

44:09

by this point, a few other people have come

44:11

to live in the Rationalist fleet. One of them is Imma

44:14

Borhanian, a former Google engineer,

44:16

and Alex Leatham, a budding mathematician.

44:19

The flotilla became a sort of marooned

44:21

aquatic salon. Wired quotes

44:23

Zizz as emailing to a friend at the time, We've

44:26

been somewhat isolated from the rationalist community

44:28

for a while, and in the course developed a significant

44:31

chunk of unique art of rationality and theories

44:33

of psychology aimed at solving our

44:35

problems. Excited

44:38

for this psychology you built on the boat,

44:40

yeah, Wired continues as Lesoda

44:42

articulated their goals had moved beyond real

44:45

estate into a more grandiose realm. We

44:47

are built trying to build a cabal, she wrote.

44:49

The aim was to find abnormally intrinsically

44:51

good people and turn them all into Gervais

44:54

sociopaths, creating a fundamentally

44:56

type of group than I have heard of existing before

44:59

sociopathy was at a road would allow the

45:01

group members to operate unpooned by

45:03

the external world.

45:05

Yeah that is because you had said that before,

45:07

right, Yeah, they had been that's sort of what

45:09

they're looking to be.

45:10

Yeah, they're obsessed with this idea of which is initially

45:12

like kind of a joke about the office,

45:14

but they're like, no, no, no, it actually is really good

45:17

to have this sociopath at the top who like

45:19

moves the and manipulates these like lesser

45:21

like fools and whatnot and puts

45:23

them into positions below them. Like that's

45:26

how we need what we need to be in order to gain

45:28

control of the levels of power. We

45:31

have to make ourselves into Ricky Gervais

45:33

sociopaths. Yeah, great,

45:37

what a good ideology.

45:39

I love that they still love pop culture

45:41

though you know.

45:42

They're obsessed with it. And again this is you

45:44

can't talk about this kind of shit if you're if

45:46

you're regularly having conversations with people

45:49

outside of your bubble, like.

45:50

Exactly it's the thing. Yeah, if you have somewhere

45:52

to go, yeah, if you have anywhere to

45:54

go this campaign.

45:55

Yes, yes. If you've got a friend who's like a nurse

45:58

or a contract if drinks with once a week

46:00

and you just talk about your ideas once, they're gonna be

46:02

like, hey, this is bad.

46:04

You need to stop.

46:05

You're going out of a bad road. Do

46:08

you need to stay with me? Are you okay?

46:10

This is clearly a cousin.

46:13

Yes, someone.

46:14

This would be so upsetting

46:17

for someone to just casually talk

46:19

about it like a paint and sip or like Ricky.

46:21

Servas somebody

46:30

there. Breaks with mainstream rationalism

46:32

had gone terminal. GW. Gwynn

46:35

criticized the rest of the central rationalist

46:37

community for quote not taking heroic responsibility

46:40

for the outcome of this world. In

46:42

addition to the definitely accurate claims

46:44

of sexual abuse within rationalism,

46:46

they alledged organizations like Sea Far we're

46:49

actively transphobic. I don't know how true that

46:51

is. Some of the articles I've read. There's a lot

46:53

of trans rationalists will be like, no, there's

46:55

a very high population of trans people

46:57

within the rationalist community. So people

46:59

just agree about this. It's not my place to come to a

47:01

conclusion. But this is one of the things that Zizz says

47:04

about the central rationalist community.

47:07

Ziz had concluded that transgender people

47:09

were the best people to build a cabal around

47:12

because they quote from Zizz's blog, had

47:14

unusually high life force. Ziz

47:17

believed that the mental powers locked within the small

47:19

community of simpatico rationalists they'd gathered

47:21

together were enough to alter the fate of the cosmos

47:24

if everyone could be jail broken into

47:26

sociopaths.

47:27

And yeah, these are all

47:29

double goods as well.

47:31

Well, no, she's the only double good. Actually,

47:33

she becomes increasingly convinced that they're all just

47:35

single good right, And this is

47:37

like her beliefs about heroism from the last episode.

47:40

If you've got the community and the hero, the

47:42

community's job is to support the hero,

47:45

right, like blind

47:47

support, right, blind support, no matter what.

47:50

And a lot of the languages is using here in

47:52

addition to being you know, rationalist

47:54

language. This is all like scientology

47:57

mixed with gaming and fantasy media.

47:59

She talks about the need to install new

48:01

mental tech on she and her friends. They

48:04

which is like tech is like a scientology

48:06

term, right, Like that's that's like a big

48:09

thing that they say. She and her circle

48:11

start dressing differently. Ziz starts wearing

48:13

like all black robes and stuff

48:15

to make her look like a syth or some sort of wizard.

48:18

Her community adopts the name vegan anarco

48:20

transhumanism and starts unironically

48:23

referring to themselves as vegan scyth.

48:26

In the boat community when they move in.

48:28

Yeah, just like, what is

48:30

going I just wanted to I'm just an alcoholic.

48:33

What's happening? I

48:35

just wanted to be like Quint from John's Oh

48:37

no, Yeah.

48:39

I'm just here because my wife.

48:40

Left, right. I think might a

48:42

different way than a grape white attack. Now

48:45

looking bad?

48:47

Yike? Oh man.

48:49

So around this time, Gwynn claims she

48:51

came up with a tactic for successfully

48:53

separating and harnessing the power of different

48:55

hemispheres of someone's brain. The

48:58

tactic was uni himus spheric

49:00

sleep, and this is a process by

49:02

which only one half of your

49:04

brain sleeps at a time. In

49:07

a critical write up, publishes a warning before

49:09

the killings that are to come. A rationalist

49:11

named Apala Mojave writes, normally

49:13

it is not possible for human beings to sleep with

49:16

only one hemisphere. However, a weak form

49:18

of UHS can be achieved by stimulating

49:20

one half of the body and resting the other like

49:22

hypnosis are fasting. This is a vulnerable

49:24

psychological state for a person entering

49:26

UHS requires the sleeper to be exhausted.

49:29

It also has disorienting effects, so they are not

49:31

quite themselves. And I

49:33

disagree with them that, like, there's no, they're not just

49:35

actually sleeping with only one hemisphere. And in

49:37

fact, I think they may have taken this idea

49:39

from Warhammer forty thousand

49:42

because.

49:43

Its do because

49:45

yeah, what are you talking about?

49:47

But yeah, that's not a thing. Like,

49:51

yes, if you don't let yourself sleep for

49:53

long periods of time and like kind

49:55

of let yourself zone into a meditative state,

49:57

you'll get a trippy effect. You

50:00

will become altered. You're altering

50:02

your state, and you cant This is

50:04

why colts deprive people of sleep.

50:06

You can fuck with people's heads a lot

50:08

when they're in that space, but this isn't

50:11

what's happening.

50:13

I like to think of them on

50:15

the boat, just only using one half of their body.

50:17

Line right, like one eye open, watching

50:21

the office.

50:24

Furiously taking notes.

50:27

So this is how that write

50:29

up describes the process of uni hemispheric

50:31

sleep. One you need to be tired. Two

50:34

you need to be laying down or sitting up. It is important

50:36

that you stay in a comfortable position that won't require it

50:38

to you to move very much. In either case, you

50:40

want to close one eye and keep the other open. Distract

50:43

the open eye with some kind of engagement. Eventually

50:46

you should feel yourself begin to fall asleep on one

50:48

side. That side will also become numb. The

50:50

degree of numbness is a good way to track how deep

50:52

into sleep the side is. Once into UHS,

50:55

it is supposed to be possible to infer which aspects

50:57

of your personality are associated with which

50:59

side of the brain. And the

51:01

goal of hemispheric sleep is

51:04

to jail break the mind into psychopathy

51:06

fully right, And that's

51:08

how Ziz describes it is that's the goal. That's

51:10

the goal, that's their goal. Got

51:12

to make ourselves into psychopaths so we can save

51:15

the world. But

51:18

it also gets used. You could use it to like I have this

51:20

thing. I don't like that I react this way in this situation.

51:22

So get me into this sleep pattern and you like talk

51:25

me through and we'll figure out why I'm doing it, and

51:27

we'll They describe it as using tech to upgrade

51:29

their mental capabilities, right, so

51:33

they're just kind of brainwashing each

51:35

other. They're like fucking around with with some pretty

51:37

potentially dangerous stuff. And again, drugs

51:40

are definitely involved in a lot of aspects

51:42

of this, which which is not usually written

51:45

up, but you could you just have to infer

51:49

given that there's some disagreement or there's

51:51

some disagreement around all this. But

51:53

it seems accurate to say that Gwynn is the one who came

51:55

up with the hemispheric sleep idea,

51:58

but a lot of the language around how tactic

52:00

it was used and what it was supposed to do came

52:03

from Ziz. And again, the process

52:05

is just sleep deprivation. Right. This

52:08

is cult stuff. It's part of how cults

52:10

brainwash people. But it also wouldn't

52:12

have seemed inherently suspicious to rationalists

52:15

because part of that subculture.

52:18

Being part of that subculture and going to those events

52:20

had already normalized a slightly less radical

52:22

version of this behavior, as this piece in Bloomberg

52:24

explains, at house parties,

52:27

rationalists spent time debugging each

52:29

other, engaging in a confrontational style

52:31

of interrogation that would supposedly yield

52:33

more rational thoughts. Sometimes, to probe

52:36

further, they experimented with psychedelics and

52:38

tried jail breaking their minds to crack

52:40

open their consciousness and make them more influential

52:42

or agentic. Several people in Taylor

52:45

and this is one of the sources sphere, had similar psychotic

52:47

episodes. One died by suicide in twenty

52:49

eighteen and another in twenty twenty one. So

52:52

in the mainstream rationalist subculture, they

52:54

are also trying to like consciously hack

52:56

their brains using a mix of like drugs

52:59

and meditation and social abuse,

53:01

and people kill themselves as a result

53:03

of like the outcomes of this. This is already

53:05

a problem in the mainstream subculture.

53:07

Yeah, let alone this extremist offshoot

53:10

right yep.

53:13

In her own writings at the time,

53:15

Ziz describes hideous fights with Gwinn in

53:17

which gwyn What tries to mentally dominate

53:19

in mind control Ziz, they've both become

53:21

believers and new theories is has

53:24

that's basically like she uses the term

53:26

mana, which she is. She describes

53:29

as like your ability to persuade people, which

53:31

is, if you can convince someone of something, it's

53:33

evidence that you have an inherent level of like magical

53:36

power. And someone with naturally high manna

53:38

like Ziz can literally mind control

53:40

people with low manna. That's what she believes

53:43

she's doing whenever she tries to talk

53:45

someone into something about themselves, is

53:47

she's mind controlling them. And she and Gwinn have

53:49

mind control battles. At one

53:51

point they start having like one

53:53

of these arguments where basically Gwinn threatens

53:56

to mind control Ziz and Ziz

53:58

threatens Gwin back, and this starts of verbal

54:00

escalation. And the way Ziz describes

54:02

this escalation, which is, again these are two

54:04

sleep deprived, traumatized people

54:07

fucking with each other's heads on a boat. But

54:09

the way that Ziz describes the escalation

54:12

cycle is going to be important because this

54:14

has a lot to do with the logic of the murders that

54:16

are to come. I said that if they

54:18

were going to defend a right to be attacking me

54:21

on some level and treat fighting back as a new

54:23

aggression and cause to escalate. I would

54:25

not at any point back down. And if our conflicting

54:27

definitions of the ground state were no further

54:29

retaliation was necessary. Meant that we were consigned

54:32

to a runaway positive feedback loop

54:34

of revenge. So be it. And if that

54:36

was true, we might as well try to kill each other right

54:38

then and there, in the darkness of the Caleb's

54:40

Bridge at night, where we were both sitting lying under

54:42

things in a cramped space, I became intensely

54:45

worried they could stand up faster. Consider

54:47

the idea from World War One mobilization

54:49

is tantamount to a declaration of war. I

54:51

stood up, still silent, waiting

54:54

see see first

54:57

off, and there's other people there is

55:00

Well, it's not just yes, And

55:04

just like the logic of well, obviously

55:06

if you attack me, then I'm going to counterattack you,

55:08

and then you're going to counterattack me, which means eventually

55:10

we'll kill each other. So we should just kill each other now,

55:12

like when you are taking your advice on how

55:14

to handle social conflict from the

55:16

warring European powers that got into

55:18

World War One.

55:20

Maybe not a good like positive

55:22

example, it's

55:29

just so like even

55:32

in understanding how they got there,

55:35

it still is such a stress.

55:36

Like, even having all this.

55:37

Back, it's still like, yeah, really taking

55:40

some leaps, It's it's yeah, I mean just

55:42

having a fight with your friend and then opening your locket

55:44

which has like Kaiser Vilhelm and the tzar in

55:47

it and going what would you guys do here?

55:49

Yeah, ancestors

55:51

guide me.

55:56

And again, but you know, part of what's going on here is this

55:58

timeless decision theory bullshit,

56:00

right. Ziz believes that she makes it clear at this

56:03

point when they start having a conflict that

56:05

the stakes will immediately escalate

56:07

to life or death. Gwinn won't risk

56:09

fucking with her, right, but by

56:11

doing this, she also immediately creates a situation

56:13

where she feels unsafe. However, in that

56:15

conflict, gwyn yields, and Ziz concludes

56:18

that the technique works right, and

56:20

so yes, yes,

56:22

what she thinks is her man is strong and

56:24

this is a good idea for handling all conflicts.

56:27

Right. So I'm going to increasingly teach

56:29

all these people who are listening to me that

56:32

this is the escalation loop that you

56:34

handle every conflict with, right. Great

56:37

stuff. One of the young

56:39

people who got drawn as is at this time was Maya

56:41

Passek, who blogged under the name Squirrel

56:44

in Hell. She wrote about mainstream rationalist

56:46

stuff, citing Yakowski and Elon Musk,

56:48

but in her blog there's like a pattern of depressive

56:51

thought. In one twenty sixteen post,

56:53

she mused about whether or not experiencing

56:55

joy and awe might be bad because

56:58

it biases your perception. So

57:00

this is this is a young person who I

57:02

think is probably is dealing with a lot of depressive

57:04

yesses and the classic stinking

57:07

thinking, right, and maybe

57:09

the community is not super helpful to her.

57:11

She was working to create a rationalist community

57:14

in the Canary Islands. She's kind of trying to do the

57:16

same thing Ziz did, But like in an island

57:18

where it's cheaper to live, is this something.

57:20

That can exist a lot of places like.

57:24

Share Yeah, I mean yeah, if you've got cheap rent, you

57:26

can get a bunch of like weirdos who work online

57:28

to move into a house with you. Right, Yeah,

57:31

Like that's that's always possible. She

57:35

found Zizz's blog and she starts commenting

57:38

on it. She's particularly drawn to Zizz's

57:40

theories on manna and Zin's theory in

57:42

Gwynn's theory about hemispheric personalities

57:45

and one of her most direct cult leader moments.

57:47

Ziz reaches out directly to this

57:50

to Maya as she's like posting on her blog

57:52

and emails her saying, I see you like some

57:54

of my blog posts. Truly a sinister

57:56

opening. Yeah,

58:00

my true companion, Gwyn and I are taking

58:03

us somewhat different than Maria. That's

58:05

the organization, one of the rationals organization

58:07

approaches, they call each other, that's

58:09

how they get for their true companions,

58:11

true opinions. At this point, or

58:15

taking a somewhat a different approach than the Myria

58:17

approach to saving the world without much specific

58:19

test technical disagreements. We are running on

58:21

somewhat pointed to by the approach. As

58:23

long as you expect the world to burn,

58:26

then change course. Right, So basically

58:28

we still expect the world to burn, so we can't keep doing

58:30

what the other rationalists are doing. And she lays out

58:32

to this, this girl she meets through a

58:35

blog post her plan to find

58:37

abnormally intrinsically good people

58:39

and jail break them into Gervais's sociopaths.

58:42

She invites Maya to come out and it's

58:44

I don't think this happened, but they do start

58:47

separately journeying into unbucketing,

58:49

and Maya gets really into this uni hemispheric

58:52

sleep thing, and Ziz is kind

58:54

of like coaching her through the process. She tells

58:56

Maya that one of her hemispheres is female,

58:58

because Maya's a transor, and Ziz

59:00

tells her one of your brain hemispheres, each

59:03

of which is a separate person, is female, but

59:05

the other is male and quote mostly

59:07

dead, and your suicidal

59:10

impulses are caused by both the pain

59:12

of being trans and also the fact that there's this dead

59:14

man living in your head that's

59:17

like taking up

59:19

half of your brain's space, and so you

59:21

really need to debucket in order to have a

59:23

chance of surviving, right, Okay,

59:26

So.

59:26

She needs to be jail broken to be

59:28

free.

59:29

To be free, and Maya will basically

59:31

replace her sleep entirely with this uni

59:33

hemispheric sleep crap, which exacerbates

59:36

not sleeping, exacerbates your depressive

59:38

swings, and leads to deeper and deeper troughs

59:40

of suicidal ideation. She

59:43

is believed to have died by suicide in February

59:45

of twenty eighteen. She posts

59:47

a what is essentially a suicide note

59:49

that is very rationalist in its verbiage

59:51

literally titled decision theory and

59:54

suicide, and

59:56

this is the first death directly

59:58

related to in Gwen's ideas.

1:00:01

But I think it's important to note that, like the role

1:00:03

mainstream rationalism plays in all

1:00:05

of this, suicide is a common

1:00:07

topic at se FAR events, and people will

1:00:10

argue constantly about whether or not, like a

1:00:12

low value individual, it's better

1:00:14

for them to kill themselves, right, is that like of higher

1:00:16

net value to the world. And

1:00:19

it was also used as like a threat to

1:00:21

stop women who were abused by figures in the community

1:00:23

from speaking up. And this is from that Bloomberg article.

1:00:26

One woman in the community, who asked not to be identified

1:00:28

for fear of herprisals, says she was sexually

1:00:31

abused by a prominent AI researcher.

1:00:33

After she confronted him, she says she had job

1:00:35

offers rescinded and conference speaking gigs

1:00:37

canceled, and was disinvited from AI events.

1:00:40

She said others in the community told her allegations

1:00:42

of misconduct harmed the advancement of AI

1:00:44

safety, and one person suggested an

1:00:46

agentic option would be to kill herself.

1:00:49

So there is just within rationalism this discussion

1:00:52

of like it can be agentic,

1:00:54

as in, like you are taking high agency for

1:00:56

your to kill yourself

1:00:58

if your net if

1:01:00

you're going to be a net harm to the cause of AI

1:01:02

safety, which you will be by reporting this AI

1:01:04

researcher who molested you, right, and.

1:01:07

Yeah, because you're taking them man,

1:01:10

Yeah.

1:01:13

Shit, these people aren't

1:01:15

Like all this whole community is playing with a lot

1:01:17

of deeply dangerous stuff and a bunch

1:01:19

of people are going to have their brains

1:01:21

kill, either kill themselves or have

1:01:25

suffer severe trauma as a result this,

1:01:28

all of this, Yeah.

1:01:29

Escaping this is even putting yourself back together

1:01:31

after living in this way seems like it would be

1:01:34

such a task.

1:01:35

Again and likely any cult part of the difficulties

1:01:37

like teaching yourself how to speak normally

1:01:40

again, how to not talk about all this

1:01:42

stuff?

1:01:42

Right, Yeah, not identify as

1:01:44

a vegan sith.

1:01:46

Right, right, because like I gotta say, like there's and

1:01:48

people who are really in the community will note like a dozen

1:01:51

different other concepts and terms

1:01:53

in addition to like vegan sith and gervais as

1:01:55

sociopaths and shit that I'm not talking

1:01:57

about that are important to this is ideology.

1:02:00

But like you just can't. Like

1:02:02

I had to basically learn like the like

1:02:06

a different language to do these episodes,

1:02:08

and I'm not fluent in it, right, Like you have

1:02:10

to triage like what shit do you

1:02:12

need to know?

1:02:13

You know? Yeah, it's so deep, it's night,

1:02:15

so deep.

1:02:18

Deep and silly. Let's do an ad

1:02:20

break and then we'll be done

1:02:25

and we're back. So I'm just going to conclude

1:02:28

this little story and then we'll end the episode for

1:02:30

the day. So this

1:02:33

person, Maya, has likely killed themselves

1:02:36

at the start of twenty eighteen,

1:02:38

and Zis reacts to the suicide

1:02:40

in her usual manner. She blogs

1:02:42

about it. She took from what had

1:02:44

happened, not that like debucketing might be dangerous

1:02:47

and uni hemispheric sleep might be dangerous,

1:02:50

but that explaining hemispheric consciousness

1:02:52

to people was an info hazard. She

1:02:55

believed that people who were single, good like

1:02:57

Maya were at elevated risk because

1:02:59

learning that one of the whole persons inside them

1:03:01

was evil or mostly dead could

1:03:04

create a reconcilable conflict, leading

1:03:06

to depression and suicide. And

1:03:08

she she comes up with a name for this. She calls

1:03:10

this passex doom. That's what she like

1:03:13

names the info hazard that kills her

1:03:15

friend who she is like fucking with their

1:03:17

head. So

1:03:20

that's nice,

1:03:24

Yeah, as

1:03:26

nice as anything else in this story. Is I

1:03:28

think you might have been the doom here.

1:03:30

Yeah, I it's you with the

1:03:33

whole problem. But

1:03:34

now, but now it's an info hazard

1:03:37

to explain. Yes,

1:03:40

a person's.

1:03:42

Like to explain your theories.

1:03:44

Yeah, yeah to a person who can't

1:03:46

handle it.

1:03:47

Yeah, And she thinks she comes to the conclusion it's

1:03:50

a particular danger to explain it, like

1:03:52

to single good trans women, who are

1:03:54

the primary group of people that she is going after

1:03:56

in terms of trying to recruit folks. So she

1:03:58

like admits her belief is that this

1:04:01

this thought thing I've come up with is particularly

1:04:03

dangerous to the community I'm recruiting from.

1:04:06

But it's the only it's essential. This information

1:04:09

is absolutely essential to saving the world.

1:04:11

So you just have to roll the dice.

1:04:13

Yeah.

1:04:13

It isolates herself within her own group that she's

1:04:15

created.

1:04:16

Well, yes, And it also she

1:04:19

is then consciously taking the choice. I know this

1:04:21

is likely to kill or destroy a lot of the people

1:04:23

I reach out to, but I think it's so important

1:04:25

that it's like worth taking that risk with their lives.

1:04:31

Yep, good stuff.

1:04:33

Yeah.

1:04:34

Anyway, how are you feeling?

1:04:35

Got plug?

1:04:37

I am I I am

1:04:39

okay, I'm I'm

1:04:42

I. You know what, I'm deeply sad

1:04:44

for these people who are

1:04:47

so lost, and I'm

1:04:49

also pretty interested because this is crazy,

1:04:51

but I'm okay.

1:04:54

I'll be great, happy

1:04:58

to happy, to happy to have to see

1:05:00

that. Well, everybody, this has

1:05:02

been Behind the Bastards a podcast

1:05:05

about things that you

1:05:08

maybe didn't think, maybe didn't

1:05:10

need to know about how the Internet breaks people's

1:05:12

brains. But also a lot

1:05:14

of people surprisingly close to this community are running

1:05:16

the government now, so maybe you do need to know about

1:05:19

it. Sorry about that info hazard.

1:05:24

Behind the Bastards is a production of cool

1:05:26

Zone Media. For more from cool Zone Media,

1:05:29

visit our website Coolzonemedia dot

1:05:31

com, or check us out on the iHeartRadio

1:05:34

app, Apple Podcasts, or wherever you get

1:05:36

your podcasts. Behind the Bastards is

1:05:38

now available on YouTube, new episodes

1:05:40

every Wednesday and Friday. Subscribe

1:05:43

to our channel YouTube dot com slash

1:05:45

at Behind the Bastards

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features