E327. The Rise of the Woke Right - James Lindsay

E327. The Rise of the Woke Right - James Lindsay

Released Thursday, 27th February 2025
Good episode? Give it some love!
E327. The Rise of the Woke Right - James Lindsay

E327. The Rise of the Woke Right - James Lindsay

E327. The Rise of the Woke Right - James Lindsay

E327. The Rise of the Woke Right - James Lindsay

Thursday, 27th February 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

All right, I'm with James Lindsay

0:02

everyone. Welcome back to Watkins welcome.

0:04

It's been a while. It's been

0:06

a while. It has been, when

0:08

was the last time you were

0:11

on the podcast? Do you remember?

0:13

Probably like this time of

0:15

year, 2019, maybe. Oh, holy shit.

0:17

Yeah. You and I saw each other

0:19

last in person in Aspen, right?

0:21

Yeah. That was like, what, 2019?

0:23

That was also 2019. When, was

0:25

that July, July? It was the summer,

0:28

it was the summer, right. Yeah, that

0:30

was fun. That was a pretty good

0:32

trip. That was a good trip. That was

0:34

a good trip. That was it. But

0:36

now it's like seems like this. The

0:38

like the days of like the golden

0:40

days of your the idyllic past when

0:43

the world hadn't gone mad yet.

0:45

COVID hadn't happened. It had gone

0:47

mad though. It had. But what was

0:49

what we didn't fully, well, maybe you

0:52

did. But what we didn't fully comprehend

0:54

was how much more mad it was

0:56

going to go. Because you guys were

0:59

in the middle, you were there with

1:01

Peter Pagosian, Helen, Helen Pluckrose, and we

1:03

were all at Pamela Preses, and she's

1:06

freaking great and that. And it was,

1:08

what were you doing there? You were trying

1:10

to like give a talk? Yeah, she put

1:12

together like a panel for us with,

1:14

I forgot what the organization

1:17

was, but they're in Aspen and she

1:19

had us go and we were talking,

1:21

I think about our grievance studies, a

1:23

hoax papers that we had written. Yes.

1:25

Because that came out in the end

1:28

of 18. So the timing lines up,

1:30

right? And so we were talking about

1:32

kind of the crisis and academia. And

1:34

if I remember, right. We were trying to

1:36

make the point that like what's happening in

1:38

academia is not going to stay in academia,

1:41

right? It's going to leak out or it's

1:43

going to take over society, which is kind

1:45

of what motivated us to do the

1:47

project in the first place. Yeah, remember

1:50

when I asked for on Twitter examples

1:52

of self-censorship, it was right before I

1:54

met you in person, and people were

1:56

like, this stuff is everywhere, and people

1:59

were like, oh, it's just going to

2:01

be a bunch of... right wingers who

2:03

are being silenced. I'm like, no, this

2:05

is left wingers telling me that like,

2:08

they can't critique someone who's building a

2:10

bridge and it's in all of health

2:12

care and it's all, and this is

2:15

like, this was, you couldn't even talk

2:17

about this. Yeah, that's right. And you

2:19

were like, send me some of this

2:21

stuff and. Not all of it has

2:24

been vetted, but I mean, it's pretty

2:26

clear that this was the case and

2:28

these were people who were at all

2:31

of these different institutions saying, and it's

2:33

basically DEA, like all this crap that

2:35

they're trying to get out. And they're

2:37

saying like this was happening and no

2:40

one believed them and they didn't feel

2:42

like they could say anything. Yeah, I

2:44

remember going to like Los Angeles, it

2:47

had to be before COVID because that

2:49

kind of kind of changed everything. I

2:51

mean being in Hollywood I guess so

2:53

we'd go out you know go to

2:56

one of the little bars or whatever

2:58

with a group of people and we'd

3:00

be just talking and then be like

3:02

shh don't talk like that yeah don't

3:05

everybody was like keep your head down

3:07

don't say stuff like that in public

3:09

yeah talk about it when we get

3:12

back to the to the apartment or

3:14

whatever and I was like what you

3:16

don't understand we're in Hollywood you can't

3:18

talk like that and I was like

3:21

okay yeah so lots of self-censorship censorship

3:23

I still have friends when we'll be

3:25

out to dinner that are here who

3:28

have all kind of, I guess they're

3:30

called Lefty Geez, like who have all

3:32

left California. They, they'll be like, I

3:34

still get nervous about talking and not

3:37

just from LA, from San Francisco as

3:39

well. Like they're like, I still get

3:41

nervous talking about this stuff in public.

3:43

I'm like, that's fucking crazy dude, that

3:46

you still have that fear to like.

3:48

talk about something and it's anything really

3:50

but it's wild we were I was

3:53

at a dinner in when I was

3:55

in LA right before the fires I

3:57

went there for my family still there

3:59

and we went for um we went

4:02

for just like the week of new

4:04

years and I went out to dinner

4:06

with some friends Megan Dom actually being

4:09

one of them and she I had

4:11

my back to the, there was a

4:13

couple seated like at a deuce kind

4:15

of right behind us and another person

4:18

that I won't blow up her spot.

4:20

And the three of us were talking,

4:22

but she's in the culture wars. And

4:25

the three of us were talking and

4:27

going on and I turned around, like

4:29

I just got that spidey sense and

4:31

I turned around and they both had

4:34

their phones in that weird way. And

4:36

I was like, are they fucking recording

4:38

us right now? And I wasn't. Sure

4:40

if they were or weren't or if

4:43

I was just being paranoid, but even

4:45

Megan was like I don't know they

4:47

might have been and it was just

4:50

so weird I'm like that because they

4:52

gave me that look of like you

4:54

caught me when I looked at them

4:56

because I like looked over my shoulder

4:59

and they were like and then they

5:01

were texting each other instead of talking

5:03

which is fucking weird too and I

5:06

was like I fucking hate this like

5:08

I haven't had to deal with this

5:10

at all. in Texas, I mean maybe,

5:12

maybe it would happen in Austin, but

5:15

it was weird. And I was like,

5:17

the other thing that I hate about

5:19

that whole environment of self-censorship and fear

5:22

is that it does make you paranoid.

5:24

It does. You're like, are they not

5:26

returning my call because they saw something

5:28

I tweeted or is it just because

5:31

they're busy? Because generally, most people are

5:33

so self-involved that it's just because they're

5:35

busy. It has nothing to do with

5:37

you. But in this kind of climate,

5:40

it might. Yeah, I've like, you know,

5:42

everybody I think has heard of, well,

5:44

maybe not everybody, but I think a

5:47

lot of people have heard of imposter

5:49

syndrome, right? Which is like, you get

5:51

your degree and you go get your

5:53

like little cutesy job or whatever, and

5:56

you believe that you're actually an idiot.

5:58

Yeah. But, you know, you got hired

6:00

to do, you know, whatever it is,

6:03

math or something hard. And then you

6:05

live your life at work, thinking every

6:07

day you're about to be discovered, you're

6:09

about to be discovered as the imposter,

6:12

You wonder if you're going to be

6:14

discovered that you're the bad person, the

6:16

evil person, the one who, you know,

6:19

has wrong opinions or wrong thoughts. And

6:21

it's like this weird, like you said,

6:23

there's this paranoia, like, are they not

6:25

calling me back? It's like, you know,

6:28

I sent him a tech. It's been

6:30

like two days, like that's kind of

6:32

weird, like are they mad at me?

6:34

What did I do? And yeah, there's

6:37

this weird vibe that comes with that.

6:39

Yeah, and it's unsettling because that paranoia

6:41

is so bad. It's something I just

6:44

had to like put completely aside. It's

6:46

like, all right, whatever the reason, it

6:48

just can't matter. You have to just

6:50

like keep moving forward and staying in

6:53

touch with people who are your... saying

6:55

touchstones and I think now you've been

6:57

kind of that was such a like

7:00

innocent time I remember going and seeing

7:02

your talk and being like you guys

7:04

need someone like me to like be

7:06

a translator to be take what you're

7:09

saying and try and because I was

7:11

sitting in the audience I'm like no

7:13

one understands what the fuck you guys

7:15

are talking about like no one knows

7:18

about cultural Marxism or like and I

7:20

also think you guys were new you

7:22

in particular at the speaking like you

7:25

have improved a lot over oh yeah

7:27

oh yeah in the past like I

7:29

guess what has it been now how

7:31

many years six years yeah it's been

7:34

a crazy decade you have a little

7:36

bit of practice now yes you've had

7:38

a lot of practice even to even

7:41

to just contextualize how insane the past

7:43

like But last time I saw you

7:45

I was single and now I am

7:47

married with a child. Yeah, that's right.

7:50

Like a lot can happen in a

7:52

short period of time. Oh, well, yeah,

7:54

it's like, well, it's almost like, I

7:57

actually literally did talk to a couple

7:59

comedians and that kind of helped me

8:01

learn how to talk. Yeah. By the

8:03

way, there's, you know, one guy was

8:06

like, you know, you're never gonna give

8:08

a really good talk until you bomb

8:10

at least once and just like let

8:12

it be you know like don't worry

8:15

about it if you go give like

8:17

a total bomb and so that's like

8:19

really relaxing but then it's like I

8:22

learned by especially watching comedians this idea

8:24

of just having a conversation with the

8:26

room yeah and so you learn to

8:28

read the room yeah kind of tell

8:31

when their eyes glaze over or when

8:33

you said too many syllables in one

8:35

you know breath or whatever That word

8:38

had a hyphen and they've checked out.

8:40

And so it's like, but it just

8:42

feels like I'm talking, like I'm talking

8:44

to you, but I'm talking to, you

8:47

know, 400 people or whatever. Yeah. And

8:49

that changed everything. And what is it

8:51

like to be somewhat, I mean, you

8:54

seem to like, how do you stay?

8:56

Sain, have you stayed sane? Have you

8:58

gone and say, there's word on the

9:00

street that you've lost your mind? There

9:03

is word on the street that I've

9:05

lost my mind, but I don't think

9:07

I have. I mean, would you know?

9:09

Yeah, that's the problem, right? Let's use

9:12

one of those big words. There's an

9:14

epistemological problem here, right? If you've lost

9:16

your mind, you're almost guaranteed not to

9:19

know it. to think you haven't, not

9:21

just not to know it, but to

9:23

think you didn't. And so I think

9:25

being still completely aware of that is

9:28

a good, you know, sanity check. Right.

9:30

No one who's lost their mind would

9:32

question if they've lost their mind. Yeah,

9:35

I question it like daily because, you

9:37

know, like, I'm like, am I getting

9:39

gaslit or am I nuts? Like, what's

9:41

going on? I would have been like,

9:44

wow, so I lose my mind in

9:46

the future. That's really unfortunate for me.

9:48

And now you're like so much winning,

9:50

I'm getting tired of winning. Yeah, there

9:53

is, it is, you have to, I

9:55

think this is a thing too that

9:57

happened with this election more than any

10:00

of them is that preference falsification that

10:02

everyone had where they were afraid to,

10:04

they were speaking in hushed tones about

10:06

these things, that it freed them to

10:09

be like, oh. No, everybody else is

10:11

on board with this. Yeah, I kind

10:13

of got the impression that that was

10:16

coming in April. Yeah. So my wife

10:18

and I and my company we all

10:20

went together on like a cruise, right?

10:22

And to the Caribbean. Well, it depends

10:25

on the boat. And it was a

10:27

big boat which means there I mean,

10:29

you got a little kid. So there

10:32

were a ton of little kids and

10:34

of loud music and a ton of

10:36

really bad cheap food that, you know,

10:38

left a little bit to be desired.

10:41

But at any rate, we were at

10:43

the, you know, the whole game when

10:45

you're on a cruise without great food,

10:47

by the way, is to just go

10:50

around to the different free food places

10:52

and try to find one that doesn't

10:54

suck. So you eat somewhere different, like

10:57

three times a day, four times a

10:59

day to try to find something. On

11:01

the boat, you're looking for something. And

11:03

they'd be like, it would be like

11:06

six old ladies and they'd be like,

11:08

well, split one cup of chowder, please,

11:10

just so they don't have to, because

11:13

you got all that free food on

11:15

me. And they just want to be

11:17

able to say they tried the chowder

11:19

and newborns. Like, I just became such

11:22

an anti-cruiser. So we were on this

11:24

boat, we're at the taco place, because

11:26

like the only place that you didn't

11:29

have to pay for extra that was

11:31

like worth eating. And we're sitting. There

11:33

were these two visible Democrat women. Like

11:35

purple hair? Yeah, like the whole thing,

11:38

like, you know, the way they were

11:40

dressed, like the style clothing. I want

11:42

to know at the like visible Democrat,

11:44

you know, criteria. The dress was a

11:47

little behemoth, you know, the piercings, not

11:49

all tattoos indicate visible Democrat, but some

11:51

do, you know, the hair, the colors

11:54

and the hair in particular. That's a

11:56

big sign. And they're talking with each

11:58

other, and you know, like the story

12:00

in LA, right? They're the next table

12:03

over. Well, we didn't bust out our

12:05

cameras in the film. But they just

12:07

keep saying, she's destroying the place. She

12:10

sucks, she sucks, she's awful. And we

12:12

kind of finally picked up that they

12:14

were talking about a governor. And there's

12:16

only really two choices. It's either Whitmer

12:19

or Hochle. So we went over and

12:21

we were like Whitmer or Hochle. And

12:23

they were like, Hochle? We're from Manhattan,

12:26

Visible Democrats. And then we talked to

12:28

him for a few minutes and they

12:30

were like, well, we're on, we're, we're.

12:32

totally Democrats were total leftists and we're

12:35

on a boat so we can say

12:37

it don't tell anybody we're voting for

12:39

Trump yeah we're done with it they

12:41

said Biden's ruined in the country Hokel's

12:44

ruined in the state and Adams is

12:46

ruined in the city yeah yeah I

12:48

was like oh there is something like

12:51

the preference falsification is now reaching deep

12:53

into Democrat strongholds yeah like something's going

12:55

to happen here yeah I wasn't I

12:57

also think one of the reasons that

13:00

I I was a little bit like

13:02

I wasn't really on the phone. People

13:04

are like, was it hard? Like, no,

13:07

it wasn't actually hard at all. I

13:09

do, I think part of the reason

13:11

that I felt compelled to vote was

13:13

that idea of like too big to

13:16

rig, even though my vote in Texas

13:18

wouldn't matter. I was like, I just

13:20

want to be part of the popular

13:22

vote. I want to be counted. I

13:25

want to be part of this, like,

13:27

the people who are like, fuck this.

13:29

Let's go. It's weird to be. You're

13:32

such a canary in the coal mine,

13:34

I can see how you would go,

13:36

insane, but I've seen you kind of

13:38

battling online a lot, I mean, always.

13:41

Always. Yeah. I ran into Dave Rubin

13:43

speaking of people the other day. I

13:45

was in DC around the inauguration, lurking.

13:48

Lurkin. Lurkin. Lurkin. Lurkin. I was, I

13:50

was kind of hiding, actually. But I

13:52

ran into Dave Rubin, and he just

13:54

comes up to me and he just

13:57

whispers. Can we make it through lunch

13:59

without you fighting with somebody else on

14:01

Twitter today? And I'm like, Dave, shut

14:04

up. I'm going to go fight with

14:06

you. Yeah. But I'm fighting on Twitter.

14:08

Yeah. Or X, X, it's X. Thank

14:10

you, Elon. What do you get out

14:13

of that? I think I have a

14:15

disability. You're like, there's some someone is

14:17

wrong on the internet. Is it that

14:19

disability? there's only particular kinds of like

14:22

bullshit that I won't leave alone and

14:24

I'm actually I got pretty good at

14:26

ignoring people that are like rude to

14:29

me except I think it's funny to

14:31

like make a joke about them right

14:33

or to make an it not to

14:35

make an example of them but to

14:38

make them into an example of something

14:40

I'm talking about like your haters will

14:42

prove your point very frequently for you

14:45

right and so I do a lot

14:47

of that it looks like fighting but

14:49

I'm not actually fighting but like like

14:51

gross misrepresentation of something that I've said

14:54

is something that for whatever reason, if

14:56

I see it, I can't leave it

14:58

alone, right? And I've solved this problem

15:01

basically, mostly by turning off notifications. So

15:03

it used to be that these guys

15:05

that have the bigger accounts, the smaller

15:07

accounts don't, but the bigger accounts would

15:10

come into my push notifications. And I'd

15:12

be like, oh, you, you know, bad

15:14

word, bad word. And then it's like,

15:16

I have to go say something. And

15:19

it's like, like, don't do it. and

15:21

have this little fight with myself, and

15:23

then I go and smart off, and

15:26

then the rest of my day sucks

15:28

because of it. And I'm finally figuring

15:30

out that this is actually not a

15:32

productive use of my time and activity,

15:35

so I'm trying to discipline myself against

15:37

these people. And I also notice that

15:39

they're trying, like, there are people definitely

15:42

trying to exploit that now. I somewhat

15:44

take responsibility for your... X habits. You

15:46

can have all over the blame. I

15:48

mean, they were really trying to rein

15:51

you in when you were with the

15:53

group. When we were in Aspen, I

15:55

was like, they were like, you can't

15:57

make jokes, you need to be taken

16:00

seriously. I was like, ah, whatever, you

16:02

can be funny on there. And then

16:04

I always say, like, take the work

16:07

seriously. I quote you on that all

16:09

the. time. It is it is something

16:11

that has been very helpful to me

16:13

but then when I see you going

16:16

off I'm like this is my fault.

16:18

Well some of it is because I

16:20

don't well another thing I learned is

16:23

I don't take myself seriously on social

16:25

media and another thing that I learned

16:27

is that if you can post something

16:29

you yourself knowing it's being made fun

16:32

of or kind of a joke or

16:34

whatever like let's say there's some picture

16:36

I don't know I don't have any

16:39

pictures of me that look bad but

16:41

let's say there was a picture of

16:43

me that looks bad a lot of

16:45

confident you are I posted the picture

16:48

of me that looked bad and made

16:50

fun of myself you become impervious to

16:52

other people making fun of it right

16:54

yeah or if maybe I don't know

16:57

you make a video with a giant

16:59

sword and you post that and you

17:01

make fun of yourself you become impervious

17:04

to So yeah, I mean I was

17:06

raised that way though. We were, I'm

17:08

Irish Catholic in our family. Did you

17:10

have like 40 brothers or something? No,

17:13

I have like 26 first cousins. That's

17:15

what I was. But yeah, we, my

17:17

whole family was, it was like, people

17:20

always ask me if I want to

17:22

do roast battle and I'm like, my

17:24

whole operating was roast battle. Yeah, right.

17:26

I have no desire to relive any

17:29

of that. But it was very much

17:31

like, like, you have to learn how

17:33

to laugh. and in green rooms, like

17:36

that's just, and on, on, like, job

17:38

sites, and this is where I feel

17:40

like the Democrats just completely last touch

17:42

with working class humor, is like, do

17:45

you know how people talk to each

17:47

other when they're working in a restaurant?

17:49

Do you know how people talk to

17:51

each other on job sites or like

17:54

firemen or in the military? Do you

17:56

have any friends in the military who

17:58

send you memes because they're fucked up?

18:01

Like do you know any Marines? Yeah

18:03

for real. No, so yeah it's your

18:05

it's your fault. No it is my

18:07

fault. But no I mean there's other

18:10

people we can blame too. But I

18:12

will say I'm lucky to have a

18:14

therapist husband because that is one thing

18:17

that drives me fucking insane. I'm like

18:19

I can I can deal with like

18:21

all of the pushback, the shit talk,

18:23

the calling me a grifter, all of

18:26

the things that come with being Open

18:28

Online, I'm not some victim, but when

18:30

someone like misrepresents me it makes me

18:33

insane and Jared is like Bridget, this

18:35

is. also part of it. Yeah, you

18:37

just remind, he's like, this is also

18:39

part of it. If you can't let

18:42

that go, like how can you let

18:44

go all this? He's like, this is

18:46

just someone calling you a grifter too.

18:48

It's all part of being a public

18:51

person and you just have to let

18:53

it go. And it's been helpful to

18:55

have someone who's like, this is part

18:58

of it. Yeah, that's right. Just let

19:00

it go. I'm going to have to

19:02

remember that. Yeah, like go step away

19:04

and you saved me though when Black

19:07

Twitter came for me. Oh yeah, that's

19:09

right. They were like, I didn't know,

19:11

I hadn't really experienced like internet pylons

19:14

like that and you texted me and

19:16

you were like log out for three

19:18

days and I was like, it was

19:20

the best advice because it just moved

19:23

on. Yeah, they get bored. Yeah, nothing

19:25

to make me money. I want to

19:27

send them a W-2 because like they

19:29

work for like I don't know what

19:32

maybe I got my tax forms wrong

19:34

I don't know but their entire business

19:36

is talking about me wow it's like

19:39

that's kind of cool I guess like

19:41

flattering I don't know could you get

19:43

a hobby what do you think are

19:45

the biggest criticisms What are the criticisms

19:48

of you that you feel are unfair?

19:50

What's the deal with you in the

19:52

the fucking woke right? To me, the

19:55

woke right is just the old right.

19:57

Can you like it? Well, it's the

19:59

alt right is what it was. Well,

20:01

it's, oh, okay. It's resurrected alt right.

20:04

I mean, that's like the simplest definition.

20:06

I hate that term, the woke right.

20:08

It makes me say it. There are

20:11

reasons, I'm sure. This is the thing

20:13

though. People think I'm like, though, people

20:15

think I'm like, though. I'm more than

20:17

happy to change if I can think

20:20

of a better one. I just can't.

20:22

But isn't it just like the right

20:24

being the right? No. So explain to

20:26

me how it's different. Well, the kids

20:29

would call them extra, right? They're not

20:31

just the right. They're like extra right.

20:33

But some of them have openly, we're

20:36

going to use big words again, but

20:38

they've openly subscribed to Marxist analysis. They've

20:40

openly subscribed to critical theory. but for

20:42

their ends, not for whatever the left

20:45

wants to do with it. They've openly

20:47

subscribed to postmodern. There are people that

20:49

describe themselves as postmodern traditionalists. Like, imagine,

20:52

you're mad at woke right? Try that

20:54

term, right? That doesn't even make sense

20:56

to me. That's a collision of two

20:58

concepts. That's like an oxymora. How do

21:01

that? but they're gonna love it anyway

21:03

and they're gonna make a like a

21:05

cartoon pastish version of it. We're gonna

21:08

be extra trad. We're gonna wear extra

21:10

like old-fashioned clothes and old spice or

21:12

whatever. I don't know what to do.

21:14

I don't hang out with these people.

21:17

They're hipsters though is what they are.

21:19

They're right wing hipsters. Oh that's interesting.

21:21

I mean it's time the right had

21:23

some hipsterishness. The thing that I do.

21:26

think that's happening in in balance is

21:28

that they're trying to have like an

21:30

intellectual side of the right yes and

21:33

they're trying to have the the hipster

21:35

side of the right it does I

21:37

do think yes but have you read

21:39

Moscow you I do think these things

21:42

need to emerge in order to find

21:44

some semblance of balance in the culture

21:46

it can't just be the left way

21:49

like my I guess my bigger question

21:51

is I'm still the guy that's like

21:53

maybe we need Less nerd shit. Are

21:55

you dealing with an HMO like I

21:58

am? It's such a pain in the

22:00

butt. If I want to get any

22:02

sort of testing done to ensure I'm

22:04

making my health a priority, I have

22:07

to wait weeks for an appointment to

22:09

see my doctor. Then get a referral

22:11

from her and wait weeks more for

22:14

an appointment with the specialists I needed

22:16

to see just to get some lab

22:18

work. Quest simplifies this process. I can

22:20

purchase my own lab tests online and

22:23

test for a multitude of health issues

22:25

I might be concerned. about, then take

22:27

those results to my doctor better armed

22:30

with the information I need. At Questhealth.com

22:32

you can get hundreds of tests across

22:34

several different categories from allergies to heart

22:36

health, hormones, sexual health, women's health tests

22:39

for egg quantity, perimenipoz, men's health tests

22:41

for testosterone, prostate, and so much more.

22:43

These are lab quality tests, the same

22:46

ones you'd be given at your doctor's

22:48

office, so prioritize your health, and visit

22:50

Questhealth.com today. Use the code walk-ins to

22:52

get 25% off and find answers to

22:55

the multitude of health questions you may

22:57

have. That's Questhealth.com and use the code

22:59

walk-ins for 25% off. Can you escape

23:01

post-modernism though? Like aren't we in... Isn't

23:04

it going to be an umbrella for

23:06

everything? That's a legitimate question. Are we

23:08

in post-modernity? Which therefore necessitate a theory?

23:11

of post-modernity that might be post-modernism. It

23:13

depends on what we, that's a very

23:15

complicated term, so it depends on what

23:17

we mean by it. And I don't

23:20

know, maybe we are in post-modernity because

23:22

of our democratization of publication, that's fancy

23:24

talk for, you can post on social

23:27

media. You can make a video, you

23:29

know, anybody can set up this kind

23:31

of studio literally in the room in

23:33

their house in a garage wherever they

23:36

want and have a TV show. Yeah.

23:38

Yeah, and get a press pass. Well,

23:40

I'm a little like, I wonder how

23:43

that's going to work out. I can't

23:45

wait. Hi, Bridget Fettice, dumpster fire, dumpster

23:47

fire. What are you going to do

23:49

the daylight savings time thing, please? When

23:52

are you going to abolish daylight saving

23:54

winter out of existence any time soon?

23:56

Can we make America Florida yet? But

23:58

yeah, so the media environment that we're

24:01

in is very democratized. It's very fake

24:03

like people don't know how fake social

24:05

media is And I don't mean fake

24:08

like, oh, there's advertisers and there's all,

24:10

no, no, no, it's profoundly fake. It

24:12

is like the algorithms are all driven

24:14

by like trending content. The trending content

24:17

is generated by networks of people who

24:19

are working together to self. and cross-promote

24:21

so that they get higher up in

24:24

the monetization scheme on X, which by

24:26

the way is gaming that system, which

24:28

is why Elon sometimes takes away their

24:30

checks, and then they say, you're censoring

24:33

me from my opinion. No, they're not.

24:35

You're gaming his freaking payout system, which

24:37

costs him lots of money, but that

24:40

creates the trends that the algorithm then

24:42

traces, is very much driven by... fake

24:44

real activity as in real humans interacting

24:46

together like hey I get in a

24:49

text group with you you whatever things

24:51

you want blown up send her to

24:53

me and I'll send her to so

24:55

and so and you know we get

24:58

a group chat together of 30 of

25:00

us and we all you know half

25:02

a million a million followers each or

25:05

whatever we all retweet each other all

25:07

the time comment on everything even if

25:09

we don't agree just whatever to keep

25:11

the keep the thing going so much

25:14

of it's driven by that and then

25:16

we haven't talked about bots because the

25:18

bot problem is Nuts, and I don't

25:21

know if it's, I don't know if

25:23

it's fixable. You can buy bots, anybody

25:25

can buy bots. Foreign governments have legions

25:27

of bots. Yeah. Like, China has, I

25:30

mean, the Chinese bots are kind of

25:32

funny, they're not very good. Yeah. And

25:34

if you figure you have one and

25:36

you reply anything and simplify Chinese, they'll

25:39

reply back to you and simplify Chinese

25:41

and say something like, you know, we're

25:43

gonna crush America. You know, something very

25:46

crazy. like they're and they're very radical

25:48

they're very radical you can see it

25:50

is true like you're like oh Ukraine

25:52

woke up or whatever like yeah right

25:55

when you're like suddenly something's trending yeah

25:57

you're like oh it's 9 a.m. in

25:59

Moscow that's right total well yeah and

26:02

so They're much more sophisticated. They are

26:04

actively trying to use bot networks to

26:06

radicalize the right. And a lot of

26:08

people are thinking, this is a huge

26:11

social movement. There's tons of like frog

26:13

accounts on Twitter that are saying the

26:15

same thing. And it's, you know, some

26:18

room full of Russian guys with 75

26:20

phones each in front of them, putting

26:22

out thousands of accounts. And it's only

26:24

about a hundred, you could have like

26:27

roughly 10,000 accounts verified on X for

26:29

about $100,000 a year. If you do

26:31

the math, it's a little more than

26:33

that, but you know, order of magnitude.

26:36

That's not a big investment for a

26:38

government. The Chinese government spends $16 billion

26:40

a year on influenced campaigns against the

26:43

West. 16 billion. This is why I

26:45

don't buy this idea that... You know

26:47

the whole the whole China thing is

26:49

interesting because I I don't know how

26:52

like I I was kind of on

26:54

board with Trump wanting to ban tic-tock

26:56

and then he went back on it

26:59

and now you hear a lot of

27:01

the people going well we need to

27:03

have like multi-powers and it's in it's

27:05

in the benefit of America and and

27:08

the world for America and China to

27:10

be friends and I'm like I don't

27:12

think China sees it that way. That

27:15

is not how China sees it. You

27:17

might think that and that's great. Yeah,

27:19

we can be buddies. But I don't

27:21

think the CCP sees it that way.

27:24

Yeah, they see it like, hey, let's

27:26

be friends, you know, fingers crossed. Like,

27:28

come on, guys. They are a communist

27:30

country. They're not your friend. They are

27:33

not even your trading partner, like in

27:35

business. I know that you make, that's

27:37

the thing is the biggest. consumer market

27:40

in the world. Yeah. And it's the

27:42

biggest manufacturing base in the world. And

27:44

that should be making Americans feel like,

27:46

oh no. But instead they're like, money.

27:49

Yes. And that was one of my

27:51

questions was, is that why all those

27:53

guys were up with Trump at the

27:56

inauguration? Like, are they hoping to enter

27:58

into the market in China, which really

28:00

for them would, is it just... like

28:02

you know you keep hearing this word

28:05

reciprocity like from Elon and is there

28:07

it's bullshit that like they have their

28:09

stuff here and we don't have our

28:11

stuff there so my question is are

28:14

they going to sell out all of

28:16

the Americans and all of our information

28:18

to China in order to get in

28:21

there look at Boeing Yeah. I mean,

28:23

so this is, you know, we

28:25

can speculate all we want about

28:27

Boeing, but this was something that

28:30

was said on Tim Poole's show,

28:32

on Tim Kast, by a former

28:34

Democratic congressman. Kucinich? No.

28:37

Yeah, Dennis Kucinich. And so

28:39

they were sitting there talking

28:41

about, like, I don't know, people

28:43

coming and asking for lobbying and all

28:45

of this crap. And he said that

28:47

it used to be, he's retired congressmen,

28:49

so it used to be that people

28:51

would come into his office, he said,

28:53

from Boeing, and they were trying to

28:55

get him to change a law that

28:57

allowed Boeing to be able to basically

28:59

show technology to the Chinese, but the

29:01

reason was because Boeing wouldn't. couldn't sell

29:04

airplanes in China unless they had a

29:06

deal where China made the landing gear

29:08

or whatever piece of technology it was

29:10

for them to install on the aircraft.

29:12

And so they had to be able

29:14

to like show some kind of like

29:16

trade secret to China to gain access

29:18

to the Chinese market, but that was

29:20

technically illegal because it was a sanctioned

29:22

communist country. And so they were trying to

29:25

get Congress to work around that. And now Boeing

29:27

is a up. shit creek you might have noticed

29:29

they've now decommissioned the they've made their

29:31

last seven i mean this is a little

29:33

technical but their last seven seven seven three

29:35

hundred er they're not making anymore it doesn't

29:37

look like they're like crashing or the news

29:39

is saying they're crashing all over the place

29:41

there's all these Boeing problems they get this

29:44

huge attack last time i was here in

29:46

Austin i was on rogan show and i

29:48

gave him this whole like rundown i was

29:50

like no what they're there what's happening is

29:52

the chinese manufacturer Comac is rising and they're

29:54

stealing Boeing technology and it's like they thought

29:56

there was like oh we want to make

29:58

lots of money we'll be friends with China,

30:00

you know, or whatever, let them make

30:02

the landing gear, which is already scary

30:04

enough because then they get all the

30:07

technology to build advanced landing gear. Right.

30:09

And I don't know if that also

30:11

involved, you know, that's all commercial. I

30:13

don't know if it also involved, you

30:15

know, Boeing's military contract stuff to be

30:17

able to get into the market. So

30:20

if you think that if these guys,

30:22

if these guys think there's going to

30:24

ever be reciprocity with a... totalitarian

30:26

closed-door system that firewalls everything like

30:29

they're releasing what is it red

30:31

note which is actually literally little

30:33

red book yeah like shau hong

30:36

shu they're releasing this into the

30:38

United States there whatever with Tik-Tok

30:40

that one's like and then they're

30:43

new AI now they got their

30:45

deep CKi and it's like screwing

30:48

up our stock market or whatever

30:50

and meanwhile you can't even like

30:52

open Twitter in China yeah Like you

30:54

have to have like a VP, there's not

30:56

going to be reciprocity. Yeah. The thing

30:59

with, which one was it? Was it

31:01

was with red note? With little red

31:03

book. They literally freaked out because all

31:05

these Westerners jumped on so quickly in

31:07

some weird coordinated push mostly among conservatives.

31:10

by the way, which is super weird,

31:12

all this pro-cCP propaganda coming from conservatives

31:14

all of a sudden, and they all

31:16

jump onto this thing, and then the

31:19

Chinese are like, oh crap, Chinese people

31:21

can see Western content, and they're scrambling

31:23

to shut that down. There's, this is

31:25

not, this is not how they play, right?

31:28

Some of the funniest videos were

31:30

like the Chinese people being like,

31:32

get your gay shit out of here, basically.

31:35

Take your content away from

31:37

here. We don't want it. I

31:39

mean, they've been using the word Bysaw

31:41

for a long time, white left. Yeah.

31:43

And they don't like like kick Nick

31:46

Fuentes off like that. And he

31:48

was like, ugh. I want to go there

31:50

because it doesn't censor me and they put it

31:52

in before he posted anything. Why do you

31:54

attribute that? What do you think that is

31:57

going on with the kind of right wing

31:59

Chinese propaganda? or even like Russian

32:01

propaganda in some respects. They're

32:04

kind of a handful of

32:06

explanations. Because I thought they were

32:08

like America first people, so I

32:10

would think you would be more

32:12

isolationist and which I understand. I

32:15

understand the isolationist part. I don't

32:17

understand the I guess is it

32:19

just like we like dictators? Do

32:21

you understand we like money? Do

32:23

you understand capitalism always wins?

32:26

Yeah, do you understand that

32:28

we like money? I do understand

32:31

that. Apparently not enough because I'm

32:33

not rich. That is not the

32:35

only kind of explanation, but that

32:38

is a very simple explanation. Some

32:40

people might be, I have no

32:42

evidence of anybody, I'm not accusing

32:45

anybody, some people probably are

32:47

in fact being paid. the Chinese

32:49

as the stereotype goes are tricky.

32:51

You may not have, you know, some

32:54

guy whose last name is Chan

32:56

showing up and like with a

32:58

briefcase with a briefcase in a

33:00

checkbook would you like to write

33:02

some tweets for us? Yeah from

33:04

Costco. You're probably going to have

33:06

somebody that's not directly connected who's

33:08

you know a couple layers in

33:10

would you let's like the tenant

33:12

media thing right? 10 million dollars

33:14

coming in from Russia. That was

33:16

like some random European guy that

33:18

had no real history at all.

33:20

Like even that was a little

33:22

suspicious. little suspicious. Like I would

33:24

love to know what was going

33:27

on not to call him out

33:29

but in Timpool's mind when it

33:31

was like we're going to give

33:33

you a hundred thousand dollars

33:35

a week and this is

33:37

totally normal. He justifies it

33:39

though. I know. He's like you know

33:41

this is these are average rates for

33:43

someone of his caliber. Oh, my God.

33:45

I mean. caliber. But yeah,

33:48

I mean, those are, if

33:50

you're getting millions of downloads,

33:52

you can command hundreds of

33:54

thousands of dollars in. I guess,

33:57

I mean, nobody ever asks me

33:59

for this. I think they know that

34:01

I would rat them out. For as much

34:03

of grifters as we are called,

34:05

we really suck at it. I am like,

34:07

the worst ever grifters. I'm like, I

34:09

can't believe I can't call the

34:12

grifter on line. Yeah, no joke,

34:14

right? It's like, they're like, who

34:16

pays you? There's some rumor,

34:18

by the way, you know,

34:20

you asked me earlier, what

34:22

I get misrepresented with, but

34:24

there's a rumor, this isn't

34:26

a misrepresenting. People think Israel's

34:28

paying me, they think the

34:30

CCP's paying me, they think

34:33

that I have all these like

34:35

weird shadowy funders, and I'm

34:37

like, I wish. Dude, what

34:39

are you talking about? No,

34:41

I have Patreon, like please

34:44

send $5, you know. Yeah, like,

34:46

do you think I'd be like,

34:48

I am one day away from

34:51

going on. What's that's fucking sight

34:53

cameo? Yeah, right. Like I'm

34:55

just a couple more

34:57

nervous breakdowns away from

35:00

cameo. Yeah, that's right.

35:02

So what do you attribute

35:04

like that? What do you

35:06

consider the woke right? And

35:09

here's my issue with what

35:11

I see sometimes from the

35:13

new. Like this, I also hate

35:16

this term, the IDW. Well,

35:18

that's a word I haven't

35:20

heard a while. I know, it's

35:22

so due to 2018. But that

35:25

kind of space as they, you

35:27

know, I'm not part of that

35:29

because I'm just part of

35:32

the dark web part. Let's

35:34

say that people got pushed

35:36

right. I think. even I'm not

35:39

a conservative by any means,

35:41

but I would feel I

35:43

think the criticism is that

35:45

these people are now trying

35:47

to gate keep the right in a

35:49

way that seems like it's because

35:52

of moral reasons or purity reasons

35:54

or we need to purge some

35:56

of the badness on the right

35:59

when really It's just like competing

36:01

for market share. Yeah, there's I think

36:03

a lot of puritanism as competing for

36:05

market share going on. The right seems

36:07

to have, you know, realized I think

36:09

ahead of the game that it was

36:11

going to get a taste of power.

36:13

Now it has. I don't know what

36:15

it means to say the right has

36:17

power. Trump has power. I don't know

36:19

what the hell you're talking about, but

36:21

the right has power and they've all

36:24

gone nuts. And you can definitely see

36:26

that scramble to be who's going to

36:28

be at the top of the pile,

36:30

who's going to be near the White

36:32

House, who's going to be this, who's

36:34

going to be that. I love it.

36:36

And that's typical, but the amount of

36:38

mind losing. It's like, I think. There

36:40

are there are elements where it has

36:42

been led where it has been orchestrated

36:44

like I mentioned the Russian propaganda the

36:46

bots are very extreme a lot of

36:48

the Nazi content on X is coming

36:51

from Russian sources That's designed to drag

36:53

in unfortunately our veterans and Teenage boys

36:55

primarily into that crap and Why they

36:57

have like willing compatriots in the? conservative

36:59

movement, I'm not entirely sure, but I

37:01

like money too. So I'm assuming that's

37:03

what it is. But what, you can

37:05

get dragged along in this kind of

37:07

like enthusiasm, right? You can don't, money

37:09

doesn't have, you have to be involved.

37:11

Right. And then. you've got to figure

37:13

out how to compete for market share

37:15

and being a little edgier and a

37:18

little hard I'm not just rejecting the

37:20

left I'm rejecting it a little harder

37:22

everything in fact maybe that the left

37:24

said that was bad some of it's

37:26

actually good maybe in fact mustache man

37:28

was good you know and they go

37:30

on and on and on and well

37:32

the thing is it doesn't matter like

37:34

the old saying is it doesn't really

37:36

matter there's no such thing as bad

37:38

press right all presses good press just

37:40

spell my name right and If they

37:42

come out and say something, like everything

37:45

we learned about World War II was

37:47

a lie, or that maybe the mustache

37:49

man had some, that's Hitler, for people

37:51

not paying attention, had some good solutions,

37:53

then they're going to go viral. They're

37:55

going to go viral because they did

37:57

the edgy thing. And people are going

37:59

to, some people, 90% of them are

38:01

going to be met. There's a book

38:03

about this style of marketing that allegedly,

38:05

I don't know if it's true that

38:07

Milo Xenopolis allegedly studied in order to

38:09

do his kind of shock jock thing

38:12

10 years ago. That book was called,

38:14

trust me, I'm lying. People can look

38:16

it up. It's an interesting read. But

38:18

it's like, I don't know, it's like,

38:20

if this dark tone should even be

38:22

out there here after I've ever recommended

38:24

I've recommended it. people do know about

38:26

this right but this kind of and

38:28

so you want to be able to

38:30

insulate yourself or inoculate yourself against you

38:32

know this kind of manipulation but the

38:34

fact is that edge lording is its

38:36

own energy right yeah if you're trying

38:39

to compete to have like the biggest

38:41

show that gets the most views you're

38:43

not going to get a clip that

38:45

goes like mega viral unless you say

38:47

something a little bit out there yeah

38:49

and there becomes this competition and then

38:51

there's like I'm not joking I think

38:53

there's actually like the dynamics with the

38:55

young men and women, I think there's

38:57

a sexual dynamic to it, not in

38:59

the direct sense, but sexual competition, right?

39:01

So you have young women who are

39:03

rightly horrified by some of the very

39:06

awful things they're seeing with crime from

39:08

illegal immigrants and so on, they want

39:10

to feel safe, and then you have

39:12

guys that are like, I won't just

39:14

deport them, I'll deport them all, I'll

39:16

find the legal ones to get ready

39:18

with the wrong color skin, I'll get

39:20

rid of them too for you for

39:22

you for you, do the edgier, harder

39:24

thing more to impress the girl who's

39:26

kind of, they're all kind of dipping

39:28

in this, this madness together. I think

39:30

that that's actually part of how they've

39:33

radicalized themselves. But I cannot leave out

39:35

that this is also driven. There are,

39:37

if that's like a bunch of, we'll

39:39

say sheep moving in a direction, there

39:41

are sheep dogs and there are guys

39:43

with whips. Making it go where they

39:45

want it to go behind it and

39:47

those people are foreign governments They might

39:49

they might be deep state They might

39:51

just be you know people who want

39:53

to see the world burn. I don't

39:55

know who they are Yeah, I do

39:57

know the foreign governments do this for

40:00

certain. Yeah Yeah, there's it's funny to

40:02

see how it reminds me because they,

40:04

you know, Trump has power, yes, but

40:06

the thing that I'm most fascinated with

40:08

because it is more my lane is

40:10

just like culture. And as I've mentioned

40:12

many times before at this point on

40:14

this podcast, I've never lived through a

40:16

time where the right was cool. Yeah.

40:18

That's just not, and I can't think

40:20

of a time in America when that

40:22

was even the case. So yes, they

40:24

may have had some power politically and

40:27

they may have had power in the

40:29

judicial power Supreme Court. But when was

40:31

it that like they were, you know,

40:33

people were doing like the Trump dance

40:35

and athletes and like the young men

40:37

are kind of like, yay, you know,

40:39

right wing. generally there is that old

40:41

saying of like if you're not a

40:43

leftist in your 20s you know and

40:45

if you're not and this has been

40:47

completely inverted. Yeah it's a complete inversion

40:49

it's flipped over on its head and

40:51

that's a ton of energy and some

40:54

of it's very organic and real and

40:56

some of it's going to get co-opted

40:58

because I mean look at A lot

41:00

of people don't understand it, so it

41:02

talked about social media being very fake.

41:04

If something divisive starts to go viral,

41:06

some debate or something starts to go

41:08

viral, it doesn't matter what the issue

41:10

is. And there's going to be people

41:12

like, you know, women shouldn't be in

41:14

the kitchen, women should be in the

41:16

kitchen, what it could be in the

41:18

kitchen, what it could be, anything like

41:21

that. Repeal the 19th. influence networks on

41:23

that within hours right and they're seating

41:25

right they'll do is get in people's

41:27

replies and you know you got one

41:29

guy with 75 phones or whatever and

41:31

what he's going to do or 75

41:33

accounts what he's going to do is

41:35

three fairly reasonable comments that agree with

41:37

your position and then one that's like

41:39

you know gas the juice and and

41:41

they do this is the strategy that

41:43

but I get it I mean it's

41:45

an unexpected thing to have said in

41:47

that moment but yeah that's that's the

41:50

strategy though right is to take it

41:52

to the next thing like to just

41:54

seed the next more extreme thing So

41:56

you're getting all this affirmation in your

41:58

replies for the extreme thing that you've

42:00

already declared or for the extreme position

42:02

that they want to Like take further

42:04

and then a handful some percentage 510

42:06

20% of the replies are going to

42:08

get more extreme and they tend to

42:10

get more extreme over time Here's an

42:12

example not of that specific thing, but

42:14

where you can see there was over

42:17

Christmas almost everybody saw this ridiculous debate

42:19

about H1B visas. They just came out

42:21

of nowhere, right? Well, you could see

42:23

it started with some influencers and then

42:25

it started to get really ugly and

42:27

then it got really like, do we

42:29

really believe this? And then all of

42:31

a sudden, I suddenly did see the

42:33

end. Well, what I saw though was

42:35

a bunch of the influencers who were

42:37

very hard in the paint at the

42:39

beginning. We're like, whoa, whoa, whoa, who.

42:41

You could see they lost control of

42:44

it, right? I guess what

42:46

happened. I mean, I don't have

42:48

proof that I can, you know,

42:50

produce for you, but I can

42:52

guess what happened. What happened? An

42:54

influence network decided probably Katari or

42:56

Russian or... Pakistani, Pakistan hates India,

42:58

who hates India, like start, you

43:01

know, doing the math, you know,

43:03

who can disrupt it, gets involved

43:05

and starts pushing the issue and

43:07

taking it deliberately steps further than

43:09

it's supposed to go over the

43:11

course of three or four days

43:13

to where all of a sudden

43:15

the influencers who started the whole

43:17

thing are like, whoa, what's happening,

43:19

right? And this wasn't what we

43:22

were saying. You can see that

43:24

they can lose control of it.

43:26

I think the reason for that

43:28

isn't that the debate lost control

43:30

of itself, but rather that there

43:32

are entities pouring gasoline or spraying

43:34

gasoline like flames there are into

43:36

it to drive it in particular

43:38

directions. Yeah, that's what's interesting is

43:40

you have the influencers who are

43:43

kind of like leading the sheep,

43:45

but then they lose control of

43:47

the sheep at some point. I

43:49

mean, that's like the story of

43:51

like every mob ever, right? Yeah.

43:53

Yeah. So what is your concern?

43:55

What is your concern with the

43:57

right? Like why did you kind

43:59

of come up? this is there

44:01

a real Christian nationalist problem that

44:04

we should be concerned with or

44:06

is it over rot? Well according

44:08

to the pastors I'm talking to

44:10

around the country because I speak

44:12

at a lot of churches now

44:14

maybe that's ironic I don't know

44:16

but I speak in a ton

44:18

of churches and I talk to

44:20

a lot of pastors and this

44:23

this same issue it's like a

44:25

lot of them are telling me

44:27

this is the number one issue

44:29

now in their church that young

44:31

men under 40 are almost You

44:33

know, you almost can't talk to

44:35

him. They don't want to hear

44:37

anything from anybody that's elder in

44:39

the church. They have very radical

44:41

views. A lot of them don't

44:44

even have families or relationships and

44:46

don't seem to want one. They're

44:48

just... there and kind of intense

44:50

and that this is a leading

44:52

problem at least in certain areas

44:54

in the country northern Idaho northern

44:56

Utah yeah parts of Arizona and

44:58

Florida my first guess would have

45:00

been northern Idaho by the way

45:02

they call it literally the Moscow

45:05

mood you got the glass I

45:07

said almost had mule copper mug

45:09

but That even the pastor who

45:11

started that shout out to Russia

45:13

by the way Yeah, there you

45:15

go Even the past signing to

45:17

them do they have ginger beer

45:19

like natively in Russia though. I

45:21

don't think they do I don't

45:23

know I don't know But I

45:26

think I said that he would

45:28

get oranges to grow in Siberia

45:30

if given long enough to teach

45:32

them Soviet theory So maybe you

45:34

could get the lime at least

45:36

No, even the pastor who kind

45:38

of started that Moscow mood Northern

45:40

Idaho thing, Doug Wilson, is now

45:42

like, holy crap, they all hate

45:44

Jews, what do I do? And

45:47

like, I just saw a video

45:49

this morning while I was flying

45:51

in today. I'm like, he's talking

45:53

and he's like, this is wrong

45:55

guys, like, he's lost control of

45:57

it, right? It's gone beyond him.

45:59

And so I'm hearing that this

46:01

is a major problem in kind

46:03

of the church networks all over,

46:05

with a lot of like Southeast

46:08

Tennessee. Again, the same, I have

46:10

friends in both areas who are

46:12

saying this is like a major

46:14

problem in our churches. So I

46:16

think it's- Is it anti-Semitism? Some,

46:18

but it's mostly this like, we

46:20

can't vote our way out of

46:22

this, like this weird, like black

46:24

pill, everything's despair, we need very

46:27

radical solutions, you know, almost- But

46:29

also God? in a manner of

46:31

speaking, usually very strict interpretations of

46:33

what God doesn't, doesn't like. It's

46:35

almost, you know, getting into that

46:37

kind of almost puritanical flavor with

46:39

these guys. And what I'm hearing

46:41

those... Like, it sounds a little

46:43

bit like any religion, religious extremism

46:45

is just, it sounds like I'm

46:48

gonna be in a burka soon,

46:50

you know? Yeah, right. And this,

46:52

so you asked me really kind

46:54

of like, what am I worried

46:56

about... with this whole broader phenomenon.

46:58

I think the Christian nationalist issue,

47:00

however big it is, is embedded.

47:02

It's like the Narnian version, because

47:04

I call the Christian world Narnia,

47:06

which they take as flattering, which

47:09

I'm glad they do, because I

47:11

sort of mean it that way.

47:13

But it's also removed, right? The

47:15

mainstream America does not know what's

47:17

going on in the Southern Baptist

47:19

Convention and does not care. But

47:21

it's like big politics inside the

47:23

Southern Baptist or inside Narnia, right?

47:25

The Christian nationalist thing is like

47:27

the Narnian version of the bigger

47:30

woke-right phenomenon, post-liberal wanting to, you

47:32

know, think of a new order

47:34

that doesn't follow, you know, individual

47:36

liberty and property rights as the

47:38

primary drivers and so on. And

47:40

what I'm afraid of, in fact,

47:42

with a lot of the radicalism

47:44

in particular, I don't think that

47:46

this is meant to win. I

47:48

don't actually have a sincere fear

47:51

yet that, you know, we're going

47:53

to... lurch into, you know, 1933,

47:55

1934, Germany, all of a sudden.

47:57

What I'm actually more concerned about

47:59

is that in 2026. the year

48:01

of our Lord next year, we

48:03

now have an election in which

48:05

a lot of people were scared

48:07

and held their nose and said,

48:10

I'll vote for Trump. And they're

48:12

going to be like, whoops, time

48:14

to vote for Democrats again. Right,

48:16

right. Right. And all the Democrats

48:18

have to do and they're not

48:20

doing a great job so far.

48:22

But all they have to do

48:24

is remain modestly sane. They can't

48:26

though. The culture side can't, but

48:28

the politicians can. The politicians can

48:31

come in and just say straight

48:33

economic populism, they can blame. Everything

48:35

on Elon Musk gets the billionaires,

48:37

it's the billionaires, it's the billionaires,

48:39

and start to sound totally sane.

48:41

and hit that, you know, Bernie

48:43

Bro base and get them reactivated

48:45

that it's not all about race

48:47

and sex and sexuality. They can

48:49

even start to find ways. That

48:52

Bernie Bro base voted for Trump

48:54

though. I know. The goal I

48:56

think would be to bring them

48:58

back. I mean, this is what

49:00

I've been saying. Well, they might

49:02

be down with the anti-Semitism, I

49:04

don't know. like the Palestine stuff

49:06

is so embedded in that the

49:08

young people to be for you

49:10

know to have rightfully empathy for

49:13

the suffering of the Palestinian people

49:15

without and this was something that

49:17

I notice even amongst normies in

49:19

my own life is that they

49:21

would send me these videos from

49:23

certain huge influencers just asking questions

49:25

just asking questions you know why

49:27

well why Why is this? But

49:29

they don't know enough about the

49:31

history of anti-Semitism to recognize that

49:34

it's actually anti-Semitism. So they'll be

49:36

sending me a video. I'm like,

49:38

this is an ancient anti-Semitic trope.

49:40

And they're like, no, it's just

49:42

like, they're just asking questions. And

49:44

so this is... Imagine... You're being

49:46

somebody that's at peacetime in your

49:48

day-to-day life being exposed to outright

49:50

war propaganda in a war that

49:52

you don't understand with historical context

49:55

that you don't have and how

49:57

you'd react to it. But also

49:59

like on TikTok these kids don't

50:01

think Helen Keller was real. You

50:03

know like there's... That reminds me

50:05

though, do you know why Helen

50:07

Keller wasn't good at driving? Do

50:09

I want to know? Because she's

50:11

a woman. I was talking to

50:14

Cat D just yesterday for this

50:16

podcast and I love her takes

50:18

and thoughts on media and whatnot

50:20

and she thinks there's going to

50:22

be some kind of re-centralization or

50:24

legacy media is going to have

50:26

to reassert itself because it's so

50:28

fractured that no one knows what

50:30

to believe anywhere but I'm I

50:32

feel like that's very optimistic to

50:35

think that I'm like, I don't

50:37

think there's any going back. But

50:39

so how does one, how does

50:41

one mentally inoculate themselves in order

50:43

to, and I was talking to

50:45

this about my husband, like you'll

50:47

hear something from someone who's maybe

50:49

got some extreme views, and then

50:51

they have views that you're like,

50:53

well, they're not wrong about that,

50:56

but how do you. How do

50:58

you navigate this time as just

51:00

a normy? That's insanely hard. And

51:02

by the way, my phrase for

51:04

that is we are the fake

51:06

news now, because it's like we're

51:08

the media now, so we always

51:10

hear from the conservators. I know.

51:12

No, we're the fake news now,

51:14

guys. We're the fake news now,

51:17

guys. We're the fake news now.

51:19

I just love that you're like,

51:21

well, the New York Times is

51:23

bullshit, so I'm going to believe

51:25

this. How much is it going

51:27

to cost? Like life-changing money for

51:29

that guy is like $60,000. Whereas

51:31

what are you going to do

51:33

with the New York Times editor?

51:35

And for all you want to

51:38

say about these publications, you've worked

51:40

with them, I've written for the...

51:42

Atlantic, they do actual fact-checking, you

51:44

know, like they will reach out

51:46

to people in your past and

51:48

say, did Bridget go to this

51:50

high school? The fact-checking, at least

51:52

on that, is really rigorous. Yeah.

51:54

I mean, some of it's mental,

51:57

but it is rigorous. Like they

51:59

check everything. Yeah. And there's not

52:01

that on a podcast. Turns out

52:03

we're not fact checking anything. Actually,

52:05

let's give you some credit because

52:07

Rogan does. Rogan does. I said

52:09

something on his show one time

52:11

and I got like called a

52:13

few days later and like, can

52:15

you provide a source for this

52:18

thing because it's the one thing

52:20

we can't find? And I'm like,

52:22

you guys looked it all up?

52:24

And they're like, yeah, we check

52:26

every single claim. And they couldn't

52:28

find it. So I sent him

52:30

the citation. paper, not Rogan, I

52:32

think is a Jamie or whatever,

52:34

sent in the paper. I'm like,

52:36

oh, okay, there it is, okay.

52:39

I don't need to check everything,

52:41

but you know, if it sounds

52:43

pretty out there, they check it.

52:45

But how do you navigate this?

52:47

You've got to slow down, first

52:49

of all, the pressure to like

52:51

throw takes or gasoline on the

52:53

fire like right now to get

52:55

involved to like the Twitter expert

52:57

phenomenon, you know, like, I don't

53:00

know. there was a gas leak

53:02

somewhere so now everybody's a expert

53:04

in gas things like today like

53:06

immediately within five minutes everybody on

53:08

the internet got a PhD in

53:10

gas fittings and it's my favorite

53:12

part of the internet right it's

53:14

like people who want to understand

53:16

what's going on have got it

53:18

like there's got to be some

53:21

stepping back yeah um Jesse Kelly

53:23

talks about that a lot For

53:25

example with any huge like a

53:27

school shooting or something happens. He's

53:29

like pause pause you don't have

53:31

to talk about this for 48

53:33

hours Let some dust settle. Yeah,

53:35

you know, let's see where it

53:37

goes before you have to just

53:39

dive in so actually that's pulling

53:42

some heat out of the kind

53:44

of you know Escalation the fire

53:46

that's raging around these issues, but

53:48

You have to almost double check

53:50

off multiple sources virtually everything which

53:52

is you know on the one

53:54

hand we get to do our

53:56

own Now on the other hand

53:58

it turns out research is hard

54:01

and it ups the level of

54:03

research that you have to do

54:05

of course But people don't really

54:07

know how to do research. This is what

54:09

my husband is saying He's like people who

54:11

do their own research like don't know actually

54:13

even how to do research And I used

54:15

to tell people you know you should

54:17

of course find some people who you

54:19

know having the past provided receipts and

54:21

shown that they know what they're talking

54:23

about, but that's not good enough anymore.

54:25

Yeah, right because it's actually really easy

54:28

to go like string together a bunch

54:30

of crap you found on five different

54:32

Wikipedia sites and like create a convincing

54:34

story about some war that you don't

54:36

know anything about or just to like say

54:38

seven true things and then three really out there

54:40

things on the back end of it. And you

54:42

think, well, that guy was really

54:44

credible, you know? And then all of

54:46

a sudden... We're to take something that

54:49

isn't hidden. It's just not studied as

54:51

much as something else and say, oh,

54:53

this is hidden because it's a conspiracy.

54:55

It's like, well, no, we all know about

54:57

this. Well, that's a big one. People got

55:00

to watch out for, which is the, they

55:02

don't want you to know that, therefore it's

55:04

true. approach is really bad, right?

55:06

But the fucking mainstream media did

55:08

this. They sure did. I'm furious

55:10

at them for it because they

55:13

broke people's brains over COVID. They

55:15

lied. All these journalists abdicated their

55:17

duties of just pursuing truth and

55:19

trying to get go viral or

55:22

trying to be on the right

55:24

side of history or be with

55:26

the right party or go to

55:28

the right parties. And now there's

55:30

you don't really know who or

55:33

what to believe. It's very very

55:35

difficult. And it's very easy to

55:37

manipulate a population like that.

55:39

That's right. And a lot of

55:42

people who have good heads on

55:44

their shoulders don't want to deal

55:46

with it and check out. And I do,

55:48

this is where I am I one of

55:50

those people. The check out people?

55:53

Who's been manipulating? Oh, yeah, we

55:55

all are. Yeah. Absolutely. We all

55:57

are. And it's like, again, are

55:59

you know. If the algorithm is

56:01

just feeding us what we want,

56:03

aren't we all radicalizing ourselves? I

56:06

was like, we are. We are.

56:08

Well, to your question about, you know,

56:10

were you manipulating voting for Trump? The

56:12

answer is maybe, but shall we watch

56:15

a few videos of Kamala Harris? Can

56:17

it bring you back to Earth? But

56:19

also, you have the lies about him. Oh,

56:21

yeah. It should be a fucking huge scandal.

56:23

Well, I mean, so you asked me about

56:26

me being misrepresented. This

56:28

is like, like, I wake up. Who am I

56:30

today according to the internet,

56:32

right? Internet.com says that I

56:34

am now, the biggest misrepresentation

56:36

to answer that question was that

56:39

I hate Christians. This is, this

56:41

drumbeat is happening everywhere. This is

56:43

happening on shows, it's happening in

56:45

dams, it's happening in text messages,

56:48

it's happening in the White House.

56:50

Yeah, I am a secret mold

56:52

to undermine Christianity by reading the

56:55

gospel. And well, because you were

56:57

an atheist and yeah, must be

56:59

right? It's like Jesus, if you

57:01

actually, I'm not supposed to talk about

57:03

the Bible because I'm, you know,

57:05

the underhanded guy. Do you know,

57:07

maybe you don't know what is

57:10

the one thing that's warned about

57:12

most in the Bible? False idols.

57:14

False teachers. Yeah. That's it.

57:16

They're like, people will pretend

57:18

to be good guys and be

57:20

lying wolves and sheep's clothing. And

57:23

it's like. So they're like, well,

57:25

James must be that because he's not

57:27

a brother. And I'm like, do I have

57:29

to go, like, put on, like, a special

57:31

outfit to be a brother? Like, do I

57:34

have to... Do you think if Russell Brand

57:36

put on his underpants and took me into

57:38

the river, that I would count? I made

57:40

a video about this. I haven't really see

57:42

yet. I bet I wouldn't count. I bet

57:44

you if Russell Brand baptized me and gave

57:46

me one of his, you know, our amulets

57:49

that I still wouldn't count. I'm like, isn't

57:51

this against Christianity? Like, even my, the woman

57:53

who does my eyebrows is very Christian and

57:55

she was like going off about this one

57:58

day because she said, she's like, no. you

58:00

don't need an ammulate to protect you,

58:02

you need God. You know that it

58:04

was, it's like this is idolatry. The

58:06

armor of God is not a $275

58:09

amulet sold by Russell Brand. No, the

58:11

reason I bring him up, though, specifically

58:13

with this is because I said, I don't

58:16

think that guy's real. as in his

58:18

conversion as Christianity, I don't think he's

58:20

real, like his history, the guy's an

58:22

actor, he just got accused of a

58:24

bunch of sexual assault and all of

58:26

a sudden he found God. And then the

58:28

whole right's like, we love you! And I was,

58:30

and he's on stage praying with Tucker Carlson

58:32

like the next day and I'm like,

58:34

no, right? And people were like, James

58:36

hates Christians. And I was like, oh my gosh, it's

58:39

like. I put it much more

58:41

simply than this, why can't people

58:43

detect a scumbag in their midst,

58:45

like no matter which party? There

58:47

it seems like partisanship has

58:49

and maybe it's because I'm

58:51

a female but I'm like

58:53

Gavinusam Scumbag Russell Brand seems

58:56

like a scumbag like why

58:58

are scumbag detectors so broken

59:00

by partisan politics that you

59:02

have allowed these literal and

59:04

sometimes predators to just move

59:06

seamlessly from one place to

59:08

another whether it's a male

59:10

to a female prison whether

59:12

it is an influencer who's

59:14

just like conveniently becoming something

59:16

so that they can pivot

59:18

out of, you know, whatever bad press they

59:20

might be getting. I don't, or an

59:22

influencer who didn't exist a year ago

59:24

and now has like a million followers.

59:26

We could see had one weird video

59:28

that went viral for like, whatever. Yeah,

59:30

so I met. It seems like people

59:32

should be better attuned to just

59:35

recognizing scumbets. You would think so.

59:37

I went to DC is at the Trump

59:39

DC back when it was a Trump DC.

59:41

So it was a long time ago. And, um.

59:43

I won't name any names, but I was

59:45

with a group of people and

59:47

they called this guy over who's

59:49

a famous person or was a

59:52

famous person sort of in conservative

59:54

circles. And the second I met

59:56

this guy. Matt Gates. No comment.

59:58

No, it was not Matt. So I

1:00:00

know Matt, he's all right. As soon

1:00:02

as I met this guy though, like

1:00:04

just look in his eyes, right? As

1:00:06

soon as I saw the look in this

1:00:09

guy's eyes, I was like, oh,

1:00:11

this guy's a sociopath, right? And

1:00:13

then he started talking, and it's

1:00:16

like, we started talking, and it's

1:00:18

like, we were talking, and he

1:00:21

wanted to talk to me about

1:00:23

the grievance studies papers,

1:00:25

and I was telling like... hurt

1:00:27

and it wasn't a funny story

1:00:30

and like I just started laughing

1:00:32

hysterically and I'm like holy

1:00:34

crap this guy's a psycho and everybody

1:00:37

else that I was with was like

1:00:39

oh he's really interesting he's really great

1:00:41

and I'm like how do you not

1:00:44

see it what's going on I see

1:00:46

this in tech too this happens in

1:00:48

tech I was at this event and

1:00:51

this guy who's very famous was talking

1:00:53

and I was like getting chills down

1:00:55

my spine and everyone's just like money

1:00:57

yeah I looked at this guy, sitting

1:01:00

next to me, I'm like, am I

1:01:02

the only one who's terrified by what

1:01:04

this guy is saying? And he's like,

1:01:07

no, he needs to be, he's like

1:01:09

the only other normal dude, he's

1:01:11

like, he needs to be stopped.

1:01:13

Do you know that I, with

1:01:15

this whole woke right thing? I

1:01:17

quoted you, I didn't put your

1:01:19

name on it, so you got

1:01:21

no credit, but I think important

1:01:23

things that you ever said though

1:01:25

to me. to me. Well, not

1:01:27

exactly. When I started to see

1:01:30

in September, October like the right

1:01:32

gearing up to kind of go

1:01:34

like buck wild with all this

1:01:36

like radical stuff and I've been tracking

1:01:38

all this for a while. And I

1:01:40

said one of the most important things

1:01:42

to know is when to leave a

1:01:44

party that's about to go sideways. And

1:01:46

that was something you told me and

1:01:49

I was like, no shit.

1:01:51

That's 100% true. And I

1:01:53

was like, this is. This

1:01:55

whole movement, this weird like

1:01:57

high energy conservative ink-ish movement

1:01:59

is off. about to get weird.

1:02:01

I'm like, I gotta get distance. Like,

1:02:03

I've got to get distance from this.

1:02:05

And I didn't deliberately create distance by

1:02:08

going provoking things, but I was like,

1:02:10

I've also gotta speak up. Do what?

1:02:12

Maybe. No, no. I wanted, like, I

1:02:15

needed psychological distance, and I needed to

1:02:17

know that I could speak up. to

1:02:19

differentiate myself from the crowd as I

1:02:21

watched it kind of like running enthusiastically,

1:02:24

not the wrong way, but a little

1:02:26

bit the wrong way. You know, you

1:02:28

know, like a little bit off course,

1:02:31

not straight, but a little sideways, right?

1:02:33

Like, here's a historical story.

1:02:35

Once upon time. there was a Korean

1:02:38

Air flight 747 took off I think

1:02:40

I don't remember the detail where it

1:02:42

took off from maybe it was from

1:02:44

Japan I don't think it was from

1:02:46

Japan I think it was from somewhere

1:02:48

in Seattle or something like that but

1:02:50

it was a Korea Air and it

1:02:53

was flying obviously to Seoul Incheon

1:02:55

Airport and they is back in the 80s

1:02:57

and so they set the heading on the

1:02:59

little it was dials right and they set

1:03:01

it off by one degree wrong. One degree.

1:03:04

So instead of flying whatever heading they

1:03:06

thought they were, there were one degree

1:03:08

sideways. Which took them over Soviet airspace.

1:03:10

So the Soviets were like, there's no

1:03:12

way they would send a passenger plane

1:03:15

over our airspace. They're not that

1:03:17

stupid. They must be disguising a spy

1:03:19

plane as a passenger jet scrambled and

1:03:21

shot it down, killed everybody. One

1:03:24

degree off from the true course

1:03:26

can have a really big consequence

1:03:28

especially like when you even like

1:03:30

going into space Yeah anytime you

1:03:33

start to And what's interesting is that

1:03:35

I've noticed is people who were

1:03:37

like, oh, James was so much

1:03:39

like a canary in the coal

1:03:41

mine, he was so ahead of

1:03:43

the curve, he saw things that

1:03:46

other people didn't see. Now they're

1:03:48

like, those same people are like,

1:03:50

this guy's full of shit. Yep, he

1:03:52

was wrong all the time. He

1:03:54

couldn't possibly be seeing something

1:03:56

on our side that might

1:03:59

be problematic. forward 10 years? Yeah,

1:04:01

no, no, he's just wrong and bad

1:04:03

and evil and don't invite him

1:04:05

to things anymore. What are the

1:04:08

criticisms that you think are accurate? I

1:04:10

mean, I don't behave well on social

1:04:12

media. That's totally fair if anybody

1:04:14

ever says that. So I do

1:04:16

look crazy. So I don't really... You're

1:04:18

much more sane actually sitting with you

1:04:20

than I thought I'd get here and

1:04:22

you'd be like, how are you doing?

1:04:25

I'm a little high strung lately

1:04:27

actually, but just a little. It's

1:04:30

totally possible that I, and I

1:04:32

think this is a very legitimate criticism

1:04:35

that I get, that I am

1:04:37

reading what I'm seeing, and it's

1:04:39

what we've actually already been talking

1:04:42

about, and that I am making

1:04:44

out that in reality something is

1:04:46

happening that's mostly a social media

1:04:48

mirage, right? So that I'm

1:04:50

overblowing... the issue that I'm seeing all these,

1:04:53

you know, Nazi accounts, but they're fake. But

1:04:55

this was something I want to talk to

1:04:57

you about because it does seem even since

1:04:59

you and I first saw one another and

1:05:01

Aspen that the distance between the virtual and

1:05:03

actual is getting smaller and smaller. That's what

1:05:05

I say. I say that Twitter or X,

1:05:07

I guess, is California. Whatever happens in

1:05:10

California hits the rest of the country

1:05:12

in five to ten years, whatever happens

1:05:14

in social media is happening in the

1:05:16

movements, the political movements within a year.

1:05:18

Right. And so I'm like. This is

1:05:21

really a concerning trend and I don't

1:05:23

actually know how much of it's real,

1:05:25

but even if it's not real, it's

1:05:27

becomes real. It becomes real. Right. And

1:05:30

what actually struck me kind of,

1:05:32

this isn't what actually tipped me

1:05:34

over and said, I've got to

1:05:36

start talking about this. There was

1:05:38

something else that was very discreet,

1:05:40

like this I've got to start

1:05:42

speaking up. What actually convinced me

1:05:45

that this is a big problem in reality was

1:05:47

when I went to North Idaho and I was

1:05:49

working with a pastor there at a church and

1:05:51

we held a meeting of a whole bunch of

1:05:53

pastors like a pastor moot or something or whatever

1:05:55

they call that when 30 pastors get in a

1:05:57

room we have you know some kind of del

1:06:00

lunch and we talked and I was

1:06:02

telling them about literally I got brought

1:06:04

in to talk about queer theory and

1:06:06

how it's impact impacting the schools and

1:06:08

the children and what it all is

1:06:11

and I talked a little bit about

1:06:13

that and I talked a little bit

1:06:15

about the big picture with you know

1:06:17

China and ESG and all of the

1:06:19

high pollutant world economic forum stuff with

1:06:22

them and then I was like well

1:06:24

let's just take the rest of the

1:06:26

time and do like what's on your

1:06:28

mind, right? What are your problems? Maybe

1:06:31

I have some color I can put

1:06:33

on them and they're like one guy

1:06:35

after another is like grippers. The grippers

1:06:37

are a number one problem in our

1:06:39

church. And I'm like, the what? Really?

1:06:42

Like I know what the grippers are,

1:06:44

but I'm like, I got there on

1:06:46

the internet. Like they do things in

1:06:48

life? Like, I had this recent experience,

1:06:51

but go on. So I was like,

1:06:53

oh, oh crap. You know, this isn't

1:06:55

like. A couple of guys saying this,

1:06:57

this is a whole bunch of them.

1:06:59

And that they're all talking, like this

1:07:02

is a major problem in regional, both

1:07:04

church and Republican Party politics, that they're

1:07:06

facing now. And that they've realized these

1:07:08

guys are very dishonest, they come in

1:07:11

and they act like they're normal people,

1:07:13

but they have very radical intentions. So

1:07:15

it's like intentional infiltration. And then everywhere

1:07:17

they go, there's division and all this.

1:07:19

I'm like, toxic. I'm like, toxic. And

1:07:22

I'm hearing this is happening in a

1:07:24

few different places in Tennessee where I

1:07:26

live. I'm hearing, you know, reports that

1:07:28

it's just out of control in the

1:07:30

Ogden, Utah area. I'm hearing reports from

1:07:33

it happening in churches in those communities

1:07:35

in parts of Oklahoma. And I'm like

1:07:37

scratching my head thinking, maybe this is

1:07:39

like, you know, coming off the internet

1:07:42

now. Right. Like it's one thing to

1:07:44

say that the groppers are a huge,

1:07:46

like weird... issue on the internet they're

1:07:48

annoying if they decide to descend upon

1:07:50

you they'll mess up your internet for

1:07:53

three days till they get bored and

1:07:55

you know yeah they might swat you

1:07:57

they might swat you they did not

1:07:59

that's a real-life thing that they did

1:08:02

they'll talk to you but like the

1:08:04

actual real life had to talk to

1:08:06

the police so we don't get swatted

1:08:08

and No, they said that this was

1:08:10

the biggest issue. Even with all the

1:08:13

woke left stuff happening, this is the

1:08:15

biggest issue. And that it's kind of

1:08:17

taken over the politics in Boise, they're

1:08:19

all kind of concerned about that. And

1:08:22

I mean, this is what we've talked

1:08:24

about, though, in the past is that

1:08:26

I, and I think you and I

1:08:28

have had this discussion probably on the

1:08:30

podcast, just like the, the, as concerning

1:08:33

as the extreme left is what the

1:08:35

reaction will be from the right. Yeah,

1:08:37

I mean, I definitely remember having that

1:08:39

conversation with you, you know, in like

1:08:41

2019. Yeah. We were, we were, there's

1:08:44

actually. I did trigonometry when it was

1:08:46

like the smallest podcast in Britain or

1:08:48

whatever back in 2019. I'm sure Constantine

1:08:50

will love that. We talked about it.

1:08:53

It was at the time. He and

1:08:55

I talked I was just on his

1:08:57

show now that it's on his show

1:08:59

now that it's huge. I was just

1:09:01

on his show now that it's huge.

1:09:04

I was just on a show now

1:09:06

that it's huge. I was on a

1:09:08

show now that it's huge. I was

1:09:10

on a show a few months ago

1:09:13

a few months ago. Yeah. He was

1:09:15

like, yeah, but I went on a

1:09:17

show and one of the main things,

1:09:19

it was Peter and I together, Peter

1:09:21

and I together, we talked about with

1:09:24

him was the fear. that this was

1:09:26

going to trigger a real racist reaction,

1:09:28

a real like bid for patriarchy or

1:09:30

whatever else. I don't think we had

1:09:33

anti-Semitism like on the radar yet. But

1:09:35

it was, yeah, no kidding. It was

1:09:37

extremely concerning that the reaction was going

1:09:39

to be, that was six years ago.

1:09:41

See, the reason for me that it

1:09:44

was always concerning was because it was

1:09:46

the anti-Semitism on the left that pushed

1:09:48

me out of the left. Yeah. So

1:09:50

I was like, if this and you

1:09:52

see it on the right. left and

1:09:55

you see this on the right, generally

1:09:57

historically, that's not a fucking good sign.

1:09:59

No, that's the juice squeeze. I think

1:10:01

they, I mean, I've talked to some

1:10:04

of my Jewish friends and they do

1:10:06

refer to a squeeze. from both sides

1:10:08

on the Jewish people and it's like

1:10:10

oh so the and it does feel

1:10:12

like some there's some fucking weird thing

1:10:15

that's encoded in human DNA that's like

1:10:17

it's the Jews. Well what it is

1:10:19

is I mean on a deep like

1:10:21

psychological and philosophical level I mean I

1:10:24

really do mean this I've been thinking

1:10:26

about this for a very long time

1:10:28

and I finally hit upon this answer

1:10:30

the other day and I'm like oh

1:10:32

my god that's the right answer. And

1:10:35

I didn't come up with it, but

1:10:37

I don't remember where I read it.

1:10:39

So somebody gets credit. And it's probably

1:10:41

you. No, I'm just kidding. It definitely

1:10:44

wasn't you. Who are you quoting now?

1:10:46

Yeah, right? Quoting now. I love this

1:10:48

marketplace of ideas where everyone just takes

1:10:50

ideas. There's too many ideas to remember

1:10:52

where they came from. No, I know.

1:10:55

But the thing is, is that the

1:10:57

idea is that on a deep level,

1:10:59

it's collectivism versus not collectivism, right? If

1:11:01

you're trying to create global oneness, which

1:11:04

is what a lot of like the

1:11:06

hippie-dippy sounding religions are trying to do,

1:11:08

but it's also what all the evil

1:11:10

cult religions try to do. communism is

1:11:12

a global oneness program, fascism is a

1:11:15

global oneness program. If you're trying to

1:11:17

create a global oneness theosophical or whatever

1:11:19

thing, and you have a group of

1:11:21

people that are like, no, we're God's

1:11:23

chosen people, we're not doing that. They

1:11:26

refuse to assimilate to whatever the big

1:11:28

prevailing thing is. And then they also

1:11:30

tend to, you know, be very successful.

1:11:32

They're like the original Edge Lords. Just

1:11:35

kidding. Well, they refuse. And so they're

1:11:37

not going to be a part of

1:11:39

a global oneness program. They're like, no,

1:11:41

we've got a covenant. and that covenant

1:11:43

is sacred and it's inviolable. And then

1:11:46

people look at that and like, well,

1:11:48

they're weirdos and we have problems and

1:11:50

where are the problems coming from? Well,

1:11:52

we got these weirdos who are different

1:11:55

from us, who won't get on the

1:11:57

program with us. And what I've seen

1:11:59

studying, you know, totalitarian ideologies now for

1:12:01

a while, you kind of got two

1:12:03

kinds of tyranny, right? You've got the

1:12:06

warlord tyranny where it's just some guy

1:12:08

who's a brute who takes everything over.

1:12:10

like Genghis Khan or whatever, maybe even

1:12:12

Alexander. Then you have these kind of

1:12:15

ideological tyrannies, and these ideological tyrannies are

1:12:17

always of the same flavor. They have

1:12:19

different actual formula, but they're the same

1:12:21

flavor, and that flavor is always, if

1:12:23

everybody did this with us, it would

1:12:26

work. So then you look at the

1:12:28

people who aren't doing it with you.

1:12:30

and it's not working because it's bullshit

1:12:32

in the first place and you're like

1:12:34

it's their fault. Well this was something

1:12:37

that really struck me when I was

1:12:39

in Prague and we had this amazing

1:12:41

tour guide. He spoke like 14 languages.

1:12:43

He was born and raised in Prague

1:12:46

and educated with the Russian history and

1:12:48

then lived through the last Russian tanks

1:12:50

leaving in the 90s. Wow. and had

1:12:52

to relearn history which would be insane

1:12:54

and was also Jewish and he was

1:12:57

he took us down to where the

1:12:59

the ghettos were and the old and

1:13:01

the walls and whatnot and he's he

1:13:03

was explaining that during the plague because

1:13:06

the Jews had been put in basically

1:13:08

their own you know ghetto and had

1:13:10

been they had better sanitation and they

1:13:12

had to get all of it out

1:13:14

one way or another and therefore were

1:13:17

not as affected by the plague and

1:13:19

then they were blamed for the plague.

1:13:21

Of course. It's fucking crazy. So they

1:13:23

were isolated. They had to deal with

1:13:26

that and come up with better practices

1:13:28

around hygiene and then because they didn't

1:13:30

get it. It was their fault. It

1:13:32

was their fault. While there's shit in

1:13:34

the streets. my weak-ass takes on communism

1:13:37

apparently that have turned young people into

1:13:39

like Nazi ideology. I don't think this

1:13:41

is true I think they're trolling but

1:13:43

there are people saying this now so

1:13:45

you have some like I wake up

1:13:48

every day and I'm like who am

1:13:50

I gonna find today? Yeah you can't

1:13:52

I mean I come from the Joe

1:13:54

Rogan School of Internet it's like posting

1:13:57

ghosts don't read the comments just like

1:13:59

you've got don't argue with people online

1:14:01

all day. You don't, you need one

1:14:03

of those calendars that's like here's how

1:14:05

many days you have left in your

1:14:08

lifetime if you're lucky enough to live

1:14:10

to 80 or whatever and I think

1:14:12

it will help put things like that

1:14:14

time is valuable. Yeah. The thing is

1:14:17

I got kicked off Twitter right for

1:14:19

five months for calling transactive as groomers.

1:14:21

Oh yeah I remember that's good for

1:14:23

you though right? Well yes and no

1:14:25

I was very happy to be off

1:14:28

actually and I think at first especially.

1:14:30

But the problem was is I was

1:14:32

insanely productive. I was so productive. I

1:14:34

was like doing some of the best

1:14:37

research I think I've done this whole

1:14:39

time. My output was like killing people.

1:14:41

Like they couldn't read everything I was

1:14:43

writing and listen to everything I was

1:14:45

recording as podcasts fast enough. Like I

1:14:48

was just like I was like the

1:14:50

Concord. Like flying past everything. And then.

1:14:52

All of a sudden about three or

1:14:54

five months total about three three and

1:14:56

a half months in I was like

1:14:59

I started to realize that it's like

1:15:01

I'm over here and like stuff I

1:15:03

don't know what people think of what

1:15:05

I'm saying now like I've become disconnected

1:15:08

I don't have that feedback mechanism right

1:15:10

so part of the reason that I

1:15:12

Read the comments is because I don't

1:15:14

know how my stuff is being received

1:15:16

without seeing how it's being received like

1:15:19

when I was saying earlier You know

1:15:21

talking with an audience you can see

1:15:23

their faces right like if they all

1:15:25

start kind of doing this But if

1:15:28

you you yourself are saying a lot

1:15:30

of it is manufactured and fake so

1:15:32

how like how can you determine how

1:15:34

you're being received and if it's yeah,

1:15:36

that's like I'm like I'm realizing that

1:15:39

that's like this huge issue now and

1:15:41

I'm trying to calibrate around it and

1:15:43

I don't know what to do because

1:15:45

that's a huge problem right if you

1:15:48

want to make a claim that something's

1:15:50

happening like that there's this woke thing

1:15:52

happening in the right wing or a

1:15:54

fascist or a Nazi or whatever you

1:15:56

need examples to show people right I

1:15:59

mean maybe that's why it bothers me

1:16:01

the woke thing is that it's not

1:16:03

it feels like on on the

1:16:05

left they labeled it cultural

1:16:08

Marxism so is it

1:16:10

Marxism or is it like Naziism

1:16:12

well you know some of it's

1:16:14

some people are like national

1:16:16

socialists I don't think most

1:16:19

of them are and I

1:16:21

think a lot of that's

1:16:23

troll activity right there's

1:16:25

There's this weird line between

1:16:27

the philosophical school of capital C

1:16:29

conservatism, which is not what most

1:16:32

people politically that are conservative call themselves.

1:16:34

It's actually a school of thought. It

1:16:36

has like kind of outlined ideas. It's

1:16:38

rooted, you know, deeply in tradition and

1:16:40

so on and all this stuff. And

1:16:42

a lot of conservatives when they hear

1:16:44

it's a yeah, I think that way.

1:16:46

But there's a school of thought. And

1:16:48

then that's like conservatism. And then you

1:16:50

have fascism, right. And then you have

1:16:52

fascism, right. And then you have fascism, right.

1:16:54

all invested in the state, but where they

1:16:56

kind of overlap is a lot of times

1:16:58

in the sense of a nation as a

1:17:00

people that have shared things in common,

1:17:02

not as a nation as in like

1:17:05

this kind of political entity where there's

1:17:07

a constitution and everybody who happens to

1:17:09

live within the borders and is a

1:17:11

legal citizen, blah, blah, blah. But no,

1:17:13

it's a distinct people who have kind

1:17:16

of an ethnic heritage that somehow

1:17:18

ties them together. And they both

1:17:20

kind of have this way of

1:17:23

thinking in that who you are

1:17:25

is determined in terms of like

1:17:28

your inheritance from that national

1:17:30

culture, right? You could call it

1:17:32

an inherited self versus, you know, like

1:17:34

a self-defined self that the left is

1:17:37

into. I get to be whoever I

1:17:39

want today, you know. I'm going to

1:17:41

change my pronouns every Tuesday at 12.

1:17:43

That's self-definition versus this is, well, you

1:17:46

don't really get to pick who you

1:17:48

are because your society gave you who

1:17:50

you are. And you inherited that and it

1:17:52

would be like, you know, it would be

1:17:54

an insult to throw it off and say

1:17:57

I'm going to be different than the traditions.

1:18:00

it can go that way, it can

1:18:02

get very hierarchical. But the thing

1:18:04

is, is that like the fascist

1:18:06

side is like, it's the same

1:18:08

thing where the Marxists believe like,

1:18:10

oh we can just be progressive,

1:18:12

oh it's not working, we need

1:18:14

state power to force people to

1:18:16

be progressive, right? The fascist thing

1:18:18

is like, oh we can just

1:18:20

have a big national community and

1:18:22

we can think of ourselves as

1:18:25

the national community, which by the

1:18:27

way is the phrasing that the

1:18:29

Nazis used for what they were

1:18:31

building as a national community as

1:18:33

a national community. National Community, oh

1:18:35

shit, we can't actually make people

1:18:37

like respect it. So we'll use

1:18:39

the state to force them to

1:18:41

do it. And so Capital C

1:18:43

conservatism doesn't believe in the state

1:18:45

forcing people to do that, but

1:18:47

fascism does. But then they have

1:18:49

this huge amount of like blurry

1:18:52

crossover between them in that place

1:18:54

and the, in that place of

1:18:56

who we are as a people.

1:18:58

is something everybody in order for

1:19:00

the society to work everybody has

1:19:02

to understand who we are as

1:19:04

a people and be kind of

1:19:06

on the same program and the

1:19:08

fact is is when people are

1:19:10

the question always comes down to

1:19:12

what do you do with the

1:19:14

people who say no right well

1:19:17

like fine that's a tradition but

1:19:19

I don't want to do that

1:19:21

so I'm not going to what

1:19:23

do with the people who say

1:19:25

no and eventually if your thought

1:19:27

process is that society will decohere

1:19:29

it will fall apart something terrible

1:19:31

will happen if we don't make

1:19:33

sure everybody's on the program then

1:19:35

eventually you have to start forcing

1:19:37

people to be on the program.

1:19:39

Right. It can start through coercion,

1:19:41

then it can get to like

1:19:44

kind of the puritanical cancel culture,

1:19:46

and eventually you end up with

1:19:48

a state apparatus that's like this

1:19:50

is what we're doing, and if

1:19:52

you don't participate, you're not part

1:19:54

of us, and you don't have

1:19:56

any. And then it'd be kind

1:19:58

of a get-up. Yeah, exactly. And

1:20:00

so there's this weird space there

1:20:02

where it's like some of both

1:20:04

of those things are happening happening,

1:20:06

but one of those things are

1:20:08

happening, but one of those things

1:20:11

is really bad. hippie-dippy, progressive, kumbaya

1:20:13

kind of people, can't we all

1:20:15

just get along and can't we

1:20:17

all just like gay people or

1:20:19

whatever, like that slope into like

1:20:21

whatever the hell we just lived

1:20:23

through is slippery, right? This slope

1:20:25

now concentrated in the concept of

1:20:27

what it means to be a

1:20:29

nation is also slippery. Right. And

1:20:31

that's where my primary concern is.

1:20:33

And it's also, it turns out

1:20:35

in both cases, I'm reading this

1:20:38

book right now called Account Rendered

1:20:40

about a girl who was 15

1:20:42

in 1933 when Hitler took over

1:20:44

and she became a Nazi almost

1:20:46

immediately. And so. in the late

1:20:48

50s or early 60s, I don't

1:20:50

know the exact timing, she had

1:20:52

finally like deprogrammed and her best

1:20:54

friend as a teenager was is

1:20:56

Jewish girl. And so she's like

1:20:58

writing this letter not of apology

1:21:00

but of explanation. How did I

1:21:02

become a monster basically to her

1:21:05

childhood friend? And I'm reading this

1:21:07

and it's so frequently reminding that

1:21:09

it's... that young people get caught

1:21:11

up in this. Yeah. It's the

1:21:13

wanting to be a part of

1:21:15

something big. It's wanting to change

1:21:17

the world. It's, you know, we're

1:21:19

15, 16, 18 years old. Nobody

1:21:21

quite takes us seriously. But this

1:21:23

is a serious movement. This is

1:21:25

life and death. This is a

1:21:27

real thing. And real people our

1:21:29

age are really doing it. And

1:21:32

that's, I see, well. that Leonard

1:21:34

Pickoff called this ominous parallels. I

1:21:36

see ominous parallels between the situation

1:21:38

that we're in now and the

1:21:40

situation that you would have in

1:21:42

those states that went into kind

1:21:44

of the fascist pit. But how

1:21:46

is it woke where like left

1:21:48

woke cares about your race? or

1:21:50

what it means to be a

1:21:52

race, right? It doesn't care about

1:21:54

your race. That's a lie. That

1:21:56

was always a lie. Larry Elder's

1:21:59

black, but he wasn't black. The

1:22:01

LA Times called him the Blackface

1:22:03

White supremacy. They cared about that

1:22:05

he acted politically black, right? Instead

1:22:07

of it being about race or

1:22:09

sexuality or one of these things

1:22:11

at the lefted or economic status

1:22:13

if it's old school Marxist, now

1:22:15

it's concentrated in the idea of

1:22:17

the nation. Who are we as

1:22:19

a people? and they get the

1:22:21

same. thought processes about what defines

1:22:23

the people in the nation. And

1:22:26

so that's why there's so much.

1:22:28

But it's hard. Like do you,

1:22:30

sorry to interrupt, do you think

1:22:32

that, isn't there some level of

1:22:34

truth to having to, one of

1:22:36

the things on the left that

1:22:38

has been so, like, degrading has

1:22:40

been the. teaching of hatred hating

1:22:42

America. Yeah. So how do you

1:22:44

go here as a nation if

1:22:46

your population hates itself? No, that's

1:22:48

right. And that's where it's like

1:22:50

it's reaction and overreaction, right? Because

1:22:53

this woman, Molita Moshenman, that wrote

1:22:55

the book account rendered, actually says

1:22:57

that too. It's like, was it

1:22:59

hate that brought me into being

1:23:01

a national socialist to Nazi? She

1:23:03

was a Nazi officer, right? By

1:23:05

the time it ended. Was it

1:23:07

hatred that brought me? No, it

1:23:09

was love for Germany. Love for

1:23:11

Germany. I wanted Germany to be

1:23:13

this great vision and I believed,

1:23:15

this is her words, and I

1:23:17

believed Hitler would deliver on those

1:23:20

promises. And so that desire, the

1:23:22

left in the communists, destroying love

1:23:24

of nation, patriotism, is extremely corrosive.

1:23:26

But that's the other word thing.

1:23:28

But it can go both ways.

1:23:30

Communism is an internationalist movement. But

1:23:32

wasn't like the motherland in Russia?

1:23:34

Wasn't it about the love of

1:23:36

Russia? Well there's a huge, I

1:23:38

mean that's... This is a whole

1:23:40

topic, but there was a program

1:23:42

that was initiated. Stalin came up

1:23:44

with it in 1913. In Russian,

1:23:47

it's Corinizatsia, which means putting down

1:23:49

roots. And it's basically, we call

1:23:51

it DEAI today. And I'm not

1:23:53

like basically my way around details,

1:23:55

like it is, DEA is the

1:23:57

import of Koreanatsatsia. And so for

1:23:59

the entire 1920s to roughly the

1:24:01

end of the 1920s getting into

1:24:03

the early 1930s, they were doing

1:24:05

this, whether it was Latvia, Latvia,

1:24:07

Ukraine, you know, whatever, they were

1:24:09

going around and they were, no,

1:24:11

you get to be your own

1:24:14

people, you get to be your

1:24:16

own nation within the broad Federation,

1:24:18

the Soviet Federation, and we're gonna

1:24:20

help you, your, you know, your

1:24:22

leaders will become leaders in Moscow,

1:24:24

and there's your inclusion program, by

1:24:26

the way. And all of this,

1:24:28

it had, the diversity component was

1:24:30

called Ruzna Brazia, which literally means

1:24:32

diversity, but unity and content. So

1:24:34

you look different, but you all

1:24:36

think the same. You're all Soviets.

1:24:38

And so that was a disaster.

1:24:41

It did exactly what DEA is

1:24:43

doing now in America, where everybody

1:24:45

kind of hates each other, and

1:24:47

there's all this tension across racial

1:24:49

or ethnic lines, and all these

1:24:51

issues, it fell apart in exactly

1:24:53

the way one would predict it

1:24:55

falls apart in what we've just

1:24:57

walked through. And then Stalin was

1:24:59

now in power. Lenin was dead,

1:25:01

late 1920s early 30s, and he

1:25:03

says the new economic policy that

1:25:05

NEP has to succeed. Russification. No

1:25:08

more, you're Latvian, you're Estonian, you're

1:25:10

this, you're that, your nation, self-determination,

1:25:12

blah, blah, blah, nope, you're all

1:25:14

Russian now. And he stopped calling

1:25:16

it Soviet Union. Stalin stopped calling

1:25:18

it the Soviet Union at that

1:25:20

point, started calling it Russia. And

1:25:22

that's when the huge motherland push

1:25:24

was brought in. So it's like,

1:25:26

they deeyed until it didn't work.

1:25:28

And then dictator Stalin with literally,

1:25:30

Stalin is a nickname by the

1:25:32

way, it means something to do

1:25:35

with steel in Russian. I forgot

1:25:37

how to say his real last

1:25:39

name, but Stalin with his steel

1:25:41

fist was like, nope, now we're

1:25:43

all going the other way. And

1:25:45

so it became the massive recification

1:25:47

project that they initiated, which actually

1:25:49

hearkened back to before the union.

1:25:51

There was a recification project under

1:25:53

the czar as well. Right. So

1:25:55

it's like, we're going to go.

1:25:57

Yeah, I know we're going to

1:25:59

go all the way back. And

1:26:02

so the thing is, what they

1:26:04

called Stalin for this was the

1:26:06

national socialist. That's what Lenin referred

1:26:08

to him as because of these

1:26:10

programs. And it was considered... derogatory

1:26:12

term because it was supposed to

1:26:14

be international socialism. The Soviet Union

1:26:16

was supposed to be a testbed

1:26:18

that would then be able to

1:26:20

plant seeds of communism, one nation

1:26:22

after another, whether it went in

1:26:24

and did a kind of color

1:26:26

revolution or a coup or whether

1:26:29

it went in and toppled a

1:26:31

nation. But the idea was it

1:26:33

was going to be the like

1:26:35

fortress and communism was going to

1:26:37

spread everywhere and everything that tipped

1:26:39

over would become part of the

1:26:41

Federation. Okay. Okay. I know there's

1:26:43

a lot of like, what the

1:26:45

hell. No, no, no. I mean,

1:26:47

this is again, I'm sitting here

1:26:49

listening going like, I don't know

1:26:51

anything. And also. But this is

1:26:54

one of the memes that they're

1:26:56

getting our young guys with, which

1:26:58

is that international socialism doesn't work,

1:27:00

but national socialism does. Okay. And

1:27:02

you'll see that, like, that's repeated

1:27:04

a lot. And I think it's

1:27:06

from large propaganda accounts. And then

1:27:08

people, you know, it's memeable. They

1:27:10

repeat it. Yeah. So this, because

1:27:12

you're seeing it, yeah. My counterme

1:27:14

of socialism sucks. Which it does.

1:27:16

Yeah. But I don't think like

1:27:18

this administration is pro-socialism. So what's

1:27:21

the capitalist version? you know, worst

1:27:23

nightmare scenario here. Oh, what we

1:27:25

have in China right now. China

1:27:27

in the 1980s had a leader

1:27:29

named Deng Xiaoping. and a lot

1:27:31

of people don't know who Deng

1:27:33

Xiaoping is, but Deng Xiaoping, it

1:27:35

wasn't the immediate successor of Mao,

1:27:37

there was like a power struggle

1:27:39

and there's another guy for like

1:27:41

a year and a half and

1:27:43

whatever. Sorry, I'm just laughing because

1:27:45

what were we just talking about

1:27:48

him with Curtis Yarvin? Deng Xiaoping?

1:27:50

Yeah, I think he was, Deng

1:27:52

Xiaoping's idea was open up, right?

1:27:54

Right. That was a slogan and

1:27:56

the the model was one country

1:27:58

two systems and so it was

1:28:00

going to continue to have the

1:28:02

the the communist political ideology at

1:28:04

its heart but it was going

1:28:06

to adopt literally the Nazi the

1:28:08

national socialist not the program just

1:28:10

the economic model right right so

1:28:12

a fascist economic model so you

1:28:15

get this communist fascist hybrid where

1:28:17

you have people can make crazy

1:28:19

money with their corporations you can

1:28:21

have whatever big corporation you want

1:28:23

in China in the CEO or

1:28:25

the owner owner can make crazy

1:28:27

money but the deal is the

1:28:29

state owns all the land the

1:28:31

state owns all the raw materials

1:28:33

the state owns all the heavy

1:28:35

capital and disappear you if you

1:28:37

if you if you disagree with

1:28:39

them or disagree with CCP policy

1:28:42

right which could change tomorrow right

1:28:44

at any given any given time

1:28:46

at any given time or if

1:28:48

you you know speak out against

1:28:50

communism or the present leader or

1:28:52

whatever as has happened Jackma right

1:28:54

and so I don't know why

1:28:56

he decided his IPO launch was

1:28:58

a good time to go into

1:29:00

some inventive about the CCP but

1:29:02

it wasn't a great timing for

1:29:04

him And that happens. So what

1:29:06

you have is you don't have

1:29:09

this like advanced super capitalist model.

1:29:11

What you have is a Potemkin

1:29:13

village of capitalism that operates at

1:29:15

the communist state's pleasure, right? And

1:29:17

so my broad thesis is that

1:29:19

this whole, you know, sustainability ESG

1:29:21

bullshit in the West. is a

1:29:23

way of doing that same thing

1:29:25

upside down. So now we're going

1:29:27

to use the corporate, we're not

1:29:29

going to use the state to

1:29:31

force the corporations, we're going to

1:29:33

use the corporations to force the

1:29:36

state. So we'll get all the

1:29:38

corporations working in lockstep, whether it's

1:29:40

through, you know, drunken deals at

1:29:42

their ski party in Davos, whether

1:29:44

it's through, you know, UN agreements,

1:29:46

whether it's through Club of Rome

1:29:48

agreements or whatever these other shadowy

1:29:50

organizations, they get some other ways

1:29:52

that they do. biggest company CEOs

1:29:54

together with the Pope, together with

1:29:56

the Rothschilds. It's like, what the

1:29:58

hell's going on there, right? And

1:30:00

it's called the. The council for

1:30:03

inclusive capitalism. And it's supposed to

1:30:05

be like we're going to transform

1:30:07

the entire world's economic system into

1:30:09

something that's more geared around caring

1:30:11

and sharing. That was cloud Schwab's

1:30:13

words, which sounds an awful lot,

1:30:15

like communism. So you're kind of

1:30:17

getting the communism in through the

1:30:19

corporate. So you've got communism and

1:30:21

fascism fused in the China model.

1:30:23

And now you're going to use

1:30:25

the kind of fascist lockstep. corporate

1:30:27

thing guided by these huge investment

1:30:30

passive investment companies like Black Rock. You're

1:30:32

going to get everybody acting in the

1:30:35

same way and then they're going to

1:30:37

coordinate because you can make more money

1:30:39

if you coordinate with the heads of

1:30:41

state. Like let's say that I want

1:30:43

to introduce such and such product and

1:30:46

I can convince the heads of state

1:30:48

that this is a really great idea

1:30:50

and they can you know loosen regulation

1:30:52

that would allow me to have that

1:30:54

edge on the market ahead of time

1:30:57

like maybe they made that deal after

1:30:59

they went skiing and had some hookers

1:31:01

in Davos. This is what I think

1:31:03

is going on. So the model that

1:31:06

we're being forced into that's been screwing

1:31:08

up everything, that's why we had DEI

1:31:10

in all of our corporations. It didn't

1:31:12

just leak out of the universities. It's

1:31:14

because they have to score on their

1:31:17

corporate equality index score and all these

1:31:19

stupid metrics. The reason that's happening is

1:31:21

to build the China model in inverse

1:31:23

here. Now, think long term though, who's

1:31:25

going to win this game? the country

1:31:28

that's ruthlessly been using it for 40

1:31:30

years already and knows all the ins

1:31:32

and outs of it, or the countries

1:31:34

that are like stumbling backwards into it

1:31:36

and not really wanting to do it.

1:31:39

And did you worry about this happening

1:31:41

when you saw all these like tech

1:31:43

guys on? You mean like the ones

1:31:45

that are literally on like the board

1:31:47

of Buildaburg? Yes I did. I'm like,

1:31:50

oh no. The scariest sentence I have.

1:31:52

Or do you think that they're like,

1:31:54

kind of working against it. You know,

1:31:56

they might be. I don't know. Like,

1:31:58

did Elon Musk wake up? That's a,

1:32:01

that's a legit question, right? He said

1:32:03

unequivocally a few years ago that he's

1:32:05

a socialist. And now he seems to

1:32:07

be very pro- trump and very pro-capitalism

1:32:09

and very pro-free speech. Did he wake

1:32:12

up or is it an act? I

1:32:14

know people who know Elon, I don't

1:32:16

know Elon, I've heard that it's genuine.

1:32:18

Yeah, I think it is. But it's

1:32:21

also like, he hasn't, you know, Michael

1:32:23

malice and I have a friendly debate

1:32:25

on how many red pills somebody's supposed

1:32:27

to somebody supposed to take. What if

1:32:29

the bottle has 100 pills in it,

1:32:32

right? There's maybe five, right? It's the

1:32:34

right number. I don't think Elon's taken

1:32:36

all of the red pills he needs

1:32:38

to take yet. Like I still think,

1:32:40

like he's still like, oh yeah, brain

1:32:43

chips. You know, it's like, no, not

1:32:45

brain chips. Like, wait, you know, let's

1:32:47

at least slow down on that one.

1:32:49

And he's like, oh yeah, you know,

1:32:51

we're going to build this whole, I'm

1:32:54

going to make, I'm going to make

1:32:56

X into the everything X into the

1:32:58

everything, Like, ooh, wait, slow down, right?

1:33:00

And that was in deals with visa

1:33:02

to make financial processing happening. I'm like,

1:33:05

uh-oh. Like, slow down here, like, where

1:33:07

is he? I don't know. But then

1:33:09

we start looking at Bezos, we start

1:33:11

looking at Zuckerberg, Zuckerberg's playing the video

1:33:13

game of life, nothing's real, nothing matters,

1:33:16

there's no real morals as far as

1:33:18

I can tell, but Zuckerberg needs to

1:33:20

get the most points. And so like

1:33:22

Trump's in power now, so he's in

1:33:24

power now, I don't know where, and

1:33:27

when you get that level of FU

1:33:29

money, like you get weird. Yeah, oh

1:33:31

yeah, so weird. And you don't have

1:33:33

people around you really telling you no,

1:33:36

ever. Ever. Ever. I mean, you just

1:33:38

have people going yes, yes, yes, yes.

1:33:40

And that starts when you're in the

1:33:42

100 Millionaire space. That doesn't start when

1:33:44

you're a billionaire. Yeah. It's like, they

1:33:47

don't, I mean, that's what I said,

1:33:49

joke about this all the time. Like

1:33:51

a lot of you have never been

1:33:53

with like Dick written an actual billionaire

1:33:55

in its shows. Like, they're very mercurial

1:33:58

too, because it is very much like

1:34:00

whatever their obsession. is whatever, and then

1:34:02

it can change on a dime. And

1:34:04

so I wouldn't trust any of it,

1:34:06

although I, I, what do you believe

1:34:09

as someone who's studied

1:34:11

all of this and seen this,

1:34:13

what do you believe is

1:34:15

the best way forward? It's

1:34:17

very clear that they actually

1:34:20

respond to stuff that we

1:34:22

say and do. If we make enough

1:34:24

noise and make something clear enough and the

1:34:26

people get mad, whether it's, you know, do

1:34:28

we end up with vaccine passports? No, we

1:34:31

did not end up with vaccine. If we

1:34:33

get it, and I don't even think it's

1:34:35

that many people in the population, I've

1:34:37

heard that it might be as small

1:34:39

as 12 or 13% of the population,

1:34:41

refuses to go along and makes enough

1:34:43

noise about it. You actually can actually

1:34:45

change their course. I do agree that

1:34:47

Elon Musk has strongly different views than

1:34:49

he had. Five years ago, maybe even one year

1:34:51

ago when he realized like if the Democrats win

1:34:53

they're going to put him in prison They're

1:34:56

going to destroy everything, right?

1:34:58

But it feels like Elon has

1:35:00

strong views about the West in

1:35:02

general. Like Western, you know, the

1:35:04

Enlightenment values, first principles. Yes, yes.

1:35:07

Does he, is he some, has

1:35:09

he opened his eyes to the

1:35:11

bigger game of like this either

1:35:13

goes away because it was a

1:35:16

miracle as guys like Jonah Goldberg

1:35:18

have laid out in the history

1:35:20

of humanity or Do we somehow salvage it?

1:35:22

No, I think, and I think that's right.

1:35:25

And I think that the, so what do

1:35:27

we do is like, obviously, Elon Musk is

1:35:29

probably one of the most accessible of these

1:35:31

people. I mean, I can't guarantee that I can

1:35:33

get Elon's attention on X, but I know

1:35:36

that I have a way better shot of

1:35:38

getting Elon's attention on an issue than I

1:35:40

do of getting Jeff Basos's attention on it

1:35:42

or Mark Zuckerberg's attention, right? That's

1:35:44

not using like, oh, I know a

1:35:47

guy. That's yeah, I can go tag

1:35:49

him on X and everyday normal people.

1:35:51

Sometimes he's like good idea. Yeah, right.

1:35:53

And so he is still in this

1:35:55

process of learning more. And it's very

1:35:57

clear that he's learning more. And I

1:35:59

think Trump is this way too. Trump is

1:36:01

too. Trump's very responsible. And they're

1:36:04

very accessible. They'll let reporters

1:36:06

in, they'll feel questions on

1:36:08

the fly, they don't have binders,

1:36:10

they don't, it's like. Yeah, I mean

1:36:12

I met Trump now, so like yeah,

1:36:14

it's totally accessible. Yeah. I mean he's

1:36:16

president, he's not like. And I will

1:36:18

say even though Elon doesn't hear no,

1:36:20

he does allow an entire platform to

1:36:22

shit talk about him. All the time.

1:36:24

All day long. So he. I don't

1:36:27

think he. doesn't see that you know

1:36:29

he's he's clearly aware of things that

1:36:31

are being said whether that permeates true

1:36:34

true or false like if

1:36:36

somebody's shit talking to me

1:36:38

i have very limited resources

1:36:41

hmm elan can personally shut

1:36:43

down their social media accounts

1:36:46

and build a cruise missile you know,

1:36:48

to take out some guy talking trash

1:36:50

on the internet. So if you had

1:36:52

advice for Normies, just the average person

1:36:55

who's not all up here in their

1:36:57

heads are all on X fighting the

1:36:59

battles of Western civilization or whatever, what,

1:37:01

how does somebody stay saying in this

1:37:04

kind of media ecosystem moving forward? How

1:37:06

do they check themselves? How, what are

1:37:08

the best, what is the best way

1:37:11

forward if you were advising, you know,

1:37:13

an average human and how to stay?

1:37:15

in optimism and not get like

1:37:18

self-raticalize. Well, those are, you have kind

1:37:20

of two real questions there. How do

1:37:22

you stay sane and ground yourself in

1:37:24

this environment and do the best that

1:37:26

you can? We already said the magic

1:37:28

word, which is pause. It's like there's

1:37:30

gonna be a lot of enthusiasm, a

1:37:33

lot of the ways that things are manipulated.

1:37:35

This is a term that comes from

1:37:37

George Soros, but everybody uses it. The

1:37:39

CCP adopted it in the 90s explicitly.

1:37:41

I mean George took it over there,

1:37:43

Soros like I don't know him, but

1:37:45

Soros took it over there and taught

1:37:47

them what's called reflexivity, reflexive movements. What's

1:37:49

a reflexive movement look like? It's all

1:37:51

of a sudden came out of nowhere

1:37:53

and everybody has to talk about it

1:37:55

right now. And we have to do something. So it's

1:37:57

the current thing is the current thing. It's right.

1:37:59

Current thing. is the slang

1:38:01

term for a reflexive

1:38:03

push. Interesting. All you have

1:38:05

to do is slow your role

1:38:08

and don't add fuel to

1:38:10

the reflexive push. In 12 step.

1:38:12

like you learn this very early in

1:38:14

sobriety and I've talked about how it

1:38:16

saved me there's one line in the

1:38:18

big book and it says pause when

1:38:21

agitated or doubtful yeah and it's like

1:38:23

agitated can also be excited that's right

1:38:25

that's right feeling that kind of like

1:38:27

bipolar mania or whatever it doesn't have

1:38:29

to just be mad it can also

1:38:31

be yay the goal is to create

1:38:33

the social mania what George Sorrow

1:38:35

said actually is that the changes

1:38:37

in history happen when there's a

1:38:39

large gap between what is true

1:38:42

and what people believe is true.

1:38:44

Right? And so the goal of

1:38:46

reflexive push is to get people

1:38:48

to believe something. What's the classic

1:38:50

example is the bank run.

1:38:52

If we don't go get all our

1:38:55

money out of the bank right now,

1:38:57

it's going to be gone and we're

1:38:59

never going to get all our money

1:39:01

out of the bank right now. It's

1:39:04

going to be gone and we're never

1:39:06

going to have our money. So what

1:39:08

do we do? We go get our

1:39:10

money out of our money. explaining how

1:39:12

he does this to like crash currencies

1:39:15

and things. And to short markets, he

1:39:17

says he takes a malicious pleasure in

1:39:19

shorting institutional favorites. Like the United States

1:39:21

is an institutional favorite on the world

1:39:23

stage, right? So how do you avoid that?

1:39:26

Well, when you see something where there's this

1:39:28

kind of madness of crowds rushing in

1:39:30

a particular direction, it's pause. Take

1:39:32

a moment. You don't have to get involved.

1:39:34

And I think that actually links to, you

1:39:37

know, how do we stay saying, how do

1:39:39

we, all the other questions. is I think

1:39:41

that there's got to be, you know, we've

1:39:43

turned politics into this weird

1:39:45

hybrid of like sport and

1:39:47

religion. And entertainment. And entertainment.

1:39:50

Yeah, you're right. And so we've got,

1:39:52

like, do something more meaningful

1:39:54

in a local sense. I don't mean

1:39:56

necessarily go volunteer for your, you know,

1:39:59

school board. what you can get involved

1:40:01

in local politics, that's great. But I

1:40:03

mean like cook dinner for your family.

1:40:05

Yeah, totally. Like step back, do something,

1:40:07

like the kids call it touching grass,

1:40:10

right? Do something real with actual human

1:40:12

beings around you or it doesn't have,

1:40:14

if you like to be a loner,

1:40:16

like go sit and like meditate or

1:40:18

whatever it is you like to do

1:40:20

or knit or whatever your thing is

1:40:22

that like just takes you back to

1:40:24

where your your zone is. I think

1:40:27

it's valuable. especially if there's like service

1:40:29

involved cooking a meal for a family

1:40:31

is an act of service yeah grounds

1:40:33

you back it reminds you of what

1:40:35

actually matters a lot of this crap

1:40:37

isn't just you know remote but it

1:40:39

also is all reflexive a lot of

1:40:41

it's just you know it reflects a

1:40:44

lot of it's just you know it's

1:40:46

influencers making content to make clicks to

1:40:48

make payout schemes deliver money to them

1:40:50

to get their accounts to grow so

1:40:52

they can get a hundred and fifty

1:40:54

thousand dollar contract to talk about some

1:40:56

wind boondogle nobody wants right that's a

1:40:59

real story but I won't name the

1:41:01

name the name the name the name

1:41:03

What's your biggest defective character? That I

1:41:05

can't leave it alone? It's very clear

1:41:07

that I, in many cases, can't leave

1:41:09

things alone. But it's weird, right? Because

1:41:11

how often is the story that your

1:41:13

greatest defective character is also one of

1:41:16

your greatest strengths? It's always the same

1:41:18

thing. Damn it. That's like how I

1:41:20

accomplish almost everything, and yet it's totally

1:41:22

toxic in the long ways. Yeah. So

1:41:24

that would be your biggest asset too.

1:41:26

It turns out. Yeah. I

1:41:28

knew we'd get there eventually. What

1:41:30

is anything else you want to

1:41:33

say before we close out? I

1:41:35

don't hate Christians. I'm so frustrated

1:41:37

with this. I'm not supposed to

1:41:39

signal that they're getting into my

1:41:41

head, but it's not that they're

1:41:43

getting into your heart. No, it's

1:41:45

that they're getting into other people's

1:41:47

heads. Like, this is serious. There

1:41:49

are people that I know who

1:41:51

are getting canceled from talks. You're

1:41:53

like, we can't have you because

1:41:55

you're friends with James and James

1:41:57

Hayes Christians. It's like secondhand cancellation.

1:41:59

being friends with me over this

1:42:02

lie. This is absolutely preposterous that

1:42:04

this is having, you know, that's

1:42:06

also weird cancel culture very left

1:42:08

like very left like and some

1:42:10

of those are not like small

1:42:12

cancellation someone were like oh yeah

1:42:14

you're gonna speak at this conference

1:42:16

or whatever some of them are

1:42:18

extraordinarily significant like as big as

1:42:20

it gets because how would you

1:42:22

define woke? I define it as

1:42:24

according to what the word says

1:42:26

that you woke up to a

1:42:29

belief in the structural systems of

1:42:31

power that are contouring all of

1:42:33

our lives right and the need

1:42:35

to resist those for your particular

1:42:37

group got it all right where

1:42:39

can we find you at conceptual

1:42:41

James everywhere but Facebook and at

1:42:43

new discourses I think everywhere I'm

1:42:45

off and that's your patron Yeah,

1:42:47

I think it's new discourses. Patron

1:42:49

slash new discourses. Yeah, Facebook, I

1:42:51

made a killer meme joke and

1:42:53

they banned me for life. Oh.

1:42:55

I did, do you know who

1:42:58

Yakov Smiranov was? Classic, you know,

1:43:00

Russian comedian. He used to do

1:43:02

these things where he'd flip it

1:43:04

around and say it backwards, you

1:43:06

know, and... capitalist America, you make

1:43:08

show in Soviet Russia, show makes

1:43:10

you, you know, that was his

1:43:12

whole stick. So I made a

1:43:14

meme of him that said, back

1:43:16

when it was like big news,

1:43:18

what they were doing in Canada

1:43:20

with the medical assistance and dying,

1:43:22

the made program. the doctor kavorkian

1:43:24

program. I said I made a

1:43:27

meme of him and it says

1:43:29

in socialist Canada suicide hotline calls

1:43:31

you. A lifetime ban from Facebook

1:43:33

for the head. Which I think

1:43:35

the lifetime ban from Facebook they

1:43:37

said that I was encouraging self-harm

1:43:39

and banned my account for forever.

1:43:41

Come on Zuck. Yeah he hasn't

1:43:43

given it back yet so it's

1:43:45

free speech just fake news. Fake

1:43:47

news. Fake news. We are the

1:43:49

fake news now. Yeah, I mean,

1:43:51

I don't know. It's going to

1:43:53

get fucking weird. That's what Kat

1:43:56

D and I were talking about.

1:43:58

She's like, it's going to get

1:44:00

weird. This whole year is going

1:44:02

to be so insane, because the

1:44:04

left's going to throw everything it

1:44:06

can, and I mean, the global

1:44:08

left. Right, so I mean like

1:44:10

Germany doing weird crap. Oh, yeah,

1:44:12

like there's gonna be like attempts

1:44:14

at wars weird political things There's

1:44:16

gonna be all kinds of crap

1:44:18

to just derail Trump and then

1:44:20

It's cyops after cyops after cyops

1:44:22

after cyops after cyops. If normies

1:44:25

know nothing else, if you see

1:44:27

it on social media, probably your

1:44:29

first gut instinct should be. It's

1:44:31

a cya. It's a cya. Maybe

1:44:33

somebody's trying to manipulate me if

1:44:35

that's like something shocking, right? If

1:44:37

it's just like normal stuff, but

1:44:39

not even like the health world,

1:44:41

like the maha, you know, eat

1:44:43

better world is like. There's some

1:44:45

crazy stuff and it's like, you

1:44:47

know, how to get your diet

1:44:49

better, blah blah blah, eat this,

1:44:52

eat that, have some eggs. Next

1:44:54

thing you know, it's like, here's

1:44:56

a picture of Hitler and it's

1:44:58

like, what? I'm sorry I didn't

1:45:00

meant, that's raw egg nationalist, that's

1:45:02

like whole thing. That's a guy,

1:45:04

it's like his whole stick. Oh,

1:45:06

really? Uh-huh. Very popular. Oh, I've

1:45:08

met him. Oh, PS. If they're

1:45:10

anonymous, you don't know them. That's

1:45:12

another thing. I'm like, oh, I've

1:45:14

met him. And P.P. Yes, you're

1:45:16

not anonymous on that, you should

1:45:18

know that. He told me that

1:45:21

I'm a thorn in his side.

1:45:23

I think that's good. Like props?

1:45:25

Because I asked them why their

1:45:27

magazine was so gay. Turned into

1:45:29

like a political philosophy. Oh, we

1:45:31

should end on that note. The

1:45:33

check-in with Bridget and Cousin Maggie

1:45:35

can now be found at fetacy.com.

1:45:37

It's been titled Another Round with

1:45:39

Bridget Fetacy, and it's now in

1:45:41

video. This has been walk-ins welcome

1:45:43

with Bridget Fetacy. I'm Bridget Betacy,

1:45:45

and you're welcome.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features