The Death and Resurrection of TikTok | TMT #112

The Death and Resurrection of TikTok | TMT #112

Released Sunday, 26th January 2025
Good episode? Give it some love!
The Death and Resurrection of TikTok | TMT #112

The Death and Resurrection of TikTok | TMT #112

The Death and Resurrection of TikTok | TMT #112

The Death and Resurrection of TikTok | TMT #112

Sunday, 26th January 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

On January 18, 2025, at

0:02

10 .30pm Eastern Standard Time

0:04

in the United States

0:06

of America, a push notification

0:08

was sent to every

0:10

TikTok user across this great

0:13

nation. And it said

0:15

that TikTok was gone, banned

0:17

in the United States.

0:19

And for 14 hours we

0:21

sat in darkness looking

0:23

out, where or where will

0:26

we go? Will we

0:28

go to Instagram Reels? Will

0:30

we go to YouTube

0:32

Shorts? Where would the believers

0:34

and the followers who

0:37

looked every day at the

0:39

clock app and who

0:41

said such words as unalived,

0:43

where would they stand

0:45

hand in hand as TikTok

0:47

users? And then 14

0:50

hours later on January 19,

0:52

12 .30pm Eastern Standard Time

0:54

in the United States

0:56

of America, TikTok returned. That's

0:59

right, and only 14 hours

1:01

didn't need to take 72

1:04

hours like a carpenter bitch.

1:09

Welcome to Too Many Tabs. Today's episode,

1:11

as you can probably tell, is about

1:13

TikTok. But about a little bit

1:15

more than that. It's about the internet

1:17

as a whole and how we speak

1:19

to each other and how we communicate

1:21

through social media apps. And on this

1:23

podcast, a podcast where a husband

1:25

and wife duo sit across from each

1:27

other at a desk learning about a

1:29

topic, ingesting that information, and then spilling

1:32

it across each other's chest as

1:34

they rub it in deep into their

1:36

bodies like two people at three o 'clock

1:38

in the morning in front of

1:40

a refrigerator who thinks that the baby

1:42

can't hear them, they

1:44

create a podcast.

1:46

And that podcast

1:48

is called Too

1:50

Many Tabs. Welcome

2:02

everyone. I

2:06

hope you

2:09

brought a

2:12

casserole dish.

2:15

I did.

2:18

Because we're

2:21

here. Remember

2:26

to smile.

2:28

Because even though TikTok is a

2:31

zombie and it is still around,

2:33

it's dead. Yeah. It's dead in

2:35

ways that we are here to

2:37

discuss today. And that's what today

2:40

is about. But before we get

2:42

started into all of these different

2:44

things we have planned to cover

2:46

today. Sure. We want to cover

2:48

and let you guys know just

2:51

a couple basic announcements about too

2:53

many tabs, about the Pearl Mania

2:55

500 Pearl Mainaverse and other things.

2:57

We have our parasocial. patron exclusive

3:00

podcast. The warm-up is up now

3:02

and you can always get that

3:04

by joining us at Pearl Mania

3:07

500.net and becoming a team lead

3:09

member and you can get access

3:11

to that podcast if you'd like.

3:14

We also we also just released

3:16

an episode of vibing out with

3:18

the food moron. Also known as

3:21

vibing out with the food idiot.

3:23

We're in discussions right now of

3:25

who has the correct title. it's easier to

3:28

rhyme for the song he's writing. I know

3:30

for the new theme song but we're gonna

3:32

work on that and that's that has been

3:34

up since Friday and on this week's episode

3:36

which we're doing monthly on there where you

3:38

explain how to cook a very basic meal

3:40

to me yeah very basic food item. This

3:42

week we did chicken soup. Yep, because we all

3:44

need a little chicken soup for our soul. We

3:46

need soup for our families That too That

3:48

too But that's that's the big that's the

3:51

big announcements for us right now You know

3:53

always as always follow us across all the

3:55

different forms of social media all those fun

3:57

things. This is p anything else you want

3:59

to say? I don't think so.

4:01

I feel like we were Just gonna

4:03

have a good time today. We are that's

4:05

really what this is about today's episode is about

4:07

just kind of having a good time I

4:09

don't want you to think of this as a

4:11

sad funeral no for tick -tock or for any

4:14

these forms of social media I want you

4:16

to think of it as the hey, you know

4:18

what they had a good full life. Yeah, you

4:20

know what I mean They did it while they were here.

4:22

They did it. They did it. They

4:24

did it and then everybody did a

4:26

little too drunk Yep, and Aunt Dawn

4:28

is gonna get into a fight

4:30

with Tommy again. Yeah, yeah

4:33

Well, it's a real shame. He's addicted to

4:35

them perks Yeah, my cousin Becky's gonna get

4:37

to get fingered in the bathroom Every

4:39

time every time back. It's like there

4:41

is no place sacred with her. Oh my

4:43

god, dude It's fucking why are you

4:45

doing vodka shooters and Nana's funeral? It

4:48

is 10 a .m. Becky Becky Who's

4:52

that guy what I've never seen this

4:54

guy before Okay, we're gonna let the

4:56

bit go and we're gonna get started

4:58

and so if you are a patreon

5:00

member We're gonna go right into it

5:03

If you are not a patreon member

5:05

then you are going to be hearing

5:07

an ad or seeing it if you're

5:09

watching us on YouTube So remember patreon.com/pro

5:11

mania 500 or pro mania 500 net

5:13

See you soon And

5:26

we're back we're we're back here. Did you

5:28

bring the fireball? I did not bring fireball

5:30

because they told me there wouldn't be an ice

5:32

lose at this Nothing like let

5:34

me tell you something you gotta put money aside

5:36

for the ice If you get it goes the eyes,

5:38

I don't know those guys didn't party in

5:41

the early 2000s You get a big block of

5:43

ice and you channel you channel a channel

5:45

You make a little sliding board make a little

5:47

slidey board and then you pour the booze down

5:49

it We also have to make sure you

5:51

chisel out a Chin play a chin rest. Yeah,

5:53

so you put your mouth in ha and

5:55

then you watch it come down used to do

5:57

Yeager ice luges Yeah, but let's talk about

5:59

the death and resurrection of TikTok.

6:02

So what? Okay, all right, for

6:04

me? The death of Tiktak, we

6:06

were both sitting side by side

6:08

on our couch. The TV's on,

6:10

we're not watching it. No. But

6:12

we're scrolling because it's like, it

6:14

felt like the last moments, like

6:17

to try to see the last

6:19

of the Tiktaks. Because we knew,

6:21

first we heard 830, then we

6:23

heard 10, then it was midnight.

6:25

And we were just like, let

6:27

me just get all these Tiktak.

6:30

It was the most junkie behavior

6:32

I have. But me sitting in

6:34

the corner just going just one

6:36

more tick-tock just one word but also

6:38

also like screw me like is this one

6:41

is this one is this it was for

6:43

me for me it was very much like

6:45

when I quit smoking right and being like

6:47

all right well if I quit that I

6:49

quit I don't want that to be the

6:52

last thing I quit smoking right and being

6:54

like all right well if I quit I

6:56

don't want that to be like I don't

6:58

want that to be like I would see

7:00

when I'd be like I'd been a Charlie

7:03

video I would have been so mad because

7:05

I don't ever get her in my feet

7:07

and whenever she does show up her

7:09

or some of the other like huge

7:11

call me Chris the other 50 million

7:13

follower level ones. I only ever see

7:15

them when TikTok is messing around with

7:17

their algorithm. And so I was so

7:20

nervous that I was gonna be seeing

7:22

Haley Bailey, that was the, let them

7:24

eat cake lady. Call me Chris, Charlie,

7:26

Charlie, Charlie, I'm trying to think of

7:28

some others. More of like the two,

7:30

the 2020 dancing ones, you know, who

7:32

do like good dancing remixes. But while

7:34

we were sitting there, we got blasted

7:36

by these updates that would come repeatedly. And the

7:39

first one that hit. Because we were all sitting

7:41

around at 830. We were all told we heard

7:43

it was 830. We got text from people. I

7:45

know people who were like working on the back

7:47

end. I knew people who were like lobbying different

7:49

members of government and in communication with like TikTok

7:51

themselves. So I was getting trickle down stuff and

7:53

I was like, I don't know. People would be

7:56

like, do you know anything? I was like, I

7:58

don't think I know anything. But we'd get. hearing

8:00

830. And so I also, it was around

8:02

839 o'clock. We all got this

8:04

one push notification. Important update from

8:06

TikTok. We regret that a US

8:08

law banning TikTok will take effect

8:10

on January 19th and force us

8:13

to make our services temporarily unavailable.

8:15

We're working to restore our service

8:17

in the United States as soon

8:19

as possible and we appreciate your

8:21

support. Please stay tuned. Yeah. And

8:23

so immediately when I got that push

8:25

notification, I said, it's not going anywhere.

8:27

Yeah. Because I read enough legal documents

8:30

to know that once you put the word

8:32

temporary in there. Yeah. You mean not forever.

8:34

Mm-hmm. And so I was like, oh,

8:36

well, whatever's going to happen, it'll be back.

8:38

It doesn't matter. Yeah. Because they wrote.

8:40

temporarily. And you know their legal

8:43

team had to look at everything 400

8:45

times. Yeah, because they had just lost

8:47

the Supreme Court case like hard. So

8:49

I was just like, oh, unanimously hard.

8:51

Yeah, which we talked about on the

8:53

the warm up episode we had. Yeah.

8:56

So we were sitting like that hit.

8:58

And I was like, oh, that doesn't

9:00

feel good. And so we were sitting

9:02

there and then that you were scrolling.

9:04

At one point, I had finally put

9:06

it down. Yeah. Because I'm like, I

9:09

can't just be sitting here. It felt like

9:11

I was underage drinking again, where you're drinking because

9:13

you don't know when you're going to be able

9:15

to do it again. Like, because once I turned

9:17

like, I remember when I turned like 23, I

9:20

think. I remember being at a bar and I

9:22

had half a beer. I drank like half a beer.

9:24

I drank like half a beer and then we're like, oh,

9:26

we're going to go. And then we'll pound that and

9:28

then we'll go. And I'm like, I don't want to.

9:30

And I just sat down. I paid for the

9:32

entire beer. I can do whatever I

9:35

want with it. This is not

9:37

a shared experience. I never left

9:39

a soldier behind in my life. Well,

9:41

you know what? Much like, much like

9:43

Henry Kissinger, I abandoned soldiers everywhere.

9:45

So I was in that feeling

9:48

of just like, you know what,

9:50

I'm not going to be controlled

9:52

by this. And then, oh my God,

9:54

you not be controlled by the Tik

9:56

Talks? So that lasted for all

9:59

of 10 minutes. Mrs. Pease on

10:01

her iPad, she's sitting on the

10:03

couch next to me, and you're

10:05

scrolling, and then there was one

10:08

of them hit, and this guy

10:10

said, I heard it's shutting down

10:12

at 1030. Yeah. That's what he

10:14

said. I heard it's shutting on

10:16

at 1030. And I looked, and I

10:18

was like, well, what time is it

10:20

now? I looked, it was like, it's

10:23

1015. Yeah. It felt very

10:25

much like, hey. Powerball's about to

10:27

hit a billion. Yep. And

10:29

I don't buy Powerball. I

10:31

don't buy Powerball tickets. I don't

10:33

buy Mega Millions. I'm not

10:35

one of those people. I'm not

10:37

into that stuff. Yeah, we don't

10:40

do the gambling. However, when everybody

10:42

at my job is buying into

10:44

the pool, I'll be fucked if

10:46

I'm not sneaking off to the

10:48

side and buying one single on

10:50

my own. Okay? Okay. Because let

10:52

me tell you something. If the lady

10:55

from accounts payable, if we all put

10:57

in 30 bucks each to be in

10:59

the fucking mega millions, all right, for

11:01

a billion dollars in this office, where

11:03

we all hate our fucking boss, and

11:06

that one lady from accounts payable buys

11:08

her own ticket and she gets out,

11:10

Kathy gets out, but the rest of

11:12

us are stuck, I'll be fucked. I'll

11:14

be fucked. So I... I started scrolling, I

11:17

started scrolling and I saw an

11:19

influencer who I know personally and

11:21

he, so many people were on

11:23

TikTok live. Yeah, everyone was on

11:25

TikTok. Everyone. And so I was sitting

11:27

there and all of a sudden he's like talking

11:29

and I paused on him for like just a

11:31

second and all of a sudden he goes, oh

11:33

no. Oh no, where, where your face, I can't

11:36

see your faces on your profiles anymore because

11:38

when you're on Tik live it

11:40

shows you like when people are

11:42

commenting, you can see their. their

11:44

profile pictures. He's like, I can't see your

11:46

faces anymore. Some of you were

11:48

saying I'm breaking up. Oh no, it's

11:51

happening. It's happening. Is it going

11:53

dirt? And it just cut. It just died.

11:55

And I'm like staring at it. And then

11:57

all of a sudden I like scroll up

11:59

one. And then I get this

12:01

push notification and it said, quote,

12:04

a law banning, sorry, TikTok

12:06

isn't available right now. A law

12:08

banning TikTok has been enacted

12:10

in the United States. Unfortunately, that

12:12

means you can't use TikTok

12:14

for now. For now. We

12:16

are fortunate that President Trump has

12:18

indicated that he will work

12:20

with us on a solution to

12:22

reinstate TikTok once he takes

12:24

office. Please stay tuned, exclamation point.

12:26

Yeah, that's when he was strapping the knee pads on. Yep.

12:29

So. That was when

12:31

she went down to Mar -a -Lago and he

12:33

was like, listen. I heard from the guy

12:36

at the fire festival that this is how

12:38

you get water. I

12:40

think I got, I

12:43

think I got two more

12:45

TikToks than you. Cause

12:47

yours went right before mine. And mine

12:49

was the last one was this

12:51

girl saying like, wow, we

12:53

all agree that the Heidi and Spencer, Heidi's

12:55

song was going number one everywhere because their

12:57

house had burned down so everybody was listening

12:59

to it. She was like, I understand that

13:02

that's like the final song of TikTok, but

13:04

I think that the song of TikTok was

13:06

this and she was playing it and I

13:08

don't, I don't know what it's called, but

13:10

it's like the whoo -hoo, it's like a

13:12

very like, I wish I knew what song

13:14

it was, but I was listening to it.

13:16

I was like, oh my God, that is

13:19

a song. It's like, no. No? No.

13:21

You can't even hum it? I know I

13:23

can't. Okay. But it's, but I was listening

13:25

to it and I, my brain was just like,

13:27

oh, that really is kind of like a

13:29

song that I hear and I immediately associate with

13:31

like the, like the little happy positive time

13:33

of TikTok, like right after the pandemic, when

13:35

everybody was just silly, billion it up.

13:37

Yeah. And it was like only the nicest

13:40

TikToks were of this song. Yeah. And

13:42

I was like, oh, that's a really good.

13:44

And then as my brain was like,

13:46

oh, that's really nice. I got the notification.

13:48

And it's locked. And it locked my,

13:50

and I was like, it made me so

13:52

sad because I was like, just having

13:54

such a happy little. Yeah. TikTok is so

13:57

nice. Yeah. Cause I, my algorithm wasn't

13:59

like yours. My algorithm. them was just showing me

14:01

all the awesome things that Tiktok's had done.

14:03

So like I'm in the corner weeping watching

14:05

Keith Lee when he was like 20 years

14:08

old. A baby. A baby. His wife had

14:10

just given birth and he's like cooking her

14:12

dinner and they're talking about how much they

14:15

love food and I was like and he's

14:17

just so young and I'm like he changed

14:19

the lives of hundreds of restaurant owners. He

14:21

like incredible what he's done on

14:23

Tiktok. And I'm just like he's

14:26

holding his newborn baby and they're

14:28

like 20 years old or something and I

14:30

was just like, oh my god, take that.

14:32

And so I was just like in a

14:34

whole different mind space than you who were

14:36

just like holding your phone like, oh no,

14:38

the death is come, it is upon us. Yeah.

14:40

But also, I mean, mine was the one that

14:42

was going for me was the family guy thing

14:44

of, if I have one last thing to

14:46

tell you. Oh my God. With a lot of

14:49

the influencers giving up. Oh, just. Too many secrets.

14:51

So many. to the point where I'm like, did

14:53

you guys not know that this was, like, this

14:55

is a K-fave death? Like, this is, like, this

14:57

is, like, the death of Vince McMahon? Like, this

14:59

is, it hasn't actually, oh my God, you guys.

15:01

Also, like, I thought you all were pivoting to

15:04

other platforms. Did you know that? But also,

15:06

all of these people are on other platforms.

15:08

Like, there's so many, like, like, there's this

15:10

one girl, like, like, like, Haley, Haley, all

15:12

these other different, all these other different ones,

15:14

all these other different ones, all these other

15:17

different ones, all these other different ones, as

15:19

Tiktakers on Instagram or on

15:21

YouTube shorts or on these

15:23

other fucking places. I don't

15:25

see them on Tiktok. And

15:27

I know they're huge on

15:30

Tiktok because I'm not in

15:32

their demo. But like, I'm

15:34

like, all of you are

15:36

diversified. Like all of you

15:38

are diversified. Like all of

15:41

you are diversified. Like I know

15:43

that's like, I know that this

15:45

is weird confession that I was

15:47

like, what? And so it was,

15:49

and then I, and then you and I were

15:51

like, let's, let's talk about the, let's talk about

15:53

the Fed allegations. Oh yeah, let's talk about that

15:55

shit. But that, but that was, it was

15:57

mine, it's like I'm flipping it. But the.

16:00

So then, so anyway, I had that and

16:02

then the adult swim. What was the adult

16:04

swim? Adult swim was there was

16:06

this one song and the one thing

16:08

that TikTok really did a lot of

16:10

that people don't talk about as much

16:13

in the edits is people taking songs

16:15

and speeding them up or slowing them

16:17

down to change the meaning of them.

16:19

And there's this one song called running

16:21

his way is easy, but the leaving

16:23

is hard. Oh yeah. they speed it

16:26

up and it sounds very much like

16:28

a mid 2000s adult swim bumper. Yeah.

16:30

Running way is easy but the living

16:32

is hot hard. And so people were

16:34

making like fake adult swim ads for

16:37

it for a while so it's

16:39

like somebody sitting at a desk.

16:41

And then they like, open the

16:43

drawer or they close the drawer

16:45

and it says A.S. for adult

16:47

swim. So people were doing like

16:49

different versions of that. That were

16:51

really really cool. that's cool so

16:53

like that was my end of

16:55

my algo before that like wiped

16:57

and so I was sitting there

16:59

I was like okay cool so

17:01

we go through the the 14 hours yeah

17:03

also in that thing once I saw

17:05

that Trump's name was in it I

17:07

was like again coming back oh yeah

17:10

definitely coming I'd be like we're out

17:12

here so we're getting the back well

17:14

okay so so so real quick I

17:16

agree with you I do agree with

17:18

you yeah but I think that people

17:20

are missing about how how that's necessary now

17:22

is so the the move of tick-talk

17:24

itself at this point and by tick-talk

17:27

I mean the executives their job is

17:29

to save the company that's it if

17:31

they don't say the company they don't

17:33

have a job yeah the end so

17:35

their job is to save the company

17:38

once Joe Biden signed the tick-talk ban

17:40

this is always what was going to

17:42

happen the second Joseph Robin at Biden

17:44

sign the ticked him The second that

17:46

happened, this was always going to happen.

17:48

And I knew it. I didn't know

17:51

how to sign a paper for a Trump

17:53

win. I didn't know how deep it was

17:55

going to happen. And so the fact that

17:57

a lot of people were shocked by this

17:59

exact. move, that actually caught me

18:01

more off guard. Because I was

18:04

like, listen, if you want Trump

18:06

to do something, you have to

18:08

pre-praise him for doing it. And

18:11

then you have to pretend it

18:13

was his idea. Trump is the

18:15

pointy-haired boss. Like, he is an

18:17

idiot. He's a moron. And he

18:20

is a moron. And he is

18:22

a narcissist. So you have to

18:24

play into those things. And that's

18:26

100% what happened in this is

18:29

that. The reason they went dark

18:31

beyond just this move was

18:33

that Apple, Oracle, and other

18:35

data partners that they have

18:37

that are US-based said that

18:39

at midnight or whatever timing,

18:41

they would have to shut off

18:44

the service anyway. So there is a

18:46

half play in here of like, they

18:48

really were going to shut off the

18:50

service, but they also... did a little

18:53

bit, they added theatrical

18:55

to it. Yeah, it's a W.W.E. work. Yes,

18:57

which is like, hey, I'm gonna

18:59

kick in the face. Oh, you

19:01

kicked too hard. Well, now act

19:03

it up. Yeah. That's really what

19:05

it is. Becky Lynch, getting her

19:07

nose broken. Not planned to have

19:10

her nose broken, but did she

19:12

fucking lean into it? Become the

19:14

man? Yeah, she smeared that blood

19:16

everywhere. Yeah. And when it came

19:18

back, 14 hours later. 14 hours

19:20

later. Not enough time. I was hoping to

19:22

have more time. I wish, I, I wish they

19:25

waited like, I'm saying I hope, I was hoping

19:27

to have more time specifically in

19:29

our household. Not for anybody else.

19:31

Yeah. Well, yeah. Because it was, because

19:33

I was in the middle of filming birds.

19:35

Yeah. I was getting the birds video

19:38

ready, which is live now on YouTube. A

19:40

lot of you guys asked for it. Lada

19:42

y'all asked for a bird-tear list. Birds on

19:44

tears. We did it. It's 17 minutes long.

19:46

We didn't shed any tears on the birds.

19:48

Mrs. P. actually sat down and wrote a

19:50

list of birds for me to rank. And

19:52

with the help of my dad. With the help

19:54

of your dad. My dad and I sat there and

19:56

I was upstairs. Now let me see, what kind of

19:58

birds we want to talk about. Yeah, and I

20:01

was upstairs working on stuff,

20:03

and then all of a

20:05

sudden people started texting me, like,

20:07

it's back, I can upload, I can

20:09

do this, I can do that, I

20:11

was like, what? Because the whole time

20:13

I was sitting there, I was like,

20:15

I was, you know, doing Instagram, YouTube.

20:18

No, no, I meant in that I

20:20

was hoping, if it was gonna be

20:22

down, and I knew him in my

20:24

heart that it was coming back, I

20:26

was hoping, I'm gonna get the garage

20:29

cleaned. I'm going to come up with

20:31

tasks. We're going to clean. Idle

20:33

hands and no longer swipe up.

20:35

Yeah, I was like, oh, we're going

20:37

to move the air conditions up to

20:40

the attic. It's happening. That's never happening.

20:42

So I was like, I was super

20:44

excited to have 72 hours or so

20:46

to delegate tasks to someone who didn't

20:48

have TikTok to keep their brain busy.

20:51

And then my dreams were crushed. Yeah.

20:53

And then I started getting taxed from

20:55

so many people. I gave you a

20:57

shout out at our. podcast. So yeah,

20:59

and so we were texting back and

21:01

forth and I started running around like

21:03

and testing stuff. And then it wasn't

21:05

on the phones immediately. I could get

21:07

to it on desktop without a VPN.

21:09

And then it came back on the

21:12

phones and it said, welcome back. Thanks

21:14

for your patience and support. As a

21:16

result of President Trump's efforts, TikTok is

21:18

back in the United States. You can

21:20

continue to create, share, and discover all

21:22

the things you love on TikTok, and

21:24

then you can continue. Now this is

21:26

what truly broke people's brains. Because number

21:28

one, it came back on January 19th

21:30

at 1230 p.m. Eastern Standard Time. For us.

21:32

Not me. I didn't get it because it

21:34

was on my iPad. Your iPad didn't update

21:37

for a while. But that was almost,

21:39

that is 23 and a half

21:41

hours before President Trump was the

21:43

president of the United States of

21:45

America. Yeah. It's almost 24 hours

21:48

early. Now, the reasoning that was

21:50

given is that Trump put a

21:52

post out on true social, stating

21:54

that he intended to ignore the

21:56

law. Yeah. And that apparently was

21:58

enough for Oracle. the other ones.

22:00

And TikTok had said repeatedly that they

22:03

needed something in writing from the Biden

22:05

administration stating that they weren't going to

22:07

enforce the law. He was busy writing

22:10

pardons. He had no time to write

22:12

for his entire family and Fauci and

22:14

Millie and the January 6th committee. Every

22:16

side of the January 6th has been

22:19

pardoned. Everything. The investigators, the actual people

22:21

who did it. Never happened. All being

22:23

gaslit. I'm glad he did Fauci though.

22:26

Yeah, no. My boy Fauci. Yeah, and

22:28

Millie. I was downstairs yelling earlier because

22:30

I saw that Fauci's going to be

22:33

at the free library. Oh yeah. He

22:35

was a book tour. And I was

22:37

like, my boy needed that so he

22:39

could go do his book tour. I

22:42

was like, he. If it wasn't for

22:44

Dr. Fauci, another 50,000 people would have

22:46

died during the HIV crisis in the

22:49

80s. Yeah. He saved so many lives

22:51

here. Yeah. And then I might have

22:53

just started screaming about Reagan and his

22:56

bitch wife for a bit. Yeah. And

22:58

then the baby woke up. So, yeah.

23:00

But yeah. But yeah. Listen, the CEO

23:02

of Tytok, Nancy Reagan, two people who

23:05

have sucked presidential dick. So the... What

23:07

you're gonna get banned what you're gonna

23:09

get banned what? Anyway, you should have

23:12

the legually button We don't know you're

23:14

gonna get sued Was a joke that

23:16

was really quiet I'm not gonna hit

23:19

it again But anyway once TikTok came

23:21

back though now now the funniest thing

23:23

happened. What's that which is for the

23:25

first time ever y'all anyone who hasn't

23:28

been deep into TikTok Who has been

23:30

who has waited through it through it

23:32

Y'all, Tiktak is a place where conspiracy

23:35

theories, oh my God. Boy can they

23:37

take off, for sure. And boy can

23:39

they clap, and they can go quickly,

23:42

and then they can also die very

23:44

rapidly as well. But the thing is,

23:46

is Tiktak's greatest adversary has been Mark

23:48

Zuckerberg, specifically. It's been Mark Zuckerberg and

23:51

meta who owns Instagram, Facebook, WhatsApp, What's,

23:53

a bunch of Congress people. So. that

23:55

that's been the biggest foil so tick-hawks

23:58

off for 14 hours it's down it

24:00

comes back and when it turned back

24:02

on this is the reason why again

24:05

i believe that that it was actually

24:07

shut down the united states a lot

24:09

of services weren't immediately available because when

24:11

you have a system as massive as

24:14

tick-tocks when you when you pull the

24:16

giant lever that says off You can't just

24:18

turn everything on at the same

24:20

time. You have to turn on

24:23

regular feeds first, then comments, then

24:25

likes, then data tracking, then these

24:27

things. Eventually lives come back, eventually

24:29

TikTok Shop comes back. People were

24:31

freaking out though because they're like,

24:33

it's different, I can feel it's

24:36

different, all these different things, and

24:38

then suddenly the conspiracy hit. During the

24:40

14 hours it was off, they didn't

24:42

turn off the servers. They switched the

24:44

servers. We're on meta servers now. Yep.

24:46

But that didn't happen. That didn't happen.

24:48

It just didn't. But it's one of

24:51

those things where people started to see.

24:53

They started to grab all these little

24:55

pieces. Hank Green made a really great

24:57

video about it because there are conspiracy

24:59

influencers out there. They sure are. The

25:01

god damn there are a lot of

25:04

them. And what they do is they create

25:06

a puzzle. And they do that by

25:08

handing you random pieces of paper. They hand

25:10

you a group of dates. They hand you

25:12

a device. They hand you the name of

25:15

a politician. And they say, don't those look

25:17

like puzzle pieces? I wonder if they fit

25:19

together. And then they wait for you to

25:21

put them together. And whatever you come up

25:24

with, he's going to agree with. He's going

25:26

to say, wow, it's crazy how much you

25:28

did your research on that and what conclusion

25:30

you led to. I wonder what else you

25:33

can find. Here more puzzle pieces. And that's

25:35

what all of these guys do.

25:37

And that's exactly- But

25:39

with that cool background

25:42

music. Yeah, you got

25:44

the cool background music.

25:46

Yeah, there's this very

25:49

specific, oh, wait, dum, dum,

25:51

dum, dum, dum, dum, dum,

25:53

dum, dum, dum, dum, dum,

25:56

dum, dum. That's the one

25:58

they love to use. Yeah. They are

26:00

part of the conspiracy. Yeah, that now

26:02

they are part of, they're like, no,

26:05

now it's state-owned, it's these things. And

26:07

then Trump kept talking. Yeah, well, he's

26:09

never gonna know. He's never gonna know.

26:12

He's never gonna know. And that's what

26:14

doubles and triples down into it. Because

26:16

now Trump is saying that half of

26:18

TikTok US. No, just the lives. And

26:21

does he just Tiktak Live? He just

26:23

wants Tiktak. He wants Tiktak Shop. He

26:25

wants access to those wigs. Folks, all

26:28

I want, I saw, I was in

26:30

the Tiktak Live folks, I saw a

26:32

rooster wearing jeans and I said we

26:35

need this, this is American. A rooster

26:37

wearing Levi jeans, Levi's, Levi jeans. I

26:39

was... Sorry, disassociating. The scene the conspiracy

26:41

happened, because the other parts that people

26:44

started notice is that Facebook has a

26:46

TikTok account. Yeah. It's had it for

26:48

a long time. Instagram has a verified

26:51

TikTok account. It's had it for a

26:53

long time. Yeah. People said, I got

26:55

a push notification that said, oh, you

26:58

know, add your Facebook contacts to spread.

27:00

Yeah, that's been there for a long

27:02

time. What a lot of people, what's

27:04

happening to a lot of people is

27:07

most people don't have updates set on

27:09

their phone. or they regularly update their

27:11

apps. And so because that doesn't happen,

27:14

when this change happened, a lot of

27:16

stuff that was backlogged got pushed through.

27:18

Yeah. And now that that's happened, they're

27:21

freaking out. I'm like, I, y'all, I

27:23

had so many problems with TikTok for

27:25

the last two years. There'd be regularly,

27:27

I would un install and reinstall and

27:30

reinstall the app just to get to

27:32

force it to give me the update

27:34

because I would find out about stuff

27:37

like the picture carrisels. Yeah. I never

27:39

had those. Yeah. There was all these

27:41

small, big TikTok shop. Yeah, you weren't

27:44

able to do TikTok shop forever. It

27:46

was like a year. It took me

27:48

a year. It wasn't until the day

27:50

after the election, something finally fixed. They

27:53

just made it so you couldn't do

27:55

it. You were blacklisted. I was so

27:57

mad. No. I don't know because I

28:00

was so mad because I was like

28:02

it was the day after the election

28:04

and I'd been wanting to do one

28:07

for the people who do inklings. I

28:09

want to do the Ali the Oddbird. He's

28:11

right there on the floor or over there

28:13

in the back. But I want to do

28:15

the Ali the Oddbird. He's right there on

28:17

the floor or over there in the back.

28:19

Yeah, he's so far away from it. But

28:22

I wanted to do the Ali the Oddbird

28:24

video. But I wanted to do the Ali

28:26

the Oddbird. He's like Jason Nash. He's like

28:28

Jason Nash. Oh, Jesus Nash, oh,

28:30

the man. But, um, listen, do

28:33

you think that Joe Biden, never

28:35

mind, he signed the paperwork to

28:37

end our jobs. Why would he

28:40

pay us? Yeah, to end TikTok.

28:42

It's crazy. So, the other part of

28:44

it, though, which again these people aren't,

28:46

it's not clicking with them. Because they're

28:48

like, no, no, no, it's all these

28:51

secret mergers are having these secret mergers.

28:53

Besides the fact the size of these

28:55

companies are so big and they sign

28:57

off and all these other things that

29:00

would have to happen. And yeah, do

29:02

we live in an illegal cleptocracy now

29:04

where that can happen? Yeah, totally. But

29:06

not in the first hour, it would

29:09

like, if there's six months from now,

29:11

I would totally be with you on this.

29:13

If Meta changed Instagram specifically to try

29:15

to pull over people who fell

29:18

off from TikTok, why would YouTube

29:20

change so much to pull over

29:22

the people that we're about to

29:24

lose TikTok? Why would all of

29:26

these apps pivot so hard for

29:28

us on this specific date? I

29:30

need you guys to understand that

29:32

as a content creator who works

29:34

specifically in short form video, whose primary

29:37

area has been Tik traditionally. I have

29:39

been reached out to by every service

29:41

with the exception of Facebook. Yeah, they

29:44

know. They don't like me and I

29:46

don't like them and I'm on it,

29:48

I'm on it, but we're not friends.

29:51

But the, all of them have reached out

29:53

over the last bunch of months and

29:55

been like, well, you know, with the

29:57

ban coming, ever since the ban was

29:59

signed, all of them. YouTube changed to

30:01

three minutes. The day of the,

30:03

the day before the ban, on

30:05

January 18th, Instagram rolled out three

30:07

minute reals. Yeah. On the, uh,

30:09

uh, uh, YouTube shorts had three

30:11

minute, uh, shorts all the time.

30:13

Yeah, they were all trying to,

30:15

all these, everything. But the thing

30:17

is, is they said, oh, we'll

30:19

make it longer, but they didn't

30:21

make the app. You can't, uh,

30:23

what's it called, edit things well.

30:25

No. The captions suck. All those

30:27

things. The comments suck. You can't,

30:29

like, do wet properly. You can't

30:31

share videos you like with your

30:33

friends properly until like your feed.

30:36

It's dumb. And they should have

30:38

spent the last year's making their

30:40

platform as good as TikTok. Because

30:42

it is. It is. Wait, no.

30:44

Of all the conspiracy theories, I

30:46

think that. One thing that is

30:48

likely true is that it's state

30:50

media now. It will, in a

30:52

way. It's, it's, no, it's influenced

30:54

by the state. At the minimum

30:56

right now, it's influenced by the

30:58

state because if something trends that

31:00

Donald Trump doesn't like, then he

31:02

can say, well, then I'm not

31:04

extending the ban. Oh, we're not

31:06

negotiating. And right now, TikTok is

31:08

worth a trillion dollars to the

31:10

owners of Tiktok. And it is

31:12

worth nothing to Trump. Yeah. And

31:14

he said that. He's like, if

31:16

I don't, if I don't approve

31:18

anything, it's not worth anything. Yeah.

31:21

He said that into the camera

31:23

in front of the press. And

31:25

he knows, and Tiktok knows, Tiktok

31:27

changed their algorithm, like people like,

31:29

oh, they don't have, they changed

31:31

their algorithm to show more conservative

31:33

stuff. No, no, no, no. They

31:35

didn't do that, they didn't do

31:37

that this week. They changed their

31:39

algorithm the day Joe Biden signed

31:41

the ban. Yeah. It's not worth.

31:43

bucket. They took right leading content

31:45

creators, they put them in another

31:47

bucket, and they took undecideds, and

31:49

they put them in the same

31:51

bucket with Charlie Kirk. And Charlie

31:53

Kirk has gone on podcasts and

31:55

openly said this. So like, that

31:57

conspiracy... is not

31:59

a theory. It

32:01

is what happened and it just happened

32:03

a while ago. Yeah. And all of this, this has

32:05

been happening across the media and it's just people are

32:08

starting to catch up to it now. So with that,

32:10

okay, listen, we're going to take a break. Yes. And

32:12

when we come back, we're going to, because it's to be more

32:14

fun than this. I feel bummed out now. You feel bummed

32:16

out now. I feel bummed out now. Well, you know what

32:18

we're going to do? We're going to remember

32:20

happier times. Yeah. And we're going to

32:22

have some eulogies for other social media apps

32:24

that we also died that also died,

32:26

but died for different reasons, but also just

32:28

to remember about, you know, what it

32:30

used to be like, but also to remind

32:32

maybe some tech people out there, what

32:34

you could just steal from the past and

32:36

rebuild. Okay, we'll be right back after

32:38

this. And

32:50

we're back. We're back. And it's time for a eulogy. And

32:52

if there's one person in this room who can give

32:54

a last minute eulogy. That has happened

32:56

to me multiple times. know. That has happened

32:58

me twice. Every time we go to a funeral, they're like, you know what,

33:01

can you get up and say some words? Yeah. That hap,

33:03

we're at one where we're literally up there and

33:05

there's someone say, can you get up and say

33:07

some words on behalf of the family? Just thank

33:09

people for coming. Say something nice about the departed

33:11

and thank the church. And I was like, yeah,

33:13

sure. And then I froze. I was like, you

33:15

mean like a eulogy? And they're like, yeah, yeah,

33:17

yeah. On the spot. I

33:19

had two minutes. You crushed. I made everybody cry. And then

33:21

I made them laugh. Cry and laugh. Killed

33:23

it. Best set of your life. But

33:26

here's the thing. Mrs. B and

33:28

I, we're millennials. Older millennials. Elder.

33:30

Older. We're here. Our bones crack

33:32

and creak. Oh, the time. You

33:34

know, sometimes we, you know, look

33:36

at booking trips to Turkey to

33:38

make ourselves feel young. We

33:41

do. I do. I'm looking at this

33:43

one spot in my hairline some days, babe.

33:45

And I'm like, I can go to Turkey.

33:47

And I could stay

33:50

viral. But, but anyway,

33:52

we have lived through a lot

33:54

of social media sites died. Yeah. A lot

33:56

of them. I mean, those died

33:58

because they were bought. were bought or

34:00

they stopped being cool they weren't shut

34:02

down by the government no they weren't

34:05

shut down by the government's new it's

34:07

a different thing it's a new thing

34:09

yeah that's a that's that's part of

34:11

the new millennium world that wouldn't have

34:13

happened to my space they would not

34:15

actually weirdly it kind of did the

34:17

government but Rupert Murdoch yeah there it

34:19

is so let's go down some of

34:21

these some of these that we were

34:23

talking about here is my space and

34:25

we have listeners who are younger who

34:27

have never been on my space well that I

34:29

mean the MySpace was such a very number

34:31

one it felt elite because it was the

34:34

first of its kind yeah so it felt

34:36

super cool and you could customize it in

34:38

a way that you can't customize most social

34:40

media is you can make it very much

34:42

your own yeah you can make your backgrounds

34:44

all sparkily you could add music

34:46

and you had to do it using HDML

34:49

yeah you actually had to go in and

34:51

know a little tiny bit or copy paste

34:53

a little tiny bit you just had to

34:55

make friends on AOL messenger that knew how

34:57

to how to do it and so they

34:59

could send you all the. copy to put

35:01

in there. Yeah so that way you

35:04

could yeah because like you couldn't just

35:06

be like oh because so you would

35:08

customize your landing page is what it

35:10

was and so like I was like

35:12

I want mine red black letters whoa

35:14

so I had to know the Pantone

35:17

color for red like I didn't know

35:19

the number like I didn't know

35:21

the number yeah number because there's

35:23

like certain codes and hashtags and

35:26

all this other different stuff you

35:28

had to know how to put in?

35:30

Jeffrey Star. Yep. They all made their

35:32

name on MySpace. And we have an

35:34

episode about Jeffrey Star. Yeah,

35:36

it's very early. Very, very good.

35:38

Good mics then. Yeah, I get really

35:41

mad about it. Some of our very

35:43

early episodes, because I didn't know, you

35:45

know, how well the pod would do.

35:47

And we didn't want to do

35:49

invest until we knew what was

35:51

happening. Like I said, we're not in

35:54

the gambling. We're not in a gambling.

35:56

And so draft kings can

35:58

stop sending those emails. Which eventually

36:00

spread to like a top 36. Yeah, it's when

36:02

you knew we were going downhill But people

36:04

used to fight when I remember people fighting over

36:06

the top eight Yeah, you have to keep your friends

36:08

at the top and then if you were mad at

36:10

somebody you pop them out of the top eight Put someone

36:13

else in there. Uh -huh passive aggressive Oh, maybe that's

36:15

why I didn't didn't like it because I

36:17

I'm not a passive aggressive person. I didn't

36:19

have eight. Oh I didn't have

36:21

eight friends that I'd want to put in

36:23

there There you go, which is what Dane

36:25

Cook was for yeah, Dane Cook was like

36:27

some of you only have six friends Put

36:29

Dane Cook up there. Dane Cook up there.

36:31

Yeah, you know what I mean and that

36:33

like literally like literally I would love to

36:35

find my own Myspace and log into it.

36:37

You can't but yeah, I can't well You can't

36:39

because and let's let and here we go. This

36:41

is what happened My space got sold to News

36:44

Corp. Mm -hmm, which is owned by Rupert Murdoch.

36:46

Yep, and He spent a lot. I think he

36:48

spent like a billion dollars for it He spent

36:50

a shit ton of money Tom who ever was

36:52

everybody's first friend on Myspace This is the great

36:54

thing about Myspace for those you guys don't know

36:56

when you first joined it You had one friend

36:58

and his name was Tom And he was the

37:00

founder of Myspace and there was a picture of

37:02

him looking over his shoulder And he was just

37:04

a nice guy and then one day when Tom

37:07

sold to News Corp Tom Got

37:09

a billion dollars. Yeah, and he did

37:11

something that a billionaire has never

37:13

done before he fucked off forever Oh,

37:15

what a thought he just fucked

37:17

off. Yo real quick Shout out Tom

37:19

for just fucking off He just

37:21

went away. He was like the money

37:23

you go live a life. He's like

37:25

I shut the fuck up Island

37:27

you don't need to try to run

37:29

the government. No, he's like though. Do you want

37:31

go to podcast Tom from Myspace? He's like why would I ever

37:33

want to do that? I'm busy in the woods

37:35

or on a boat or an island

37:37

or fishing something else every day

37:39

is great I don't need to fly

37:41

to Austin all about Rogan. All

37:44

of my needs are met. Yeah, I'll

37:46

never have a need not met Damn

37:48

living in the woods like Burt's Bees

37:50

in a tiny cabin That's

37:52

what Tom did and so News Corp bought

37:54

it and they didn't know what

37:56

social media was Yeah, and they tried doing

37:58

a bunch of different pivots And eventually it

38:00

got sold off and I think sold

38:03

off again and somewhere in there.

38:05

The wrong server got deleted. Yep.

38:07

And all of our MySpace photos,

38:09

pictures, videos, all of our likes, all of

38:11

our repos. I don't think that that

38:13

hacker group anonymous is real, right?

38:15

Yeah. I never had any faith in them.

38:18

People are always like, Robbie, you're going on

38:20

the internet. Where's anonymous? Like, they're not coming

38:22

to guess, save us. But I do think

38:24

if anonymous is real. then they're the ones

38:27

that deleted the server to save us all

38:29

from the pictures of how bad our hair

38:31

was back then. See, no, I don't think that's

38:33

true. I think because I've worked in enough companies

38:35

did know that people are just lazy and dumb.

38:38

Just somebody named Janet hit a wrong button. Yeah,

38:40

it's answered a fishing email. She shouldn't have.

38:42

Yeah, something happened and then someone or it wasn't

38:44

labeled. And they're like, oh yeah, you can throw that

38:46

one out. The whole stack and the guy looked at

38:48

was like. It was malicious compliance, I guarantee you it

38:51

was probably malicious compliance by one of those guys, like

38:53

a janitor or something. He's like, you want me to

38:55

throw out this whole stack? And they're like, yeah, he's

38:57

like, the one marked, don't throw out. I told you

38:59

to throw it out, Steve. Yep. So Steve said, all

39:01

right. And he took it outside and he crushed it.

39:04

And they're like, just sign this paper saying you don't

39:06

want me to do it. Yeah. because it's not going

39:08

to be on me. Steve always covers his ass.

39:10

Of course he does. Cover your ass.

39:12

CYA, baby. That's why you CYA Steve.

39:14

So Myspace dies. Myspace dies. And

39:17

the thing is, is MySpace was

39:19

replaced by what? Facebook. It was.

39:21

And Facebook. It was. And Facebook.

39:23

For those you guys who don't know.

39:25

Used to be fun. So first, when

39:28

Facebook. A lot of people forget this.

39:30

When Facebook first rolled out, it was

39:32

invite only. Yep, you had to get

39:34

a link to get in. You had

39:37

to be in a college. You

39:39

had to be in a green

39:41

lit college, no less. So, because

39:43

it was, what, Harvard? Not Harvard.

39:45

I didn't think it was, yeah. I

39:47

know, I wasn't there for that. No,

39:49

I'm saying where it was

39:51

founded. Number one, wasn't in

39:54

college. And it made with

39:56

Stanford. Number two, wasn't really

39:58

on the internet. University and

40:00

it was these specific Ivy League

40:02

ones and then because those people

40:04

who go to those schools knew

40:06

other people other Ivy leagues and

40:08

so that's how it started out

40:11

it spread there and then it

40:13

spread a little more my space

40:15

was for everybody my space was

40:17

like hot topic or Spencer's gifts

40:19

or one of those things Facebook

40:21

when Facebook first rolled out it

40:23

was Abercrombie and Fitch Oh, I

40:25

was going to say, if you're,

40:27

I was like, kind of say,

40:29

like, Neiman Marcus. No, it wasn't

40:31

Neiman Marcus. It was more this,

40:33

like, well, because they were judging

40:35

everybody and how they looked, because

40:37

it was a popularity contest. It

40:39

was 100% a popularity contest. But

40:42

in the same way, Abercrombie was,

40:44

it was, in the same way

40:46

Abercrombie was. It was, in the

40:48

same way Abercromm was, They never

40:50

let him speak to a customer.

40:52

And he did not know that

40:54

he was considered ugly. He had

40:56

a really, check him in the

40:58

mail. Until the settlement hit. I'm

41:00

sorry. Hey buddy. Here's your, here's

41:02

your, here's your, here's your, here's

41:04

your, here's your, here's your, here's

41:06

your, here's your, here's your, here's

41:08

your, here's your, here's your, here's

41:10

your, here's, and I would use

41:13

it a lot for comedy and

41:15

those things when I was doing

41:17

stand up really early early on.

41:19

But early Facebook, you could, that's

41:21

all your friends were on there,

41:23

but it was a really great

41:25

way to have group chats, and

41:27

it was a really great way

41:29

to have Facebook events, and spam

41:31

the fuck out of them. We

41:33

used to put the exact time,

41:35

place, address, and every single person

41:37

who was gonna be in the

41:39

building on to Facebook. Yeah, you

41:42

can't do that now. And it

41:44

used to be public for anyone

41:46

to join. Yep. It was they

41:48

weren't private anyone could end you

41:50

be like big party Kim's house.

41:52

Yeah this time moms away it

41:54

would be moms away at one

41:56

two three Main Street. Yeah seven

41:58

p.m till till three a.m. underage

42:00

drinking aloud that would be the

42:02

title that would be the title and

42:05

it would be shared it would just

42:07

have a picture of a cooler

42:09

filled with jungle juice yeah and

42:11

then everyone would get into it

42:13

that that group that page would

42:15

live on as a group yeah

42:17

because people would then start to

42:19

post pictures from the party yeah and

42:21

then tag everybody who was there

42:23

that we y'all We were doing crimes.

42:26

And we were creating data trails.

42:28

And we were putting it everywhere.

42:30

Everywhere. And then you know what

42:32

happened? Around 2014, you know what

42:34

happened? All of our parents joined.

42:37

All of our parents were convinced

42:39

to join Facebook. And they showed

42:41

up and they're like, what's this

42:43

thing over here? And before we knew

42:45

it. And they said, oh, I'll post

42:47

my Social Security card. Yep. Sure why

42:50

not one day we all logged in and

42:52

then suddenly our entire Facebook feed was

42:54

just people saying I do not get

42:56

a Facebook The ability to shake my

42:59

personal data Yeah, and it's was the

43:01

most deep-fried screenshot because it had been

43:03

copy-paced so many times so many times

43:05

you can even read the letters and

43:08

that was that I was like I

43:10

don't know this place kind of sucks, but

43:12

I was using the only reason I stayed

43:14

around for so long on Facebook as long

43:16

as I did was because I was doing

43:18

standup. Yeah And it wasn't until

43:21

about 2017 when the Cambridge

43:23

Analytica scandal broke. I'm sorry,

43:25

what was that? Cambridge Analytica

43:28

was a scandal. So Facebook

43:30

was doing psychological experiments on

43:32

their users? Oh, what? Yeah. They

43:34

had tried out a bunch of

43:37

different things as marketing stuff and

43:39

then they also had all these

43:41

crazy data points on everyone. Everyone.

43:43

And Cambridge Analytica was a political

43:46

consulting group that figured out basically

43:48

how to hack people's brains on

43:50

Facebook by using, it was like

43:53

50 million Facebook users. They got

43:55

a hold of all of their

43:57

data and then were able to.

44:00

like specifically target groups of people

44:02

in swing states and others across

44:04

the United States and the world

44:06

to convince people that the opposite

44:08

of what they wanted was what

44:10

they should vote for. And never

44:12

heard of that happening before. We

44:15

should ban this app. We should. And

44:17

between that and a few other things, and

44:19

me finally stepping away from doing stand

44:21

up, I decided and you decided we deleted

44:23

our Facebook. I think I only made

44:25

a Facebook when you and I started dating.

44:27

I don't think I had one before

44:29

you had to have because that's how you

44:32

and I you and I actually our

44:34

first communication was in Facebook Facebook messenger messenger.

44:36

Yeah. Yeah. I asked you out on

44:38

a date on Facebook. That's right. And 2012.

44:40

Did I get one? Okay. So I

44:42

had at least 2012. I guess 2010. Here's

44:44

the thing I know. I don't think

44:46

I had it while I was still like

44:48

out in the world, partying and drinking

44:50

because even though we're talking about like, oh,

44:52

we had party crimes. I was I

44:54

was committing enough crimes in my day to

44:56

life that I knew enough to not

44:58

put them online for the most part. There's

45:01

I'm sure there's some crimes online, but

45:03

like the kind of crimes we just get

45:05

fines, not the kind of arrested. And

45:07

so I feel like maybe she was stealing

45:09

parking cones. I was. I still do.

45:11

I see a parking. Oh, is everybody using

45:13

that? That's mine. Hey, I'm gonna need

45:15

that later. Law of the streets. I'm gonna

45:17

save a spot. So I feel like

45:19

it might have been I know that I

45:21

was late because I know that when

45:23

I signed up and finally created a Facebook

45:25

page, I went and I had to

45:27

like find all the friends and it was

45:29

like everybody was already there. They had

45:32

a backlog of information and then yeah, you

45:34

and I just were both like, this

45:36

is lame. We're out. Yeah. It took us

45:38

forever to download it because we had

45:40

to like download all our pictures shut it

45:42

down was forever, but it was it

45:44

was leaving that and then leaving Instagram at

45:46

the same time. And it wasn't until

45:48

2022 when we decided to really start focusing

45:50

on content creation because I got popular

45:52

on TikTok that we came back and it

45:54

forced me to make a Facebook so

45:56

I could make an Instagram account. Yeah, it

45:58

was like, if you want to have

46:00

an Instagram be half. to have this. And to many of our

46:03

listeners who have reached out to me in the last few days,

46:05

we are very aware that there is

46:07

a Facebook account with 38,000 followers that

46:09

is called Perlmania 500, that is a

46:11

huge supporter of RFK Jr. That is

46:13

not me. We're gonna figure that out.

46:16

My account has not been hacked. We

46:18

knew of the clones who've been posting

46:20

me and impersonating me on Facebook for

46:22

some time, but we have not really

46:24

looked at Facebook as a place really

46:26

cared too much about. We are going

46:29

to, it's just been, it was, it

46:31

was another thing to do. Between having

46:33

to post on, having to repost on

46:35

Tiktok, YouTube shorts, Instagram reels, Blue Sky,

46:37

Tumblr. and threads, and then cross posting

46:39

between all the different ones, then going

46:42

on to a live to promote, and

46:44

then making sure it was in the

46:46

Instagram group chat, and then coming back

46:49

over to patron at Pearlmania 500.net to

46:51

making sure that you all are understood,

46:53

and then doing it. It's a lot.

46:56

It's a lot of moving pieces. Yeah.

46:58

So we picked and choose, chose what's

47:00

important to us. And Facebook's not important.

47:03

It really isn't, but seeing that someone

47:05

is using my face to promote RFK

47:07

Jr. and colloidal silver? Crazy. Doesn't

47:10

know the lore. Doesn't know the

47:12

lore. Which is wild. Which brings

47:14

us to the last one that

47:16

we want to eulogize today. Which

47:18

was your favorite. My baby, my

47:20

Pinterest. She got murdered by the

47:22

AI. Yeah, AI Pinterest. Pinterest

47:24

used to be so fun. Pinterest

47:26

is like the most, was the

47:28

most calming. Like you could just open

47:30

it up and you just look at

47:32

pictures of nice things. And my

47:35

Pinterest is like. So old, so

47:37

it's like, it's, I have deeply

47:40

built that algorithm for a long

47:42

time. The pretty cakes, the lovely

47:44

gardens, a focacha that looks like

47:47

a flower. Like it's wonderful, a

47:49

cute haircut, a pair of pants I

47:51

like. That's what it used to

47:53

be. And now, it's just A.I.

47:55

Slop. And now, it's an A.

47:58

I haven't seen something that's... not

48:00

an AI like garden in a year.

48:02

So it's like it's so unusable

48:04

now. Yeah, it's terrible. Yeah,

48:06

it's absolutely terrible. Because

48:08

you used to show me like cool

48:11

things. Where did you find that?

48:13

Like Pinterest? And I want to

48:15

do this to like our kitchen.

48:17

Yeah. And now you'll show me

48:19

something and I'm like, right? Yeah.

48:21

Why does she have six, seven

48:23

fingers? Every single one of them.

48:25

Why does the kitchen sink have

48:27

two sinks and why is there

48:29

a candle lit in the sink?

48:31

Also, also, but in this, this

48:33

issue with interest is an issue

48:35

across all of social media now,

48:37

and which is going to get

48:40

worse, especially on meta platforms now

48:42

that they have gone and completely

48:44

destroyed it. And it's also going

48:46

to be an issue for, um,

48:48

uh, YouTube, especially. Yeah. It's such

48:50

an issue for YouTube and it's

48:52

being specifically targeted at kids. Also, here's

48:54

the thing, it's like... AI is bad for

48:56

the environment. Yeah. And I think that's what

48:58

upsets me the most. It's like, it's one

49:00

thing that I'm looking at this and you're

49:02

like ruining the thing I used to look

49:05

at. But then I look at it and

49:07

it's so terrible and I'm like, you ruin

49:09

the environment for this? Yeah. This is what

49:11

you ruin the environment for? And you know

49:13

what? When it comes to AI, that

49:15

leads into a very specific theory. And that

49:17

theory, we're going to talk about in

49:19

the next segment in the next segment. What's

49:22

that. Because I don't. And

49:24

you're going to explain

49:26

to me what dead

49:28

internet theory is. You've

49:31

been screaming it for

49:33

weeks and I have not

49:35

asked you to elaborate because

49:37

I've been busy doing other

49:40

things. Yeah. And so basically

49:42

there has been a theory

49:44

that's been out for a

49:46

very long time. Yeah. That

49:49

we aren't speaking to people.

49:51

Okay. Ever. that most of the

49:53

internet, if not all of the

49:55

internet, since about 2016, 2017, comment

49:58

sections, all those different things. that

50:00

none of these are real people that

50:02

in fact these are all bots. Okay,

50:04

all right. And that belief. I've seen

50:06

your Instagram comment section. I can believe

50:08

it. But that and that belief has

50:10

actually gotten worse since the rollout of

50:13

chat GPT. Specifically, because chat GPT is

50:15

the first one where there's a language

50:17

model before the bots might be like,

50:19

hey, we're gonna create a bot that

50:21

anytime somebody posts, like in 2016, anytime

50:23

somebody posts, I'm voting for Hillary Clinton.

50:25

when you start trying to explain this

50:28

to me before was when was it

50:30

the threads are blue sky and every

50:32

time somebody posted something nice they got

50:34

responded to by hate comments yeah because

50:36

like the point is to make people

50:38

hate the platform and leave it's also

50:40

to destroy discourse yeah so so one

50:42

thing you know this is um We're

50:45

working on a couple different things. And

50:47

this one in particular is a, this

50:49

is kind of like a term of

50:51

service that I want all of you

50:53

guys to keep in mind. Okay. Okay.

50:55

Everybody out there listening, you know, listening

50:57

at home, in the car, anything, when

51:00

you're on the internet, and a stranger

51:02

is rude to you, block them. Yeah.

51:04

If they say something you don't like,

51:06

or they're, and especially if they're doing

51:08

it in a way that's intentional, that's

51:10

fine. But you're like, I like carrots.

51:12

They're like, oh, oh, you fucking hate

51:14

peas? Block them. Block that person. Because

51:17

more than likely, that person, especially as

51:19

we saw on Blue Sky and a

51:21

few data scientists that I've been talking

51:23

to, have been pointing this stuff out,

51:25

they are designed to destroy confidence and

51:27

any sort of possibility of unity. Yeah.

51:29

These, they're beyond trolls. Their existence is

51:32

literally to so discord. Their existence is

51:34

literally to break our brains. Yeah. And

51:36

we've said, I think I've said this

51:38

before, I don't know if it was

51:40

on episode of the warm-up, or if

51:42

it was on a specific episode of

51:44

the pod before, but... in the living

51:46

room? I don't know. This all comes

51:49

together like so often. But there are

51:51

different types of bots that do different

51:53

types of things. And I saw it

51:55

on Blue Sky, but I also know

51:57

that for a fact that they exist

51:59

on Twitter. Yeah. And their job is

52:01

literally to make you the human user

52:04

feel like your opinion is worthless. So

52:06

you'll stop talking. That you also will

52:08

believe that whatever you think is not

52:10

the majority. So therefore, something that might

52:12

not be in the majority could seem

52:14

like it has a mandate. Yeah. So

52:16

I mean, that's just like part of

52:19

the theory. But the other part of

52:21

it is that there is not only

52:23

coordinated bots, that it's not only coordinated

52:25

bots that are out there, but that

52:27

the masters of the bots may have

52:29

lost control. Yeah, yeah, yeah. And so

52:31

it's bots talking to bots looping back

52:33

into themselves, which is how we end

52:36

up on Facebook. 10,000 comments saying, oh

52:38

my God, I love shrimp Jesus. Yeah,

52:40

because it's, well, I think some of

52:42

our grandparents are responding to shrimp Jesus.

52:44

Yes. I do think that it's just

52:46

bots responding to shrimp Jesus. For those

52:48

of you guys who don't know what

52:51

shrimp Jesus is. Please Google it. No,

52:53

you can. You can Google it. But

52:55

also if you go to the Wikipedia

52:57

page. There's actually too many if you

52:59

Google it. Yeah. If you go to

53:01

the Wikipedia page for Dead Internet Theory,

53:03

there is a picture of Shrimp Jesus.

53:05

Oh, there he is. Shrimp Jesus. He's

53:08

there. But basically what you need to

53:10

know is AI. The AI models that

53:12

have been running a lot of different,

53:14

especially Facebook pages and groups, that are

53:16

full of AI models that are posing

53:18

as actual accounts, what they have been

53:20

doing is they've created a feedback loop,

53:23

where they know, they'll put in a

53:25

basic thing to like, listen, we know

53:27

that we're aiming for dumb conservatives living

53:29

in the middle of the country, and

53:31

what do they love? They love Jesus.

53:33

And so over time a couple people

53:35

who were in there also might be

53:38

posting about like hey I just got

53:40

back from Louisiana I had a really

53:42

good time eating crawfish. All right, now

53:44

we have two data points. They love

53:46

Jesus and they love crawfish. And bubble

53:48

gum shrimp. And slowly, over time, these

53:50

things start to get knelled together. And

53:52

so weirder and weirder stuff starts to

53:55

pour out to the point where now,

53:57

like, people have very little confidence that

53:59

anyone were speaking, unless you specifically know

54:01

them and you walk up to them

54:03

at a bar and they're like, hey,

54:05

I love your Facebook post the other

54:07

day? Yeah. No one's ever said that

54:10

in the last 20 years. No, yeah,

54:12

it's been. Last time, so he said,

54:14

I loved your Facebook post. The last

54:16

time I heard somebody say that I've,

54:18

you know, I love following you on

54:20

Facebook or any of that different stuff.

54:22

I want to say it was about

54:24

2015. Yeah. So yeah, about 10 years

54:27

ago. Yeah. Because I used to, that

54:29

was one thing that used to really

54:31

annoy me on Facebook, which was when

54:33

I ran into people I hadn't seen

54:35

in a while, I hadn't seen in

54:37

a while, I had nothing to tell

54:39

them, I had nothing to tell them,

54:42

I had nothing to tell them, I

54:44

had nothing to tell them. Yes, because

54:46

they were silently following me. I mean

54:48

you and I still get in arguments

54:50

about stuff like this over sharing so

54:52

that when you sit down to dinner,

54:54

there's nothing to talk about. What are

54:56

we going to talk about? What are

54:59

we going to talk about? And that's

55:01

why I stopped, I actually, you and

55:03

I stopped texting each other. what was

55:05

happening at work throughout the day. So

55:07

we actually had something to fucking talk

55:09

about when we sat down because I'd

55:11

be like, oh yeah, you know, and

55:14

so and so was a real asshole

55:16

at work. You're like, I know, you

55:18

texted me and when you were on

55:20

your smoke break. And I'm like, oh

55:22

yeah, I forgot. And then you'd be

55:24

like, yeah, and this fucking idiot didn't

55:26

fill out the payroll. I'm like, I

55:29

know. And then you'd be like, yeah,

55:31

and this fucking idiot, it didn't feel

55:33

like, it. It didn't feel like, it.

55:35

It didn't feel like, it. And so

55:37

that we can have conversations. But that

55:39

same thing happened to us for Facebook,

55:41

and especially if you were somebody like

55:43

Mrs. P. and myself who we don't

55:46

want to to to at our own

55:48

horn, but we live a more interesting

55:50

life than most of the people we

55:52

would run into who we hadn't seen

55:54

since high school. And so they'd be

55:56

like, oh, I saw that you were

55:58

doing stand up or I saw that

56:01

you just got back from this place.

56:03

Or oh, yeah, I saw like two

56:05

years ago. And I'm like, I was

56:07

on a phone I was on a

56:09

YouTube live with Philip DeFranco. with 25,000

56:11

people watching simultaneously. And that same day,

56:13

there was an article about me being

56:15

the Philadelphia face of red note in

56:18

the news. I completely forgot all this

56:20

stuff. And that same day, our son

56:22

took his first steps. And also, that

56:24

was around the same time I found

56:26

out that Fox News made me the

56:28

face of crying influencers. We didn't even

56:30

talk about that earlier. Dead internet. Dead

56:33

internet. Because again all this things too.

56:35

Wait, we have to talk about the

56:37

Ashley bots. Yes, we're going to. Okay.

56:39

We're going to talk about it right

56:41

now. So there's a thing that I

56:43

know, and the reason why I don't

56:45

trust 90, there's a certain way, and

56:47

this is for the listeners, this is

56:50

how you can tell, you can spot

56:52

a bot immediately. Yeah. Number one, go

56:54

to their page. All right, listen, this

56:56

isn't, this isn't specifically for everybody. This

56:58

is a specific bot that I get

57:00

targeted by a lot that I get

57:02

targeted by a lot in my lot

57:05

in my lot in my lot in

57:07

my lot in my comments. Is the

57:09

person wearing a bikini in their profile

57:11

picture? Yeah. More than likely. That's a

57:13

bot? That's a bot. I have two

57:15

bots in particular. There's also more, but

57:17

there's two that there's this one bot

57:20

net really enjoys. One is named Ashley

57:22

Ivy and the other one's named Ashley

57:24

Ivy and the other one's named Wendy

57:26

Ivy. You can type them in the

57:28

TikTok. Ashley. Or Wendy. Or Wendy. And

57:30

there's thousands of them. They all are

57:32

like, that's Ashley Ivy and then a

57:34

string of numbers. And then a string

57:37

of numbers. Yeah. They're always the same

57:39

five photos. They're all AI generated. And

57:41

then she has a link to sign

57:43

that as an only fans knockoff that

57:45

is there to steal your identity. Yeah.

57:47

That's completely what this is. That's what

57:49

this is. It's called Only France. Yep.

57:52

And it says pictures of Fran Drescher.

57:54

And then I paid for it because

57:56

I was like, I love frame dresser.

57:58

I love frame dresser. I was working

58:00

on how to match up the dressing

58:02

queen. So I gave him on your

58:04

credit card card card. Yeah, my credit

58:06

card number. Thank you so much. That's

58:09

fine. That's fine. Well, I typed in

58:11

your social security number. Into Facebook? Yeah.

58:13

Oh my God. Well, because they said,

58:15

they said that if I wanted to

58:17

get verified on Tiktok, I had to

58:19

type it into Facebook. Oh, that was

58:21

one of the other conspiracies that saw

58:24

today. Someone got sent a fishing email

58:26

where the fishers, because I get them

58:28

all the time, the people who are

58:30

sending the scams. Yeah. Forget if they're

58:32

pretending to be Facebook or if they're

58:34

pretending to be Tiktok because they're just

58:36

mass blasting them. So their profile picture

58:39

on Google or whatever they're using that

58:41

pops up is the meta symbol, but

58:43

then they claim their Tiktok and they're

58:45

like, this is proof that meta bought

58:47

Tiktok. I'm like, no, that's proof that

58:49

that's a fishing scam. Oh my God,

58:51

don't click that. Does nobody do fishing

58:53

scam? Emails, emails anymore? Anyway. Anyway, what

58:56

Ashley Bot does is she takes my

58:58

caption or the AI also reads my

59:00

subtitles from my very funny subtitles. Very

59:02

funny. From the entire video. And then

59:04

it replies. using a chat gPT style

59:06

summary. And you'll also notice these on

59:08

Twitter, and you'll also notice these on

59:11

Blue Sky, and you'll also notice these

59:13

on Instagram a lot. So if I

59:15

make a video, and I write the

59:17

caption to be, I really enjoyed the

59:19

Garfield movie. However, there's a few things

59:21

I really didn't like about it. Hashtag

59:23

Chris Pratt, hashtag bad dad. If that's

59:25

what it was, Ashley Ivy will respond.

59:28

Glad you enjoyed the Garfield movie too

59:30

bad about that, but dad that's in

59:32

it Yeah, like that's what Ashley Ivy

59:34

will respond and don't you hear how

59:36

that already feels? Shitty. It already feels

59:38

gross This is what that bot does

59:40

and what it's doing is this one

59:43

in particular is trying to grab on

59:45

and then pull forward The one negative

59:47

bot that I found that all it

59:49

does is post hate towards whatever anybody

59:51

is saying. Yeah, if you say you

59:53

like something it immediately goes you're wrong

59:55

The one was we watched was the

59:57

Interior Chinatown. Interior Chinatown. Love it. One

1:00:00

Hulu. It was a TV show. Really,

1:00:02

really, really, really good. Great book. Everything.

1:00:04

We watched. We watched like the first

1:00:06

three episodes and I was like, you

1:00:08

know, Blue Sky's a little different. I

1:00:10

really enjoy it. I like the people

1:00:12

there and it's not as toxic as

1:00:15

Twitter. Yep. So I skated because that's

1:00:17

what I called. We call it Skeets.

1:00:19

We call it Skeets. So I skits.

1:00:21

We call it Skeets. So I skits.

1:00:23

So I skits. We call it Skeets.

1:00:25

Hey. So I skits. So I skits.

1:00:27

We call it. That's this. And like

1:00:30

I screenshot them and I block them.

1:00:32

And I wrote a post about it.

1:00:34

And then so I replied like, oh,

1:00:36

that's a bot. I said, what? And

1:00:38

then I unblocked it when looked and

1:00:40

all it was, its whole feed was

1:00:42

whatever you're talking about. Oh, that's stupid

1:00:44

and you're talking about. Oh, that's stupid

1:00:47

and you're dumb. Over and over and

1:00:49

over and over and over again. And

1:00:51

we're like, why did this exist? And

1:00:53

then slowly, somebody explained it. It is

1:00:55

there to make you not to make

1:00:57

you not want to make you not

1:00:59

want to make you not want to

1:01:02

make you not want to make you

1:01:04

not want to talk. Yeah. Which is

1:01:06

why the only safe comment section on

1:01:08

the internet can be found at Pearl

1:01:10

Mania 500.net. Yeah, I mean, that's not

1:01:12

wrong. That's not wrong. Those. We have

1:01:14

elite commenters over at Pearl Mania 500.net

1:01:16

on the Patreon. Especially, especially among our

1:01:19

grandfathered in hayhuns, our team leads, our

1:01:21

downline. Fed tier. Hey, what were we

1:01:23

talking about was the new Fed tier?

1:01:25

The other day we said was so

1:01:27

oh was somebody in the federal government

1:01:29

what listening to our podcast? And I

1:01:31

was like they got to pay Fed

1:01:34

Tier. Oh no no no because what

1:01:36

we said is we lost our Chinese

1:01:38

spy and now it's going to be

1:01:40

an FBI. Yeah if you want to

1:01:42

if you really want to know what

1:01:44

we're doing on our phones you got

1:01:46

to pay Fed Tier 30 dollar Fed

1:01:48

Tier. But the in general with like

1:01:51

all this different stuff the internet at

1:01:53

this point it's going to get even

1:01:55

worse because the thing that people really

1:01:57

like about was we had pretty good

1:01:59

idea that who you were looking at

1:02:01

is a person. Yeah. And even that

1:02:03

is now getting wild back with more.

1:02:06

So what we're saying is, or what

1:02:08

I'm thinking is, I can't speak for

1:02:10

anybody else but myself, I'm saying we're

1:02:12

going outside. Well, exactly, we're going outside,

1:02:14

but also more importantly, touring artists, touring

1:02:16

comedians, touring podcasts, all these different things,

1:02:18

are going to become more, more and

1:02:21

more important, because that is going to

1:02:23

be your verification. Because I don't know

1:02:25

if you guys know this, dead internet

1:02:27

theory is coming to podcast. Oh yeah,

1:02:29

I saw that where you can just

1:02:31

like make a chat chibiti podcaster. Yes.

1:02:33

They're listening to my voice right now

1:02:35

and they're like, hmm, is that gentle

1:02:38

marbles? Yeah. They are. They are. But

1:02:40

also at the same time, and it

1:02:42

was totally scary. It's an interesting idea.

1:02:44

That's the thing. Some of these things

1:02:46

were not designed in malice. Yeah. And

1:02:48

this one kind of blew my mind.

1:02:50

Because for those you guys who don't

1:02:53

know the backstory, I had for myself,

1:02:55

I had a job working in data

1:02:57

entry at a bank and they're like,

1:02:59

this is a really good job, like

1:03:01

all these different things. And I was

1:03:03

like, yeah, and I was hoping that

1:03:05

when I first got the job, like,

1:03:07

oh, I could keep that for like

1:03:10

30 years. And then ChatGPT rolled out

1:03:12

and I saw how fast automation was

1:03:14

taking over. I was like, this job

1:03:16

won't exist in two years. And. That

1:03:18

wasn't the reason why I pivoted towards

1:03:20

content creation, but it was one of

1:03:22

the things that didn't make me scared

1:03:25

to pivot towards content. Either way, I

1:03:27

don't have a job. Once I heard

1:03:29

this podcasting, I know exactly why it

1:03:31

was created. It was created because, hey,

1:03:33

let's say we had to read a

1:03:35

book like Board Gay Wherewolf, right? You

1:03:37

don't have to. You can listen to

1:03:40

our podcast episode, and I'll explain the

1:03:42

entire book to you. And it's super

1:03:44

fun. But then you should be read

1:03:46

it. Why wait for us to read

1:03:48

the entire book when now you can

1:03:50

type in, give me a summary of

1:03:52

board gay werewolf that sounds like a

1:03:54

husband and wife duo sitting across from

1:03:57

each other at a desk explaining to

1:03:59

this in pithy ways. And what they

1:04:01

did is they trained an AI model

1:04:03

on a lot of two and three

1:04:05

person podcasts. And you can even hear

1:04:07

them get things a little bit wrong,

1:04:09

redouble down, dial back, all those different

1:04:12

things to the point where like, it

1:04:14

was scary. Yeah, it was gross. And

1:04:16

it was it was horrifying. And then

1:04:18

when I was listening to it, I

1:04:20

was like, oh, you morons don't realize

1:04:22

you may have just destroyed another industry.

1:04:24

They love destroying an industry. But the

1:04:26

thing is, is they think they're being

1:04:29

helpful because some people only really do

1:04:31

learn information this way now. Where they're

1:04:33

like, hey, it's easier for me to

1:04:35

listen to a podcast and go and

1:04:37

listen to people go back and forth.

1:04:39

It was like a textbook chapter of

1:04:41

homework. Yeah, they were explaining like quantum

1:04:44

physics or something. Yeah, they had two

1:04:46

people explaining the homework and I said,

1:04:48

oh my god, actually, that would have

1:04:50

helped me so much in high school.

1:04:52

Yes. Oh, that would have been so

1:04:54

useful. It's taking difficult topics and making

1:04:56

a conversational, which is kind of what

1:04:58

we do. Yeah. But also what I

1:05:01

do on Tiktakate. That's coming soon. Do

1:05:03

you guys know about the Stargate? Half

1:05:05

trillion dollar AI that's going to build

1:05:07

personalized MRNA vaccines to get rid of

1:05:09

the cancer? Where did they get all

1:05:11

the genetics from? Hmm. I don't know.

1:05:13

23. So the internet's dead. We're going

1:05:16

outside. Yeah, but this is also one

1:05:18

of those things where all as we

1:05:20

talk through the eulogies. Yeah. And we

1:05:22

talk through tech talking the algorithms. All

1:05:24

these things, all these little puzzle pieces.

1:05:26

Start to come together and you draw

1:05:28

your own conclusion. No, no, no, our

1:05:31

own conclusion. Do your own conclusion. Do

1:05:33

your research. Oh, yeah? Well, you don't

1:05:35

have to become a Tinfoil Hat podcast.

1:05:37

Oh, yeah? Well, you don't have to

1:05:39

become a Tinfoil Hat podcast. But I

1:05:41

start selling hair pills. Oh, hell yeah.

1:05:43

We're selling hair pills. We're going to

1:05:45

start selling your buckets of mac and

1:05:48

cheese. Oh, no. They're dry goods for

1:05:50

you. That's what he's what he said.

1:05:52

That's what he said. There's an AI

1:05:54

data center coming near you and they're

1:05:56

gonna try to sweep up your kids

1:05:58

So anyway, we're gonna sell Pickle buckets

1:06:00

full with powdered potatoes Hmm, and you

1:06:03

want mrs. P with that I think

1:06:05

this has been an incredible episode I

1:06:07

what I feel like we've been everywhere

1:06:09

we have this has been a little

1:06:11

bit more all over the place yeah

1:06:13

but because we've been in a little

1:06:15

bit more all over the place as

1:06:17

a people yeah in fairness it wasn't

1:06:20

the normal research episode because it's been

1:06:22

the weirdest week so far so far

1:06:24

so for all of our listeners out

1:06:26

there we are about to do shout-outs

1:06:28

hell yeah and then we're also gonna

1:06:30

pepper in some extra lore as we

1:06:32

always do. Yeah. And then we're gonna

1:06:35

tell you all the secret code word

1:06:37

at the end of the episode that

1:06:39

we'd love to hear you say down

1:06:41

in the comments for all of the

1:06:43

algorithms. So if you are not already

1:06:45

a member of our patron, Pearlmania 500.

1:06:47

That is where you're not already a

1:06:49

member of our patron, Pearlmania 500. Net

1:06:52

is where you can join us or

1:06:54

you can join us across all of

1:06:56

the social media. So if you are

1:06:58

not already a member of the. things

1:07:00

that we have planned for in the

1:07:02

near future. What are those plans? We

1:07:04

don't know. We don't even know them.

1:07:07

We write them down and then I

1:07:09

lose the post. But that's the reason

1:07:11

why because we don't trust Google Drive

1:07:13

anymore. Why? Because they're using it to

1:07:15

train our elements. They literally are training

1:07:17

other podcasts. They steal my podcast episodes

1:07:19

I wrote. Yeah. Also Facebook openly admits

1:07:22

on Lama 4 that they're using all

1:07:24

of everything to train We're all being

1:07:26

replaced slowly because if you're not paying

1:07:28

for it. You're the commodity. All right

1:07:30

Okay, calm down. We'll be right back

1:07:32

with our shout-outs. It'll be way more

1:07:34

fun than whatever he just said. Yeah

1:07:36

500.net! Oh, that's where if you want

1:07:39

to join this illustrious list of the

1:07:41

greatest people in the entire world, then

1:07:43

you can join us at Pearl Mania

1:07:45

500.net or patron.com/Pearl Mania 500. become a

1:07:47

team lead today. Show the Perlmaniacs of

1:07:49

the world. So here are our shout-outs.

1:07:51

I'm ready. Okay, we have a lot

1:07:54

this week. Hell yeah. Number one we

1:07:56

have, Carrow. Hey Carrow, wait like the

1:07:58

syrup? Yeah, that's spelled like their syrup.

1:08:00

Is that how it's spelled? Yeah. After

1:08:02

that we have Angela Anzman. Hey Hun.

1:08:04

After that we have M. M. It's

1:08:06

not M and M. Which could be

1:08:08

Mother's Milk from the Boys. I don't

1:08:11

know what that is. Exactly, I don't

1:08:13

let you watch the boys or read

1:08:15

the comic. It sounds weird when you

1:08:17

say you don't let me. Yeah, well,

1:08:19

there's certain things. It's not what happened.

1:08:21

Is what happened is you said, do

1:08:23

you watch the boys with me? I

1:08:26

said, I'm not really interested. And I

1:08:28

said, thank God. And then yeah, you

1:08:30

said, you said, I'm not really interested.

1:08:32

And I said, thank God. And then

1:08:34

you said, you said, you said, you

1:08:36

muttered, and I. Thank God. You said,

1:08:38

you said, you said, you said, you,

1:08:41

you said, you said, thank God. You

1:08:43

said, you, you, you, you, I'm not,

1:08:45

I'm not, I'm not, I'm not, I'm

1:08:47

not, I'm not, I'm not really, I'm

1:08:49

not really, I'm not really, I'm not

1:08:51

really interested. I'm not really interested. I'm

1:08:53

really interested. I'm really interested. I'm really

1:08:55

interested. It's not that, it's that also

1:08:58

one of the other things that happens

1:09:00

often is that, because it's happened with

1:09:02

like the Avengers for a long time,

1:09:04

you and I saw the first Avengers

1:09:06

movie together. Yeah. After I'd already just

1:09:08

gone out willy-nilly and seen all the

1:09:10

other movies and seen all the other

1:09:13

movies, and so when we saw the

1:09:15

first Avengers movie together, after I'd already

1:09:17

just gone out willy-nilly and seen all

1:09:19

the other movies and seen all the

1:09:21

other movies, and seen all the other

1:09:23

movies, and seen all the TV' with

1:09:25

like TV shows, and seen, and seen,

1:09:27

like TV shows, and seen, and seen,

1:09:30

like, like, like TV shows, like, like,

1:09:32

like, like, like, like, like, like, like,

1:09:34

like, like, like, like, like, like, like,

1:09:36

like, like, like, like, like, like, like,

1:09:38

like, like, like, like, like, like, like,

1:09:40

like, like, like, like, like, like, like,

1:09:42

like, That's what I mean, like you're

1:09:45

not allowed to watch it. It's like,

1:09:47

but the only thing I've ever told

1:09:49

you actually not allowed to watch was

1:09:51

John Wick. Yeah, because of the puppies.

1:09:53

Yeah, because of the dead dog at

1:09:55

the beginning. Yeah, it would be too

1:09:57

much for you. We would just watch

1:09:59

Traders. That's all we ever watch. Well

1:10:02

now, yeah, because we love Alan coming

1:10:04

and Bob Drag Queen and all the

1:10:06

others. After that we have Glenda Maldonado.

1:10:08

Boardner. I hope he got that right

1:10:10

for you. I feel like I did.

1:10:12

Boardner. After that we have bootleg underscore

1:10:14

bleach. Hey. Hey, Hun. Hey, that's that's

1:10:17

crazy. That's a crazy name. After that

1:10:19

we have Kai Ward. Hey Kai Ward.

1:10:21

After that we have Charlie underscore and

1:10:23

Z. Oh, maybe that's New Zealand. Might

1:10:25

be New Zealand. It might be. Hey

1:10:27

Charlie, if you are in New Zealand,

1:10:29

can we come live with you? Do

1:10:32

you want to sponsor a family? For

1:10:34

a dollar a day. For a dollar

1:10:36

a day. That we'll give you a

1:10:38

dollar. And then you can wash that

1:10:40

dollar for us. I have Jeanette, I

1:10:42

have Jeanette down. Okay. I'm going to

1:10:44

go with Guilla. Okay. Hey Jeanette! Jeanette

1:10:46

Garula! Hey Hun! After that we have

1:10:49

Courtney Morse! Hey Courtney! After that we

1:10:51

have Jillian nickel! Hey Jillian! All right,

1:10:53

this next one, this is one of

1:10:55

the long ones. I'm seeing underscores. Deleading,

1:10:57

underscore streaming, underscore 2, underscore afford, underscore

1:10:59

a team, underscore lead, underscore spot, underscore

1:11:01

finally, res the minimum wage, underscore. That's

1:11:04

the fuck I'm talking about. I don't

1:11:06

usually see these these these. idiot thing

1:11:08

yeah and then this popped up it

1:11:10

just came up across my screen and

1:11:12

I laughed yeah so hard because I

1:11:14

was like absolutely because also at the

1:11:16

beginning of the year you've been deleting

1:11:18

our streaming oh you've been like I'm

1:11:21

not paying for this I'm not paying

1:11:23

for this I'm not paying for this

1:11:25

because honestly that's the thing is it

1:11:27

keeps going down to like I started

1:11:29

to realize like Paramount Plus like I

1:11:31

don't really watch that this other thing

1:11:33

I don't really watch that I'm like

1:11:36

why am I paying for any these

1:11:38

fucking things all I ever do is

1:11:40

open them and get sad. No I

1:11:42

just opened my phone and our TV

1:11:44

like I went through on the end

1:11:46

of December when I was coming down

1:11:48

from the chicken pox yeah I was

1:11:51

starting to feel my brain again and

1:11:53

I said oh all the subscriptions are

1:11:55

going to renew what can I go

1:11:57

through and cancel. Got your money back.

1:11:59

Thank you. After that we have most

1:12:01

underscore excellent underscore dude. Most excellent dude,

1:12:03

hey Hun. The only subscription I've, I

1:12:05

will probably have to pry from. cold

1:12:08

dead hands is my canvas subscription. Oh

1:12:10

yeah you well graphic design is your

1:12:12

passion. It is my passion and anyone

1:12:14

watching on the YouTube has seen some

1:12:16

crazy news. Yeah listen if you don't

1:12:18

if you're not a YouTubey person Just

1:12:20

go take a gander at YouTube to

1:12:23

see the beautiful artwork that I've been

1:12:25

I've been making. Romania 500 has been

1:12:27

making on camera. He takes a lot

1:12:29

of time to make those. And I'm

1:12:31

getting better. You are getting better. And

1:12:33

my thumbnails are getting better? The ones

1:12:35

for the bird video? Oh my god,

1:12:37

they're so good. If y'all haven't gotten

1:12:40

down with the birds yet. After that,

1:12:42

we have Cassandra. Hey, Hun. Uh-oh. Cassandra.

1:12:44

Hey. I'm sorry that no one loves

1:12:46

us. I'm sorry. Don't worry, Cassandra. I

1:12:48

know exactly what. I know exactly what.

1:12:50

I know exactly what that's like. Okay.

1:12:52

I'm sorry that you can have true

1:12:55

prophecies and nobody fucking hears you. Uh,

1:12:57

after that, I've, I know for a

1:12:59

very long time, that was one of

1:13:01

the names. I was like, if I

1:13:03

ever have a daughter, I'm fucking, ironically

1:13:05

name or that. Ironically name or that.

1:13:07

Rob Powell. Good strong name. After that

1:13:09

we have hot Marta. Hey Marta! With

1:13:12

sparkle emologies built in. Hot Marta is

1:13:14

hot. She made sure to bring the

1:13:16

sparkles. She brought the sparkles. She brought

1:13:18

the sparkles. Like a real girl. Yeah.

1:13:20

And honestly, can I tell you real

1:13:22

quick? Because sometimes I reply to the

1:13:24

messages on my desktop. So the patron

1:13:27

messages. I want to tell y'all. When

1:13:29

I write so, you get a thing.

1:13:31

It says like Smiley face like smiley

1:13:33

face emoji. If I am, if I

1:13:35

am, I don't answer the messages on

1:13:37

my phone because oftentimes people will tell

1:13:39

us in messages like, hey, we don't

1:13:42

want, um, your name, your elder millennial,

1:13:44

and not a younger millennial. Well, no,

1:13:46

because I want to make sure I'm

1:13:48

not going to write them down separately

1:13:50

and then mess it up. So I

1:13:52

want to make sure that I don't

1:13:54

read the messages until days that we're

1:13:56

going to record, because I'm always nervous,

1:13:59

because we used to only get messages

1:14:01

that were about the names. Hey, I

1:14:03

really don't want you to say my

1:14:05

government name. I really want you to

1:14:07

say this other thing. I didn't figure

1:14:09

out how to do that. So I

1:14:11

don't check the messages. is regularly anymore

1:14:14

because I wanted to make sure when

1:14:16

I pull this list to put into

1:14:18

an Excel spreadsheet I'm doing it at

1:14:20

the right time. I'm explaining the list

1:14:22

there's doing not again me. I know.

1:14:24

After hot Marta we have a Shales.

1:14:26

I'm gonna go with Shales. It's C-H-A-E-L-Z.

1:14:28

It might be Chales. Chales. I feel

1:14:31

like is Shales. Okay. Okay, there it

1:14:33

is. Yep, the lure hitting the lure.

1:14:35

Can I tell you when I saw

1:14:37

this pop up? I was like, oh,

1:14:39

yeah, Fox News did call me that

1:14:41

two days ago. I already forgot I

1:14:43

had already forgot for me. It was

1:14:46

a Monday. We used to have memories.

1:14:48

We used to, but on The Baby

1:14:50

Destroy, then which I've talked about multiple

1:14:52

times on stage, and maybe in a

1:14:54

YouTube comedy special that might be coming

1:14:56

out soon. Oh boy. We're trying, trying

1:14:58

to get that footage. We try to

1:15:00

do things, we try. We really try.

1:15:03

But when they said, Pearl Mania 500,

1:15:05

who appeared flustered, what a pull. I'm

1:15:07

going to put that over. Like I

1:15:09

feel like I want to put a

1:15:11

magnet on the fridge itself. Well, you

1:15:13

know a few people said that they

1:15:15

will. A few people said that they

1:15:18

will want that they will want that

1:15:20

as a few people said that as

1:15:22

a few people said that they will

1:15:24

want that as a few people said

1:15:26

that as a few people said that

1:15:28

as a Yeah, appeared Flustered. Appeared Flustered.

1:15:30

And I'm like, you know what? We're

1:15:33

thinking about some merch. We are thinking

1:15:35

about merch. After a period underscored Flustered.

1:15:37

We have Michelle Miller. Hey, Hun. After

1:15:39

that, we have Shelby Hayes. Hey Shelby.

1:15:41

After that, we have Crystal Blomquist. Crystal

1:15:43

Blomquist. That is a good character name.

1:15:45

That's like a character name. Yeah. Yeah,

1:15:47

she's a character. She's a character. She's

1:15:50

in a soap opera opera opera. She's

1:15:52

in a soap opera. She's in a

1:15:54

soap opera. She's in a soap opera.

1:15:56

It says H-I-D-D-I. I'm gonna go with

1:15:58

H-I-D-I. I'm gonna go with H-I-D-I. I'm

1:16:00

gonna go with H-I-D-I. I'm with you.

1:16:02

After that we have Mark Domino. Hey

1:16:05

Mark with a K. Wow. Why? This

1:16:07

is just a joke from Empire Records.

1:16:09

Oh, is it? Yeah. Oh, is it?

1:16:11

Yeah. Oh, Mark with. From Empire Records.

1:16:13

as, as, underscore retribution, underscore four, underscore

1:16:15

world affairs. Yeah, that is. With sparkles

1:16:17

around world affairs. Incredible. I appreciate it.

1:16:19

I absolutely agree. 10 out of 10,

1:16:22

no notes. Fairies booking. Fairies parking. Yeah,

1:16:24

torturing men with fairies and smut as

1:16:26

retribution for world affairs. And finally. No

1:16:28

notes. And finally. You having good? Thank

1:16:30

you, Chell, for making it to the

1:16:32

end of the episode. We did it.

1:16:34

We got to come up with a

1:16:37

word. We do need to come up

1:16:39

with a phrase. What did we talk

1:16:41

about? Death of Tiktok. Death of Tiktok,

1:16:43

Dead Internet Theory. Oh, I got it.

1:16:45

I got it. I got it. Everybody

1:16:47

get in the comments, because this is

1:16:49

the word I want you guys all

1:16:52

to push from here on now. Okay.

1:16:54

Okay. These podcast bros. These podcast bros.

1:16:56

These podcast bros. Joe Rogan. Joe Rogan.

1:16:58

Right? Theovon. All of them were at

1:17:00

the inauguration. All of them openly, openly

1:17:02

pulled for this presidency. And then man

1:17:04

is now the president. You know what

1:17:06

that means? That means that Joe Rogan

1:17:09

is a Fed. That means that Theovon

1:17:11

is a paid chill. That means that

1:17:13

Logan Paul is a Cyop. So let's

1:17:15

start at the top and work backwards

1:17:17

from there. Ladies and gentlemen, please write

1:17:19

in the comment, Joe Rogan is a

1:17:21

Fed. Thank

1:17:24

you guys so much for watching this

1:17:26

episode. Oh my god, like subscribe comment

1:17:28

Give us stars across all the apps

1:17:30

during our content flood this week. This

1:17:33

is our content flood. We're trying to

1:17:35

hit you guys with a lot Get

1:17:37

in the arc baby. There's gonna be

1:17:39

a flood to get the animals and

1:17:41

with that why did they put the

1:17:43

roaches on the boat? Why did they

1:17:46

save the roaches? Why did they say

1:17:48

it? It's a fair question. Mrs. Why

1:17:50

did they save the roaches? Too many

1:17:52

frauds and too many scammers that we

1:17:54

wish weren't real. Too many crimes.

1:17:56

and too many spammers,

1:17:59

and we're starting

1:18:01

to feel like we've

1:18:03

got we're many

1:18:05

to feel like we've

1:18:07

got too

1:18:09

many tabs. too

1:18:12

many to smile. to

1:18:15

smile.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features