Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Coming up Debbie and I here
0:02
for our Friday roundup, Bill Gates says
0:04
AI will replace doctors, teachers within
0:06
10 years and claims humans
0:08
won't be needed for most
0:10
things. We're going to discuss
0:12
the impact, the potential impact
0:15
of AI, and how it's
0:17
transforming our society. We're also
0:19
going to talk about how
0:21
NPR can be defunded, how
0:23
Disney movies lost the plot,
0:25
and how trendy Aragua, the
0:27
left's favorite new victims, is
0:29
giving the Venezuelan people. a
0:32
bad name. If you're watching
0:34
on X or Rumble or
0:36
YouTube, listening on
0:39
Apple or Spotify,
0:41
please subscribe to
0:44
my channel. Hit
0:47
the subscribe follow
0:50
and the notifications
0:53
button. This is
0:55
the Dinesh Tussa
0:57
podcast. We need
0:59
a brave voice
1:01
of reason, understanding
1:03
and truth. This
1:05
is the Denesha Sousa
1:08
podcast. W and I
1:10
here for our Friday
1:13
roundup and honey it
1:15
happens to coincide with
1:17
us now in our tenth
1:20
year of marriage. In fact
1:22
we did our... 9th anniversary
1:24
getaway. Although you reminded
1:26
me this was really
1:28
our 10th getaway because
1:30
we had our honeymoon.
1:32
So when you count starting
1:34
there... I said it was
1:36
our 10th honeymoon. And we
1:39
went to a new destination
1:41
which was in the Caribbean
1:43
Grand Cayman and we loved it.
1:45
Yeah, oh my gosh. I loved it.
1:47
I loved it. You're a tropical girl
1:50
you like. You like Hawaii. We have
1:52
a favorite place we go to in
1:54
Mexico. But I think in some ways
1:56
Grand came and took the cake as
1:59
we say. and it was I think
2:01
not one thing but it was really
2:03
a complex of factors so number one
2:06
you know if you're going from Texas
2:08
to Hawaii it's like nine hours because
2:10
you got to go to California you
2:12
fly another it's probably more like
2:15
11 hours okay so it's like going
2:17
to Europe yeah it is now our
2:19
Mexico spot is about two hours away
2:21
but 45-minute drive when you get there
2:23
and it's Mexico so it's a
2:26
little bit more complex to navigate
2:28
to navigate your way But Grand
2:30
Cayman is a British island. I
2:32
think one of the things you liked
2:34
about it is you like that very
2:37
British formality, right? Everything is kind of
2:39
done properly. And it's funny because you
2:41
see, you know, I don't mean this
2:44
in a bad way, but I mean,
2:46
you have all these Caribbean islands, right?
2:48
I mean, you have all these Caribbean
2:50
guys home on, you've got an amazing
2:53
ton and so on. But what I'm
2:55
getting at is these British guys. They
2:57
all speak with an English accent.
3:00
So it's it's it's it's almost
3:02
like a transplanted part of London
3:04
in a little bit a little
3:06
bit so it's the proximity Yeah,
3:08
the price the price. Oh, well
3:10
the proximity from the airport to the
3:13
hotel was great. It was like 10
3:15
minutes 15 minutes Max and then just
3:17
the cleanliness of the city because I've
3:20
been to Caribbean islands and I can
3:22
tell you most of them are kind
3:24
of dirty you know, the water's not
3:27
exactly clean. Well, the islands are, you
3:29
know, they're close to each other. So
3:31
people are motorboating here to there
3:33
and they're throwing stuff in the
3:36
water. Whereas Grand Cayman is kind
3:38
of... by itself. It's got that
3:40
seven mile beach, which is
3:42
fantastic. And it's also a
3:44
little unusual. Both of us,
3:46
we've got in the ocean more than
3:49
once. Well, I swore that I would
3:51
never get in the water in the
3:53
ocean ever again because of all
3:56
the shark, you know, attacks that
3:58
happened. Yeah. I was like. And then
4:00
of course Julian is like, make sure
4:02
that you don't step, that you kind
4:05
of shuffle because of the sting rays.
4:07
And I was like, oh my gosh.
4:09
But it just so happens that the
4:12
water is so clear. It's like a
4:14
swimming pool. You can see right through
4:16
to your feet. Very, very clear, very
4:19
clean, beautiful color, upper marine. Well, the
4:21
other thing I have to add, because
4:23
the two of us being foodies, is
4:26
really good food. Yes. And in fact,
4:28
we should recommend there is a famous
4:30
chef, Eric Rippert, and he's
4:32
got a Michelin-starred restaurant in
4:35
New York called Bernardin. It's
4:37
a seafood restaurant. but he
4:39
has a restaurant called Blue
4:41
in Grand Cayman and that
4:43
was kind of a very fitting
4:45
name because the water is blue.
4:48
That's probably where you
4:50
got the idea and that was kind
4:52
of the memorable last day that
4:54
we were there and we had
4:56
a great time. Yes and I have
4:58
to say I hadn't had sugar. in
5:01
over two years? Well when you get
5:03
a famous French chef doing dessert
5:05
it's a little hard to
5:07
say no. Yeah but the only
5:09
thing about that dessert was it
5:12
was very small. The portion was
5:14
quite small. And I couldn't say
5:16
no. I usually say no but...
5:19
I just couldn't. I was a
5:21
little surprised you didn't say, because
5:23
you have such iron will on these
5:25
things. But I was like the heck
5:28
with it. But notice, I mean,
5:30
when you go to France, you
5:32
just don't see a lot of
5:34
300 pounders. In fact, you don't
5:36
see any 200 pounders either, because
5:38
the French eat very rich food,
5:40
but they eat in small quantities,
5:42
and they walk a lot. We
5:44
walked a lot. Really before the
5:46
sun went out. Although it's so
5:48
warm there that I mean, I
5:50
think I got a couple of
5:52
shades of darker in just four
5:54
days. Yeah. And, but we've, it
5:56
definitely was a very memorable
5:58
and fun trip. Yeah, I
6:00
think so. It was one of
6:02
those trips that I just, I
6:05
want to do it again. I
6:07
want to do it again. And
6:09
I love, I love the tropics,
6:11
I love the weather. You know,
6:13
when we went to Punta Mita
6:16
in January, it was a little
6:18
chilly, even for me. It was
6:20
in the low 70s, but the
6:22
wind was very, very chilly
6:24
and, you know, a little. down
6:27
right cold I thought yeah I think most
6:29
people have a kind of a sweet
6:31
spot between about 68 and 72 if
6:33
you ask people what's the ideal
6:35
weather yeah basically say San Diego
6:37
weather but I remember when you would
6:40
visit me in San Diego when we
6:42
were courting You would say
6:44
it's I always hear you
6:46
say it's a little chilly.
6:49
It's too cold here. I
6:51
need to get my sweater
6:54
I need to get my
6:56
jacket. I always wear
6:58
a sweater in San
7:00
Diego. I never Didn't
7:02
go around with a
7:04
sweater. I always wear
7:06
a sweater. Yeah, so your
7:08
sweet spot is 78 to
7:10
about 95 Before 95, like
7:13
90 to 94, I'm still
7:15
okay. I'm not even sweating
7:17
at that point. And so
7:19
I actually like it. Now
7:21
the only problem is when
7:23
it gets that hot here,
7:25
where we live, is that
7:27
it usually brings on a
7:29
lot of rain. It does. It
7:32
produces a lot of rain and
7:34
it's very humid. And so it's
7:36
not so nice. But pure sunshine
7:38
at 94 and I'm fine. Yeah,
7:41
no, absolutely. Let's talk about, you
7:43
know, we often zoom into the
7:45
political topics of the day and
7:47
we'll do some of that today,
7:49
but we also wanted to talk
7:51
about a broader topic and that's
7:53
AI, artificial intelligence and and our
7:56
trigger for doing it is this
7:58
article in the New York Post.
8:00
There's a kind of a startling
8:02
headline. Bill Gates says AI
8:05
will replace. doctors, teachers
8:07
within 10 years and claims
8:09
humans won't be needed quote
8:11
for most things. And I
8:13
said, oh, that's Bill Gates dream
8:15
come true. He's been wanting to get
8:17
rid of people for a long time.
8:19
So you think this is like, this
8:22
is consistent with his
8:24
anti-human, eugenic? Yeah. Let's
8:26
reduce the population of
8:28
the world. I mean, this is, it
8:30
is admittedly the unifying theme
8:32
between Warren Buffett and Bill Gates.
8:35
I'd like these two guys and
8:37
they're quite different in
8:39
fact they're not even from the
8:41
same generation and yet they converge
8:43
on the issue that the earth has too many
8:45
people now I think it is
8:48
interesting that against this we have
8:50
Elon Musk who is constantly saying
8:52
He needs to populate. And
8:55
he's participating in a
8:57
big way. He is actively,
8:59
he is actively, he is
9:01
actively trying to repopulate
9:04
the earth, it seems. Single
9:06
handedly. That's right. Spreading his
9:09
genes at a level not
9:11
seen since Genghis Khan. Right?
9:14
All right. So let's talk
9:16
about gates here, because even
9:18
though I agree that this is
9:20
a strange... Man, I don't think
9:22
that he's wrong here. And what I
9:25
mean by that is, is I've only
9:27
gotten a glimpse of this
9:29
new AI. And by the way,
9:32
what I like on X is
9:34
they've got an artificial intelligence called
9:36
Groc, GR OK. You can download
9:39
it, and you can start using
9:41
it. And in fact, I see
9:43
a lot of people on on
9:46
X making the comment, we don't
9:48
have to use Google anymore. Think
9:50
how great this is. We definitely
9:52
want to, until now we've had
9:54
no alternative to Google. Let's Google
9:56
it. So when you're searching for
9:58
information, it's Google. But Grock
10:01
is better because not only does
10:03
it do the search that Google
10:05
does But it is almost like
10:07
Handing over the information to a
10:09
highly intelligent research assistant that will
10:11
analyze it for you They don't
10:13
just say hey, here's a website
10:16
or here's ten websites all ranked in
10:18
the order that we choose which is
10:20
Google rather Grock will give you a
10:22
almost like a little bit of a
10:24
legal summary a position paper on what
10:26
you said you can put a book
10:28
proposal in there and get a critique
10:31
it so this is AI and I
10:33
think yeah so I think AI is
10:35
is something very big it's it's
10:37
almost like in earlier decades
10:39
we had an age of
10:41
automation and that hasn't stopped
10:44
right we still see now
10:46
the new generation of automation
10:48
are these kind of life-size
10:50
robots. And these robots will,
10:52
they will cut crops. I
10:54
saw a short video of
10:56
a robot inside a BMW
10:59
plant and it was doing
11:01
a car assembly robot. Right?
11:03
So you've got automation
11:05
continues, but automation threatens
11:07
the people who work.
11:09
How would you feel
11:11
about automation, AI automation
11:14
on airplane? airplane parts
11:16
would would that make
11:18
you feel more confident
11:20
than the DEI is
11:22
making us feel? Although it
11:24
is to say I would say
11:27
yes yes because think of it
11:29
this way what are you more
11:31
confident in right now a pilot
11:34
and certainly with DEI in the
11:36
about a DEI pilot landing
11:38
a plane or the auto
11:40
pilot doing it? Well auto isn't
11:43
the auto pilot the same kind
11:45
of thing? The autopilot is the
11:47
same thing as... Oh, DEA-I pilot.
11:49
Oh, I'm sorry. A DEA-I-I- Let's
11:51
see, what did I- What was
11:53
on the, what was on the
11:55
flight test the other day? I
11:57
didn't remember it all that
11:59
well, you know. versus yeah
12:01
a mechanized robot computer
12:03
system which lands planes all the
12:06
time by the way and as far
12:08
as I know I mean think of
12:10
it this way how many shows have
12:12
we watched where something goes wrong by
12:15
pilot error a lot a lot a
12:17
lot and have you ever seen the
12:19
only time I've seen something involved the
12:22
autopilot error is when someone accidentally disengages
12:24
the autopilot right or the plane is
12:26
given a bad instruction and then follows
12:28
that instruction to the letter but the
12:31
instruction was bad. So the human input
12:33
even there is what caused the problem.
12:35
Now, I think what will startle
12:37
many people is the fact that
12:40
in a sense what we're getting
12:42
is automation in the mental sphere.
12:44
Because all the people who are
12:46
lawyers, doctors, all thought their jobs
12:49
were safe. Because they thought, listen.
12:51
Yeah, you can have a tractor, you
12:53
know, plow a field, but nobody
12:55
can replace a doctor. I still
12:57
don't think that you can have
12:59
an AI do an operation. I
13:01
still think that that requires a
13:03
human being and it always will.
13:06
It absolutely does not. How? I'm
13:08
not saying it doesn't require
13:10
any human eyes on it, but
13:12
a lot of times the human is
13:14
going to be, I mean, think of
13:16
it. On the operating room. Opening up
13:18
a person. Opening up a person.
13:20
Going in there and putting in whatever you
13:22
know stitching up the heart or well, maybe
13:24
so I'm not saying there's no human
13:26
component But I'm saying even now
13:29
think of it I had eye
13:31
surgery a few years ago. It
13:33
was laser surgery. Yes, but you
13:35
had a doctor doing it. I
13:37
had a doctor doing it, but
13:39
wouldn't it be funny if
13:41
a robot comes out? Hello,
13:43
Danesh! Right, it wasn't as
13:46
accurate. Right, so AI, it's
13:48
not a case where you swap
13:50
one for the other, right? Even
13:52
when you have a car that's
13:55
largely self-driving, we still sit in
13:57
the driver's seat. An Uber that showed
13:59
up. with no life person in it. In
14:01
fact, you don't even have a robot. The
14:03
car is self-driving. You just get in the
14:05
back seat and take you where you want
14:08
to go. Now, that's coming. Does that mean
14:10
we won't have to tip? It's
14:12
kind of strange to tip
14:14
the robot or my robot will
14:16
be sending would be tipping your
14:18
robot. No, because I've seen
14:21
people talking about a financial
14:23
world in which transactions are
14:25
done device to device. Oh
14:27
my goodness. So you can
14:29
actually have a device have
14:31
an account because it's sending money
14:34
digitally. So my robot can send
14:36
money to your robot. The whole
14:38
thing is kind of dizzying in
14:41
many ways. Now the question is,
14:43
is it bad? I mean, I think this is
14:45
really, it certainly gives you anxiety,
14:47
but that anxiety is, I think,
14:49
the normal reaction to something completely
14:51
new. I'm sure people had this
14:53
anxiety, in fact, I know they
14:55
did, when cars showed up on
14:58
the road. I'm sure that farmers
15:00
had this anxiety when you had
15:02
tractors. Right, or at least, right,
15:04
the kind of mechanized agriculture that
15:06
we have now. And so, and
15:08
yet, and here's where Bill Gates
15:10
is going, and this is the
15:12
part I think that's worth thinking
15:14
about. What happens if most, if
15:16
not all, not all, but most
15:18
of the jobs kind of go away?
15:21
Well, as we, you know, we discussed
15:23
this actually, and it would be a
15:25
world without having to work. And what
15:27
kind of world would that, I know,
15:30
I know there are some people that
15:32
would love that kind of world. Well,
15:34
you too. Genzies. Well, but I mean,
15:37
you joke, you say that I joke
15:39
about Genzies. I have a daughter that's
15:41
a Genzi. But you joke also about
15:43
being. of the philosophy that every day is
15:46
Friday. Yes. So in other words, women every
15:48
day is Friday point of view. Let's think
15:50
about that. But I don't act that way.
15:52
I just pretend that that is happening. You
15:54
don't act that way, but I work very
15:57
hard. Let's think it through because let's
15:59
ask this question. Isn't it a fact
16:01
that from the dawn of mankind,
16:03
most people who have worked have
16:05
done it out of necessity? In
16:07
other words, they didn't decide, work
16:10
is really fun. I'm going to
16:12
get up and I'm going to... Except
16:14
if you're Donald Trump. Except if
16:16
you're Trump, look, except if you're
16:18
Murdoch, except if you're Warren Buffett,
16:21
you say this is true of
16:23
me. Yeah, yeah. But, but there...
16:25
Look, why would a guy like
16:27
Murdoch who is like 88... It's
16:29
got to be that he enjoys
16:32
it. It's got to be that
16:34
he prefers doing it to anything
16:36
else. So there's a small number
16:38
of jobs, I admit mine
16:41
included, that offer a lot of
16:43
inherent satisfaction. But that's
16:45
not true of most jobs.
16:47
I think it's fair to
16:49
say. You walk into a
16:52
store and you have someone
16:54
behind the counter. Do they, are
16:56
they really enjoying being there? They might
16:58
enjoy, you know, chatting with people, but
17:01
they're, they're out of necessity. And the
17:03
proof of it is stop paying them
17:05
and see if they show up. They
17:08
won't show up. So it has to
17:10
be that if we can create a
17:12
system that produces a certain
17:14
degree of abundance, and I
17:17
agree that you're looking at. problems of
17:19
boredom, people will have to think
17:21
about what am I going to
17:23
do with my day. But I'm
17:25
just saying I'm rejecting the idea
17:27
that this is automatically
17:29
a bad thing. I'm simply
17:31
saying that human interests, human
17:33
needs, human desires, human wants
17:36
will migrate to something else.
17:38
So do you think college will
17:40
become null and void? I mean
17:43
will people stop needing to go
17:45
to college because there will be
17:47
no career? to have after college
17:50
and therefore college is no
17:52
longer needed? I think
17:54
that would be actually
17:56
in many ways a
17:58
good thing. Now to say
18:01
that what about learning because
18:03
right so those two things
18:05
should never be confused because
18:07
let's say for example
18:10
that the that AI
18:12
is producing the best teachers
18:14
in the world right making
18:16
knowledge widely available at no cost
18:18
to anyone who wants it from everything
18:21
from you want to learn how to
18:23
play the guitar you would to you want
18:25
to learn about Shakespeare you know normally if
18:27
you want to learn about Shakespeare you enroll
18:30
in school and you get some guy who
18:32
barely knows about Shakespeare himself teaching
18:34
you right and and and that's
18:36
the best you can do and
18:38
that's the best we had But
18:41
now imagine that you get somebody
18:43
who is world class teaching you
18:45
that not only knows the material
18:47
like nobody else, but has the
18:50
gift of making it entertaining, can
18:52
provide all kinds of visuals. It's
18:54
almost like watching a Shakespeare play.
18:56
Yes, that all sounds fine, but
18:58
what a, but obviously it's going to
19:01
cost you, right, to go to
19:03
this university. No. What I'm saying is
19:05
that in a world of AI, the
19:07
cost of knowledge goes down to
19:10
zero. It goes down to
19:12
zero because think about it.
19:14
The cost of digital information,
19:16
of adding another bit of
19:18
digital information, is almost zero.
19:20
Let's say Google has a
19:23
billion people on Google, right?
19:25
And then over the next
19:27
year it becomes two billion.
19:29
What is the added cost to Google?
19:32
Zero. Because? They've got already got
19:34
Google, they've got the search engine, if
19:36
more and more people use the search
19:39
engine, what is the added cost of
19:41
adding one guy? There's no cost. So
19:43
what I'm saying is that when we
19:45
see this movement toward intelligence,
19:47
which intelligence has always been aristocratic, by
19:50
that I mean, it's always been a
19:52
few people who have it. Typically in
19:54
a class, think of it. If you
19:56
grade people, you're separating out the A's
19:58
who are just a... people from the
20:01
bees who are the majority and
20:03
then of course you got the
20:05
people at the bottom end
20:07
but what if this knowledge
20:10
becomes widely available through artificial
20:12
intelligence now you know we
20:14
are getting dumber as a
20:16
society yes and and so
20:18
there are going to be
20:20
those people that are going to
20:22
say oh you mean to tell me
20:24
that I don't have to go to
20:27
college because I don't have to get
20:29
a Are those people going to then
20:31
go, I don't really want to learn,
20:33
I don't really care to learn, I
20:36
don't really want to go to a
20:38
class, online or otherwise, and so we're
20:40
going to have even more dumb people
20:42
on the planet? I mean, I'm just
20:45
saying... That's very possible. Well,
20:47
what would happen is that,
20:49
is that, see, right now we have
20:51
a tradeoff, right? Most people spend
20:53
most of their day, in fact
20:55
most of their life, either preparing
20:58
for... or working. And they work in
21:00
order to buy leisure. This goes right back
21:02
to Adam Smith, the Wealth of Nations. Very
21:04
simple, right? We work so that we get
21:07
the weekend and then people get time to
21:09
do what they want to do. Now, do
21:11
what they want to do doesn't mean that
21:13
they do nothing because they go for a
21:16
drive on Saturday and then they do
21:18
some chores around the house and then
21:20
they go to church on Sunday. So
21:22
they're doing things, they're just not working.
21:24
Now imagine that they didn't have to
21:26
do... Imagine that the Friday really
21:29
is Monday. So in other words,
21:31
every day is Friday, right? The
21:33
question then becomes, people will now be
21:35
able to do a lot more of
21:37
what they want to do. But there
21:39
is no guarantee that they
21:41
will want to do impressive things,
21:44
right? I mean, you might have all kinds
21:46
of... Can you imagine? Oh, I'm just
21:48
going to watch TV all day.
21:50
I'm going to sit on the
21:52
couch. put my legs up and
21:55
watch all the talk shows, all
21:57
the soap operas, everything, you know,
21:59
everything else. and just become
22:02
a slug. Terror
22:04
of Wars, recession fears, stubborn inflation
22:06
in this environment, gold has been
22:08
routinely hitting all-time highs. In volatile
22:10
markets, like right now, don't sit
22:12
on the sidelines with your head
22:15
in the sand. Take control, safeguard
22:17
your savings, and that's why so
22:19
many Americans today are turning to
22:21
birch gold group like Debbie and
22:23
I have. They've helped tens of
22:25
thousands convert an existing IRA of
22:27
401k into an IRA in physical
22:29
gold. Isn't a time for you
22:32
to find out if you can
22:34
head... against inflation and economic instability
22:36
with gold to learn how to
22:38
own physical gold in a tax
22:40
sheltered account. Text to Nash to
22:43
9-8-8-9-8. Portugal will send you a
22:45
free, no obligation information kit. Again,
22:47
text my name Dinesh to the
22:49
number 9-8-9-8. Birch Gold has an
22:51
A-plus rating with the Better Business
22:54
Bureau, countless five-star reviews. I count
22:56
on Birch Gold to help me
22:58
protect my savings with gold and
23:00
you can too. Text to 9-8-9
23:03
8-8. 9-8 today. President Trump's 2024
23:05
victory wasn't just an election, it was
23:07
a movement. The American people stood up,
23:09
fought back, and reclaimed their country. But
23:12
the fight isn't just about politics. It's
23:14
about culture. For too long, Hollywood has
23:16
dictated what we watch, what we think,
23:18
and what we value. But now a
23:21
revolution is underway. A grassroots film studio,
23:23
powered by We The People, is taking
23:25
back our culture. From sound of freedom,
23:28
a courageous film that exposed an evil
23:30
the elites would rather keep hitting. to
23:32
movies that honor faith, family, and country
23:35
like Homestead. Angel Studios is redefining
23:37
what storytelling should be. And here's
23:39
the best part. You get to
23:41
be a part of it. By
23:43
joining the Angel Guild, you get
23:45
a say in what stories get
23:47
made. You're not just watching, you're
23:49
shaping a culture that reflects your
23:51
values. Hollywood won't fix itself. The
23:53
media won't change, but we the
23:55
people can build something better. Angel
23:57
Studios is proof of that. Go
23:59
to angel.com. slash Danish today,
24:01
join the movement, support the
24:03
films that matter, let's take
24:06
our culture back. The people who
24:08
do that now might be saying,
24:10
and this is not unreasonable, they
24:12
could say, listen, I knock myself
24:14
out, you know, six days a
24:16
week or five days a week, I'm
24:19
exhausted on the weekend, I don't want
24:21
to be bothered, I just want to
24:23
sit on the couch, eat pizza, watch
24:25
football, right? My question is... Would that
24:28
person do that if it was, if
24:30
they didn't have to do that job?
24:32
If they didn't, if they weren't driving
24:34
a truck all week and then go,
24:37
oh, I'm pooped, I need to kick
24:39
back. So they wouldn't be tired,
24:41
they wouldn't want to, so unwining
24:43
would take a whole new meaning.
24:45
Yeah, you could actually do things
24:47
like, for example, you could, you
24:50
could travel, you wouldn't be restricted,
24:52
you could travel for weeks on end
24:55
and learn. You're starting to sound
24:57
like AOC. Well, I mean, coming
24:59
back to the, you know, Marx's dream.
25:01
Yeah, exactly. That's where I'm getting at.
25:03
Marx's dream was that you should be
25:05
able to work in the... Now, Marx didn't
25:08
think that there would be a society
25:10
where work would be abolished.
25:12
He never did. Yeah. He
25:14
thought you would work in
25:16
the morning like fish in
25:18
the afternoon for fine. Go
25:20
out to dinner. So I
25:22
think what Marx was talking
25:24
about is taking the abundance
25:26
that he thought was stolen
25:28
by the capitalists and give
25:30
the worker a better life.
25:33
But what I'm saying is
25:35
that free market capitalism and
25:37
modern technology are delivering.
25:39
one better on marks because even marks
25:41
never thought we could get marks always
25:43
thought there's going to be dirty work
25:45
to be done and What we're saying
25:47
is that a lot of the dirty
25:49
work because all the fun work you
25:52
can still do no one is saying
25:54
that you don't have to use AI
25:56
right so for example Let's say for
25:58
instance you can learn something by just
26:00
getting an AI report. Or you
26:03
can learn something by reading a
26:05
book. You might decide, listen, I
26:07
don't just want someone to recite
26:09
to me the ingredients of Western
26:12
philosophy. I actually want to read
26:14
Plato's Republic. I want to spend
26:16
time inside that book and think
26:18
it through for myself. And then
26:20
I might get into a conversation
26:22
with AI about it. Which think about
26:25
it right now you read a
26:27
book or I read a book
26:29
and your heart pressed to find
26:31
somebody else who's read the same
26:33
book and can have a conversation
26:35
with you about it Whereas with AI
26:37
you now have an interlocutor
26:39
so you don't so you won't need
26:42
a book club You just have a
26:44
AI club. Yeah, and you know I
26:46
don't know I'm having a hard time
26:48
with this because I do believe that
26:50
a world with internet and social media
26:53
has become less
26:55
tolerant and more
26:57
divisive and I feel
26:59
like this will take it
27:01
to a whole new level
27:04
when you don't really have
27:06
to interact with people
27:08
anymore about anything
27:11
really. Well what you say
27:13
is true because you
27:15
know contrast to people
27:17
who are who disagree
27:20
strongly. at least under
27:23
normal circumstances. They're going
27:25
to somewhat moderate the
27:28
extremism of their views.
27:31
They're going to say things
27:33
like, you may not agree with
27:35
me on this, but, or let
27:37
me be honest and say this.
27:40
Whereas if you're posting on
27:42
social media, you don't need
27:44
any of that. those niceties,
27:46
right? You don't need any
27:49
preface, you just unload. Yeah,
27:51
sometimes even with people, there's
27:53
no such thing as niceties.
27:55
Right. But yeah, I don't,
27:58
I'm just not sure. that
28:00
that world exciting though
28:02
it may be is good for a
28:04
person is good for a soul
28:07
I just don't know about that
28:09
I don't think it is ideal
28:11
either but let me say this
28:14
that you know when I look
28:16
at the harshness that you
28:18
do see expressed let's just
28:21
say on the X platform
28:23
right now it's harsh but
28:25
the old X was more sanitized
28:28
but it was heavily
28:30
censored. Right? And so
28:33
the- You're talking about
28:35
the X between 2020
28:37
and 2024. I'm talking
28:39
about Elon Musk's X
28:42
versus the old regime.
28:44
Because I do remember
28:46
a time on Twitter. I
28:48
first started on Twitter in
28:51
2009. So between 2009
28:53
and I would say 2015-ish. I
28:55
was able to express my views
28:57
and say whatever I want and
28:59
in fact I told you that
29:01
I was under a little bit
29:03
of an incognito mode and I
29:05
went crazy. Yeah. Really crazy. And
29:07
I wasn't censored at all. My
29:10
tweets came out. I had lots
29:12
of views. I had lots of
29:14
likes. So you're saying there really
29:16
are three phases of Twitter. There
29:18
was the old Wild West Twitter.
29:21
Then there was the censored Trump
29:23
Twitter. When Trump came on, they
29:25
were like, ooh, Trump supporter. I
29:27
don't think so, you know, plus
29:29
Trump, right? They did censor even
29:32
Trump in that time period. And
29:34
then there was the time. When
29:36
I told you I lost about
29:38
30,000 followers on on X the
29:40
day after January 6th, so January
29:43
7th, 2021, I had 30,000 less,
29:45
and I have the same amount
29:47
today. Which is a good point
29:49
because we we think of X
29:51
as a free speech platform, and
29:54
it has been for me. My
29:56
X has grown enormously, but yours
29:58
has been frozen. And so what
30:01
you're saying, and others as well,
30:03
that some of the censorship algorithms
30:05
still seem to be in place.
30:07
They are. I don't know if
30:09
it's the case that the new
30:11
Elon Musk regime has not uncovered
30:13
them all or what's going on,
30:15
but that is definitely the case.
30:17
Let's talk about, let's change topics
30:19
here a little bit. Well, one
30:21
final thought on the AI front
30:24
before we pass on is this. I
30:26
think with some of these things. The
30:28
power of the new technology is such
30:30
that whether you think it's good or
30:33
not, it's hard to see it
30:35
being resisted. Think of the internet.
30:37
If we could sit before that
30:39
and go, well, I don't think
30:41
the internet's good for society. Well,
30:43
whether you do or not. It
30:46
happened. It's going to
30:48
happen. Right. I don't think
30:50
it's good for people to
30:53
have these phones that they
30:55
can walk around with and
30:57
because they double as a
30:59
camera. Well, the truth of
31:01
it is like the woman
31:03
on the on the floaty
31:05
in the ocean. She was
31:07
on a floaty on the
31:09
phone. I thought. This is too
31:11
much. Even, you know... But it shows that
31:13
see for her, she couldn't be
31:16
without her phone, even on a
31:18
plastic load of device. Yes. Loading
31:20
on the ocean. Yeah, and remember when
31:22
we went for a walk, we saw
31:25
a couple walking and he was on
31:27
a business meeting and the wife was
31:29
rolling her eyes like really... It was
31:31
such a sight, right? Oh. And
31:34
then you noticed him, he was
31:36
basically talking into the phone and
31:38
saying, I'm just checking in. Yeah.
31:40
Is this really necessary to? And
31:42
the answer is, yeah. Now talk about a
31:44
guy for whom work. He does not unplug.
31:46
And probably enjoys his work. It's
31:48
not that he enjoys the work.
31:51
He probably enjoys the status. He
31:53
enjoys the sense of kind of
31:55
combat and competition. Yeah. He's probably
31:57
the boss. And so he thinks.
32:00
it's great to be barking
32:02
out orders to other people. All
32:04
right, let's talk about, I don't
32:06
know, you want to talk about
32:09
NPR? Yeah, let's do it.
32:11
NPR. We've got so many
32:13
comments on Brandon Guild's cross-examination
32:15
of Catherine Mar, the head
32:18
of NPR, and I think
32:20
for the first time there
32:22
is a chance that NPR
32:25
will be defined. This idea
32:27
has been around for... And
32:29
the reason is that NBR is
32:31
very clever at maintaining good relations
32:34
with moderate Republicans. And so when
32:36
it comes time to defund them,
32:38
the moderate Republicans will, my wife
32:40
lives in the NPR every day,
32:43
so, or my kids need Sesame
32:45
Street. Oh, but they've got, they've
32:47
got enough going for them. Kind
32:49
of like Disney. It's difficult
32:52
to mobilize people who are
32:54
not political against Disney, because
32:56
they just had this idea,
32:58
well, You know, Cinderella, Snow
33:00
White, really? You know, so
33:02
Disney plays on that and
33:05
so they can push all
33:07
kinds of promotions. Well, and
33:09
you know what? Like, growing
33:11
up, when I came to
33:13
this country and I was
33:15
trying to learn English, I
33:18
watched a good bit amount
33:20
of PBS. They had some
33:22
children shows that were kind of
33:24
funny, but... PBS also is a
33:27
good mechanism for indoctrination. Oh yeah.
33:29
And so, look, they can be
33:31
around. It's not that we don't
33:33
want them to be around. They
33:36
can exist, but. It's the taxpayer
33:38
money going toward something that is
33:40
indoctrinating on one side. I mean
33:43
it's very simple. Actually Bill Maher
33:45
of all people made this point.
33:47
Yeah. You know if you have
33:49
a unified society, somewhat like Finland
33:52
where the political differences are very
33:54
muted, you can have a public
33:56
radio or public television that kind
33:58
of reflects the concern. census of
34:00
the society. But he goes, how
34:02
can you have a society in
34:05
which power is passing between
34:07
Republicans and Democrats? And you've
34:10
got a publicly funded institution
34:13
that is intrinsically, not 6040, 99
34:15
to 1 or 100 to zero.
34:17
And far left. Far left. Far,
34:19
far left. Yeah, this is intolerable.
34:22
This cannot stand. And so. I think
34:24
it is imperative. Look, I'm glad
34:26
we've taken over, Trump has taken
34:28
over the Kennedy Center, but NPR
34:30
and PBS have been a thorn
34:32
in our side for decades and
34:34
just spiking those balls. And Brandon, the
34:37
questions he was asking, like, you know,
34:39
do you really, and he was putting
34:41
it back on her, you know, do
34:43
you really think this is a good,
34:45
you know, a good spend for our
34:47
tax dollars? Do you really think we
34:49
need to spend this and that on
34:52
our tax dollars? And, you know, she
34:54
was obviously her views, like, you know,
34:56
she said something about being
34:58
embarrassed that she was white.
35:00
Well, her views are reflected
35:02
on NPR. They are. Well,
35:04
I mean, that's how she became
35:07
the CEO. Right. She is in tune
35:09
with she sings from that sheet music.
35:11
That's right. Now she tried to mute
35:13
it in the conversation with Brandon. I
35:15
think the good thing that Brandon did
35:17
was, and this is different than a
35:20
lot of Republicans, typically a Republican says,
35:22
I've got five minutes a time, right?
35:24
And so I'm going to give a
35:26
thunderous speech. Right. And so I ask
35:28
a question, but just when the person
35:30
starts to answer, I interrupt, I say,
35:32
I take back my time, and I
35:35
just keep talking for five minutes. Right.
35:37
was more of a sort of a surgical...
35:39
Well it was it was a
35:41
little bit like being a
35:43
prosecutor. Right and cross-examining a
35:45
witness. Cross-examination. In which very
35:47
often the most important thing I
35:49
find this to be true when I
35:51
do debates is to listen to what
35:53
the other person is saying and and
35:55
that's so rare if you're in the
35:58
media, even lawyers, they don't listen. just
36:00
go to the next question. Whereas what Brandon
36:02
does is, like she would say something,
36:04
this is my favorite, but Brandon goes,
36:06
he quotes her in favor of reparations,
36:08
and then she to throw him off
36:10
goes, well I don't support fiscal reparations.
36:13
And normally, you know, a clumsy
36:15
interlocutor would have been like, let's
36:17
talk now about something else. But Brandon's
36:19
like, well, what kind of reparations do
36:21
you support? What other kind are there?
36:24
Do you know, do specify? And then
36:26
she was not prepared. She was not prepared
36:28
for that. So then she was like, well,
36:30
I just think we should be honoring our
36:33
ancestors, which was so...
36:35
transparently duplicitous and a hokey
36:37
that I don't think I don't
36:39
think she looked good. She looked
36:42
bad. And I hope that he looked
36:44
good. And I hope that this is
36:46
an impetus to really get that place
36:49
shut down. My
36:51
pillow is excited to announce they
36:53
are extending the mega sale on
36:55
overstock on clearance and also on
36:57
brand new products This is your
36:59
chance to grab some incredible deals
37:02
on some of my pillow's most
37:04
popular and newly released items for
37:06
example save $40 on the new
37:08
spring My pillow bed sheets available
37:11
in any size and any color
37:13
these luxurious sheets are designed for
37:15
maximum comfort and breathability They're perfect
37:17
for a great night sleep looking
37:20
for a meaningful crosses inspired by
37:22
the one Michaelandell has worn
37:24
every day for over 20
37:26
years. These beautifully crafted crosses
37:28
come in both men's and
37:31
women's designs and are proudly
37:33
made in the USA. Get
37:35
the six-piece bath or kitchen
37:37
towel sets just 3998. Initial
37:39
quantities are low so act
37:41
now and don't forget the
37:44
best-selling standard. My pillow just
37:46
1798 plus orders over $75
37:48
ship free. or you can
37:50
go to my pillow.com
37:52
and don't forget to
37:55
use the promo code.
37:57
It's D-I-N-E-S-H-D-N-S. Disney
38:00
is facing perhaps the
38:03
biggest bomb in its
38:05
certainly look Disney is
38:07
like my snow white
38:09
outfit I know I'm looking
38:11
at this I'm like oh I
38:14
think I'm trying to dress like
38:16
snow white snow white outfit
38:18
yeah here's the thing
38:20
Disney has has introduced
38:22
some new types of
38:24
films and they haven't
38:27
done well true but
38:29
Disney became Disney based
38:31
on a handful of
38:33
major brands, right? Cinderella.
38:36
Snow White. Snow White. There are
38:38
a few others like that.
38:40
Hansel and Gretel. Hansel and
38:42
Gretel, the, what is it,
38:45
the Little Red Riding
38:47
Hood? I mean, there are
38:49
these classic grims tails and
38:51
other tails. And to bomb
38:53
on Snow White is not the
38:55
same as bombing on, you know,
38:58
some... Someone that you haven't
39:00
heard of. Yeah, or even Moulin
39:02
and Moulin was very successful. But
39:04
my point is if you bomb
39:06
on Moulin, it's not the same
39:08
as bombing on Cinderella Snow White, especially
39:11
since those movies made Disney. That's right.
39:13
And I made the point earlier this
39:15
week where I said that, you know,
39:17
people are blaming Rachel Ziegler.
39:20
She's an extremely annoying lead.
39:22
She's Snow White and she always
39:24
In all her interviews, she's so
39:26
self-absorbed, she says things like, you
39:29
know, I had to put on
39:31
my snow-white dress and stand there
39:33
for hours. Everybody who streams this
39:35
movie needs to be paying me.
39:37
Think about it. Think I think
39:40
we would say that. Who would
39:42
say that, right? So she's a
39:44
very objectionable individual, yes. But I
39:47
also think it is a
39:49
little unfair to blame her
39:51
exclusively for snow-white bombing, why? The
39:53
script. Who wrote the plot? Yeah. Who decided
39:55
to get away to the dwarfs? Who decided
39:57
not to have really a normal French
39:59
job? story. So what are they, so
40:01
what is the story about? I mean, what
40:04
is it about? If it's not about the
40:06
doors, it's not about the love story, then
40:08
what happens? It's about the whole
40:10
community coming together. It's like it's
40:13
winter vocification rewrite of, look, I
40:15
mean, I'm not going to watch
40:17
this film, right? So I'm not
40:19
going to give you the detailed
40:21
blow-by-by-low. The moment I saw that
40:23
they have taken the film... See,
40:26
they've taken everything in the film
40:28
that they would, that they see
40:30
in their own, um, demented way. As
40:32
a problem. Right. So Prince
40:34
and Princess, that's a problem.
40:36
Um, Princess eats apple and
40:39
collapses until kissed by Prince.
40:41
Major problem. Major problem. Uh,
40:43
dwarfs, you're getting into sensitive
40:45
territory. Um, the dwarves having
40:47
defined characteristics, sleepy and so
40:49
on. And dopi. Dopi. Oh,
40:51
you know. Can go there.
40:53
So then what you get
40:55
is basically, you know, a
40:57
whole bunch of writers coming
41:00
in, you know, plugging in
41:02
their butt plugs and giving
41:04
a wolf. Shut up just
41:06
talking about the composition process
41:08
of giving us the woke version.
41:10
Oh, and I'm saying that's
41:12
what destroyed the movie. Rachel
41:14
Ziegler, she's like Catherine Marr
41:16
at NPR in that they brought her
41:19
in because she was as woke as
41:21
the rest of them. Right, but now
41:23
they're acting like they are not
41:25
woke and she came in and destroyed
41:28
the movie I guess that's what I'm
41:30
objecting to There's there's a little
41:32
bit of blame to go around
41:34
right because first of all She's
41:36
woke we know she's woke the
41:38
studios to blame Yeah, brought her
41:41
in yes, and but she accepted
41:43
knowing that this was the case She
41:45
was hired like a you know, she was
41:47
hired to be woke I don't
41:49
know if she sings. That's really
41:51
not the issue. They're acting now
41:54
like because she's a Palestinian activist,
41:56
which no surprise she is. That's
41:58
part of the woke ideologies. So in
42:00
a way, Rachel Zegler could say to
42:02
that, what did you expect? You weren't
42:04
looking for somebody like Catherine
42:06
Hepburn. You wanted somebody like
42:08
me. You got it. And now you're
42:11
blaming me for destroying the movie.
42:13
Look, I'm not defending her. I'm just
42:15
saying there's a lot more blame to
42:17
go around. I hope that this this
42:20
this model this Disney model I think
42:22
is not going to work. Yeah well
42:24
now that we are you know have
42:26
grandchildren. I mean see for a while
42:29
like I know I didn't even know
42:31
what was out there with Disney because
42:33
our kids were grown. They weren't watching
42:36
Disney anymore. We didn't care. We could
42:38
care less. it's starting to come full
42:40
circle and our kids our grandkids are
42:43
now going to start wanting to watch
42:45
Disney and wanting to you know
42:47
read those tales and all of those things
42:49
so so that's going to become important to
42:52
us again so I mean I have to
42:54
full circle is the right way to
42:56
put it because when I came to
42:58
America at the age of 17 I
43:00
was an exchange student for a year
43:02
in Arizona at the end of that
43:04
year we had a kind of exchange
43:06
student mini sightseeing tour that began in It
43:08
began in Colorado, Arizona, but
43:11
it also went to Disneyland
43:13
in California. And I love
43:15
Disney because I had never
43:17
seen anything like that. In fact,
43:19
I'd never seen an amusement park.
43:22
And so the experience of Disney,
43:24
Circa, like 1979, I was very
43:26
wild by it, even though I was a
43:28
little too old. I mean, I was
43:30
17. But nevertheless, I just found
43:32
the whole thing just so charming. was
43:35
just Americana you know yeah and and
43:37
I even liked I liked I like
43:39
space mountain I like bare country so
43:41
I was a small world after all
43:44
it was hilarious I mean so
43:46
the whole thing just made you
43:48
chuckle from start to finish and
43:50
at the end of the day you
43:52
were just like wow yeah that is
43:54
just so funny and entertaining and wholesome
43:57
and good music and of course
43:59
very clean Yes. And of course
44:01
it's a fun day out. Yeah.
44:03
And then to see Disney take
44:06
this kind of twisted, nasty
44:08
turn. Well, you know,
44:10
like you, I went
44:12
to Disneyland when I
44:14
was like four. You
44:17
know, we came to
44:19
visit my grandparents from
44:21
Venezuela. We went to
44:23
California and I went
44:25
to Disneyland. I went to
44:27
Disney World in Orlando and
44:29
loved that. But then when
44:31
my kids were little, we
44:33
went as a family and
44:35
I told you that my
44:37
poor children were so embarrassed
44:39
because we went to this,
44:41
it was like the wall
44:43
of presidents, right? And they
44:45
had the animatronics, all of
44:47
the presidents. It was incredible.
44:49
But then they had Barack.
44:51
Obama. Was he the last one?
44:53
Yeah, and I booed him. And
44:55
my children were so embarrassed. They
44:58
were like, oh no, mom, you
45:00
didn't just do that. And so,
45:02
you know, I think that's when
45:04
they started getting a little bit
45:06
on the, you know, we're gonna
45:08
kind of, not that, not that
45:10
Obama wasn't the president, but I'm
45:12
just saying that they, I think
45:15
they... They did that kind of
45:17
on purpose to like they did
45:19
that was the creeping Yeah
45:21
of the political the other
45:23
thing is is When Daniel
45:26
was very little we were
45:28
Disney one time and I
45:30
remember it's kind of toward
45:32
the evening. So it might
45:34
have been winding down a
45:36
little bit and you know
45:39
when you step away from
45:41
some of the rides I happen
45:43
to see you know goofy uh...
45:45
walking ahead of me and
45:47
then leaning up against one
45:49
of the buildings and what
45:52
is he do he reaches
45:54
and pulls out pulls off
45:56
his the top part of his
45:58
goofy a mask and then he
46:01
lights a cigarette, right? Oh yeah.
46:03
Yeah. And so, and he had
46:05
this very cynical, perverted look. You
46:07
know, he looked like, he essentially,
46:09
goofy was transformed before my eyes,
46:11
right? And it's almost like I saw
46:14
the old Disney and the new Disney
46:16
in one snapshot, right? I saw the
46:18
old Disney, which is goofy. And
46:20
then I see this perverted man.
46:22
Right. Just with a cynical look.
46:24
And his cynical look told
46:26
me everything. It was kind of
46:29
like. I'm supposed to play this
46:31
dumb roll from morning to night.
46:33
Yeah. Here, give me a break.
46:35
Let me have a cigarette and
46:37
like return to the real world.
46:39
Yeah. Before I have to put
46:42
the stupid goofy costume back on.
46:44
And I, and at the time,
46:46
of course, I didn't think of
46:48
this ideologically in any way. I
46:50
was just like, oh wow, there's, there's
46:52
goo, oops, oh what? It's like taking
46:55
a very fine restaurant and going
46:57
into the kitchen where you basically
46:59
see everything's a mess on these
47:02
big ovens and smoking. Or how
47:04
about this one when you find
47:06
out Santa's not? You know who we
47:08
thought he was? Okay, let's
47:11
turn to we want to
47:13
talk about about Venezuela because
47:16
You were telling me that
47:18
because you fall you are
47:21
part of a Of a
47:23
group of Venezuelans who are
47:25
on Facebook and Talk about
47:28
the latest, the latest kind
47:30
of vibe. Well, this was
47:33
actually a post that you
47:35
did on Instagram. And a
47:37
lot of these Venezuelans that
47:40
chimed in, I did not
47:42
know. They were not part
47:44
of my group. But they
47:46
were like, Debbie, please talk
47:49
about the fact that all
47:51
Venezuelans are not bad people.
47:53
Apparently because because of this
47:56
trend that I wah these
47:58
these gangsters you know, they're
48:00
giving Venezuela a bad name. Please
48:03
do something about it. Please, you
48:05
know, talk about the fact that
48:07
most Venezuelans, the majority of us
48:09
are Trump supporters, capitalists, hard workers,
48:11
you know. I mean, really, it's,
48:13
and so, and I said, listen,
48:15
I know, I'm, I'm one of
48:17
them. I, I definitely don't want
48:19
to be compared to Trinidad Awa.
48:22
I mean, that, that's ridiculous. Any
48:24
more than, and I told you,
48:26
it would be like an American
48:28
being compared to an Antifa person.
48:30
You know, like, oh, you, oh,
48:32
you must be like them because
48:34
you're American and all of the
48:36
Americans are like these guys. No,
48:39
Antifa, that group of people is
48:41
a very small portion of the
48:43
population and they're very bad people.
48:45
They shouldn't be compared with the
48:47
rest of us. The great defenders
48:49
right now of Trendi Aragua. I'm
48:51
not the Venezuelans. I never see
48:53
a Venezuelan going, trendieragua is amazing.
48:55
No, it's the left. It's trendieragua
48:58
is embarrassing. Right, so the Venezuelans
49:00
know that trenderagua is horrible. Yeah.
49:02
Just like the Mexicans know that
49:04
MS-13 is horrible. No, that's Salvadorian.
49:06
That is a Salvadorian gang. Now
49:08
there are elements of Mexico. And
49:10
there is the Mexican mafia, of
49:12
course. Yes. And a trenderagua, I
49:14
mean, Trederagua, MS- is mostly made
49:17
up of El Salvadorians, from El
49:19
Salvador, but there are Mexicans and
49:21
Guatemalan, and you know from other
49:23
Central American countries. So it's become
49:25
a kind of transnational, yeah. It
49:27
is a transnational, but the majority
49:29
are from El Salvador, and that's
49:31
why when they go home to
49:34
Bukalé, it's, they're going home, because
49:36
that's their home. So, but... Bukalé
49:38
made a telling remark when he
49:40
said, look. He was talking about
49:42
Mexico. And he said, everyone tells
49:44
me that you've got these criminal
49:46
gangs and that they're so big
49:48
and they're so powerful. He goes,
49:50
in no country in the world.
49:53
Is there any gang in existence
49:55
that is more powerful than that
49:57
country? And he goes for the
49:59
simple reason that that country has
50:01
the military, right? So he goes,
50:03
and so it doesn't make any
50:05
sense for any country to say,
50:07
we have been overtaken by gangs.
50:09
He goes, there is really only
50:12
one way for a country to
50:14
be overtaken by gangs. And that
50:16
is for the politicians and leaders
50:18
of that country to be part
50:20
of the gang. That's right. And
50:22
that's what's happened in Mexico. That's
50:24
right. The regime doesn't get rid
50:26
of the cartels because the cartels
50:29
are in the government, which is
50:31
to say they own part of
50:33
the government. The government is getting
50:35
benefit and law enforcement has been
50:37
penetrated, corrupted. So Buchale's point is
50:39
that if a country wants to
50:41
get rid of gangs, there is
50:43
a way to do it and
50:45
he's proven that he knows how
50:48
to do it. Yeah, clean up
50:50
your government. Yeah, clean up your
50:52
government. But anyway, back to the
50:54
Venezuelan and the Trenderawa. So the
50:56
video that you posted on Instagram
50:58
was was a video of these
51:00
Venezuelans on a plane going back
51:02
to Venezuela and they were not
51:04
Trenderawa members. They were just thugs.
51:07
And I said, these are the
51:09
worst of the worst of society
51:11
that society has to offer. You
51:13
were one of the first people
51:15
to warn that the Venezuelans who
51:17
are being sent over and were
51:19
coming over illegally were not fleeing
51:21
socialism. No, they were... They were
51:24
the Chavez themselves. They were actually
51:26
fleeing what socialism promised them and
51:28
didn't get. So they were coming
51:30
to America because word on the
51:32
street was that they were going
51:34
to get all those goodies that
51:36
they didn't get in Venezuela. So
51:38
they were like, oh, Biden's going
51:40
to give me a car. He's
51:43
going to give me a house.
51:45
What's not to like? Let's go.
51:47
Let's go through the area and
51:49
the Darien jungle. Who cares what
51:51
happens there? As long as we
51:53
get to America, we're good. You
51:55
know, so these people were really
51:57
the bottom of the barrel of
51:59
Venezuela. They were not the the
52:02
hardworking people that were destroyed, where
52:04
their livelihood was destroyed. That's another
52:06
show. And obviously my family is
52:08
part of that group of people.
52:10
I mean that group of people,
52:12
it looks like. broke into two.
52:14
In other words, there were business
52:16
people, there were Jewish merchants, there
52:19
were some people who had the
52:21
means and the ability to get
52:23
out of Venezuela and they got
52:25
out early. Yes. And they fled.
52:27
And they've set up in Miami,
52:29
you see them in Florida, they're
52:31
very successful Venezuelan community, they have
52:33
helped deliver Florida to Trump. That's
52:35
right. And then you've got a
52:38
second group of people who were
52:40
fooled by Chavez, and perhaps even
52:42
by Maduro, or they are not,
52:44
like your aunt, they're not in
52:46
a position to get up and
52:48
leave. They're too old to leave,
52:50
where would they go, where would
52:52
they live, what would they do?
52:54
My grandparents were, they were the
52:57
the holdouts, they were like, oh,
52:59
things will get better, things will
53:01
get better, and then they passed
53:03
away. You know, unfortunately... But they
53:05
saw that they didn't get better.
53:07
They saw that they did not
53:09
get better and they were horrified
53:11
by what happened. And then I
53:14
have like a cousin that, you
53:16
know, I'm really trying very hard
53:18
to help, but he may fall
53:20
prey to the fact that, you
53:22
know what, the Venezuelan have kind
53:24
of... that have come illegally have
53:26
kind of given all the other
53:28
Venezuelan's a bad name. Right. And
53:30
so we're having to deal with
53:33
that issue, you know, which is
53:35
unfortunate. You know, with the, with
53:37
Indians, not to the same degree,
53:39
because of course you're not talking
53:41
about trendy Aragua, but what's happened
53:43
is you have abuses in like
53:45
the H1B pro. And what happens
53:47
is that the abuses are not
53:49
coming exclusively on the Indian side.
53:52
It's true. You have these Indian
53:54
suppliers. And what they do is
53:56
that they create these will supply
53:58
the labor that you need, and
54:00
then you've got these tech companies
54:02
and software companies, and sometimes not
54:04
even tech companies. And what they
54:06
do is they act like there
54:09
are no willing and employable Americans.
54:11
Why? Because they'd rather get an
54:13
Indian guy that they can pay
54:15
much less, not to mention the
54:17
fact that that guy then has
54:19
to work for this company. Because
54:21
you've come on that visa. You
54:23
can't go anywhere else. You're almost
54:25
like beholden to that company. So
54:28
you've got this. unhealthy alliance. But
54:30
my point is, for Americans watching
54:32
it from the outside, you go,
54:34
these horrible Indians, they're just, they're
54:36
gaming the system, and so it
54:38
gives Indians a bad name. Yeah,
54:40
no, and I think that that
54:42
is... That's unfortunate really because there
54:44
are a lot of really great
54:47
Indian Americans that have contributed a
54:49
great deal to this country and
54:51
and there are a lot of
54:53
great Venezuelan Americans. Hello. No. But
54:55
you know I don't get it
54:57
because I've been obviously been here
54:59
since I'm 17. Yeah. Most people
55:01
who follow me you know know
55:04
know exactly my story and so
55:06
I'm not going to get. lumped
55:08
in, but I noticed even Vivek.
55:10
Vivek's a good guy, right? And
55:12
Vivek was born here. So Vivek
55:14
is more American than I am
55:16
in that sense. And yet, I
55:18
see people that go home, deport
55:20
Vivek. You know, why is Vivek
55:23
talking about the H1B? Vivek is
55:25
somebody who I think has has
55:27
bought in the debate in this
55:29
country in a very good way.
55:31
So I wish Vivek well, and
55:33
I hope he makes it by
55:35
the way in his race for
55:37
the Senate. Yeah, for the Senate,
55:39
you mean for governor? Oh, for
55:42
governor, sorry, Trump is endorsed. but
55:44
endorsed in Ohio. Ohio.
55:46
And of course
55:48
the election is
55:50
next year. next
55:52
Well, I hope
55:54
that that people, you
55:56
know, don't don't
55:58
be I guess guess
56:01
don't be so
56:03
cruel and judgmental
56:05
about different people
56:07
that that are
56:09
here for different
56:11
reasons. Well, make valid distinctions.
56:13
I mean, I key
56:15
is that you. is
56:18
that you. Yeah, the trend of
56:20
Aragua is a a
56:22
Venezuelan it's not but it's
56:24
not exactly representative of
56:26
the people of Venezuela.
56:28
Duh. Duh. Subscribe to
56:30
Denesha podcast on
56:32
Apple, Google and
56:35
Spotify and watch or watch
56:37
on and YouTube, and salemnow.com.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More