Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Good evening, good evening.
0:02
Welcome to your Friday Night
0:04
Live. Thank you for
0:06
everyone who dropped past
0:09
earlier today for a
0:11
spontaneous live stream. We
0:13
had a very good chat
0:16
and conversation. And hello,
0:18
hello. Good evening, welcome
0:20
to your Friday Night Live.
0:22
Fourth of April, my
0:24
God, April already. 2020-24.
0:26
Boy, time really is
0:28
flying. And... I
0:31
have topics, of course,
0:33
I'm sure you're aware, that
0:35
we do have topics
0:38
as a whole. I have lots
0:40
of space in my brain,
0:42
heart and mind for your
0:45
questions and comments
0:47
and preferences and
0:50
what is nice for you.
0:52
And we could start with
0:54
some fraud. So
1:03
yeah, we could start with some
1:05
fraud. I mentioned this many years
1:07
ago, but I came across it
1:09
recently, and I thought it was worth talking
1:11
about a little bit. So Russell T.
1:13
Warren, that's Russ Warren, W.A.R.N.E. on
1:16
X, wrote, I finished reading Thebaud
1:18
Let Textsier's book, Investigating
1:21
the Stanford Prison Experiment, History
1:23
of a Lie. This is the
1:25
most thorough treatment of the
1:28
real history behind the Stanford
1:30
prison prison experiment, So
1:35
this is his perspective on it.
1:37
Obviously I haven't verified these claims,
1:39
but I'm putting them out there
1:41
because they're interesting. So the Stanford
1:43
Prison experiment was this idea that
1:45
you take a bunch of random
1:48
college students, you divide them into
1:50
guards, and you divide them into
1:52
prisoners, and you just give them
1:54
these roles to play, and the
1:56
guards become progressively more hangdoggy and
1:59
desperate and... so on, stiflingly
2:01
rebellious and all of
2:04
that, and that they
2:06
had to stop the
2:08
experiment very quickly because
2:10
the gods were just
2:12
becoming so crazy and sadistic.
2:15
And so the general
2:17
argument is that there's
2:20
this incipient sadism
2:22
that is in the hearts
2:24
and minds of people as
2:26
a whole. And
2:29
if you just, if
2:32
people get power, they
2:34
go crazy,
2:36
they get
2:39
crazy aggressive,
2:41
crazy hostile,
2:43
their personality
2:45
has all this
2:47
latent sadism
2:49
and so on. And
2:52
it's a very sort
2:54
of powerful experiment.
2:56
So, Russell writes, after reading the book,
2:59
it's hard to deny that it's impardo,
3:01
as the original psychologist lied about almost
3:03
every aspect of the study at some
3:05
point in the 53 years he lived
3:08
after conducting it. Some of the most
3:10
inexcusable lies include, saying that five, quote,
3:12
prisoners left the experiment early for mental
3:15
health reasons. In reality, only two to
3:17
three did, in fact, one left, because
3:19
the dry air and denial of access
3:21
to his medication was causing problems with
3:24
his eczema. Zimbabwe's
3:26
then-girlfriend, later wife, was not the
3:28
cause of the study ending. In
3:30
Zimbabwe, she visits on day six and
3:32
is horrified about what's happened and convinces
3:35
him to stop the study. In reality,
3:37
she had visited earlier, participated in a
3:39
fake parole board, and was aware of
3:41
what was happening in the study before it
3:43
ended. No, the quote guards did not
3:46
all turn sadistic. In fact, most were
3:48
reluctant about embracing their role, and the
3:50
day shift guards were actually pretty lenient
3:52
about rules. Next, the
3:54
experiment did not get progressively
3:56
or increasingly intense with each
3:58
passing day. Also, the
4:01
guy's behavior was not spontaneous. They
4:03
were coached multiple times about how
4:05
to behave. They were given suggestions
4:08
for punishments, and they did not
4:10
invent the prison rules. There are
4:12
also lies of omission, as Embardo
4:14
did not come up with the
4:17
experiment himself. Some of his undergraduate
4:19
students did a smaller version of
4:21
it a few months earlier as
4:24
a class project he almost
4:26
never credited them. The
4:29
guards were misled into believing that
4:31
they were part of the experimental
4:33
team. They thought the study was only
4:35
about prisoner behaviour. As a result, the
4:37
guards did not lose themselves in a
4:40
role by being placed in a fake
4:42
prison. They never thought of themselves
4:44
as real guards. The participants were
4:46
not all good or normal young
4:48
men with no history of misconduct.
4:50
Some had a history of a petty
4:52
crime, drug use, social dysfunction, etc. participants
4:54
treated the experiment as if it
4:57
were real, both prisoners and guards
4:59
were constantly aware that they were
5:01
in an experiment, and that they
5:03
were not really prisoners and guards.
5:05
No one consistently lost himself in
5:07
his role. Variability was the rule
5:09
in the Stanford Prison experiment, not
5:11
the exception. For decades Imbardo portrayed
5:14
all the prisoners as becoming
5:16
rebellious and then broken as
5:19
the guards became authoritarian and
5:21
cruel. Some prisoners had good
5:24
relationships with some guards. The
5:26
day shift was business-like and
5:28
some prisoners or guards saw
5:31
the situation
5:33
as a weird
5:35
temporary job, whereas
5:38
others desperately
5:40
wanted out. Let's
5:43
see. He says, Russell
5:45
says, the Stanford
5:48
Prison experiment. It was
5:50
simply bad science. There are so
5:52
many flaws that it cannot reveal
5:54
anything about human behaviour. In the
5:56
past I called it a performance
5:58
art, he says, reading. Letexius book
6:01
reinforce their view. The protocols were
6:03
erratic, changed often and haphazardly.
6:05
Almost nothing in the Stanford
6:07
Prison experiment was systematic. Data
6:10
collection was erratic. Irregular, resulting
6:12
in sloppy data. In the
6:14
months and years after the
6:16
experiment, Zimbabwe's assistance and students
6:19
warned him that the data were
6:21
hired to interpret, he ignored them all.
6:23
Zimbabwe started the study. With a
6:25
predetermined goal in mind, he published a
6:28
press release on the second day of
6:30
the study touting its results. He testified
6:32
to Congress and gave dozens of interviews
6:35
before he had even analyzed his data.
6:37
The demand characteristics must have
6:39
been overwhelming, especially for the guards, who
6:41
were coached in their behavior. Everybody knew
6:43
or had a pretty good idea of
6:46
the purpose of the study and what
6:48
Zimbabwe wanted to see. There was almost
6:50
constant supervision from Zimbabwe and
6:52
his assistance. Conditions only superficially
6:54
resemble the real prison. This is
6:56
two consequences. One, running the experiment
6:59
was sometimes cruel and definitely unethical,
7:01
even by the standards of his
7:03
time. The Stanford Prison experiment does
7:05
not tell us anything about the effects
7:07
of real imprisonment. Among the conditions that
7:09
were worse than those of real American
7:11
prison were prisoner uniforms or
7:14
gowns warm without underwear, which
7:16
sometimes exposed prisoners genitals. Conditions
7:18
were unsanitary, bathroom access was
7:20
severely limited, at night the
7:22
prisoners had to urinate and
7:24
defecate in a bucket. Sometimes prisoners
7:26
even had to clean out the buckets
7:28
with their bare hands, the prisoners
7:30
were worried about disease. The prisoners
7:32
could not shower and were only allowed
7:35
to shave. I'll have a sponge bath if
7:37
outside visitors were expected. The prisoners
7:39
had no access to fresh air
7:41
or exercise. Access
7:43
to recreation was almost zero. Books were
7:45
taken away, and prisoners were not allowed
7:48
to have any personal effects or momentos.
7:50
The quote, parole board was a total sham
7:52
that had no power to release prisoners early.
7:55
Prisoners wore chains almost constantly,
7:57
which caused discomfort and injury.
8:01
For Zimbabwe, the lesson of
8:03
the Stanford prison experiment was
8:05
that potential for cruelty and
8:08
evil lurks inside everyone, and
8:10
the right or wrong situation could
8:12
let out that in a monster. I
8:14
think Zimbabwe thought this message
8:16
resonated, says Russell, because he
8:19
actually did do cruel things
8:21
to other people. The conclusion
8:23
that everyone has evil inside
8:25
them probably greatly swashed
8:27
at Zimbabwe's guilt. Very
8:31
sad. And I think
8:34
it's probably a bit
8:36
of an atheist thing too,
8:38
right? In that, well, new
8:41
God, virtue, philosophy, doesn't
8:43
tame the beast
8:45
within and all
8:47
of that sort of stuff.
8:50
All right, let me get...
8:52
We had an interview with
8:55
Russell Warren back in
8:57
the day, didn't we really? Thank
8:59
you. If you give me the
9:02
link, I'd appreciate that. Give me
9:04
the number. All right, so let's
9:06
get to your... So, the general
9:09
idea is that whenever there's
9:11
like a, what seems to be a
9:13
very sort of... This is Emil Kikegaard
9:16
wrote this, Remember kids,
9:18
whenever you hear one
9:21
important story or study
9:23
from social science with
9:25
some morality tail that
9:27
has, it's probably fake in
9:29
some way. was seriously misrepresented, you know,
9:31
this idea that this woman gets stabbed
9:33
to death and nobody does anything even
9:36
though dozens of people can hear her.
9:38
It's not really what happened, but you
9:40
can look that up. The murder of
9:42
Kitty Genovese, G-N-O-V-E-S-E, you can look that
9:44
up, debunked. It's very interesting. There's
9:47
all this stuff that was gospel when I
9:49
was growing up. Oh, okay, yes, sorry. Wrong about
9:51
IQ. So the show is four-oh-oh-five nine,
9:53
four-five nine, four-one, four-one,
9:55
four-one, four-one, four-one, four-one,
9:58
four-one, four-one, four-one. We
10:00
did that show, one of the
10:02
intelligence experts. Okay. Thank you
10:04
for the tip. Flort Network, I'd
10:07
appreciate that. Country scholar
10:09
writes, what would you do if
10:11
your wife and two daughters hated
10:13
the Mars, you made your SUV?
10:16
Would you add more off-road gadgetry
10:18
or give in and get something
10:20
more refined? My kids are forced
10:23
to ride with me for several
10:25
more years. Hmm. Well, I
10:27
don't know, it's interesting. I
10:30
guess you made some manly
10:32
stuff that the women don't
10:34
appreciate. I would get
10:36
it changed, personally. I would get
10:39
it changed. I mean, it's not
10:41
like, I'm not saying you
10:43
should get bullied or pushed
10:45
around with every decision or
10:47
choice that you make, but
10:49
what I would say is that
10:52
you want your family to know
10:54
and understand that you care
10:56
more for their happiness then.
11:00
Tricked up thing for your SUV.
11:02
I mean if they hate it, just get rid
11:04
of it. You know, if I were to
11:06
buy, I don't know if I would buy
11:08
a shirt or a hat, then my wife
11:10
and daughter hated. I'd just get rid of
11:12
it. I'd just take it back if you
11:14
hate it. It's not big. So I would,
11:16
I mean, you should preview these things
11:19
with your family as a whole. Make
11:21
sure you get people's buying. But, uh, I
11:23
would, you know, it's not worth it's
11:25
not worth it. It's not worth it.
11:27
It's not worth it. Not a hill
11:29
to die on, so to
11:31
speak, right? James says,
11:33
I'm reminded of an
11:36
episode of Star Trek,
11:38
the next generation, where
11:41
the big moral was
11:43
there's an evil monster
11:46
that lives inside
11:48
all of us. Yeah. Yeah.
11:50
This is... It's not true.
11:52
I've never had an evil monster
11:54
that lives inside. I think we
11:57
all have the capacity to be
11:59
cruel, sure. But that's just
12:01
a larger part of self-protection
12:03
of the whole, right? All
12:06
right. Somebody writes, High
12:08
Staff, why do I feel burdened
12:10
by someone else's destructive
12:12
decisions? Because my friend,
12:14
in the past, secret,
12:16
secret, in the past, we
12:18
couldn't escape them. You grew
12:20
up in a small town, a small
12:23
village, a small tribe, and
12:25
somebody has some really destructive,
12:27
some really destructive...
12:30
opinion, some destructive
12:33
perspective, you really
12:35
can't escape it. So you
12:38
are burdened by it, you know.
12:40
The idea that you
12:43
can escape the disfunctions
12:45
of your childhood
12:48
environment is really
12:50
new. Or even the
12:52
possibility that you can really
12:54
do that is really new.
12:59
And because of that, you
13:01
had to stay focused and
13:03
attentive to what was going on.
13:05
Now you do have a choice, right?
13:08
Now you have a choice. In the
13:10
past you didn't. All right. Yeah,
13:12
I don't like the idea that,
13:14
you know, we're all just
13:16
different degrees of evil. And,
13:19
you know, we're all just, you know,
13:21
one day of power away
13:23
from becoming a sadist. I
13:25
don't believe that's true. All
13:29
right. Well, what do you
13:32
make of insomnia
13:34
so far as
13:36
philosophy goes? I would
13:38
say it's more of
13:40
a self-knowledge thing.
13:42
I think it's more
13:45
of a self-knowledge
13:47
thing. I think it's more
13:50
of a self-knowledge
13:52
thing. I
13:59
would say. that insomnia is
14:01
when there's, and I'm just
14:03
talking about, you know, self-knowledge
14:05
stuff, not any sort of
14:07
physical thing, but if you
14:09
have a big sort of
14:12
rank contradiction in your life,
14:14
I did, and I had
14:16
some insomnia when I was
14:18
in my late 20s, early
14:20
30s, if you have a
14:23
big rank contradiction in your
14:25
life, and mine was, I
14:27
was... reading about philosophy and
14:29
theorizing about philosophy, but not
14:31
practically living in a sort
14:33
of very material way philosophy.
14:36
So if you have a
14:38
big contradiction in your life,
14:40
it needs to be resolved.
14:42
Your contradictions will twist your
14:44
brain into some supernatural London
14:47
subway pretzel. It's really not
14:49
good for your mind as
14:51
a whole to have sort
14:53
of... big contradictions floating around
14:55
in your brain. So if
14:58
you've had yourself physically checked
15:00
out and you're fine sort
15:02
of physically, the place that
15:04
I would look for resolution
15:06
of insomnia is to look
15:08
for a big contradiction. I'll
15:11
give you some of the
15:13
typical ones, I mean, based
15:15
upon 20 years of calling
15:17
shows. So some of the
15:19
typical contradictions are, I say
15:22
I love someone, but I
15:24
don't. I claim
15:26
to love someone, but I don't
15:28
love that person. I claim to
15:30
be attached to someone who is
15:32
not producing any particular virtues that
15:35
are inspiring me to love worship
15:37
and adore. So if there's someone
15:39
in your life you claim to
15:41
love, maybe you have some historical
15:43
or residual affection for whatever, but
15:45
someone in your life that you
15:47
claim to love, but you don't
15:50
love them, that's a big one.
15:54
If your values are drifting
15:56
from the production of virtue?
15:58
In other words, if you're
16:01
pursuing sex for the sake
16:03
of sex alone, if you
16:05
are pursuing money for the
16:07
sake of money alone, if
16:10
you're pursuing status, or beauty,
16:12
or you know, like if
16:14
you're one of these sort
16:16
of Greek statue narcissists who
16:18
spends three hours a day
16:21
in the gym, if your
16:23
values and your decisions are
16:25
said by you to be
16:27
valuable, but they do not
16:30
directly contribute to the spread
16:32
of virtue in the world.
16:34
That's kind of a contradiction.
16:36
At least it will be
16:39
from a philosophical standpoint because
16:41
philosophically speaking we get the
16:43
most happiness from the production
16:45
of virtue in the world.
16:47
So that's another one. If
16:50
you are suppressing your true
16:52
self, but calling it... being
16:54
nice. That's a big, that's
16:56
a big problem. If you
16:59
are redefining negative traits, vices,
17:01
as positive, as positive traits,
17:03
you're going to have a
17:05
problem. You're going to have
17:08
a problem with your heart,
17:10
you're going to have a
17:12
problem with your peace of
17:14
mind. It's not so much
17:16
that we do negative things.
17:19
I mean... We do it.
17:21
It happens from time to
17:23
time. We do negative things.
17:25
But if we redefine those
17:28
negative things as a positive
17:30
thing, you know, if you,
17:32
let's say that you have
17:34
some dissolute friend who keeps
17:37
wanting to go out and
17:39
get drunk and you're like,
17:41
hey man, I deserve a
17:43
break, I deserve some relaxation,
17:45
I deserve to have fun,
17:48
all that that kind of
17:50
stuff. Then.
18:00
You're gonna have
18:02
problems because it's
18:05
it's it's it's
18:07
the dishonesty with
18:10
the South I
18:13
think that's that's
18:15
the big problem
18:18
All right, let's
18:20
see here. Oh,
18:23
so what are
18:26
the ones? Um
18:29
Severely undershooting your own potential?
18:31
That's a problem. And if
18:34
you are severely undershooting your
18:36
own potential, and you're not
18:39
honest yourself about it, that's
18:41
a big problem. That's going
18:44
to cause you, I think,
18:46
some significant unhappiness and problems.
18:49
What else? What do you
18:51
guys think? You ever had
18:54
this kind of insomnia that
18:56
comes from not? having a
18:59
very solid sense of, oh,
19:01
there's some sort of contradictions
19:04
going on in your brain.
19:06
Saying that you're happy with
19:09
your circumstances when you're simply
19:11
addicted to the familiar, I
19:13
think that's another one that
19:16
can be a big problem
19:18
for people. That's another one
19:21
that can be a big
19:23
problem for people. That's
19:27
right I missed check the various places
19:29
here. All right so somebody says thanks
19:32
deaf appreciate you I have recently a
19:34
long overdue increased my donations thank you
19:36
very much I appreciate that C2 spark
19:39
nice to see you again says one
19:41
of the most successful shows breaking bad
19:43
premise was where one cancer diagnosis away
19:46
from becoming a violent drug lord yeah
19:48
yeah I couldn't stand that show I
19:50
couldn't do it. It was too gross.
19:53
too violent. What do you think of...
19:55
I'm saying I'm so lucky to have
19:57
her. Yeah, that is signaling a kind
20:00
of, I think that comes from having
20:02
a distant mother that you constantly had
20:04
to race around and try and please
20:07
and have her enjoy your company. I
20:09
think that just comes from that. It's
20:11
a very cuck behavior. I mean, I'm
20:14
lucky to have my wife. She's lucky
20:16
to have me. We both work hard.
20:18
We have a great relationship. We both
20:21
work hard and virtue and honesty. And
20:23
so we have a very sort of
20:25
direct and fun relationship. We're lucky to
20:28
have found each other. I think we
20:30
both earn and deserve each other as
20:32
a whole. I know that. So I
20:35
think it is a signal to say
20:37
to the woman, I don't know how
20:39
she's put up with me for so
20:42
long, blah blah, right? Of course a
20:44
woman would not want that. A woman
20:46
with self-esteem would not want a guy
20:49
who she has to kind of put
20:51
up with, deal with it, and so
20:53
on, right? Oh,
21:04
he says, yes, I've had some
21:06
of that very in line with
21:08
what you're talking about. Yeah. You
21:10
know, it's hard, if you didn't
21:12
have parents who are really devoted
21:14
to you, it's hard to feel
21:16
that people can really devote themselves
21:18
to you in the future, if
21:20
that makes sense. You always feel
21:22
like a little bit like you're
21:24
hanging on by a thread that
21:26
people are putting up with you,
21:28
that you're kind of tolerated, if
21:30
that makes sense. And that's not
21:32
fun. That's
21:36
no fun. Sorry,
21:38
just to go
21:40
back to the
21:42
Breaking Bad and
21:44
Stanford Prison experiment
21:46
and James, the
21:48
Star Trek show
21:50
that you talked
21:52
about, it's a
21:54
way of eroding
21:56
social trust. Right,
22:00
so this sort of mythology
22:02
of the, every single one
22:04
of us, the devil inside,
22:06
it's a way of having
22:08
you look around the world
22:10
and you don't see, you
22:12
know, normal people, you don't
22:14
see, you know, people with
22:16
their struggles and, you know,
22:18
average people, you see these
22:20
like caged demons in middle
22:22
class skin suits of vague
22:24
respectability, and I can turn
22:26
on you like that, right?
22:29
You can turn on
22:31
you like that. And
22:34
that really does erode
22:36
social trust. There's a
22:38
movie, I remember a
22:40
friend showing it to
22:42
me. Oh gosh, Dennis
22:44
Hopper, a comic last
22:46
then, I mentioned it
22:48
a couple of months
22:50
ago, the show's called
22:52
Blue Velvet. Isabella Rossolini?
22:54
And it starts with
22:56
a guy having a
22:58
heart attack and the
23:00
camera kind of zooms.
23:02
past the kids just
23:04
standing there the dog
23:06
is uninterested and he
23:08
the camera kind of
23:10
zooms down in to
23:12
the grass and in
23:14
the bottom of the
23:16
grass underneath the grass
23:18
is all of that
23:20
disgusting sounds of the
23:22
insects all eating each
23:24
other and fighting with
23:26
each other and so
23:28
on right and it
23:30
is this idea that
23:32
that there's this sort
23:34
of respectable life that
23:36
people have, but then
23:39
underneath that, and very,
23:41
you know, very quickly
23:43
underneath that, is this
23:45
really terrible predatory, violent,
23:47
ugly, nasty world. And
23:49
I mean, Dennis Harper
23:51
was a complete psycho
23:53
and really lobbied hard
23:55
to get this role
23:57
of Frank, whatever his
23:59
name was, because he
24:01
said, like, that guy's
24:03
me. So the idea
24:05
that everyone is, can
24:07
be easily possessed and
24:09
will turn on you
24:11
on a dime. And
24:13
it's a nasty, there's
24:15
a nasty perception to
24:17
have. And you... can't
24:19
really relax, you can't
24:21
trust, you can't get
24:23
a sort of productive
24:25
or healthy or happy
24:27
tribe around you, you
24:29
can't do any of
24:31
that. You just wait
24:33
for people to turn
24:35
on you. Now, the
24:37
counter-argument to that is
24:39
COVID, where people did
24:42
kind of turn on
24:44
each other. I very
24:46
much appreciate your support.
24:48
All right, let me
24:50
get back to... There's
24:52
great questions and comments,
24:54
thank you. Charles
25:03
Murray, did he write something?
25:05
The three laws of social
25:08
programs. I did see world
25:10
of engineering is always a
25:12
good thing to follow. The
25:14
human brain can store an
25:16
almost infinite amount of information
25:19
equivalent to about 2.5 million
25:21
gigabytes. Amazing. I don't know
25:23
if this is true as
25:25
a whole, but Brian Ramell
25:28
wrote. The AI generated only
25:30
fans workers are now scheduled
25:32
to make more than their
25:34
human workers by 2026, displacing
25:37
them by sheer numbers of
25:39
a thousand to one. I
25:41
don't know if that's true.
25:43
I'm not sure how you
25:45
would figure that out. I
25:48
suppose, I mean, isn't the
25:50
way that only fans works
25:52
that you can... Oh, somebody
25:54
says you're... post is a
25:57
fabrication Only Fans has strict
25:59
rules against AI generated content.
26:01
He said that it's absolutely
26:03
not the case, it is
26:06
rather easy to bypass the
26:08
rules. So that's interesting. It
26:10
seems unbelievable, but as far
26:12
as I understand it, Only
26:15
Fans works to some degree.
26:17
Because what you can do
26:19
is you can text the
26:21
only fans model and ask
26:23
you to do stuff, right?
26:26
And if you do that,
26:28
then she, you know, take
26:30
off your top, you know,
26:32
hold your hands up in
26:35
the air and shake him
26:37
with that old song. So
26:39
you would instruct the only
26:41
fans model on what it
26:44
is you want the only
26:46
fans model to do, and
26:48
then she'll do it. But
26:50
I suppose it's just an
26:53
AI prompt then at some
26:55
point, right? But
27:01
it will be very
27:03
interesting. It would be
27:05
very interesting to see
27:08
what happens if AI
27:10
can spontaneously generate pornographic
27:13
content based upon user
27:15
input in real time.
27:17
Interesting. And if AI
27:20
does that, then that
27:22
would... liberate more women
27:24
from this fairly vile
27:27
line of work I
27:29
suppose. So that could
27:31
be real positives around
27:34
that. people in Washington
27:36
D.C. and the people
27:38
on Twitter will be
27:41
really upset by it.
27:43
But I'm not sure
27:45
the American people will
27:48
ever understand why foreigners
27:50
can charge higher tariffs
27:52
on us than we
27:55
charge on them. If
27:57
you tell an American
28:00
that Trump is simply
28:02
matching the tariffs they
28:04
impose on us, most
28:07
Americans are going to
28:09
be happy about it.
28:11
And what is significant
28:14
is absolutely no one
28:16
in this whole frenzy
28:18
debate has made a
28:21
case about why foreign
28:23
countries should have higher
28:25
tariffs on us than
28:28
we have on them.
28:30
I've not seen one
28:32
person make that argument.
28:35
America should just suck
28:37
it up and take
28:39
it. So, it's an
28:42
interesting question, but of
28:44
course all of the
28:47
people who were instant
28:49
experts on, I've remarked
28:51
in, are now instant
28:54
experts on tariffs. It's
28:56
pretty wild. There's
29:00
a guy, a dude's posting
29:02
their wins, W's. This guy,
29:04
he sees, this guy is
29:07
staying in a hundred billion
29:09
dollar development project and he
29:11
basically has the entire city
29:13
to himself because barely anybody
29:15
lives, lives there. Isn't that
29:17
wild? It's across from Singapore
29:19
and there's, you know, a
29:21
few thousand people in this
29:24
massive... area. I remember seeing
29:26
this as this sort of
29:28
the ghost stuff that was
29:30
going on in China, right?
29:32
Which is, yet these malls
29:34
with like two stores out
29:36
of like 200 that were
29:38
actually running. Isn't that wild?
29:40
I mean, it is pretty
29:43
horrible. The amount of wasted
29:45
resources, but it's, I don't
29:47
know, I'd be curious visiting
29:49
a place like that. The
29:51
Redhead Libertarian posted this now.
29:53
a marriage vow from 1,450
29:55
years ago. I took a
29:57
youth to be my wife.
30:00
I hold your clothes here
30:02
as far as my wealth
30:04
allows. I won't invite the
30:06
friends for a drinking party.
30:08
If you are opposed to
30:10
it or I'm liable for
30:12
the penalty fee of 18
30:14
solittle. Solittle. That's, oh, sorry.
30:16
For some reason I thought
30:19
that was Italian. But it's
30:21
from Egypt. It's now at
30:23
the British library. Isn't that
30:25
funny? It
30:32
was funny, I'm seeing, or
30:34
interesting, was it yesterday? I
30:36
saw a bunch of tweets
30:38
and I got to see
30:40
my old alma mater. I
30:42
graduated with an undergraduate degree
30:45
in history from McGill. And
30:47
it's interesting seeing all the
30:49
streets I used to walk
30:51
and I'm taking over the
30:53
buildings and so on. It's
30:55
pretty wild man. Now did
30:58
you know this? Hit me
31:00
with a why. If
31:03
you are into anime, or know much
31:05
about anime, I don't. Not because there's
31:07
anything cool about it, I just, it
31:09
was not my demographic, not my generation,
31:11
really. But hit me with a why,
31:14
if you know much or anything about
31:16
the exciting world of anime and manga.
31:18
Animas, the cartoons and manga, the cartoons
31:20
and manga, the cartoons and manga, the
31:22
cartoons and manga, the cartoons and manga,
31:24
the cartoons, the cartoons, the cartoons and
31:27
manga, the cartoons, the cartoons and manga,
31:29
the comics, the comics, Is
31:33
that right, something like
31:35
that? I've not heard
31:38
of Brian Tracy, I'm
31:40
a friend. I'm not
31:42
really half-way, yeah. 50-50.
31:44
So this is wild.
31:46
It's sort of like
31:48
when you look into
31:50
how much money video
31:52
games make compared to
31:54
movies, it's just insane,
31:56
right? So this guy
31:58
wrote. Whether you like
32:00
it or not, it
32:02
is undoubtedly a huge
32:04
success story, what anime
32:06
has become. In 2024,
32:08
it produced a whopping
32:10
estimated $35 billion. For
32:12
comparison in box office
32:14
sales, the entire American
32:16
movie industry generated $8.7
32:19
billion in 2024. It's
32:21
almost four times. At
32:23
the core of most
32:25
Japanese storytelling reflected in
32:27
their anime. is honor,
32:29
courage, friendship, and hard
32:31
work. Is it a
32:33
surprise it is poised
32:35
to surpass the American
32:37
movie slash TV industry
32:39
which produces soldiers, corporate,
32:41
agenda-driven, dispassionate, contrived, and
32:43
predictable content? The Japanese
32:45
storytelling motto never give
32:47
up, American storytelling motto,
32:49
life is meaningless. Not
32:51
wild. There was of
32:53
course about a billion
32:55
memes about this. The
32:58
spectator index wrote, just
33:00
in, the US has
33:02
imposed tariffs on the
33:04
Australian territory of herd
33:06
and McDonald Islands, which
33:08
is uninhabited by humans,
33:10
but has colonies of
33:12
seals and penguins. I'm
33:14
always a joke, the
33:16
penguins are saying, you
33:18
think you can deport
33:20
us? We've been dealing
33:22
with ice for centuries.
33:24
Yeah. Of
33:28
course, the reason why you have
33:30
to include these things is that
33:33
if you don't include them, then
33:35
people would just set up little
33:37
shops or corporations there and then
33:39
be exempt from the tariffs. So
33:41
that's kind of how it works.
33:43
But yeah, of course, people do
33:45
find it funny and I get
33:47
that it's funny, but it's mid-wit
33:49
humor to put it mildly. Somebody
33:53
wrote, if you would
33:55
write out every number,
33:58
one, two, three, etc.
34:00
You wouldn't use the
34:02
letter B until you
34:04
have reached one. billion
34:07
dollars, one billion. That's
34:09
true. I guess that's
34:11
true, right? Now I
34:13
sure you remember me
34:16
talking about how sort
34:18
of pretty fake the
34:20
economy is as a
34:22
whole in America. Well,
34:25
most Western countries. So
34:27
this guy wrote almost
34:29
every year since 2008.
34:31
if you take out
34:34
increases in government spending.
34:36
And he wrote, that's
34:38
what radicalized me. Very
34:40
true. and it has
34:42
about 500 million neurons
34:45
more than the spinal
34:47
cord. It actually produces
34:49
90% of the body
34:51
serotonin, which affects mood.
34:54
This is why the
34:56
gut feelings are a
34:58
literal thing. That's very
35:00
true. Cut instincts, gut
35:03
feelings and so on.
35:05
Very interesting. Can I
35:07
cut with a great
35:09
quoted Stephen Moore, who
35:12
was saying, he said,
35:14
our tariffs, this is
35:16
the US, our tariffs
35:18
are about three to
35:20
five percent, and many
35:23
other countries are above
35:25
20 percent. The
35:28
US does have the lowest tariffs
35:30
virtually in the world and as
35:33
Trump has been saying these other
35:35
countries are ripping us off. In
35:37
2023, the United States had the
35:40
lowest trade barriers among G20 nations
35:42
and imposed lower tariffs than most
35:44
of them. Trump's right, we don't
35:47
have free trade and America is
35:49
getting ripped off. All
36:00
right, we'll get back to
36:02
your questions and comments. Can
36:04
we do a vote on
36:06
whether we want staff to
36:08
watch and review an anime?
36:10
I do that. I'll tell
36:13
you what, just hit me
36:15
with some animates. I used
36:17
to watch Dragon Ball Z.
36:19
I've never seen an anime.
36:21
Anime still tells stories. Western
36:23
movies. etc. about the message
36:25
in subversion. Yeah, true. There's
36:27
nothing organic in Western art
36:29
anymore. Hasn't been for decades.
36:32
Um, of Robo says, I
36:34
think that the anime motto
36:36
is actually, this series never
36:38
ends. They go on forever.
36:40
Good evening, James. Nice to
36:42
have you, you right? There
36:44
are strange elements in anime
36:46
series, but on our loyalty.
36:48
I remember really enjoying seeing
36:50
it portrayed. Okay, let me
36:53
just make some notes of
36:55
these. We got Attack on
36:57
Titan. And what's the other
36:59
one here? Death note. Death
37:01
note. Is that a movie?
37:03
Attack on Titan is very
37:05
violent. Grave of fireflies. I
37:07
used to when I was
37:09
a kid, when we moved
37:12
to Canada. There used to
37:14
be show on in the
37:16
morning from 830 until 9.
37:18
I never got to watch
37:20
the end of it because
37:22
I had to get to
37:24
school. Oh! Star Blazers! Star
37:26
Blazers! Star Blazers! We're off
37:28
to outer space to save
37:30
the human race! Anyway, it
37:33
was, uh... It was fun.
37:35
And it was half... I
37:37
guess it was half anime
37:39
and half not anime, because
37:41
some of the characters were
37:43
more realistic and some were
37:45
less. Yeah,
37:51
I don't want any
37:53
like sex tentacle stuff.
37:55
Thank you very much
37:58
Yeah, I don't want
38:00
anything too violent. I
38:02
don't want anything with
38:04
this grotesque, you know,
38:06
squealing Japanese girls being
38:08
penetrated by weird squid
38:11
beasts, I don't want
38:13
any of that nonsense.
38:15
You know, if you're
38:17
going to give me,
38:19
you know, good honor-based
38:21
and all of that,
38:24
then give me that.
38:26
If you could. Thank
38:28
you. And
38:33
I watched it on
38:35
the recommendation of an
38:37
FDR listener, I don't
38:39
know, 15 years ago,
38:41
found it a little
38:43
too weird. I watched
38:45
it again, didn't find
38:47
it that weird. Maybe
38:49
I've just become less
38:51
sensitive to weird. But
38:53
yeah, but it's not
38:55
anime? I guess it's
38:57
anime, right? Well, okay,
38:59
what is the definition
39:01
of anime? What is
39:03
anime? It's not all
39:05
Japanese comics, is it?
39:07
I've heard a lot
39:09
of references to Tengen
39:11
Topa Gavin Lagan. I
39:14
think I'm having a
39:16
stroke. I'll make a
39:18
note of that one
39:20
too. I might have
39:22
to ask my daughter.
39:24
Not that she's an
39:26
anime, but she might
39:28
know people who are.
39:34
Yuki, I'll make
39:36
a note of
39:39
that. Thanks. How
39:42
boy bebop? I've
39:44
heard of that.
39:47
It's a space
39:50
western followed outlaws
39:53
might be worth
39:55
washing a couple
39:58
of episodes to
40:01
see what you
40:04
think. Now Ruto
40:06
is another one?
40:09
Okay. Thank you.
40:12
Are they hard
40:15
to get these?
40:29
Yeah, but attack on
40:31
Titan is the violent
40:33
one, right? Cowboy Bebopop
40:35
is similar to Firefly.
40:37
Yeah, I was still
40:40
going to say that,
40:42
right? Cowboy Bebopopop. Okay,
40:44
I will check that
40:46
out. Thank you. Somebody
41:07
says, my brother was
41:09
big into Naruto, follows
41:11
a group of children
41:13
through moral and otherwise
41:15
trials and tribulations. Thank
41:17
you. I appreciate that.
41:20
You just sign up
41:22
for crunchy roll to
41:24
get most anime. Crunchy
41:26
roll. All right. Gets
41:28
most enemy. Now we'll
41:30
check that out. Thank
41:32
you. I assume they
41:34
have a trial. Thank
41:40
you for the link to
41:42
ugio, ugio, ugio, ugio. I'm
41:45
going to have to look
41:47
up how to pronounce these
41:50
things, I assume. Thank you.
41:52
Oh, that's on Netflix. Okay,
41:55
I'll check it out. All
41:57
right. On the topic of
42:00
products of Japan, have you
42:02
seen... seen people complaining because
42:05
the Nintendo switch too is
42:07
delayed because of the tariffs.
42:10
A grave of fireflies. Yeah,
42:12
my mother would have been
42:15
one of those. Well, not
42:17
in Japan, but in Germany.
42:20
Same, same kind of thing.
42:22
Thank you. somebody
42:34
says you can strike your child
42:36
but not your dog. Yet you
42:38
can leave your dog in the
42:40
car but not your child, what
42:42
a confused species humanity is. Yeah.
42:44
If we took the love that
42:46
people had for their pets and
42:48
put them into our children we
42:51
would have a transformed planet I
42:53
think. You truly transformed planet planet.
42:55
And if you're listening to this
42:57
later then you want to suggest
42:59
an anime. You can just email
43:01
me host H-O-S-O-S-O-S- Free-Free at Free
43:03
domain. Akira
43:06
is a much loved
43:09
anime movie. Thank you.
43:12
All right, appreciate that.
43:14
Thank you. All right,
43:17
so I'll move on.
43:19
I've got more than
43:22
enough, but I appreciate
43:25
that. I appreciate that.
43:39
All right, any other questions,
43:41
thoughts, issues, challenges, problems? Very
43:43
much appreciate you guys dropping
43:46
by tonight. Ghost in the
43:48
shell is also a grand
43:50
daily enemy. Hide in your
43:53
shell! Thank you, I appreciate
43:55
that, I grab that. Sorry,
44:14
this is a... Let's see here.
44:16
Do you know of Alan de
44:18
Beton? Alan a la de Beton
44:20
of School of Life? He recently
44:22
appeared on Chris Williamson Show and
44:25
had a marvelous conversation. Why does
44:27
that seem familiar? How's the earbuss
44:29
thing doing? It was better earlier
44:31
today. I did a show on
44:34
Taris for donors earlier today and
44:36
then I put these clamp on
44:38
headphones so I think it kind
44:40
of irritated the earbuss. So... It's
44:42
not as good this evening, but
44:45
it was certainly fine earlier today.
44:47
So it's, you know, like things
44:49
certain, sometimes things, physical things, they
44:51
improve, kind of zigzag, like better
44:53
worse, better worse, but generally, you
44:56
know, one step forward, two steps
44:58
back. So it's certainly better. All
45:00
right. And somebody writes,
45:02
the one thing good people could learn
45:05
from bad people is to just do
45:07
what they want instead of limiting themselves.
45:09
Like not doing something that isn't harmful
45:12
just because they need permissions for example.
45:14
They are children and need to grow
45:16
up. Hmm. I'm not sure what that
45:19
means. When you do sales on in
45:21
your software company, what do you think
45:23
is a reasonable sales rate? One to
45:26
five percent of companies who reached out
45:28
purchases, purchased. Yeah, yeah, yeah, I would
45:30
say one to five percent is about
45:33
right. Especially it was less in the
45:35
very beginning because we were just starting
45:37
out and the software that I wrote
45:40
was new to the field. There was
45:42
nobody else doing it when we started.
45:44
So we had to really educate. You
45:47
know, if you have software as a
45:49
service, it's kind of people understand it
45:51
and you're just one of many, right?
45:54
A match three game. Everybody kind of.
45:56
understands that as a whole. know what
45:58
they're talking about. But when you have
46:01
an entirely new software offering, that's tough.
46:03
That's tough. Because you have to educate
46:05
people on that it even exists. It's
46:07
not a market. And then you spend
46:10
a little bit of time building that
46:12
up and then what happens is larger
46:14
companies say, oh, if there's a market
46:17
for us and they come in with
46:19
all of their experienced sales people and
46:21
existing customer contracts and all of that.
46:24
So yeah, it's quite a lot. But
46:26
yeah, I say. I was, I didn't
46:28
really do much cold calling. That was
46:31
generally the job of the sales people.
46:33
I was director of marketing, so I
46:35
created a knowledge of the software space
46:38
as a whole. But I did not
46:40
in general do the cold calling. I
46:42
did, oh, I don't know, this is
46:45
a while ago, so what I did
46:47
was there are databases where you can
46:49
find out a lot about public. companies,
46:52
right? All the public companies publish all
46:54
of their inner profits and losses and
46:56
income and expenditures and all of that.
46:59
And so I wrote code to get
47:01
that from a database and then create
47:03
mailouts that would go out to clients
47:06
that would say, you know, based upon
47:08
your expenditures of this and this, based
47:10
upon the cost savings that we can
47:13
provide because of this, this and this.
47:15
you know we can make a very
47:17
strong business case that our software can
47:20
pay for itself within 12 to 18
47:22
months and after that it's pure profit
47:24
and I had drafts and charts and
47:27
everything was just beautifully formatted reports that
47:29
went out to like a thousand different
47:31
companies and that was huge that went
47:34
out to like a thousand different companies
47:36
and that was huge that was huge
47:38
for us because it really looks because
47:41
I had coded it all it really
47:43
looked like we had individually prepared a
47:45
detailed presentation for each company. that I
47:48
did to raise awareness. And of course
47:50
I would go to conferences and I
47:52
would chat with people and we'd have
47:54
a little jar, you put your business
47:57
card in and then you win an
47:59
iPod. At the end of the
48:01
day, this is back when they were hard,
48:03
hard, hard drive IPots. So, I did a
48:05
lot of travel, I presented a lot of
48:08
places, and I would generally go down and
48:10
do the presentation of the software when the
48:12
sales people had it and then let them
48:14
hammer out the sort of business details, which
48:16
was not always wise. But, yeah, you just
48:18
have to get used. It's not really a
48:20
lot of rejection, right? You just have to,
48:22
you know, think of all of the women
48:24
that you find unattractive that you find unattractive,
48:26
unattractive, and then you'll understand that you'll understand
48:28
that you'll understand that. there are women who
48:30
find you unattractive and it's not it's not
48:32
terrible or bad it's just not a particular
48:34
type of thing so I think of all
48:36
the people who would love to sell you
48:39
stuff think of all the emails that come
48:41
in all the text messages or whatever you've
48:43
got right so I think of all the
48:45
people that you would like they would like
48:47
to sell you stuff and you say no
48:49
so you just have to recognize that you
48:51
just try to find the right fit for
48:53
people to really get value out of what
48:55
it is that you're doing Yeah,
49:02
so in Canada, this is back in
49:04
the day, there's these things called phase
49:06
one environmental side assessments. So if you're
49:09
buying some land, you want to make
49:11
sure there wasn't like a battery plant
49:13
or a gas station or something with
49:16
a lot of pollution because then you
49:18
buy the land and then you dig
49:20
down, you find a bunch of crap,
49:23
you've got to have it all remediated
49:25
and cleaned up and cleaned out. So
49:27
you do this sort of side assessment
49:30
and our software automated that process to
49:32
a large degree. and then you could
49:34
figure out whether you wanted the site
49:36
or not, whether it had, whether they
49:39
were underground storage tanks there before that
49:41
you need to remove or even above
49:43
ground storage tanks sometimes. So whether there
49:46
were any kind of VOCs, volatile organic
49:48
chemicals that were used on the site,
49:50
that kind of stuff, right? So you
49:53
could have a real portfolio of all
49:55
of your environmental liabilities and issues, which
49:57
was very important for legal reasons. And
50:00
yeah, super fun, yeah, like a super
50:02
fun site, right. Now of course the
50:04
Superfund sites were 80% of the money,
50:07
just went to lawyers, didn't even go
50:09
to clean up. Love Canal? You got
50:11
to look into Love Canal. It's not
50:13
a, I mean it was a very
50:16
striking name, but it's not, probably not
50:18
exactly what you remember. Love Canal was
50:20
a little overhyped, overhyped, over pumped, so
50:23
to speak. Have you pumping on the
50:25
Love Canal. Hey, we're back to AI.
50:27
AI, only fans. Now,
50:30
I remember the very first
50:32
software that I sold, I
50:35
remember that, it sold for
50:37
$5,000. Seems like all the
50:40
money in the world. So,
50:42
I did $5,000 and then
50:45
I think one of the
50:47
later ones was 1.25 million
50:50
US. So, we had some
50:52
growth. We had some growth.
50:55
We had some growth. And
51:05
I did a lot of work.
51:07
I built this whole thing called
51:09
the database builder, where you would
51:11
go down with the client and
51:13
you'd say, okay, so what do
51:16
you want the database to look
51:18
like? What matches your, what matches
51:20
your data? What do we have
51:22
to integrate with? And I wrote
51:24
a whole program where the client
51:26
would fill out spreadsheets or maybe
51:28
do it online. And then when
51:31
they had put together all of
51:33
the changes they wanted to the
51:35
system, then my code would go
51:37
and... change the system for them.
51:39
Change the data fields, the tables,
51:41
the queries, the forms, the reports,
51:43
the query forms, the whole thing.
51:46
It was wild. And it would
51:48
change also everything on the web
51:50
interface because it was all metadata.
51:52
It was a really great, great
51:54
code. Let's see here. If you
51:56
were 24 and had $10,000, would
51:58
you put it towards therapy or
52:01
towards... a business? Well, I would
52:03
say that if I didn't have
52:05
a business partner, I would put
52:07
it towards therapy, unless I'd had
52:09
a pretty good child, but in
52:11
which case, I would put it
52:14
towards the business. So there's a
52:16
lot of variables involved in that.
52:18
Therapy was really some of the
52:20
best money I'd ever invested or
52:22
spent in my life. I would
52:24
not have been able to get
52:26
married to my wife if I
52:29
had not gone through therapy, at
52:31
least I don't think so. So
52:33
if you had a bad childhood,
52:35
I think putting some money into
52:37
therapy is a good idea. Starting
52:39
a business at 24 without a
52:41
partner is pretty risky because there's
52:44
a lot to learn. I didn't
52:46
start a business. I co-founded a
52:48
business. I didn't start a business
52:50
entirely on my own. Oh, I
52:52
guess this one. That's... I was
52:54
older. That really depends. Sorry, I
52:56
hate to say that kind of
52:59
depends, but it kind of does.
53:01
If your child was okay, I'd
53:03
put it into the business, but
53:05
I would wait. I would wait
53:07
until I had a partner, somebody
53:09
who had more experience or least
53:12
some knowledge of how to get
53:14
a business going. And if I
53:16
had a bad child, I would
53:18
put it into therapy. If I
53:20
were in your shoes. All right.
53:22
Any last questions. Going
53:25
once, going twice. I'm all
53:28
years. One buzzy, one good.
53:51
All right, we just get two
53:54
other questions. Thank you so much.
53:56
Yeah, thank you guys for dropping
53:58
by tonight I'm sorry We're still
54:01
doing shows less than two or
54:03
three hours, but things are What
54:05
was the best thing you learned
54:07
in therapy? The best thing that
54:10
I learned in therapy was to
54:12
take my instincts with great seriousness.
54:14
You know, to peel you apart
54:17
from your instincts is the fundamental
54:19
goal of the propagandist and Lord
54:21
knows we're surrounded by little about
54:23
propaganda these days. So what I
54:26
learned, the most important thing that
54:28
I learned in therapy was to
54:30
take my instincts with great seriousness,
54:33
to take my dreams, my, you
54:35
know, that, the sense that I
54:37
was talking about earlier. Take yourself
54:39
very seriously. Take your thoughts, your
54:42
instincts, your suspicions. It's very easy.
54:44
You know, all of this language
54:46
is thrown at us to separate
54:49
us from our instincts. And, you
54:51
know, I'm not alerts, I'm paranoid,
54:53
right? I'm not legitimately angry, I'm
54:55
unreasonable, I'm aggressive, or whatever, right?
54:58
You can sort of... go through
55:00
all of this language. But the
55:02
language that is is hacked into
55:05
us to separate us, it's almost
55:07
like taking a sneaker and sawing
55:09
the soul off, right? But what
55:11
I learned through therapy was to
55:14
take my instincts, my gut sense,
55:16
my dreams, my intuitions, You know
55:18
trust but verify right? Trust but
55:21
verify So yeah, I would trust
55:23
my instincts. I wouldn't just act
55:25
on them because we're a combination
55:27
of things right mind body heart
55:30
soul that kind of stuff right?
55:32
Somebody says I've always felt that
55:34
when I don't follow my instincts
55:37
things go wrong. Yeah, are you
55:39
ever going to come back to
55:41
X? Not sure Somebody says, what
55:43
do you think of the avenues
55:46
by which to communicate with your
55:48
unconscious? Dream analysis, sentence, completion, exercise,
55:50
that sort of thing. What are
55:53
your favorite such avenues? Dream analysis
55:55
is very tough to do on
55:57
your own. So if you've got
55:59
someone you can talk about it
56:02
with who's good at that sort
56:04
of stuff, I think that's really,
56:06
really helpful. Since completion exercises, they
56:09
can be good as well. meditation,
56:11
I think, is very good just
56:13
to physically relaxing and letting your
56:15
instincts bubble up can be very
56:18
helpful. There's
56:22
a lot of workbooks. I've talked about
56:25
these before. Nathaniel Brandon has them, John
56:27
Gray has them, other people have them,
56:29
sort of workbooks to try and figure
56:31
out what's going on in your unconscious.
56:33
Very helpful. Somebody says, how can you
56:35
tell if a therapist is any good?
56:37
Had one who was encouraging me to
56:39
stay in touch with a crazy family
56:41
of origin because, quote, family. Now, I
56:43
always forget this number. I don't know
56:46
why have this bizarre block about it,
56:48
but let me find it. Let me
56:50
find it. Let me find it for
56:52
you. All right.
56:54
I think it's
56:57
1927. Yes, FDR
56:59
1927. I'll put
57:01
the MP3 link
57:03
here in the
57:06
chat. That's my
57:08
best thoughts. My
57:10
God, how old
57:13
is that show
57:15
now? June 8th,
57:17
2011. Wow. Almost
57:19
14 years ago.
57:23
That's a lot of
57:25
dog years. But yeah,
57:27
so I have a
57:29
thought on that. Yeah,
57:32
FDR 1927, how to
57:34
find a great therapist?
57:36
I'm trying to think
57:38
of a great question
57:41
to keep you live,
57:43
ha ha ha, thanks
57:45
again, donation to a
57:47
minute. All right. I
57:54
mean, yeah, it's funny because
57:56
therapists generally are not moralists.
57:58
So, there's that challenge. I
58:00
think that they'd be better
58:02
off if they were moralists,
58:05
but that can cause some
58:07
significant problems. But they're not
58:09
moralists. So I think what
58:11
you want is a therapist
58:13
who's really focused on what's
58:15
best for you, not trying
58:17
to impose an agenda. I
58:20
think that's really good. Hey,
58:22
Staff, do you read home
58:24
math tweets? I do. Yeah,
58:26
he's a doodler. And very
58:28
incisive and bitter, I get
58:30
it. I mean, I'm very
58:32
incisive and bitter, I get
58:35
it, I get it. which
58:37
I understand is not quite
58:39
as easy for younger men
58:41
these days. Well, that wasn't
58:43
easy for me either. But
58:45
yes, I have read Homath's
58:47
tweets and he's a very,
58:49
definitely a smart guy. Definitely
58:52
a smart guy. He has
58:54
a great intuitive grasp of
58:56
both female and male hypocrisy.
59:05
People are typing. I can
59:07
see it. All right. Oh,
59:10
I missed a couple here.
59:12
Have you spoken about love
59:14
bombing and red-pilling combined in
59:16
a new relationship? Thoughts if
59:19
no. Love bombing and red-pelling.
59:21
I'm not sure what you
59:23
mean, but I mean, I
59:26
know what the two-phone terms
59:28
mean individually, but I'm not
59:30
sure how that. Yeah,
59:34
hi Steph, why do narcissists colloquial engage
59:36
in smear campaigns? Because they want to
59:38
protect their territory, they want to protect
59:41
their turf, and they want to protect
59:43
those they're praying upon, so they need
59:45
to keep incisive people who can expose
59:47
their methods far away from a social
59:49
group. So they will project all of
59:51
their negative characteristics onto, usually the innocent
59:54
and accuse them of things that they
59:56
themselves are doing. So it's a form
59:58
of territorial, right? The
1:00:00
narcissists usually have
1:00:02
a kind of
1:00:04
stable of people
1:00:06
that they're exploiting
1:00:09
and they need
1:00:11
to keep those
1:00:13
with perceptiveness and
1:00:15
moral courage away
1:00:17
from their victims
1:00:19
so they can
1:00:21
continue to exploit
1:00:23
them. So the
1:00:25
smear campaigns are
1:00:27
this person is
1:00:29
bad and to
1:00:31
even question that
1:00:33
is wrong and
1:00:36
trust me bro,
1:00:38
all that kind
1:00:40
of stuff. So
1:00:42
it's just a
1:00:44
form of maintaining
1:00:46
control over the
1:00:48
people that are
1:00:50
exploiting, if that
1:00:52
makes sense. Thank
1:01:40
you very much for the donation.
1:01:42
I really really do appreciate that
1:01:45
Did you read the one about
1:01:47
video games and anime many angry
1:01:49
replies that remind me of weed
1:01:51
Alex when you point out that
1:01:53
they are addictive? Oh, thank you.
1:01:55
I will Have a look at
1:01:58
that. Let's see Let's
1:02:09
see here. Okay, I will
1:02:11
read that. That's quite long
1:02:14
too. Go on this as
1:02:16
a whole. So, I mean,
1:02:18
there's a couple of things
1:02:21
I really don't like about
1:02:23
anime, which is the combination
1:02:25
of baby faces with adult
1:02:28
female bodies. I find that
1:02:30
a little creepy. And again,
1:02:32
I'm not saying that to
1:02:35
everyone, but that's everyone, but
1:02:37
that's a lot. How
1:02:41
would you decide if
1:02:43
it's appropriate to defend
1:02:45
yourself against the campaign?
1:02:47
Well, you know, if
1:02:49
people, usually with the
1:02:51
narcissist, you don't notice
1:02:53
anything directly. You simply
1:02:56
notice a slight diminishment
1:02:58
or maybe a not
1:03:00
so slight diminishment of
1:03:02
social invitations and positive
1:03:04
feedback and curiosity and
1:03:06
contact and so on.
1:03:08
as a victim of
1:03:10
a smear campaign, then
1:03:12
things just kind of
1:03:14
seem to fall away.
1:03:16
Things just kind of
1:03:18
diminish. You don't get
1:03:20
the invites, you don't
1:03:22
like people kind of,
1:03:24
a little bit of
1:03:26
sort of closing the
1:03:28
walls and maybe somebody
1:03:30
will tell you. Well,
1:03:32
so-and-so said such and
1:03:34
such about something, right?
1:03:36
If you decide to
1:03:38
fight, it's going to
1:03:40
be extraordinarily volatile. I
1:03:42
just... say that straight
1:03:44
up. I'll just say
1:03:47
that straight up because
1:03:49
it is the NARS
1:03:51
assist in general views
1:03:53
any challenge to control
1:03:55
an authority as a
1:03:57
battle to the death.
1:03:59
So I think you
1:04:01
want to say the
1:04:03
truth. and those people
1:04:05
who care about you
1:04:07
will listen to the
1:04:09
truth and they'll make
1:04:11
the case and so
1:04:13
on and I would
1:04:15
definitely say that it
1:04:17
is very volatile and
1:04:19
most people most people
1:04:21
in the world like
1:04:23
90% of people they
1:04:25
really hate being caught.
1:04:27
to opposing moral forces.
1:04:29
They hate it. I
1:04:31
mean, it's funny because
1:04:33
everybody wants to watch
1:04:35
movies about this and
1:04:38
read stories about this
1:04:40
and the heroic and
1:04:42
fighting and good and
1:04:44
evil and so on,
1:04:46
but in the reality
1:04:48
is that most people
1:04:50
are desperate to avoid
1:04:52
any kind of moral
1:04:54
danger. And in general.
1:04:57
It's probably in my
1:04:59
opinion. It's probably not
1:05:01
worth fighting for a
1:05:03
community where the the
1:05:05
Smira the rumor spreader
1:05:07
where that person has
1:05:10
authority and that person
1:05:12
has control It's probably
1:05:14
not worth fighting because
1:05:16
you're probably just gonna
1:05:18
lose So
1:05:24
I think it's always worth trying
1:05:27
to get the truth out. It's
1:05:29
always worth getting the truth out,
1:05:31
but in general, people, they do
1:05:33
like their gossip. And I mean,
1:05:36
it's a minor weakness of mine,
1:05:38
if that's of any consolation, but
1:05:40
people do like their gossip. And
1:05:43
one of the ways that the
1:05:45
cruel people slowly pull others into
1:05:47
doing bad things is that they
1:05:49
get them to repeat salacious gossip,
1:05:52
which is why... you know, just
1:05:54
almost every conceivable moral system in
1:05:56
the universe tells you to avoid
1:05:58
gossip like the plague. And the
1:06:01
reason for that is that when
1:06:03
you get involved in gossip and
1:06:05
you repeat things, that turn out
1:06:07
to not be true. That's really
1:06:10
what gossip is. If it's true,
1:06:12
it's not really so much gossip.
1:06:14
It's exaggerated or distorted or something
1:06:16
like that. But you can't take
1:06:19
it back. Right? Once you have
1:06:21
used your words to create the
1:06:23
impression, a negative impression of something
1:06:25
or someone in someone else's mind,
1:06:28
you can't take it back. At
1:06:30
least not without a lot of...
1:06:32
a lot of work and apology
1:06:34
and all that kind of stuff.
1:06:37
So, yeah, try to avoid those
1:06:39
kind of, I've always been very
1:06:41
careful to try and make sure
1:06:43
that I don't repeat anything that
1:06:46
doesn't seem to be pretty true.
1:06:48
I mean, Lord knows. It's happened
1:06:50
to me once or twice over
1:06:53
the years. But
1:06:55
you can defend yourself, it's worth getting
1:06:57
the truth out, but for the most
1:06:59
part, most people will simply, most people
1:07:01
will simply bow to whoever has the
1:07:04
most power. Most people will simply bow
1:07:06
to whoever has the most power, and
1:07:08
it's the most willing to use it.
1:07:10
This is why, you know, bad people
1:07:12
kind of run the world and good
1:07:14
people hide like mammals at the feet
1:07:17
of dinosaurs. Because most people will... Simply
1:07:19
say, oh, well, you're a nice person,
1:07:21
you're a reasonable person, so you're not
1:07:23
going to attack me if I disagree
1:07:25
with you, this person is a very
1:07:27
crazy, aggressive person, so they will attack
1:07:30
me if I disagree with them, so
1:07:32
I'm afraid I'm going to have to
1:07:34
side with them against you, and it's
1:07:36
just the way that it is. And
1:07:38
until childhood is generally improved as a
1:07:40
whole, it's probably going to maintain itself
1:07:43
as the standard. All
1:07:56
right. I
1:07:58
think Somebody
1:08:02
says, used to used to think freedom
1:08:04
was the most important in life. in
1:08:07
Now I realize it is the
1:08:09
truth. Otherwise you end up with
1:08:11
fake freedom, fake you fake health, with
1:08:13
fake freedom, very true. love, fake health, true. Yeah.
1:08:16
Yeah, Rachel. Rachel. All right, well, thank you everyone right.
1:08:18
Well, thank you everyone so much
1:08:20
for a lovely philosophy. Have philosophy. Have a
1:08:22
a beautiful, beautiful night. We'll talk
1:08:24
to you on We'll talk to you on Sunday
1:08:26
morning. And you for all of the
1:08:28
people who are showing interest in
1:08:30
my new book. I am working
1:08:32
hard on it and I appreciate
1:08:34
everybody's thoughts about it. thoughts about it. I
1:08:36
love writing fiction. It is such a, I mean,
1:08:38
it's a I mean, it's a challenge
1:08:40
for sure, but I really do.
1:08:42
I I really do love it.
1:08:44
And you you everyone who listened really enjoying
1:08:46
the new the new books. I'll give new
1:08:48
book, I'll give you guys some
1:08:50
more when I'm ready to roll.
1:08:52
slash.com slash. And if you'd like to.com out the show. If
1:08:54
you'd like to help out the
1:08:56
show. Take care. Have a great Take care.
1:08:58
Have a great night. a Bye. night. Bye.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More