Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Banking with Capital One helps you
0:03
keep more money in your wallet
0:05
with no fees or minimums on
0:07
checking accounts and no overdraft fees.
0:09
Just ask the Capital One Bank
0:11
Guy. It's pretty much all he
0:13
talks about. In a good way.
0:15
He'd also tell you that this
0:17
podcast is his favorite podcast too.
0:19
Ah, really? Thanks Capital One Bank
0:21
Guy. What's in your wallet? Terms
0:23
Apply. See Capital one.com/Bank. Capital One
0:25
N-A member FDIC. In
0:32
this episode, you will be listening
0:34
to Mental Mastery vs Mental Slavery, why
0:36
99 % stay trapped, with you will
0:38
know a Harari. Get
0:41
access to the Resilient Mind Journal by clicking the
0:43
link in the show notes. Well,
0:47
I think this is maybe the
0:49
most important thing to know about living
0:51
right now in the 21st century
0:53
that we are now hackable animals.
0:55
We have the technology to decipher what
0:59
you think, what you want
1:01
to predict human choices, to manipulate
1:04
human desires in ways which
1:06
were never possible before. Basically,
1:09
to hack a human being, you need
1:11
two things. You need a
1:13
lot of data, especially
1:15
biometric data, not just about where you
1:17
go and what you buy, but
1:19
what is happening inside your body and
1:21
inside your brain. And secondly,
1:23
you need a lot of computing power. to
1:26
make sense of all that
1:28
data. Now previously in
1:30
history, this was never
1:32
possible. Nobody had enough
1:35
data and enough computing power
1:37
to hack human beings.
1:39
Even if the KGB or
1:41
the Gestapo followed you
1:43
around 24 hours a day,
1:45
eavesdropping on every conversation
1:47
you had, watching everybody you
1:49
meet, still they did
1:51
not have the biological knowledge.
1:54
to really understand what's happening
1:56
inside you and they
1:58
certainly didn't have the computing
2:00
power necessary to make sense even
2:02
of the data they were
2:05
able to collect. So
2:07
the KGB could not really
2:09
understand you, could not really
2:11
predict all your choices or
2:13
manipulate all your desires and
2:15
so forth. But now it's
2:17
changing. What the KGB
2:20
couldn't do corporations and
2:22
governments today are beginning to
2:24
be able to do. And
2:26
this is because of the
2:28
merger of the revolution in
2:31
biotech. We are getting better
2:33
in understanding what's happening inside
2:35
us, in the body, in
2:37
the brain. And at
2:39
the same time, the revolution
2:41
in infotech, which gives us
2:43
the computing power necessary. When
2:45
you put the two together,
2:47
When infotech merges with biotech,
2:50
what you get is the
2:52
ability to create algorithms that
2:54
understand me better than I
2:56
understand myself. And then
2:58
these algorithms cannot just predict my
3:00
choices, but also manipulate my desires
3:03
and basically sell me anything, whether
3:05
it's a product or a politician.
3:07
Yeah, this is one of the things you can
3:10
do. Then you can predict, you can
3:12
manipulate, you can
3:14
eventually also re -engineer
3:16
or replace. If you
3:18
really hack a system, you really
3:20
understand how it functions, then usually
3:22
you can also re -engineer it, or
3:24
you can completely replace it. And
3:27
again, one of the dangers that
3:29
we are facing today in the
3:31
21st century is that computers and
3:33
AI would be able to replace
3:35
humans in more and more tasks
3:37
and maybe push millions of humans
3:39
out of the job market as
3:41
a result. The thing
3:43
about this ability to hack
3:45
humans is that it has
3:47
also potentially tremendous positive consequences
3:49
and this is why it's
3:51
so tempting. If it
3:53
was only bad then it would
3:55
have been like an easy
3:57
deal to say, okay, we don't
4:00
want that and let's stop
4:02
researching or going in that direction.
4:04
But it is extremely tempting
4:06
because it can provide us, for
4:08
example, with the best healthcare
4:10
in history, something which goes far
4:12
beyond anything we've seen so
4:14
far. This can mean that maybe
4:17
in 30 years, the poorest
4:19
person on the planet can get
4:21
a better healthcare from her
4:23
or his smartphone than the richest
4:25
person today gets from the
4:27
best hospitals and the best doctors.
4:30
The kind of things you can
4:32
just know about what's happening
4:34
in your body is nothing like
4:36
we've seen so far. The
4:38
Stanford algorithm, actually there is a lot
4:41
of problems with that research and let's put
4:43
it aside, but first key
4:45
message from that is
4:47
how little people actually know
4:49
about themselves. And
4:51
one of the most important
4:53
things in my life and also
4:56
I think in my scientific
4:58
career was the realization of how
5:00
little I know about myself
5:02
and humans in general. There were
5:04
so many important ideas and
5:07
important facts we don't realize about
5:09
ourselves. I was 21. when
5:11
I finally realized that I was
5:13
gay, which is, you know, when
5:15
you think about it, it's absolutely amazing. I
5:17
mean, it should have been obvious at age,
5:19
you know, 16, 15. And
5:21
an algorithm would have realized it
5:24
very quickly. And you can build
5:26
algorithms like that today or in
5:28
a few years. You just
5:30
need to follow your eye
5:32
movements, like you go on
5:34
the beach, or you look at the
5:36
computer screen and you see an
5:38
attractive guy and an attractive girl and
5:41
just follow the focus of the
5:43
eyes. Where do the eyes go and
5:45
whom do they focus on? It
5:47
should be very easy. And
5:49
such an algorithm could have told
5:51
when I was 15 that I
5:53
was gay. And
5:55
the implications are really mind -boggling.
5:58
when an algorithm knows such
6:00
an important thing about you
6:02
before you know it about
6:04
yourself. Now, it can go
6:06
in all kinds of directions. It really depends
6:08
on where you live and what you
6:10
do with it. In some countries,
6:13
you can be in trouble now with
6:15
the police and the government. You
6:17
might be sent to
6:19
some re -education facility. In
6:22
some countries, like with
6:24
surveillance capitalism, so maybe
6:26
I don't know about myself that
6:28
I'm gay, but Coca -Cola knows that
6:30
I'm gay because they have these algorithms
6:32
and they want to know that
6:34
because they need to know which commercials
6:36
to show me. Let's say Coca -Cola
6:38
knows that I'm gay and I
6:40
even know it about myself that they
6:42
know it and Pepsi doesn't. Coca
6:45
-Cola will show me a commercial
6:47
with a shirtless guy drinking Coca -Cola
6:49
but Pepsi will make the mistake
6:51
of showing a girl in the
6:53
bikini. And next day, without my
6:55
realizing why, when I go to
6:57
the supermarket, when I go to
6:59
the restaurant, I will order Coca
7:01
-Cola, not Pepsi. I don't know
7:03
why, but they know. So
7:05
they might not even share this kind of
7:07
information with me. Now, if
7:09
the algorithm does share the
7:11
information with me, again, it all
7:13
depends on context. One scenario
7:15
is that you're 15 years old,
7:17
you go to a birthday
7:19
party of somebody from your class,
7:21
and somebody just heard that
7:23
there is this cool new algorithm
7:25
which tells you your sexual
7:27
orientation. And everybody
7:29
agrees it will be a
7:32
lot of fun to just
7:34
have this game that everybody
7:36
takes turn with the algorithm
7:38
and everybody else looking and
7:40
seeing the results. Would you
7:42
like to discover about yourself
7:44
in such a scenario? This
7:46
can be quite a shocking
7:48
experience. But even if it's
7:50
done in complete privacy, it's
7:52
a very deep
7:54
philosophical question. What
7:57
does it mean to
7:59
discover something like that about
8:01
yourself from an algorithm? What
8:04
does it mean about human life,
8:06
about human identity? We
8:08
have very little experience
8:10
with these kinds of things.
8:13
You know, from very ancient times, all
8:16
the philosophers and saints and sages tell
8:18
people to get to know yourself better.
8:20
It's one of the, maybe the most
8:22
important thing in life, is to get
8:24
to know yourself better. But
8:26
for all of history, this
8:28
was a process of self
8:31
-exploration, which you did through
8:33
things like meditation and maybe
8:35
sports and maybe art and contemplation
8:37
and all these things. What
8:39
does it mean? when
8:41
the process of self
8:43
-exploration is being outsourced
8:45
to a big data
8:47
algorithm. And the philosophical
8:49
implications are quite mind -boggling.
8:52
Something as simple as choosing music,
8:54
so you were just dumped
8:56
by your boyfriend or girlfriend, and
8:59
the algorithm that controls
9:01
the music that you listen
9:03
to chooses the songs
9:05
that are the best fit.
9:08
for your current mental
9:10
state. And of course,
9:12
this brings up the question of what
9:14
is the metric? What do you actually want
9:16
from the music? Do you want
9:18
the music to uplift you? Or
9:20
do you want the music to kind
9:22
of connect you to the deepest
9:24
level of sadness and depression? And
9:27
ultimately, we can say
9:29
that the algorithm can follow
9:31
different kinds of instructions. If
9:34
you know, what kind of
9:36
emotional state you want to be
9:38
in, you can just tell
9:40
the algorithm what you want and
9:42
it will do it. If
9:44
you are not sure, you can
9:47
tell the algorithm, follow the
9:49
recommendation of the best psychologist today.
9:52
So let's say you have the five stages
9:54
of grief. So, okay,
9:56
walk me with music
9:58
through these five stages of
10:00
grief. And the
10:02
algorithm can do that better
10:04
than any human DJ. And
10:07
what we really need to
10:09
understand in this regard is that
10:11
what music and most of
10:13
art plays on in the
10:15
end is the human biochemical system,
10:18
at least according to the
10:20
dominant view of art in
10:22
the modern Western world. We
10:24
had different views in different cultures,
10:26
but in the modern Western
10:28
world, the idea of art
10:30
is that art is above
10:32
all about inspiring human emotions. It
10:35
doesn't necessarily have to be joy, great
10:38
art. can inspire also
10:40
sadness, can inspire
10:42
anger, can inspire fear.
10:44
It can be a whole
10:46
palette of emotional states,
10:48
but out is about inspiring
10:50
human emotions. So
10:52
the instrument artists play
10:54
on, and whether it's
10:56
musicians or poets or
10:59
movie makers, they're actually
11:01
playing on the Homo
11:03
sapiens biochemical system. And
11:05
we might reach a
11:07
point quite soon when an
11:09
algorithm knows this instrument
11:11
better than any human artist.
11:14
A movie or a poem
11:16
or a song that will
11:18
not move you, that will
11:20
not inspire you, might inspire
11:22
me. And something that will
11:24
inspire me in one situation
11:26
might not inspire me in
11:28
another situation. And as time
11:30
goes on and the algorithm gathers
11:32
more and more data about me,
11:34
it will become more and more
11:37
accurate. in reading my biochemical
11:39
system and knowing how to play
11:41
on it as if it was
11:43
a piano like okay you want
11:45
joy I press this button and
11:47
out comes the perfect song the
11:49
only song in the world that
11:51
can actually make me joyful right
11:53
now if there is like something
11:56
seriously wrong in my body that
11:58
I don't know about, like, I
12:00
don't know, cancer or something, I
12:02
would like the algorithm to find
12:04
that out. I don't want to
12:06
wait until, I mean, the usual
12:08
process is that it has to
12:10
go through your own mind. You
12:12
can't outsource it. I mean, today,
12:15
when you need to diagnose cancer,
12:17
there are exceptions. But in most
12:19
cases, there is a crucial moment
12:21
when you feel something is wrong
12:23
in my body. and you go
12:25
to this doctor and that doctor
12:27
and you do this test and
12:29
that test until they finally realize,
12:31
okay, we just discovered you have
12:34
cancer in your liver or whatever.
12:37
But because it relies
12:39
on your own feelings, in
12:42
this case, feelings of pain, very
12:44
often it's quite late
12:46
in the process. By
12:48
the time you start feeling
12:50
pain, usually the cancer has
12:52
spread And maybe it's not
12:54
too late, but it's going
12:56
to be expensive and painful
12:58
and problematic to treat it.
13:01
But if we can outsource
13:03
this, don't go through the
13:05
mind, through my feelings. I
13:08
want an algorithm that with
13:10
biometric sensors is monitoring my
13:12
health 24 hours a day
13:14
without my being aware of
13:16
it. It can
13:18
potentially discover this liver cancer
13:20
But it's just a
13:23
tiny, just a few cells
13:25
are beginning to split
13:27
and to spread. And
13:29
it's so easy and cheap and
13:31
painless to take care of it
13:33
now instead of two years later
13:35
when it's already spread and it's
13:37
a big problem. So this is
13:39
something that I think almost everybody
13:42
would sign on to. And this
13:44
is the big temptation because it
13:46
comes with the whole other, the
13:48
long tail of dangers. I
13:50
mean, this algorithm, the
13:52
healthcare system knows almost everything
13:55
about you. So
13:57
one of the biggest battles
13:59
in the century is
14:01
likely to be between privacy
14:03
and health. And
14:05
I guess that health is going to win. Most
14:09
people will be willing to give
14:11
up a very significant amount of
14:13
privacy in exchange for far better
14:15
healthcare. Now, we do need to
14:17
try and enjoy both worlds to
14:19
create a system that gives us
14:21
a very good healthcare but without
14:24
compromising our privacy, keeping the, yes,
14:26
you can use the data to
14:28
tell me that there is a
14:30
problem and we should do this
14:32
or that to solve it, but
14:34
I don't want this data to
14:36
be used for other purposes without
14:39
my knowing it. Whether we can
14:41
reach such a balance, and
14:43
like, you know, have your cake and
14:45
eat it too, that's a big political
14:47
question. So our identity
14:49
is really just a story
14:51
which we constantly construct and
14:53
embellish. I mean, you can
14:56
say that the entire human
14:58
mind is a machine that
15:00
constantly produces stories and especially
15:02
one very important story which
15:04
is my story. And
15:07
different people have specialized
15:09
in different genres. Some people
15:11
build their stories a tragedy. Some
15:13
people build their stories a comedy or as
15:15
a drama. But in
15:17
the end, the self
15:20
is a story and
15:22
not a real thing. And
15:25
on the one hand, with all the
15:27
new technologies, you get
15:29
better and better
15:31
abilities to construct yourself.
15:34
But already today, A lot
15:36
of the work which previously
15:38
was done in the brain and
15:40
in the mind of constructing
15:43
my identity, my story, has been
15:45
outsourced to things like Facebook. That
15:48
you build your Facebook account
15:50
and this is actually outsourcing it
15:52
from the brain and you
15:54
are busy maybe for hours every
15:56
day just building a story.
15:58
and becoming extremely attached to it
16:01
and publicizing it to everybody.
16:03
And you tend to make this
16:05
fundamental mistake. You think this
16:07
is really me. First
16:09
of all, if you take something
16:11
like the profile that people
16:13
create about themselves in Facebook or
16:15
in Instagram, it should be
16:18
obvious. It doesn't really reflect your
16:20
actual existence, your actual reality,
16:22
both inner reality and outer reality,
16:24
like the percentage of time
16:26
you smile in your Instagram account
16:28
is much bigger than the
16:30
percentage of time you smile in
16:32
real life. And you
16:34
go on some vacation and
16:36
you post the images from
16:38
the vacation. So usually you're
16:41
smiling in your swimming suit
16:43
on the beach with your
16:45
girlfriend and boyfriend holding this
16:47
cocktail and everything looks perfect
16:49
and everybody is so envious.
16:52
but actually you just had a
16:54
nasty fight with your boyfriend
16:56
five minutes ago and then this
16:58
is the image that everybody
17:00
else is seeing and thinking
17:03
oh they must have such wonderful
17:05
time and afterwards like a
17:07
year later or two years later
17:09
you look back and this
17:11
is what you see and
17:13
you forget what was the actual
17:16
experience like we constantly edit
17:18
the the story Just
17:20
like the news on TV are edited
17:22
and just like, you know, it's a
17:24
bit like making a movie. Like you
17:26
watch the movie in the cinema and
17:29
everything is so seamless. Like,
17:31
yeah, this is the story, it flows. And
17:33
then when you actually see how
17:35
a movie is produced, this is insane.
17:38
Like, you have this tiny
17:40
bit of a scene. You
17:42
repeat it 50 times and
17:44
sometimes, you know, you shoot
17:46
this scene, this scene two
17:48
comes after scene one, but
17:50
actually it was filmed long
17:52
before that. So sometimes you
17:55
film the breakup of the
17:57
lovers before you film the
17:59
first meeting for all kinds
18:01
of scheduled reasons and locations. So
18:03
the end result
18:06
is completely seamless and
18:08
perfect. but it is
18:10
actually made up from all these
18:12
tiny disconnected bits that have been,
18:14
you know, this is from here
18:16
and this is from there and
18:19
we somehow glue it together and
18:21
it looks good. And it's
18:23
the same with the story of our
18:25
life. It's all kinds of
18:27
bits and pieces and only when
18:29
you tell it to yourself or
18:31
to somebody else, it kind of
18:33
makes sense. The cost
18:36
of trying to stick
18:38
with the reality as it
18:40
is is very, very
18:42
high. It's very difficult,
18:44
it demands a lot of
18:47
effort and it's often very
18:49
painful because you have to
18:51
acknowledge many things about yourself
18:53
that you don't want to
18:55
acknowledge them. People have this
18:58
fantasy of going to some
19:00
retreat and just taking out
19:02
a week or two from
19:04
life. to really observe
19:07
inside, to really explore
19:09
who am I, what is
19:11
my authentic self. And they have
19:13
this fantastic notion that I
19:15
will be able to finally connect
19:17
to my inner child and
19:19
I will discover my true vocation
19:21
in life and I will
19:23
discover all these wonderful things about
19:26
me. And when you actually
19:28
do it, the first
19:30
thing you usually encounter is all
19:32
the things you don't want
19:34
to know about yourself. There is
19:36
a reason that you don't
19:38
want to know them. I think
19:40
it's worth the effort, but
19:42
it's a very, very hard task.
20:04
It's pretty much all he talks about, in
20:06
a good way. He'd also tell you that
20:08
this podcast is his favorite podcast, too. Ah,
20:11
really? Thanks, Capital One Bank
20:13
Guy. What's in your wallet?
20:15
Term supply, ccapitalone.com slash bank,
20:17
Capital One NA member FDIC.
20:21
What if I told you that right
20:24
now, millions of people are living with
20:26
a debilitating condition that's so misunderstood, many
20:28
of them don't even know that they
20:30
have it? That condition is obsessive compulsive
20:32
disorder, or OCD. I'm Dr. Patrick McGrath,
20:34
the Chief Clinical Officer of NoCD, and
20:36
in the 25 years I've been treating
20:38
OCD, I've met so many people who
20:41
are suffering from the condition in silence,
20:43
unaware of just what it was. OCD
20:45
can create overwhelming anxiety and fear around
20:47
what you value most, make you question
20:49
your identity. beliefs and morals, and drive
20:51
you to perform mentally and physically
20:53
draining compulsions or rituals. Over my
20:55
career, I've seen just how devastating
20:58
OCD can be when it's left
21:00
untreated. But help is available. That's
21:02
where NoCD comes in. NoCD is
21:04
the world's largest virtual therapy provider
21:06
for obsessive compulsive disorder. Our licensed
21:08
therapists are trained in exposure and
21:10
response prevention therapy. A specialized treatment
21:13
proven to be incredibly effective
21:15
for OCD. So visit nocd.com
21:17
to schedule a free 15-minute
21:19
call with our team. That's
21:21
N-O-C-D-D-com. .com.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More