Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
We are in a cyber war. We are in
0:02
World War 3. You're in a completely
0:04
different mindset when you're
0:06
in war. You're thinking
0:08
differently, you're acting differently,
0:10
you're paranoid, you're concerned.
0:12
The problem we have in the
0:15
United States, we have peacetime mentality.
0:17
We're acting and behaving like we
0:20
trust people. When links come in
0:22
and people click on it, that's
0:24
peacetime mentality. But the reality is
0:27
if we knew we were at
0:29
war. and we had war, time,
0:32
mentality, and a link came in,
0:34
we wouldn't even think of
0:36
clicking it. Mm-mm. Who
0:38
says tech can't be
0:40
human? What's
0:59
going on, Hacker Valley fam? Welcome
1:01
back to the show. With me, I
1:03
have someone on this episode
1:06
that has paved the way
1:08
for people like myself, this
1:10
quote-unquote area and genre of
1:13
cyber security influencers, but. Just
1:15
like myself, this person, my guest for
1:17
this episode, isn't just someone that talks
1:20
about cybersecurity. They have the chops, they
1:22
have the background, and I would consider
1:24
them a triple OG in the game.
1:27
My special guest for this episode is
1:29
Dr. Eric Cole. Dr. Eric Cole is
1:31
the CEO and founder of Secure Anchor
1:33
Consulting. Eric, when you and your team
1:36
reached out to us on Instagram, I
1:38
was like, Eric. But I'm glad that we
1:40
made this happen. I'm glad that I
1:42
finally get a chance to speak to
1:44
you, because I've been following you for
1:46
a long time. Most importantly, welcome to
1:48
the show. Pleasure to be here, my
1:50
friend, and I'm looking forward to some
1:53
fun conversation. Yes, sir. So let's start
1:55
way back, way, way, way back. When
1:57
I first got on the cybersecurity, you
1:59
were already... figure in
2:01
the industry. I think that you
2:03
were at Sands at the time,
2:05
but when I was doing my
2:08
research for this episode, I saw
2:10
that, you know, not only were
2:12
you at Sands for quite a
2:14
while, but your career transcends just
2:16
that what I knew. Like you
2:19
worked at the CIA, DOD contractors.
2:21
So I want to start back
2:23
in the very beginning. What was
2:25
what was cybersecurity when you first
2:27
learned about it? Was it hacking?
2:30
Was a phone freaking? How did
2:32
you get into this crazy world?
2:34
Yeah, so I actually started at
2:36
the CIA in 1990. It was
2:38
January, mid-January 1990, and just sort
2:41
of put things in perspective, because
2:43
we forget how far we've come.
2:45
The worldwide web wasn't actually invented
2:47
until 92. Like, you didn't have
2:49
a website, so the CIA did
2:52
not have a website in 1990.
2:54
And remember, invented in 1992. means
2:56
the web really become mainstream until
2:58
94-95. Google. Google. It wasn't incorporated
3:00
until 1998. I mean that's crazy
3:02
like eight years later. So when
3:05
I got involved, the internet was
3:07
basically a next computer. If you
3:09
remember, next was a spin-off company
3:11
that Steve Jobs did when Apple
3:13
basically kicked them out. So the
3:16
next computer, it was a next
3:18
computer. And remember CIA headquarters, we
3:20
had a special area on class
3:22
and everything else, and it was
3:24
basically all command line. It was
3:27
just command line, news groups, and
3:29
it was all about research and
3:31
sharing information. It was basically universities
3:33
and companies were on there. I
3:35
mean, there was no business, there
3:38
was no e-commerce, there was none
3:40
of that. And how I got
3:42
involved is, I was working and
3:44
this is all stuff I can
3:46
talk about now, but I worked
3:49
in an area. in the CIA
3:51
called COVCOM or covert communications and
3:53
in 1990. It was all RF.
3:55
It was all radio frequency devices,
3:57
closed circuit, RF. So when you
4:00
going in and had to exchange
4:02
information with somebody in a covert
4:04
manner, you had to be within
4:06
50, 60, 70 feet. which is
4:08
you can imagine is high risk
4:11
because if you and me are
4:13
not supposed to be associated with
4:15
each other we're not supposed to
4:17
be meaning any connection between me
4:19
and you could hurt you hurt
4:22
me put us at risk and
4:24
we had to get within 50
4:26
feet of each other and we
4:28
had to do that weekly from
4:30
a pattern matching observing perspective that's
4:33
huge risk because they're gonna go
4:35
in and say okay how come
4:37
every Friday Eric and Ron are
4:39
in the same coffee shop I
4:41
mean that's how this this was
4:44
cold war Right, 1990 Cold War,
4:46
Russia, US did not get along
4:48
at all. And we're talking about
4:50
using the internet, right? Could we
4:52
use the internet? Because the internet
4:55
allows two people to cross channels,
4:57
but they could be anywhere in
4:59
the world. So I could be
5:01
in a news group. A professor
5:03
from Russia could just happen to
5:06
be in the same news group.
5:08
And that would not look normal.
5:10
Nobody's monitoring that. Nobody was tracking
5:12
it in a 1990. So like,
5:14
hm. What if we sort of
5:17
used the internet for doing this
5:19
and I'm sitting in a meeting
5:21
and I asked a question that
5:23
basically changed the trajectory of my
5:25
life because in 1990 when I
5:27
started at the CIA I was
5:30
doing AI programming. So I wasn't
5:32
in cybersecurity. I was actually programming
5:34
and I actually built a system
5:36
that actually would help track and
5:38
monitor terrorists and actually using AI
5:41
predictive modeling neural networks rule base
5:43
Systems and actually got a lot
5:45
of awards from the DCI and
5:47
others for doing that So I
5:49
was not a cyber guy in
5:52
1990, but I was doing the
5:54
communication area with AI programming and
5:56
I asked a question in a
5:58
meeting that changed my life And
6:00
this is where I come up
6:03
with my phrase that I live
6:05
by, which is smart people know
6:07
the right answer. Brilliant people ask
6:09
the right question. And I asked,
6:11
I didn't realize it at the
6:14
time, but I asked a brilliant
6:16
question, which was, okay, we're moving
6:18
systems to the internet. How do
6:20
we know they're secure? How do
6:22
we know they're safe? How do
6:25
we know that there's not somebody
6:27
watching or monitoring us? And in
6:29
the government, if you ask questions
6:31
that most businesses are the same,
6:33
if you ask a question that
6:36
nobody knows the answer to, you're
6:38
volunteering to solve it. So they
6:40
basically said, okay, you're now a
6:42
team of one, you're gonna be
6:44
a four month special project, we're
6:47
gonna give you some money, and
6:49
you're gonna go in and show
6:51
us how to create secure systems
6:53
on the internet. and back then
6:55
you had software development life cycles
6:58
like you had you had ways
7:00
to develop software you had the
7:02
waterfall model you had the spiral
7:04
model rad was just coming on
7:06
the scenes in 1990 so i
7:09
figured i was basically going to
7:11
create a software development life cycle
7:13
for security of basically how can
7:15
you build secure systems guaranteed testing
7:17
verification create the test what i
7:20
quickly realized which we all realize
7:22
today is today is today You
7:24
can't prove a system of secure.
7:26
It's impossible. So that's when I
7:28
embarked on, okay, the only way
7:31
to prove a system of secure
7:33
is to try to break into
7:35
it, is to try to exploit
7:37
it. And that's when I began
7:39
a career of being a professional
7:41
hacker, where basically I was given
7:44
a lab and I, I mean,
7:46
I. still today. I mean, you
7:48
give me hexadecible packet decodes and
7:50
I can read it like it's
7:52
English because that's just what I've
7:55
done for like 30 years. People
7:57
think I'm crazy, which I probably
7:59
am, but like packet decodes, protocols,
8:01
ports, services, exploitation. develop my model
8:03
where it's basically you need visible
8:06
IPs open ports weaknesses and servers
8:08
exploitation create back doors covert operations
8:10
where I develop basically my offensive
8:12
mindset that is really sort of
8:14
set the stage for the rest
8:17
of my career. I love that
8:19
story I love it because that's
8:21
you know I feel like in
8:23
the best ways people often have
8:25
the opportunity thrown in their lap.
8:28
It's almost like The hand of
8:30
God, X Makina, when you want
8:32
something, like I want to know
8:34
how this thing is, how these
8:36
computer systems are not going to
8:39
be hacked or how they're going
8:41
to be secure, someone says, well,
8:43
here you go. Here's the answer,
8:45
but the answer is within you.
8:47
I think that's powerful. And what
8:50
I also loved about the story
8:52
is the focus on the fundamentals.
8:54
I talk about it all the
8:56
time. with my family, people I
8:58
have a mentoring session tomorrow with
9:01
a young 22-year-old, and I was
9:03
talking to him about all of
9:05
the fundamentals. If you can understand
9:07
how your web browser speaks to
9:09
Google.com and all of the things
9:12
that happened in between, by just
9:14
definition, you now know a lot
9:16
of information. And just knowing that
9:18
makes you eligible, not only for
9:20
jobs, but for opportunity. Exactly. And
9:23
that's my whole thing where my
9:25
career, my life, how I do
9:27
everything is really, I drive my
9:29
kids crazy because I'm like back
9:31
to the basics and keep things
9:34
simple. Like when my life starts
9:36
getting crazy and I had there
9:38
where I mean I had so
9:40
many apps, so many texts, so
9:42
many computers, so many cars and
9:45
like my life was out of
9:47
control and I'm just like simplify.
9:49
What is the simple basic things
9:51
I need to live my life?
9:53
Let me do that for a
9:56
month. So to humble, get back
9:58
to basics, very simple. Same thing
10:00
with eating. Like, I don't believe
10:02
in complex diets. Just go back
10:04
to the basics. If it has
10:06
one ingredients, you eat it. Beef
10:09
has one ingredients. Guess what? Processed
10:11
cheese doesn't have one ingredient. Broccoli,
10:13
one ingredient. So you just go
10:15
back to simple and it's funny,
10:17
your body, your life and your
10:20
mind can reset and readjust because
10:22
we're really simple creatures. We're not
10:24
meant for the, like everything on
10:26
the internet and social media and
10:28
fast-paced world is our minds are
10:31
like melting down because we were
10:33
not created for complexity. And at
10:35
some point we always end up
10:37
back to the basics, back to
10:39
the root of either a question
10:42
or a solution or an answer.
10:44
And I would imagine in some
10:46
ways you might feel that way
10:48
because you mentioned that you were
10:50
focused on AI development. And now
10:53
2025, everybody can't get enough of
10:55
AI and you were looking at
10:57
neural networks well before they became
10:59
in the mainstream. What is your
11:01
reflection on where we were at
11:04
then and where we're at now?
11:06
So the big reflection is, which
11:08
I think people are missing to
11:10
our detriment, AI, artificial intelligence, whatever
11:12
discipline, whether you're talking about neural
11:15
networks, machine learning, rule-based systems, artificial
11:17
intelligence about one simple thing, data
11:19
sets. The better data you can
11:21
give it, the more artificial intelligence
11:23
will be. It's not about the
11:26
algorithms, it's not about open AI
11:28
versus Grog3, it's not about via
11:30
with the GPUs and 100 million
11:32
dollars, it's basically the data. If
11:34
you give AI enough data, it
11:37
will become smarter and smarter and
11:39
can simulate artificial intelligence. And to
11:41
me, we're in this very dangerous
11:43
spot because we are sharing. our
11:45
entire lives online. Look at what
11:48
people post on social media, public
11:50
media, pictures of their family, pictures
11:52
of their kids. I mean, it's
11:54
insane. Why would you give that
11:56
away publicly? And the more data...
11:59
AI has, the more we're giving
12:01
it, the smarter it can be.
12:03
And we're at a stage where
12:05
it's scary because you watch movies
12:07
from the 80s that you think
12:10
are totally science fiction. And we're
12:12
basically creating digital twins.
12:14
And if we're not careful, these digital
12:17
teams will be as smart and as
12:19
brilliant at us and essentially
12:21
can make us obsolete. Maybe, yeah,
12:23
and like we were saying, might be even
12:26
better. Sometimes when I am speaking to someone,
12:28
or especially, like, you know, on the line
12:30
with the customer support rep, I do think
12:32
about all the other things I could be
12:35
doing with my time, I wish it would
12:37
hurry up, like, come on, get going, and
12:39
we're seeing that now, especially with things like
12:41
chat bots. Chat bots are becoming better and
12:44
better, right now, there's still a little bit
12:46
of a friction to work with them, but
12:48
I could see a world in 10. 15
12:50
years, maybe even sooner, where
12:52
chat bots are performing at
12:55
the same level as a human
12:57
rep. I mean, just pick
12:59
anyone, any political figure. They
13:02
have digital twins that are
13:04
creating alternative content.
13:06
Like, I won't get political,
13:08
I'll just sort of talk
13:11
factual, but the president of
13:13
the country gave a speech
13:15
on Tuesday. There are clips of
13:17
him. giving the speech,
13:20
it looks like him, I mean
13:22
there's no glitches, there's
13:24
nothing, except it's
13:26
saying alternative information,
13:28
they've already done deep
13:30
fake of it. So now I have
13:32
a situation where I go in
13:35
to my browser and there are
13:37
three versions of his speech.
13:39
One is real, it actually
13:42
happened, the other two are
13:44
alternative manipulations, and here is
13:46
the problem. I'm a smart
13:49
person. I work in this field. I
13:51
know how to do analytics. I
13:53
can't tell the difference. And
13:55
now that's scary. So now the
13:57
question is, who do we believe?
14:00
and who is really leading when
14:02
people watch this stuff. So we're
14:04
in a really scary place where
14:06
our laws haven't kept up, and
14:08
this is the problem. If I
14:10
go in in the real world
14:12
and I say false things about
14:14
you, I go in and I
14:17
say, Ron is a criminal, or
14:19
Ron is this or Ron is
14:21
that, and it is not true,
14:23
there's defamation laws. You can actually
14:25
file civil or criminal action against
14:27
me, and I could potentially either
14:29
get... financial fines are depending on
14:32
the severity of it, jail time
14:34
for it. So I'm prohibited from
14:36
doing it. Problem is, when they
14:38
wrote the Constitution, I'm pretty sure,
14:40
Ron, you probably agree with me
14:42
on this, I don't think George
14:44
Washington had an iPad, like I
14:47
wasn't there, I could be wrong,
14:49
but I'm pretty sure he didn't
14:51
have an iPad, and he wasn't
14:53
thinking this, and the problem is
14:55
our current laws, don't cover it.
14:57
So if I go in. And
14:59
I create a video of you
15:02
that puts you in an incriminating
15:04
situation that embarrasses you, hurt your
15:06
reputation, causes issues with your family.
15:08
It's not illegal. You can't do
15:10
anything against it. And that's the
15:12
problem today is we're in a
15:14
lawless society on the internet and
15:17
it is scary my friend. The
15:19
internet's always been one of those
15:21
places. It's been a little lawless.
15:23
I was telling my wife a
15:25
few months ago about... the lawless
15:27
internet that I grew up in.
15:29
Like there was a very little
15:31
censorship, very little tracking, so you
15:34
could really be anonymous in a
15:36
great deal, a great type of
15:38
way. But now it is a
15:40
little bit more surveyed, but you're
15:42
right. It's still lawless because in
15:44
the 2024 election, there was someone
15:46
that used a deep fake of
15:49
Biden. And he got fine, I
15:51
think like five million dollars. And
15:53
you know what he said? I
15:55
do it again, it was worth
15:57
it. Even though he's facing this
15:59
punishment and... you know, could be
16:01
in trouble. For some people, even
16:04
if you slap them with a
16:06
monetary challenge in front of them,
16:08
they might say, I don't care.
16:10
My mission was accomplished. I did
16:12
what I felt as though was
16:14
the right thing to do, even
16:16
though you're violating someone's trust and
16:19
their reputation. Yeah. And the thing
16:21
with that though, that I just
16:23
want to highlight is, he actually
16:25
was threatened. with a fine he
16:27
wasn't actually levied on him exactly
16:29
so so it now is in
16:31
the courts and the judge has
16:34
to decide whether it's valid and
16:36
depending on what judge you get
16:38
in front of you mean the
16:40
judge might say he didn't break
16:42
a law because because that's really
16:44
the issue is you can go
16:46
in and file a civil suit
16:48
against him for five million dollars
16:51
and you can say it was
16:53
damaging you could say it all
16:55
to the light you could say
16:57
all that stuff but for it
16:59
to see the light of day
17:01
You have to point to a
17:03
law that was broken. And I
17:06
don't know any laws that cover
17:08
that. So yes, he was slapped
17:10
with a fine, but I can't
17:12
see this one going. And then
17:14
even if the judge allows it,
17:16
you have to convince a jury
17:18
of your peers. Now, if you
17:21
were sitting on that jury and
17:23
you saw that this individual did
17:25
that. I mean, I could see
17:27
people go either way. Like, it
17:29
depends on who you get on
17:31
the jury. Some people might say
17:33
that was perfectly okay. Others might
17:36
not. So, I mean, there's just,
17:38
right now, yeah, he was threatened
17:40
to find, but whether it actually
17:42
comes to fruition and he pays
17:44
it, it's gonna take three to
17:46
four years, and it's really rolling
17:48
dice in Vegas on which way
17:51
it's gonna go. I need to
17:53
jump in for a second and
17:55
share some details about a special
17:57
group that we've created for you,
17:59
our listeners. We understand the importance
18:01
of creativity in cybersecurity, and Hacker
18:03
Valley has created a special cyber
18:05
creators mastermind. Each month we meet
18:08
and discuss how to break down
18:10
technical concepts through stories, the intricacies
18:12
and opportunities. of using audio and
18:14
video to highlight what's important in
18:16
cybersecurity, and we share how you
18:18
can use content to highlight the
18:20
hard work that you put into
18:23
your craft. You can check out
18:25
the mastermind by visiting hackervally.com/mastermind. We'd
18:27
love to help unlock your creative
18:29
potential and would love to see
18:31
you there. One of the things
18:33
I wanted to speak with you
18:35
about that I don't know too
18:38
much about myself. and this is
18:40
getting a little political but that's
18:42
all right because I feel like
18:44
we're not necessarily giving opinion we're
18:46
more so talking about the information
18:48
that's in front of us is
18:50
cyber warfare I feel like this
18:53
is one of those hot topics
18:55
we hear about it in a
18:57
business perspective but I think it's
18:59
a lot deeper than that especially
19:01
having worked at the NSA as
19:03
a contractor for about four years
19:05
myself I know that some of
19:08
these capabilities that are out there
19:10
We're already significant from a cyber
19:12
perspective, but introducing AI and having
19:14
companies like Google release the restrictions
19:16
of developing weapons with AI is
19:18
interesting. So I'll pose the question,
19:20
what is the state of cyber
19:23
warfare? So the reality is, and
19:25
I don't know why people don't
19:27
like talking about the truth, and
19:29
they don't like facing it or
19:31
addressing it. The reality is, and
19:33
we could go for hours, but
19:35
I'll just give you facts, we
19:37
are in a cyber war. We
19:40
are in World War Three. Whether
19:42
we want to admit it, whether
19:44
we're denying it, what is a
19:46
war? Well, let's just get simple.
19:48
A war is when countries or
19:50
individuals target and attack each other
19:52
trying to course harm. You look
19:55
it up in the dictionary. What
19:57
is a world war? A world
19:59
war is when you have two
20:01
or more countries. that blatantly disagree
20:03
and get to the point where
20:05
they're going to actively take action
20:07
to hurt individuals hurt the country
20:10
or prove their method until the
20:12
other side surrenders and gives it
20:14
to what they want. If you
20:16
look at the internet... and you
20:18
look at what's happening with China
20:20
Russia North Korea it's a war
20:22
I mean that they are actively
20:25
targeting us but most people don't
20:27
realize North Korea we spend all
20:29
this time and energy sanctioning them
20:31
so they can't have nuclear weapons
20:33
because that would be bad like
20:35
if North Korea had ballistic nuclear
20:37
weapons and we saw them trying
20:40
to test weapons and then I'm
20:42
exploding. That was all a smoke
20:44
screen. Because guess what? They have
20:46
cyber nuclear weapons, but most people
20:48
don't realize this. The North Korean
20:50
economy is based off of hacking
20:52
US companies individuals. It's estimated last
20:54
year that North Korea made 2.3
20:57
billion dollars on hacking US citizens.
20:59
Hacking U.S. companies. Most people realize
21:01
a lot of the ransomware attacks
21:03
hitting companies. North Korea. A lot
21:05
of the cyber crime and draining
21:07
people's accounts slowly as all North
21:09
Korea. 2.3 billion, that is the
21:12
revenue of some small countries. I
21:14
mean, that could support North Korea
21:16
for a long time. That is
21:18
warfare, right? We're calling it ransomware,
21:20
we're calling it attacks. No, it
21:22
is warfare. that they are launching
21:24
against the United States, causing harm
21:27
and damage to us. And we
21:29
just are not waking up. I
21:31
mean, the fact that, and I
21:33
hate that I have to do
21:35
this, when I work with most
21:37
companies that are hit with ransomware,
21:39
from a business standpoint, I have
21:42
to put my business head on,
21:44
because these companies need to stay
21:46
in business and make money, from
21:48
a business standpoint, 95% of the
21:50
time. The best business decision, and
21:52
I hate to say it, the
21:54
best business decision, is to pay
21:57
their ransom. Because these are commercialized
21:59
companies. They're very, I hate to
22:01
say this, they're very reliable. I
22:03
mean, Russian business network, which is
22:05
one of the other large ransomware
22:07
providers, they have a help desk
22:09
and they have a money back
22:11
guarantee. If you pay the ransom
22:14
and you don't get your data
22:16
back, they refund your money because
22:18
they want to be reputable. and
22:20
they want to be known that,
22:22
okay, if you pay, you get
22:24
your data, so it's a good
22:26
investment. And most companies, if they
22:29
lose their data, a hospital, unfortunately,
22:31
if they lose their data, they
22:33
have two options. They can pay
22:35
three million, or they can be
22:37
down for two to three weeks,
22:39
not be able to treat patients
22:41
and lose 20 million dollars. So
22:44
from a business standpoint... Which one
22:46
unfortunately makes sense? I'm one that
22:48
I do prefer to work with
22:50
our clients, so they have proper
22:52
infrastructure proactive, so you can recover
22:54
without paying ransom. But if you
22:56
don't involve security professionals until after
22:59
the fact, the unfortunate reality which
23:01
sucks, it's awful, is paying the
23:03
ransom is less on your balance
23:05
sheet than actually fighting it, rebuilding,
23:07
and going from scratch. That's warfare.
23:09
I mean, we are in war,
23:11
and here's the problem. You're in
23:14
a completely different mindset when you're
23:16
in war. You're thinking differently, you're
23:18
acting differently, you're paranoid, you're concerned.
23:20
The problem we have in the
23:22
United States, because nobody will face
23:24
the fact that we're at war,
23:26
companies and individuals and even the
23:28
government, we have peacetime mentality. We're
23:31
acting and behaving like we trust
23:33
people. When links come in and
23:35
people, and people click on it,
23:37
that's peacetime mentality. But the reality
23:39
is, if we knew we were
23:41
at war and we had war
23:43
time mentality and a link came
23:46
in, we wouldn't even think of
23:48
clicking it. I mean, there's no
23:50
way of clicking it. Somebody going
23:52
in in peace time mentality and
23:54
saying, let's take a critical server
23:56
that controls the oil and gas
23:58
pipeline for the East Coast and
24:01
let's connect it to the internet
24:03
because it's easier and simpler. Peace
24:05
time mentality, you do that. War
24:07
time mentality, you would never in
24:09
a million years do that. I
24:11
mean, from 2023 to 2024, the
24:13
amount of cyber crime or cyber
24:16
warfare hurting America doubled. and they're
24:18
predicting it's gonna quadruple in 2025.
24:20
The problem is getting worse, not
24:22
better, but we're still approaching it
24:24
from a peacetime mentality. Hmm. You
24:26
know, I'll be honest, I don't
24:28
have the wartime mentality either. There's
24:31
times where I'm definitely kicking back
24:33
and taking it easy, not reading
24:35
things too closely. I think it
24:37
brings a little bit of sanity
24:39
to our lives. I think in
24:41
back into 2021, 2001, during the...
24:43
September 11th attacks everyone was on
24:45
high alert like they would show
24:48
pictures of people who were Allegedly
24:50
terrorists and I remember my own
24:52
mother going around calling in the
24:54
hotline numbers because she saw she
24:56
saw someone that looked Seemingly like
24:58
this person, but she wasn't sure
25:00
she wasn't sure but she was
25:03
scared and she didn't want the
25:05
same thing to happen again and
25:07
you're totally right when someone is
25:09
hit by ransomware they don't feel
25:11
that to their core like September
25:13
11th, 2001 brought us. It's a
25:15
different feeling. It's more of like,
25:18
oh, who's the dummy? All right,
25:20
you made the company lose 35
25:22
million out of our billions of
25:24
dollars of revenue. 9-11 impacted everyone.
25:26
Everyone in this country felt like,
25:28
okay, that could have been me
25:30
on an airplane. Exactly. It touched
25:33
every one of us and we
25:35
felt the real threat that it
25:37
could happen. The problem now is
25:39
we're not socializing this enough. Like
25:41
the fact that people are losing
25:43
money out of their bank account.
25:45
ransomware, we're not on social isolation,
25:48
people don't feel that it could
25:50
impact them like it did 9-11.
25:52
We're not getting that same impact,
25:54
but the reality is, and I'm
25:56
not trying to downplay in any
25:58
way, shape or form, but if
26:00
you look at the cyber crime,
26:03
the cyber war, the amount of
26:05
loss of lives, and the amount
26:07
of monetary damage was greater than
26:09
what we lost in 9-11. and
26:11
I'm not trying to downplay in
26:13
any way shape or form but
26:15
the impact is larger but because
26:17
it's slower and it's spread out
26:20
9-11 happened on a single day
26:22
within two hours high impact imagine
26:24
taking the loss of life and
26:26
the damage of 9-11 but spreading
26:28
it out over 12 months you're
26:30
numb to it you don't feel
26:32
it as much you don't feel
26:35
that impact as great when it's
26:37
done over a longer period of
26:39
time but the same thing is
26:41
happening again we're just not waking
26:43
up and realizing it. And like
26:45
the simple example I give to
26:47
friends and family members is why
26:50
in the world are you posting
26:52
pictures of your children on public
26:54
social media where any creep criminal
26:56
can see it, target them, and
26:58
go after them. We say we
27:00
care about our children and we
27:02
love our children. Why in the
27:05
world are our social media sites
27:07
not private? Why are we only
27:09
sharing that with a few? I
27:11
mean, posting pictures of your child's
27:13
life for 12 years? Think of
27:15
the impact that has, but we're
27:17
just not thinking wartime mentality. So
27:20
we're thinking it's perfectly okay to
27:22
make our children and our family's
27:24
life totally public, and it's insane
27:26
the long-term damage that could have.
27:28
Oh, you know, I have mixed
27:30
feelings about this because I totally
27:32
understand and appreciate the perspective of
27:34
you gotta be careful, especially you
27:37
never really understand what you're consenting
27:39
to. I was just speaking to
27:41
someone recently about all of the
27:43
face. recognition at the airport. I
27:45
was like, how does this work?
27:47
And they were saying it's a
27:49
zero-proof piece of information. You send
27:52
information to a centralized server, it
27:54
gives you back information, saying that,
27:56
yes, this is Ron, or maybe
27:58
you want to ask this person,
28:00
take their glasses off, and take
28:02
a closer look at them. And
28:04
I was like, oh, wow, OK.
28:07
I was like, I never agreed
28:09
to that. And they're like, yeah,
28:11
you did. All of the airline
28:13
terms of service, terms of service,
28:15
you agreed to that. And I
28:17
think it's the same thing with
28:19
our kids. Like we want to
28:22
make sure our family is able
28:24
to see them because maybe the
28:26
extended family, they don't really have
28:28
their numbers. They might live in
28:30
a different city, state, or country.
28:32
And it's like, I want my
28:34
family to see my my family
28:37
as well. But when you boil
28:39
it down, you never know how
28:41
someone's using that information, whether you
28:43
agreed to it, especially if you
28:45
agreed to it or not. I
28:47
don't know about you, but I
28:49
have about 15 people in my
28:51
family, and I maybe have about
28:54
40 friends. Now, let's say I'm
28:56
a little weird, so call it
28:58
100. You have a big family,
29:00
100 people. Why don't you set
29:02
up a private account and share
29:04
it with 100 people? Why are
29:06
you letting millions upon millions of
29:09
total strangers? and he and the
29:11
world see that. It's like, so
29:13
I'm not telling you, this is
29:15
where I'm different than most folks,
29:17
I'm not saying go Amish, trust
29:19
me. There's days, my friend Ron,
29:21
you were asking me like how
29:24
my day was going. There are
29:26
some days where I sit in
29:28
my car and I seriously contemplate
29:30
driving to Pennsylvania buying a farm
29:32
of being Amish, because guess what?
29:34
I mean, think about how beautiful
29:36
it would be at night. to
29:39
sit with a candle in a
29:41
book and not have your cell
29:43
phone not have texting not have
29:45
social media maybe I'm crazy but
29:47
I'm like that sounds like an
29:49
amazing life now I'm sure after
29:51
a while I get bored because
29:54
I miss helping people so I
29:56
wouldn't do that but the point
29:58
is I'm not I'm not saying
30:00
go Amish, but I'm saying there's
30:02
a middle ground. There's a balance
30:04
that we're missing here. We don't
30:06
have to go to the extreme
30:08
where we post every detail of
30:11
our life publicly for everybody to
30:13
see. There's a balance. Limit what
30:15
you post, limit who sees it,
30:17
limit what goes out there, and
30:19
we've just lost that balance and
30:21
we're just so one-sided. You know,
30:23
I... I agree that you should
30:26
definitely limit, especially when it comes
30:28
to your kids, you definitely want
30:30
to limit their exposure to being
30:32
online, whether it's their face being
30:34
online or them being online, because
30:36
the advertisement is ridiculous. I'm on
30:38
Instagram. I'll probably go on Instagram
30:41
later and it will show me
30:43
your book. But hey, you used
30:45
to now buy Dr. Eric Cole's
30:47
book and also become a customer
30:49
of his company. And it's interesting
30:51
to see all of the advertisement.
30:53
reasons why the United States government
30:56
is so on edge about TikTok
30:58
is because we don't have control
31:00
over it. They could insert advertisement,
31:02
messaging, and ultimately program the minds
31:04
of the adults, but especially the
31:06
youth as well. So can Facebook,
31:08
so can Twitter, so okay, and
31:11
I'm just throwing it out there.
31:13
Yeah. So we're concerned about the
31:15
Chinese influencing us. But
31:17
we're okay with Elon Musk
31:19
or Mark Zuckerberg influencing us
31:21
and biasing us. And once
31:23
again, this is factual. I'm
31:26
not, I mean, I always
31:28
be careful is we need
31:30
to be able to talk
31:32
about politics in a non-political
31:35
way, but this is factual
31:37
in October of 2024. So
31:39
a month before the election.
31:41
If you went into Facebook.
31:43
And you put a post
31:46
that said, Donald Trump is
31:48
the best president we've ever
31:50
had and he will make
31:52
this country great again. You
31:54
got flagged in Facebook. That
31:57
post was not allowed. Zuckerberg
31:59
admitted to it. I did
32:01
the test, but he admits
32:03
to it now. But if you change
32:06
two words, two words, you change
32:08
Donald Trump to Kamala Harris,
32:10
and you said Kamala Harris
32:12
is the best president, she
32:14
will make this country great again.
32:17
It was allowed. So these platforms
32:19
in the US are biasing
32:21
us. Oh yeah. Is that okay? No, but
32:23
so it's not the Chinese. I mean, to
32:25
me, the solution is not banning
32:27
a country. But regulating, where's the
32:30
regulation around TikTok, Facebook, and all
32:32
these others? Because they're influencing us,
32:34
they're influencing our kids. But it's
32:37
so addicting to be online. And
32:39
I think that one of the
32:41
addictions that I currently suffer from
32:44
is artificial intelligence using LLM after
32:46
LLM, trying to solve problems that an
32:48
LLM probably doesn't necessarily need to solve,
32:50
but it'd be really cool if it
32:52
solved it without knowing all the steps
32:54
of the solution. And it kind of
32:57
was able to figure out where to
32:59
go from here, you know, how are
33:01
you looking at AI today, like, especially
33:03
considering your background, and are you
33:05
seeing it apply in any special
33:08
or detriment ways for you in
33:10
your life? So I basically spend
33:13
an hour or two a night brainstorming
33:15
with my digital twin. So I
33:17
loaded in and created a digital twin
33:20
that thinks and acts like me. And
33:22
now I go in and I ask
33:24
a question, I go listen. I know
33:26
in the past I've hired some bad
33:28
marketing firms and I need
33:31
a marketing firm now that does
33:33
this this and this but I don't
33:35
want to make that same mistake.
33:37
Can you tell me what were the
33:39
three reasons why I hired bad firms
33:41
in the past and what I can
33:43
do to avoid that from happening
33:45
in the future? And it comes
33:47
back and it tells me. And then
33:49
it says, would you like us to
33:51
create a 30 day plan to interview?
33:54
higher and buy a marketing from that
33:56
will actually help you when I go yes, and
33:58
it basically produces an action plan,
34:00
a massive action plan, and all I
34:02
do is follow the steps now. So
34:05
it's like, it's crazy where we're doing
34:07
it now. I'm keeping that to myself,
34:09
and it is controlled by Eric. So
34:11
it's an augmentation to me. It's not
34:14
a replacement. Right. If we use it
34:16
as an augmentation tool, and it maintains
34:18
our identity, I think AI could be
34:20
super powerful. But if it gets to
34:23
the point now where you're losing the
34:25
identity of people. and it's now just
34:27
putting an engine and now it's not
34:29
Eric Cole that I'm asking for help
34:32
it's open AI. It's not it's not
34:34
Elon Musk I'm asking for help I'm
34:36
asking for grog three it's basically making
34:38
humans obsolete which scares and terrifies me.
34:41
All right P stack overflow I mean
34:43
there's so many talented developers out there
34:45
that offer so much help guidance and
34:47
wisdom that I used to think and
34:50
you know like and uphold their stuff
34:52
they would get that adrenaline hit or
34:54
that dopamine rush. But now, it's all
34:57
sent to open AI. I use tools
34:59
like cursor and get help co-pilot to
35:01
now do all of my programming for
35:03
me, but it's really on the heels
35:06
and on the backs of someone's blood,
35:08
sweat, and tears. So big shout out
35:10
to them and thank you to everybody
35:12
that Stack Overflow contributor. But Eric, I
35:15
did want to ask you a very
35:17
important question. You know, you have such
35:19
an eclectic range of skills. there's a
35:21
lot of people that are that watch
35:24
and listen to this show and they
35:26
are always curious about like how do
35:28
I become a little bit more like
35:30
this person how could I you know
35:33
write books get paid for speaking engagements
35:35
have the have so much information I
35:37
need a marketing agency to help dispense
35:39
it in an effective way you know
35:42
if you were to start your career
35:44
over from day one will be the
35:46
first will be your first move my
35:48
first move would be to invest a
35:51
lot more in personal development, emotional intelligence,
35:53
communication because because what hurt me in
35:55
my career is I was the brilliant
35:57
geek. I mean, that's what I was
36:00
labeled as. I mean, that's what I
36:02
was known as in school. You know,
36:04
I mean, the geeky guy. I mean,
36:06
if you want to solve any problem,
36:09
go to Eric, math and science was
36:11
my thing. You give me a computer
36:13
and I'm in my happy place. I'd
36:15
rather, you know, I mean, program and
36:18
write code than go out to parties.
36:20
Like, like people go, what do you
36:22
do you do for fun? new software
36:24
companies or I program, they're like, no,
36:27
Eric, what do you do for fun?
36:29
I'm like, that's fun. So I didn't
36:31
really have a lot of social skills.
36:33
So I would come into meetings and
36:36
I didn't realize people don't think like
36:38
me. People don't act like me and
36:40
I would go in and I would
36:42
be like, well, of course this is
36:45
obvious. Like, what are you stupid? You
36:47
don't understand these things? And like, I
36:49
hurt my career because I was known
36:51
as the brilliant, that couldn't talk in
36:54
social settings. So I would step back
36:56
and spend a lot more time, which
36:58
I've done in my last 10 years,
37:00
and sort of has changed my life,
37:03
is focus a lot more on how
37:05
can I be a better human? How
37:07
can I have better friends? How can
37:10
I communicate better? How can I sort
37:12
of win friends and influence people and
37:14
really communicate and socialize? Because here's the
37:16
reality. If you can get along with
37:19
people, if you can read other people,
37:21
And you can think and switch instead
37:23
of me going into a conversation going,
37:25
what do I want? And how am
37:28
I thinking? What if I went to
37:30
the conversation going, how is Ron thinking?
37:32
Like, what is Ron doing? What is
37:34
Ron feeling right now? And to me,
37:37
one of the things like when I'm
37:39
in any business situation or any conversations,
37:41
and we sort of did this a
37:43
little in the call, you did it
37:46
very well. You did it very well.
37:48
Eric, how you feeling today? Like, like,
37:50
where are you at? What sort of
37:52
your tone and message? And if we
37:55
just did that where we went into
37:57
meetings, and I just said, hey, I'm
37:59
having a tough day, you know, when
38:01
I got. some not so favorable news,
38:04
I'm a little angry, I'm a little
38:06
frustrated, you would probably have a little
38:08
more compassion and approach it differently than
38:10
if I'm like, oh, I'm in a
38:13
super happy mood, I'm in a pump
38:15
mood, I'm ready to go crazy and
38:17
stuff, or even the same thing. If
38:19
me and you are business partners and
38:22
you want to give me a new
38:24
business idea to grow and expand the
38:26
company, and you come to me on
38:28
a day where I'm like, we lost
38:31
too much money, I'm in scarcity, I'm
38:33
afraid the company's gone to business, I'm
38:35
going to shut you down so quick,
38:37
and you're going to be like, what's
38:40
wrong with him? But then three days
38:42
later, we just won three big contracts,
38:44
we just got an award, our customers
38:46
love us, and I'm ready to run
38:49
through walls and go crazy, and you
38:51
now come to me with that idea,
38:53
I'm a big fan of everything is
38:55
meant to happen. You just have to
38:58
time it correctly. And only by understanding
39:00
and connecting with humans and having compassion
39:02
and emotional intelligence to understand where they're
39:04
at, it would solve so many problems.
39:07
If I went in and I only
39:09
focused on money and I forced the
39:11
deal and I forced the deal, they
39:13
would be very unhappy. They would be
39:16
like, okay, secure anchor does okay services,
39:18
but they're not that good and they
39:20
wouldn't call us again. But if I
39:23
go in and I wait for the
39:25
right timing, they weren't ready to buy
39:27
today. So I'm not forcing it. They
39:29
needed consultation, so I gave it to
39:32
them for free. And they walk out
39:34
of that meeting going, Eric changed my
39:36
life. Eric gave me information that's probably
39:38
worth hundreds of thousands of dollars, and
39:41
he didn't charge me a penny. Imagine
39:43
what he would give me if I
39:45
paid him $100K. And now guess what?
39:47
It always happens. A week, a month,
39:50
a quarter later. They always call me
39:52
for the business and now their customers
39:54
for life. So I'm not rushing the
39:56
sale. I know the sale will happen,
39:59
but instead of forcing a bad deal...
40:01
and getting a frustrated customer, I give
40:03
them what they need, I give them
40:05
what they want, and I close the
40:08
deal in three or four months, and
40:10
now I have their business for the
40:12
next 10 years. Powerful. I love that.
40:14
There's two quotes that come to mind
40:17
through all of this, you know, the
40:19
quality of your questions, and the other
40:21
one is diagnosis without assessment, is malpractice.
40:23
You gotta understand where someone's at. Whenever
40:26
I call someone, I always try to
40:28
ask, is this a good time to
40:30
chat? Just giving them that space is
40:32
going to allow for a much more
40:35
productive conversation. I love that. The best
40:37
way to get what you want is
40:39
to give other people what they want.
40:41
Damn. That's what I'm talking about. Yes.
40:44
If there was anything that you learn
40:46
from this entire conversation, it is that.
40:48
I really had a good time chatting.
40:50
Eric. I hope that I'm wishing you
40:53
the best on this journey of... understanding
40:55
how we can protect oneself better and
40:57
also companies out there, the American people
40:59
and people at large better and also
41:02
the growth and self-development journey that you're
41:04
on. I'm definitely curious about this AI
41:06
bot. Before anyone that wants to stay
41:08
up to date with Dr. Eric Cole,
41:11
be sure to check out the show
41:13
notes or description wherever you're listening or
41:15
watching. Eric has a host of books
41:17
that I'll drop the links in to
41:20
the show notes for everyone to check
41:22
out and hopefully purchase a few copies
41:24
and want to say thank you again
41:26
Eric and with that we will see
41:29
everyone next time.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More