Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:54
How's it going , david ? It's great to finally
0:56
get you on the podcast . I think that we've been planning
0:59
this thing for quite a while , you
1:01
know back in 2024 . And now you're
1:03
the . You're the first episode of 2025
1:06
. Because I burned myself
1:08
out and had to take like six weeks off
1:10
.
1:11
I am honored no-transcript
1:38
feelings about that , actually and
1:40
so there's so many people who
1:42
bill themselves as cyber security experts
1:44
you're probably having some on the show
1:47
and some of them . When you drill down
1:49
on what they , why they think they
1:51
are , it's because they took a bunch of microsoft
1:54
certification classes , right . So
1:56
that's virtually useless
1:59
in any real world scenario , because
2:01
the really bad stuff is stuff
2:03
that people have never seen before , and
2:06
no amount of Microsoft licensing
2:08
or certification is gonna prepare
2:10
you for some wacky denial
2:12
of service attack that nobody's ever seen . That's
2:15
just basically having a brain and
2:18
being able to think through it . So the best
2:20
cybersecurity people are people who
2:22
are not formally trained to be cybersecurity
2:25
people .
2:26
Yeah , yeah , no , that's , that's
2:28
very true . You know , it's always interesting
2:31
when , when
2:33
I bring people on and you know I try
2:35
and find fantastic guests , you
2:38
know like overly qualified , very
2:40
experienced people like yourself and whatnot and
2:43
every once in a while this happened
2:45
. This happened , you know , maybe
2:48
in the middle of last year , right , where someone
2:50
came on and we started . I started to get
2:52
a little bit technical , because I'm technical , you
2:54
know like I'm . I'm in the weeds
2:56
, you know . I , I , uh , I
2:59
wake up and I get you know into log
3:01
files and I'm figuring out what's going
3:03
on . I'm reverse engineering systems
3:05
and whatnot , like all day long and I look up
3:07
and it's 6 PM , you know that
3:09
sort of thing , and we started getting a little bit technical
3:12
and I immediately reach
3:14
their technical quote-unquote
3:17
expertise limit and
3:20
then I push a little bit
3:22
farther and come to find out , oh
3:24
, you're not technical at all , you
3:26
kind of stumbled your way into
3:28
this thing and someone promoted you early
3:30
and that's what happened
3:32
, right , which is it's frustrating
3:34
for myself .
3:35
I deal with this all the time
3:37
, given the kind of things I do and where I live . I've
3:40
been asked by some VC firms and
3:43
other firms to do
3:45
vetting of people , and about
3:47
80% of the people who call themselves cybersecurity
3:50
are actually lawyers
3:52
and they're no
3:54
, I'm serious . They're people at a law
3:56
firm who were involved in
3:58
one case involving
4:01
some kind of aspect of cybersecurity
4:03
and now they're an expert . It's like
4:06
you know , I go into CVS and buy a
4:08
bottle of aspirin and now I'm a doctor . Yeah
4:10
, it doesn't help
4:13
our profession because it
4:15
downplays . It makes it look like it's
4:17
easy to be this .
4:19
Yeah , yeah , I mean , you
4:21
know , when someone is trying to
4:23
get into cybersecurity
4:25
, right , and they're reaching out to me , they're asking for
4:27
advice and whatnot really
4:29
the very first thing . And some people , some
4:32
people that I've had on and I said this too they like
4:34
took offense to it and like were appalled by how I approach it . But
4:36
I this to , they like took offense to it and like we're appalled by how I
4:39
approach it . But I try to convince people to not get
4:41
into security . Right , because if
4:43
I can convince you just through words
4:45
to not get into cybersecurity
4:47
, you're not going to
4:49
be successful in this field . Right , like
4:51
, because you have to have a curiosity that
4:54
cannot be itched . Right , and you
4:56
need to be the expert like , not
4:58
just an expert in security . Like
5:00
. You need no networking pretty well
5:03
. You need no system design pretty well
5:05
. You need to know , you know the different
5:07
processes and services that are talking
5:09
behind the scenes on your windows device
5:11
, right , what they're linked up to and
5:13
where they're actually configured and all that sort
5:15
of stuff so I've .
5:17
I've been doing this literally for half
5:19
a century in some form
5:22
or fashion with computers . The
5:24
thing when I look at a problem , I
5:26
think of it at multiple levels and
5:29
sometimes it's literally at the bit
5:31
level and I'm thinking , okay
5:34
, what's on the heap , what's in the stack
5:36
, what are
5:38
the bits ? And without
5:40
even articulating it , this
5:42
helps me do things like I
5:44
repaired a remote control fireplace
5:47
the other night because I just knew
5:49
what was wrong with it without having ever
5:51
touched the thing . And this is
5:53
a skill set that
5:55
I'm sure you have . I have Many
5:57
people don't . You cannot
5:59
teach this and it's some kind
6:02
of weird survival trait that
6:04
I don't think people recognize
6:06
for what it is . But you can throw anything
6:08
at me that's a computer-related thing
6:10
and I can figure out what's
6:13
wrong with it in a couple of minutes .
6:14
Yeah , wrong
6:18
with it in a couple of minutes . Yeah Well , david , you know , we kind of we kind of just like dove into
6:20
the deep end here without you know you talking about your background
6:23
, so why don't we backpedal a little bit
6:25
and talk about ? You know
6:27
how you got into IIT . What was that
6:29
start like what ? What
6:31
intrigued you right about the
6:33
field that kind of propelled
6:36
you into this world that you're in now ?
6:39
Okay , well , I mean , we have to go back a ways for
6:41
this . So I mean , I went
6:43
to high school in Pennsylvania in
6:45
the mid-70s early 70s
6:47
even and we actually had a computer
6:50
and it was an IBM 1130 or something
6:52
like that , and it had punch cards
6:54
, if you've ever used those , and you
6:56
had to mark , sense the cards and then you
6:58
run them through . And if
7:00
you did assembly programming , which is what you normally
7:03
had to do , there were 16 switches
7:05
on the CPU and you would configure
7:07
the switches for a binary number
7:09
and you'd hit the button and that was one
7:12
machine instruction and then you would do that
7:14
for the entire program you just wrote . So
7:16
that's like days to do that . So
7:18
I was intrigued by it and
7:21
then I sort of let it go
7:23
for a while . I got a degree in philosophy
7:25
, taught some symbolic logic
7:27
and other things , and then I
7:29
, through a bizarre set of circumstances , I
7:32
ended up being an intelligence agent for
7:34
a number of years . I had to go in
7:36
the military . I did
7:38
, and they , because of my test
7:40
scores , they trained me to be
7:42
a cryptographer . They sent me to
7:44
Russian school for a couple of years and I
7:47
ended up on submarines , and
7:49
so I did that during the height of the Cold War
7:52
and it was fun . I actually had a
7:54
really good time . It's not like anybody got
7:56
killed , you know . You didn't have to worry about
7:58
bombs blowing up a Jeep or something
8:00
, it was just the Cold War was a very different
8:02
kind of thing . But computers
8:04
first time I've seen computers play
8:07
a part in the real world because
8:09
submarines at the time were heavily
8:11
computerized . I mean , given what
8:13
was there at the time , what
8:20
was there at the time . So the computers were called YAK20s , u-y-k and I think
8:22
it was a deck computer and they used to send out this is I always thought it was hilarious
8:24
. They used to send us out with a repair kit and
8:26
it was a big brown plastic case
8:28
and if you opened it the only thing I
8:30
had in it was a rubber mallet . And they
8:32
said you'll never fix anything
8:34
. Just start whacking the crap
8:37
out of the boards and
8:39
something will go back in place . And
8:41
you know what it did . I had to do it like three
8:43
times . So I mean
8:45
, that's kind of the early days
8:47
. I did some other work . I went to NSA
8:50
, I worked at Cosmonaut , the Cosmonaut
8:52
program , and then I got some more
8:54
degrees from UMBC in computer
8:57
science with a math concentration , did
8:59
grad work at Hopkins and then I was
9:02
at a crossroads because I
9:04
either stayed as a professional intelligence
9:06
agent or I went into this
9:08
fledgling world of computers
9:10
in the early 80s and it was a no-brainer
9:12
for me and I mean I knew where
9:14
things were going and I got out and
9:16
started programming
9:18
a number of different languages
9:20
and within a few years I
9:23
was designing systems and
9:25
then I ended up . I was , I
9:27
ended up running research at Booz Allen
9:29
and Hamilton . The consulting firm IBM
9:33
hired me to be the chief scientist
9:35
for the Internet Information Group , which
9:37
is all their Internet-related software
9:39
, basically Not networking , but
9:41
anything above that . And that was
9:44
pretty cool . Actually at the time I'd
9:47
never played in the big leagues like that before
9:49
. I got a lot of job
9:51
offers this was like 95
9:53
, 96 . And I decided
9:56
. The one I wanted was this little little
9:59
company in Herndon , virginia . It was an 8A
10:01
firm and it was called Network Solutions
10:03
and the only thing they had going for them
10:05
is they had a locked contract
10:08
with the National Science Foundation to
10:11
basically run the internet . So
10:13
they it was called a cooperative agreement
10:16
, so they ran
10:19
the whole domain name system
10:21
, all the root servers
10:23
, tcpip allocation
10:25
for North America and the CDPD
10:27
, the cellular data network . So
10:29
I came in as CTO and
10:31
then I ended up running all that
10:34
. So that was pretty cool . And I
10:36
got to deal with crisis after crisis
10:38
because from 96
10:40
, 97 on that's basically the
10:42
dot-com bubble . So all
10:45
of a sudden people actually gave
10:47
a damn what was going on in the internet
10:49
, and up until that they didn't . It
10:51
was a curiosity . In the early
10:53
90s it was like labs
10:55
, and by the end of the 90s
10:58
it was billions
11:00
, tens of billions . So
11:02
I went through that . My company went public
11:04
, I did an IPO , a couple secondaries
11:06
and I was running all this stuff during
11:09
Y2K and I was on President
11:11
Clinton's task force representing the internet
11:13
during Y2K and
11:15
that's a whole story right there . I didn't like
11:17
where some things were headed . I left
11:20
and started writing books on privacy
11:22
and wrote for a couple of magazines . Nobody
11:25
cared really that much about it at
11:27
the time . It was some kind
11:29
of like weird conservative thing and
11:31
the liberals didn't want to have anything to do with it because
11:34
privacy seemed to run smack into
11:36
First Amendment issues , and
11:39
so my natural constituency
11:41
were people I didn't actually want to deal with
11:43
. So then I got into some
11:45
other things . Story's almost over
11:47
here . Sorry the staking's long , no worries . I
11:50
did politics . I was a CTO
11:53
for Senator Bayh when he ran for president
11:55
for two years . That was actually
11:57
a paid gig in Arkansas . And
11:59
then I was the head of security for
12:01
General Wesley Clark when
12:04
he ran for president , and
12:06
so I got some other
12:09
exposure and at
12:11
this point I was pretty cynical
12:13
about almost everything . And
12:15
the thing I was cynical about was the
12:18
people who should understand what was going on did
12:21
not understand what was going on , and
12:23
this was a huge . I mean I knew where
12:25
things were going . I mean what we're seeing today
12:28
with cybersecurity , for instance , you
12:30
know , and data breaches . I mean the writing
12:32
was on the wall for that 20 years ago and
12:36
it's now . Anyway , we can talk about that . So
12:39
I started doing . I started traveling
12:41
the world . I hit 85 countries in a couple
12:43
of years and then I came back and
12:45
I started working with very
12:47
early blockchain companies all
12:49
in Europe , because none
12:51
of them wanted to work in the United States
12:53
because they were terrified of the
12:56
Security and Exchange Commission , especially
12:58
when they're doing ICOs for tokens . I
13:01
mean it's still not clear
13:03
how US tax law
13:05
treats that stuff . So I worked
13:07
with a number of those companies and
13:10
I'm still working with a couple , and then
13:12
I got into post-quantum encryption . So
13:14
now I'm doing sort of Web3
13:16
non-centric security
13:18
with post-quantum encryption . So that's
13:21
kind of a long story . Wow
13:23
that is .
13:26
I mean , that's really fascinating . You
13:28
know , it's just
13:31
where this field has taken you . I
13:35
mean , did you ever think that you would
13:37
taken
13:39
you ? I mean , did you ever think that you would , you know , be on President Clinton's cabinet ? You
13:41
know , like no , no , Like starting all those years ago . You
13:44
know , did you ever have that in mind as
13:46
that even being a possibility
13:49
?
13:49
So the truth is I was
13:51
a single parent , I was raising five
13:54
children , I couldn't even afford daycare , and
13:56
I'm sitting on all this stock in
13:58
a company that might go public , so
14:01
that was very good motivation
14:03
for me . And finally , when
14:06
the stock did do well , I mean I didn't
14:08
get rich , but my kids all
14:10
went to college and I bought a Porsche
14:12
. So what
14:14
?
14:14
kind of .
14:15
Porsche . I got a 911 . Okay
14:17
, which I'm now feeling really embarrassed
14:20
about because I sold it and bought a Tesla
14:22
and I really love . Yeah , I know
14:24
, I love , love my Tesla and
14:27
I just feel , I feel like such
14:29
you
14:34
for the Tesla .
14:35
We we bought two Teslas in 2024
14:38
and I love them . I absolutely
14:40
love the car . Recently
14:44
bought myself a Model X and , uh , I've wanted that car since it was
14:46
announced , right Like . I just love
14:48
everything about it , but
14:51
I couldn't imagine selling
14:53
a Porsche , even for a Tesla . I would
14:55
just like have both .
14:57
Well , you know , here I live in , I live in the city
14:59
Parking spaces are at
15:02
a premium . I actually
15:04
have two spaces , which is like two
15:06
more than most people have . So
15:08
when my wife and I bought this house , that was
15:10
one of the reasons we bought it . But
15:12
we have an SUV too , and
15:14
the Porsche was just sucking up money
15:17
and
15:20
every time something happened it was thousand dollars . Oh
15:22
yeah , I mean everything . You can't you know ? Cigarette
15:25
lighter , three thousand dollars , yep . So
15:28
I got tired of paying it . The dealer
15:30
here sucks so whenever they
15:32
couldn't get parts . So , and especially
15:35
during the pandemic . So anyway
15:38
, that's , that's why I did it but I got
15:40
to drive it I got . I had
15:42
to drive it for 20 some years . It's
15:45
uh , it's an incredible machine .
15:46
When you're dating , oh yeah yeah
15:49
, I , um , I I just sold
15:51
my audi s5 and it was my
15:54
. It was my first sports car and
15:56
that's a good car . What a , what
15:58
a fun vehicle . But
16:01
when it breaks a man , when something
16:03
goes wrong on that car , it's I
16:06
mean , like you said it , I just got
16:08
to the point where I assumed you
16:10
know I'm going in for an oil change and
16:13
I assume they're going to find
16:15
three thousand dollars worth of stuff that's
16:17
broken that I don't even know about
16:19
. Oh , I'm sure .
16:21
So you know , going off that , going back
16:23
to the other thing I said when
16:25
I was a kid , growing up , this is , like
16:28
you know , pre-psychedelic
16:31
era , going through the Beatles and all that
16:33
. So my friends who are
16:35
good with mechanical stuff were
16:37
highly in demand . Women
16:40
liked them , guys liked them . They
16:43
could change your spark plugs . They
16:45
didn't have to go into the gas station , they would
16:47
go , yeah , it's your timing
16:50
, and they would get in there and they would fix
16:52
it . They could fix TV
16:54
sets , they could fix washing machines
16:56
. Guess what ? You can't fix a goddamn
16:58
thing today . So now
17:01
it's the person with the skills that
17:03
I was just talking about . It's
17:05
it's the person who I used
17:07
to go to dinner parties with people and a
17:09
lot younger than me and I would
17:12
have an iPod and a CD and
17:14
I would say , hey , I'll give 20 bucks to anybody who
17:17
can take the songs off this CD and
17:19
put it in this iPod . Nobody ever
17:21
knew how to do that
17:23
and to this day , when I deal
17:25
with politicians and multi-hundred
17:28
millionaire VIPs , they
17:31
don't know how to do anything either . And
17:33
they all have like nephews and
17:35
like eight-year-old , nine-year-old
17:37
nephews and they do the work for
17:39
them Like printers , like configuring
17:42
a printer is still way too hard
17:44
. Way too hard , yeah
17:47
, and it should be easy , and if
17:49
you're lucky it will be , and
17:52
if it doesn't configure in the first two minutes
17:54
, you're in for a bumpy ride .
17:56
I hate printers . I really do . I really do . I really
17:59
hate having them . I only have it
18:01
because my wife is a she's an early
18:03
childhood teacher and so
18:05
she has to print a whole lot . So we have like
18:07
a very robust printer and
18:09
it's just . You know it doesn't
18:11
work . You know very like
18:13
fluently with a Windows
18:16
PC and MacBook laptops
18:19
and you have to reinstall the driver all
18:21
the time and it's so
18:23
, so dumb . But so
18:25
go ahead . Yeah
18:27
, I was going to ask you what
18:29
, what your time was like at
18:32
the nsa . You know , I've had other people
18:34
on from various agencies CIA
18:36
, nsa , dia and
18:39
they all tell me roughly
18:41
the same thing . And
18:45
I have a good friend who's in the military
18:47
and he said that if I ever do make it
18:49
into the NSA , that
18:52
first month when you're being
18:54
read into 90%
18:57
or 80 , 80 of what you need to be read into
18:59
and whatnot , it's
19:01
gonna like the capability side of it
19:03
. It's kind of just gonna blow your
19:05
mind right , like you wouldn't even
19:07
realize . Oh , you can use that for for
19:10
this thing over here , right ? Well
19:12
, I'm I'm wondering did
19:15
you have that same kind of experience back
19:18
then ? Because you were
19:20
really , I mean , at the
19:22
beginning of this digital
19:24
era , right ? I mean it didn't
19:26
really even start . You were at the very foundation
19:29
of it . Was that experience
19:32
true for you as well , or what was that like
19:34
?
19:35
Well , when I got to NSA it
19:37
was early 80s and
19:39
they had a couple of supercomputers
19:41
, like really expensive ones Cray's
19:44
, cray 2s is what they were and we
19:46
didn't have access . Nobody had access to them , so
19:48
they had like PCs . They
19:50
were like 83 , 86 machines or
19:52
something . So if I wanted software
19:55
I had to write it . So people used to
19:57
come to me and I would write a Turbo
19:59
Pascal program to do
20:02
some intelligence thing , because you couldn't
20:04
bring stuff in from the outside , and
20:06
so that was kind of fun . And when
20:08
I was in the submarines I had some
20:11
of the deepest security clearances
20:13
you can get . I mean things that are only like
20:15
that are still classified and
20:17
only like 30 people in the world
20:19
could read the material . It
20:22
was like there is stuff like that . But
20:24
in the end in CIA
20:27
what that means typically is
20:29
it means they have an asset , a human
20:31
asset , like Putin's
20:34
hairdresser . So Putin's
20:36
hair I'm just making this up I hope if
20:38
he gets killed tomorrow I'm going to feel really bad
20:40
. But so let's say Putin's
20:42
hairdresser gets turned you know
20:44
, happens all the time . So that
20:47
would be very , very carefully
20:49
protected because they're
20:51
going to shoot him in the head if they find out . For
20:54
NSA . It's
20:56
almost identical to
20:59
what hacking is . In fact , now it is hacking
21:01
. It's like you
21:03
know . It's like basically it's zero days
21:06
. Before there was even a term zero
21:08
day . Nsa was looking for zero
21:10
days . They were looking for defects
21:14
, bugs , some kind of malfunction
21:16
in any mechanical or electronic
21:19
device that they could turn into
21:21
an acquisition thing . So that's
21:23
why and this stuff's not classified anymore
21:25
, I think but that's why they were doing things
21:27
like bouncing laser beams off windows
21:29
. So you could I mean you could
21:32
hear what was being said in the room , and
21:34
there's crazier things than that in
21:36
the room and there's crazier things
21:38
than that . And so that's the secret . The
21:43
secrets in NSA were mostly that kind of stuff . And then a bunch of boring stuff
21:45
to most people , like what frequency
21:47
a satellite downlinks on . You
21:50
know what I mean ? It's it
21:53
. Most people could really give a damn and
21:55
wouldn't even understand if you
21:57
told them . But if the Russians
21:59
got it it would be a big deal , right
22:02
?
22:05
Yeah , that is , that's really fascinating . You know you talk about having that
22:07
clearance and you know only 30
22:09
people in the world are
22:11
even allowed to read that document . I always
22:14
wonder how , like
22:17
, the level up from
22:19
that even works
22:21
right , because someone
22:24
I'm just trying to think of you know
22:27
least privileged permissions , right From my
22:29
perspective . From my perspective
22:31
, if I want to give someone else access to
22:33
a system or whatever it might
22:35
be right , it doesn't matter the sensitivity
22:37
of that system , I have to have access to that
22:39
system right , in some way , shape
22:42
or form , I have to have that access
22:44
. And so it's just interesting
22:47
to me for how agencies
22:49
deal with that , because obviously you
22:52
don't want everyone knowing you know nuclear
22:54
secrets or you know whatever
22:56
that might be , and you have to really tightly
22:58
control that information
23:00
. It's just fascinating for
23:02
me to you know , think about
23:04
it , how you would do it , even with a people
23:06
like a physical , you know person
23:09
right , like how do you control that
23:11
, how do you monitor what they're doing , and that
23:13
sort of thing 10
23:28
years from now , nobody's going to be doing that .
23:29
Maybe six or seven years , nobody's going to be doing that . And the reason is because
23:32
both defense and offense
23:34
is going to shift
23:36
over to AI AI-driven systems
23:38
because they move much faster
23:41
than human beings . So if
23:43
an AI is running some kind
23:45
of denial of service act or some
23:48
kind of penetration hit on your network
23:50
, it can make like a million
23:52
hits on every single address
23:54
in your network just like that . So
23:57
no human being will even see
23:59
it coming , let alone stop it
24:01
. So you need to have some kind
24:03
of AI-driven defensive system
24:05
. On the other end , and that's one of the reasons
24:07
I'm working with a company or two that's
24:10
doing Web3 decentralized
24:12
stuff , because I think the
24:14
biggest damage that's been done in security
24:16
in the last 20 years is
24:18
deferring things to centralized companies
24:21
and that's where all the breaches happen
24:23
. They're service providers . I
24:25
mean you know Equifax and SolarWinds
24:28
. You look at any of those , it's never
24:30
the company with the name on it
24:33
, that's they're not responsible
24:35
. It's some idiot third party that
24:37
they hired to do credit card processing
24:39
or something and they
24:41
got hacked . And then it happened
24:43
with AWS too . So
24:45
I mean that's the hole . So
24:47
in the future , when it moves into an
24:50
AI driven system , that hole
24:52
, those holes , will go away . Hmm
24:54
.
24:56
Yeah , I , you know , I always
24:58
talk about planning
25:00
for the future on the podcast
25:02
and and you kind of seem like someone
25:04
that that thinks into
25:07
the future . Right , then you start . You
25:09
start working towards it immediately because it's
25:11
like , hey , if we're going into a post-quantum
25:13
world like we are , I
25:16
need to be experienced with
25:18
it , I need to have some level of expertise
25:21
with it , otherwise in 10 years I'm
25:23
going to be obsolete and I won't
25:25
be able to do anything . Right
25:27
, how , how do you determine
25:30
? You know where things are
25:32
going , where to spend your time
25:34
, what to really focus on ? Because you
25:36
know for myself , right 10 years
25:38
ago , I knew I wanted to get into cloud security
25:41
, right , and now I've been in
25:43
cloud security for a while and now
25:45
I'm shifting gears , getting a PhD
25:47
in how to
25:49
secure satellite communications in a post-quantum
25:52
world . That's a good one . Using
25:55
zero trust principles right .
25:57
Good yeah .
25:59
So I'm also someone that looks towards
26:02
the future and then acts on it and says
26:04
, well , what's going to challenge me , right
26:06
, what's going to make me grow ? And those
26:08
are typically the most rewarding , probably
26:11
most arduous tasks , right , how do you
26:13
approach it ?
26:14
Well , I have some old friends
26:16
who are very , very
26:18
senior tech people and
26:21
sometimes we talk . I
26:23
just had a long call with an old friend of mine
26:25
yesterday who used to be the chief scientist
26:27
at Amazon in the early days In fact
26:29
I had a fellowship there at the time and
26:32
we had this futuristic talk
26:34
because we both were kind
26:37
of laughing about it , because we both see very
26:39
similar things coming two
26:42
years , five years , 10 years . I
26:44
mean there's nothing we can do about
26:46
it . And I found a long
26:48
time ago that if
26:50
you invest in the future you
26:53
will go broke so fast , because
26:56
I tried this , because I always saw what was coming
26:58
and I was almost always right . But
27:00
you can't just because you know something I like 3D
27:02
printing . I saw that coming years before it happened . I saw that coming years
27:05
before it happened . So when the 3D printing
27:07
companies came up , I said , oh , I'm going to buy stock
27:09
in this stuff . Well , guess what ? I was
27:11
right about the industry , wrong about the companies
27:13
, and I mean that's you
27:15
know that's the kind of stuff that happens
27:17
. But I think futurism
27:21
that's another
27:23
word I mean I sometimes call myself that
27:25
, but many people who
27:28
call themselves futurists are
27:30
frauds . I mean just flat
27:32
. They're like hella evangelists
27:34
, like that level of
27:36
fraud . And when you talk to
27:38
many of these people they
27:40
have a marketing background . They're
27:43
not people like you and I
27:45
. That could you know in a pinch . You
27:47
know , dig into a router and try and
27:50
figure out what's going on . I haven't done that
27:52
stuff in years but I could still do it . They're
27:55
not like that and
27:57
you know that goes back again to
27:59
the kind of the theme that I didn't know I
28:01
had here , but that these skills
28:03
are changing and
28:05
they're going to be less useful . Like
28:07
I tell you , you know you were saying about cybersecurity
28:10
and you give people like a
28:12
test question to see if they're
28:14
serious . I try to talk people out of
28:16
going into computer science and
28:18
I've been doing that for seven or eight years and
28:21
I often give talks to grad schools and
28:24
they get angry , usually because they're , like
28:26
you know , one year away from getting
28:28
their doctorate in computer science . The
28:31
argument I have is computer
28:33
science today and tomorrow will
28:37
be mostly algorithm development
28:39
, and there is only so
28:41
many algorithms that
28:43
you need people to develop , and it's
28:45
a very small subset of
28:48
the number of people running around today with
28:50
graduate degrees in computer science . So
28:52
most people who call themselves computer
28:55
people or technologists they're
28:58
kind of , you know not to be offensive but they're
29:00
kind of webmasters . You know
29:02
they put up a website , they know
29:04
how to do some Java , javascript
29:06
. I mean they know what JSON is , maybe
29:08
. I mean they know stuff , but it's
29:11
very , very narrow . It's
29:13
not the way things used to be , where
29:15
you had to know all of this stuff . It's
29:17
like the mechanic guy who can do your spark
29:19
plugs . It's not just General Motors
29:22
. He had to work with Fords and Chryslers
29:25
and you know else , because
29:27
it was they were principals .
29:29
So yeah
29:33
, that is a really good point . You
29:36
know , I don't even know what , like
29:38
what they would get a phd in
29:40
computer science and like what does
29:42
that even look like ? Because in your bachelor's
29:44
you're learning , you're
29:46
learning , you know the bits and
29:48
you know hexadecimal , you're learning
29:50
c plus plus and all
29:53
that sort of stuff and I I didn't get my bachelor's
29:55
in that area . I actually got my bachelor's
29:57
in criminal justice and you know
29:59
, wanted to , wanted to go the federal agency
30:01
route and I kind of stumbled into
30:03
it and found it to
30:05
be a lot more interesting in some
30:07
ways . But what
30:10
does that even
30:12
look like for a PhD in computer
30:14
science ?
30:15
Yeah , I think your point's a good one . I never thought about
30:17
that . Basically , everything
30:19
you need to know about computer science you can get
30:21
as an undergraduate Right
30:24
. That's kind of what you're saying and that's absolutely
30:26
true . The stuff that paid
30:29
off for me in the long run was
30:31
stuff like knowing how to build a compiler
30:33
. So I took a couple of grad level
30:35
classes in that and I
30:38
did build compilers , but they
30:40
were like natural language compilers , so
30:42
you can apply that technology
30:44
to many other things if you understand
30:47
what that technology is and that
30:49
kind of thing . Like , I was a
30:51
Lisp programmer for a while if you know
30:53
anything about Lisp . So
30:56
Lisp was the language for AI for many
30:59
years . But it's a
31:01
crazy programming style . It's
31:03
all recursion , so you have to be I
31:05
mean all of it , that's what it does . So
31:07
you have to understand recursion or mean all of it , that's what it does . So you have to understand
31:09
recursion or you cannot possibly
31:12
program unless . So those programmers
31:15
are pretty much gone now , but that
31:17
was a skill I had to learn from school
31:19
.
31:22
Huh , I guess it makes sense
31:24
. For , yeah
31:27
, I guess , just thinking about
31:29
from an education perspective , right , it
31:32
makes sense to get that undergrad degree
31:34
in computer science , if you're going to go down
31:36
that path and whatnot , and then it
31:38
probably makes more sense to get you know
31:40
these onesie , twosie classes of
31:43
developing core technology
31:46
types right , rather than
31:48
even going down the path of getting a master's
31:50
like a full master's . Just get those
31:52
courses , get that skill and
31:54
then build from there . You know
31:56
, because those are skills that
31:58
really you know you can build off
32:01
of right and it'll transform into
32:03
something else where you're using it with
32:05
AI and building a model .
32:07
Well , every once in a while , because of the kind
32:09
of stuff I do , I run into hybrid
32:11
people . Now I mean younger
32:13
, typically like computers , people
32:16
that have an undergraduate degree in computer science
32:18
and then they get a law degree . I've
32:21
run into half a dozen doctors
32:23
who started off as IT
32:25
people and then they went to medical school
32:28
. And these people are they're killers
32:30
because they can do
32:32
stuff none of their colleagues can do . So
32:35
when they get out into that world , the
32:37
legal world , the medical world , everybody
32:40
relies on them for anything that looks
32:42
like a computer , and I'm
32:44
talking like litigation . Is the hospital
32:47
going to buy a new $150 million
32:49
automated surgical robot arm
32:51
? Well , let's ask Joe , because
32:53
he's got the computer science degree , although
32:55
you said you didn't , but even so . So
32:59
I mean , I think that's very
33:01
powerful . I don't see the
33:03
specialization requirements
33:05
anymore specialization
33:11
requirements anymore .
33:12
Yeah , that's actually very true . You know , I think this is kind of how I approach
33:14
it . You know , when I was getting started
33:16
, I wanted to learn as much as I
33:18
possibly could about everything
33:21
. There wasn't a specific technology
33:23
that I wanted to focus on or a specific
33:26
domain or anything like that , and so I got
33:28
experience , you know , with
33:30
WAFs , right , and then regular
33:32
firewalls and EDR systems . I
33:34
have experience with all of the big
33:37
EDR systems . You know , when
33:39
a lot of people , a lot of people , will
33:41
say I only know CrowdStrike or
33:43
I only know X EDR
33:45
, right , I have a full spectrum
33:48
of experience across almost
33:50
every single domain in security . And
33:53
then I went through and
33:55
I decided , okay , I'm going to specialize
33:57
in cloud security . And now I'm kind of taking
33:59
a step back and I'm upscaling right
34:02
on the PhD side with post
34:04
quantum encryption on satellites , two
34:07
things that have
34:09
so many different facets to it that I've never
34:12
touched before , right , while
34:14
I'm also going back in my career
34:16
and getting more broad , getting more
34:18
generalized and specializing
34:21
in a few niche areas , but
34:23
still building , you know , a
34:25
stronger I just I say building
34:28
a stronger overall experience , right , because
34:30
you know something I'll learn
34:32
in network security or
34:35
with a WAF or whatever it might be , will
34:37
benefit me in vulnerability management
34:39
and it'll benefit me in other areas
34:42
.
34:43
Knowing the concepts will
34:45
pay off throughout your entire lifetime . The
34:51
concepts will pay off throughout your entire lifetime Far more than memorizing
34:53
tables or something like that . Understanding the concepts is really
34:55
, really important , and I think that gives people
34:58
survivability in the marketplace
35:00
. So you know . Something else to consider
35:02
. When I went to college and
35:05
my first degree was in the 70s
35:07
, late 70s , there
35:09
was no computer science degree . You couldn't
35:11
get one . You had to get a
35:13
math degree , carnegie Mellon . I've got a couple
35:15
of friends who went there . They got math degrees
35:18
and then they ended up being computer programmers , but
35:20
that's all they could do . So we
35:23
changed job titles , especially
35:25
in this country , every seven
35:27
or eight or 10 years . As look
35:29
at what a lot of people do now in
35:32
LA and New York and Chicago
35:34
, and you , if you go to a room of millennials
35:37
, you know at a bar or something , and say , what do
35:39
you do ? I guarantee you
35:41
at least a third of them were professions that
35:43
I may not even know what they are and they did
35:46
not exist 10 years ago . I'm
35:48
an SEO specialist , okay , well
35:51
, what is that ? I mean
35:53
I know what it is . I'm exaggerating , but
35:55
most people my age wouldn't
35:57
, and it's because the professions
36:00
have changed . So you want to be futuristic
36:03
for a while . Put your hat on
36:05
and think five , six , seven , 10
36:07
years . What kind of professions are
36:09
we going to look at ? Well , I bet a lot of them
36:11
are going to have the word AI in them , and
36:14
they're not going to be building AIs
36:16
, they're going to be training
36:19
AIs or they're going to be servants
36:21
to AIs . So when the AI
36:23
needs like a cup of coffee or something , you'll
36:25
metaphorically , that's what
36:27
you'll do , because they don't need
36:29
us to do anything like this . They
36:32
need us to feed them data , but they've
36:34
already eaten all the data . Openai
36:36
announced , I think a week or two ago , that
36:39
they've now looked at every single piece of data
36:41
that they could possibly look
36:43
at and they're now building systems
36:45
that generate false data
36:47
that they can use for training . The rest
36:49
of the systems Sounds goofy
36:52
, but what that ? I mean ? What
36:54
that is is those are machines that are now
36:56
training themselves . I mean
36:58
, look at programming . I mean open
37:00
AI , like the chat , gpt stuff . I'm
37:02
sure you've tried to write programs with it
37:04
. Everybody has . Oh , yeah , they're not
37:07
bad . Yeah , I mean , they're not
37:09
the most clever thing I've ever seen , but
37:11
they work , they compile and
37:13
they do the thing they're supposed to do . So
37:15
you know we're just . And then you
37:18
know not to get you know too spiritual
37:21
here or anything , but you take that idea
37:23
of technology and then you put
37:26
drones and robots
37:28
and Tesla . My
37:31
Tesla has a summon feature that
37:33
I am terrified to use . I
37:35
tried it once in the middle of DC and
37:38
we were like two blocks away and I hit
37:40
the button and then it comes barreling down
37:42
Connecticut Avenue with nobody
37:44
at the wheel . And this
37:47
isn't Waymo , this is like a car driving
37:49
itself with no real particular direction
37:51
in mind . So when you start
37:53
looking at that , I mean what
37:56
are we looking at ? We're kind of looking at Skynet
37:58
.
37:59
Yeah , yeah
38:01
, that's a really good point . Where
38:05
do you think AI
38:07
security fits into the
38:09
development of AI ? I
38:12
know that we talked about that offensive
38:14
and defensive component , but when
38:16
we're talking about models , it's
38:19
a little bit different , right , because
38:21
you almost have to , you know . It's
38:23
like you have to monitor what
38:25
the model is consuming . It's like you
38:28
have to monitor what the model is consuming . And
38:30
you know this is the thing , I
38:32
don't know how else to explain it Right , you
38:35
wouldn't want the model right
38:38
to look at Nazi Germany and
38:40
say that is good
38:42
, that's something that we want
38:44
to propagate , but you don't
38:46
want to keep that information from
38:48
the model Right , and
38:53
so you get into a weird dilemma
38:55
.
38:55
Do you remember there was an AI that Microsoft
38:57
did about 10 years ago that
39:00
they had to pull off the market because it
39:02
became a racist ?
39:04
Yeah , Do you know what I'm ? I'm just looking it up because
39:06
I'm here , yeah , t-a-y
39:09
.
39:09
So it did exactly what you're
39:11
saying . So they fed it a bunch of
39:13
stuff and they didn't really constrain
39:15
what it ate on the websites
39:18
and it hit a bunch of white supremacist
39:20
sites and then basically it
39:22
was saying Sieg Heil and it
39:25
was saying like a whole bunch of anti-Semitic
39:27
stuff and it used
39:29
the N-word with people in casual
39:32
conversations . So Microsoft
39:34
shot it in the head and they never revived it again
39:36
. Wow , that's
39:40
not programming . That's not programming
39:42
. That is its interpretation
39:44
of the data that was input to it
39:46
.
39:47
Right , right , so
39:49
like that . That's that's kind
39:52
of where I think AI security like comes
39:54
into , into play , right
39:56
, where it's kind of more about monitoring
39:58
what the model is consuming and
40:01
trying to figure out . See , I
40:03
always view it and people with at nvidia
40:05
they argue with me on this as
40:07
like a ai model hierarchical
40:09
system where you have
40:12
an overarching ai model that you want
40:14
people to consume and interact with
40:16
and whatnot , and then that ai
40:18
model is fed off of other models
40:20
that is looking at specific
40:23
topics . So it's almost like that model gets
40:25
specialized into a
40:27
certain area , like maybe world history
40:30
or European history or
40:32
, you know , sports , right
40:34
, the finance industry , and when
40:37
that model reaches a certain level
40:39
of maturity , it starts feeding that upper
40:41
level model that information
40:43
to query , to interact
40:46
with , for users like us
40:48
to start querying it and building
40:50
different things from . I think that that
40:52
might be the only way to do it . But again
40:54
, you know NVIDIA , those geniuses over
40:56
there they argue with me that that's not a great
40:58
way .
41:00
Yeah , I would argue that too . Actually , I mean
41:02
part of the problem here . Here's another
41:04
weird comment that I
41:07
don't think a lot of people make . We have hit , for
41:09
the first time in
41:11
the computer lifetimes , we have
41:13
hit the point where you can no longer backchain
41:16
why a computer had an
41:18
answer . We could always
41:20
do that before . It might take a while
41:22
, but if somebody said , god damn it
41:24
, why did the computer do that , why
41:26
did it shoot down that airplane , you know
41:28
, a week later somebody's going
41:30
to tell you why that
41:32
day is gone . And generative
41:35
AI systems . It is completely
41:37
impossible to backchain
41:39
those guys and to get like a
41:41
stack dump and find out exactly
41:44
why they did what they did . So
41:46
as that is I mean that's like
41:48
pervasive across this industry
41:50
. So as that continues
41:52
, you know you're going to . Well , that was like
41:54
the racist , the racist plot at Microsoft
41:56
. They knew what it was because
41:58
they went back and looked at the websites that
42:00
it was it was looking at . But in
42:03
the future there's already so many
42:05
of them . How would you know ? And it's not
42:07
like they have to look at whitesupremacistcom
42:11
that pulled in . They can just go to Twitter
42:14
or X or any of a number of other ones , and
42:19
they can find all of that crap in
42:21
free speech forums . So
42:24
, from a cybersecurity viewpoint
42:26
, you go back to your question . I
42:28
don't think you can look at the input , I think
42:30
you have to look at the output . So
42:33
I think cybersecurity for AIs
42:35
is going to be like
42:37
it's like you have an attack dog
42:40
and it's been trained and you're
42:42
walking around with it on a leash to make sure it doesn't
42:44
bite anybody . And I think that
42:46
is what AI is going to be like , because
42:48
you won't know why it's doing it and
42:51
there will not be a human
42:53
understandable correlation
42:55
of causality between reading
42:57
this post on X and
43:00
deciding to use the N word
43:02
in a forum . You just won't know . So
43:05
you just have to wait until it screws
43:07
up and then you have to roll up a newspaper
43:09
and hit it in the nose . Huh
43:11
.
43:13
That's . It's fascinating . I feel like
43:15
we could go for another hour just talking
43:17
about any of the 10 topics we
43:20
just dove into . You know , unfortunately
43:22
we're almost at the end of our time , but I
43:24
really want to dive into the
43:27
stuff that you're working on
43:29
now , right , so you're discussing
43:31
about building or
43:34
working on , you know , web3 and
43:36
post-quantum . So talk
43:38
to me a little bit about that , because I don't want to butcher it
43:40
and this can get pretty complex .
43:42
No , I , actually I wanted to do that . So
43:44
I'm working with a company called
43:46
Neoris . It started in Portugal
43:48
, but it's a global company and it's
43:50
a it's a web three company and the
43:53
founder developed some really
43:55
cool security approaches to
43:57
where you have these little lightweight processes that
44:00
can be very quickly ported
44:02
to any device you know routers
44:04
, computers , whatever , and even IoT
44:07
devices and then , when
44:09
there is a possible attack
44:12
or there's some suspicious
44:14
looking , you know , packets start coming
44:16
in . Instead of going
44:18
out to like Microsoft and saying , is
44:20
there a CrowdStrike or something saying Is
44:22
this okay ? What it does is
44:24
it takes a , it has a blockchain
44:27
attached to it and it has a vote
44:29
, but not just its computers
44:31
, but other networks that are
44:33
in , like the big meta network . So
44:36
some computer you know in
44:38
Berlin will vote on this because
44:40
their profiles and you know , like
44:42
virus profiles , right . So there'll
44:44
be that kind of thing and and it works really
44:47
, really well . And the demo
44:49
we've been using is we
44:51
have a robot arm and we hit it with a
44:53
, with an attack , put
44:55
a virus on it , and then our
44:57
system is able to . When
45:00
we do it again with our system running , it
45:02
deflects the virus and it won't accept it
45:04
as input , and so now we've added post-quantum
45:06
onto that . So the attractive
45:09
part about this system , to me at
45:11
least , is it's decentralized
45:13
. So if you're a company
45:15
and you buy a system like this and
45:18
you run it and something goes wrong
45:20
, it's your IT guy's fault , it's
45:22
not Microsoft , and I think
45:24
that's very empowering . That's
45:27
interesting .
45:28
We spent the whole time talking
45:30
about how that control
45:32
or that empowerment is going away
45:35
from us and more towards
45:37
technology or these thousand-pound
45:39
gorillas in the industry and
45:42
that's interesting how it's bringing it
45:44
back , how it's bringing that
45:46
ownership back to us almost in
45:48
some ways .
45:56
Well , I think I mean I'm kind of an anarchist at heart really . You would never know from my background
45:58
, but whenever I see things getting too institutionalized
46:00
it gets my hackles up . And
46:02
the government never bothered me
46:05
because I'd been in the government and the
46:07
government's fundamentally incompetent , no
46:09
matter who's president , and they can't really
46:11
do things . They say they're going to do things
46:13
, but it takes them like a decade
46:16
to do almost anything . The
46:18
thing to worry about is guys like Zuckerberg
46:20
, you know , and those people , the
46:22
billionaires , that are not stupid
46:25
, that have lots of
46:27
assets . Elon Musk is probably
46:29
a better example , because he'll
46:31
do almost anything , potentially , if
46:33
it suits his interest . I
46:35
worry about those . So the more
46:37
we take our technology
46:40
out of these people's hands , the
46:42
better off we are .
46:43
Yeah , well , it also enables us
46:45
to maintain our own privacy right
46:48
, which has been something that you
46:51
know doesn't really exist .
46:54
I wrote a couple of books in this . I wrote a book called
46:56
Privacy Loss , still on Amazon
46:58
. It's not there for 14 years . I predicted
47:01
a lot of this stuff and
47:03
I think that the trick to privacy is
47:05
you have to accept the idea that
47:07
the old definition
47:09
of privacy is irrelevant . Privacy
47:12
is not binary , it's not , and
47:15
baby boomers talk about
47:17
it in Gen X people . They go oh , I lost
47:19
my privacy , oh , I got my privacy back
47:22
. It's not virginity , it's
47:25
not like that . It's not binary Privacy
47:28
. It's like uptime on a network . You
47:30
know 99.9999
47:32
. It's like four nines or three nines or two
47:34
nines . That's what privacy is , and
47:37
you have to expend a certain amount of energy
47:39
and time and money to achieve
47:42
each granularity level on privacy
47:44
. But people don't want to
47:46
spend that money because they think they're entitled
47:48
to it anyway . So
47:50
that's going to be a problem too .
47:54
Yeah , yeah , that is definitely going to be a problem
47:57
. I feel like to some extent
47:59
it's been that
48:01
whole debate , that whole talk has
48:03
almost kind of been pushed to the back burner
48:06
in some ways . You know
48:08
, I I always remember the
48:10
first time I went to to germany
48:12
for a study abroad in college , it
48:15
when I was on the plane it
48:17
had just broke from the snowden
48:19
leaks that we were spying
48:21
on germany , right . So when
48:23
I get off this plane , I have a connecting flight
48:25
to make it to Berlin . I'm in Germany
48:28
, I'm in Dusseldorf , right , and I'm
48:30
going through customs and this guy doesn't want to stamp
48:32
my passport and I'm sitting here like like
48:35
hey , man , I have a flight in 20 minutes
48:37
. I have to run across an airport in
48:39
Germany . I don't know where I'm going to
48:42
catch this flight that I hopefully get the
48:44
right one , right . And so
48:47
I started to argue with him , right , and
48:49
it eventually because there was no TVs
48:51
around me or anything like that he
48:54
eventually just stamped it . His boss came over
48:56
and stamped it and by the time I get to my
48:58
gate , I sit down for five minutes and
49:00
I see America spying on Germany
49:03
since whatever year , and
49:05
I was like that's not good for
49:07
me , because now I just came here and
49:09
I yelled at that guy and they're
49:11
probably looking at me a different way now , but I
49:13
mean of course we were .
49:16
Everybody spies on everybody , and
49:19
you know it's like this TikTok thing
49:22
which is absolutely ludicrous . And you know
49:24
it's like this TikTok thing which is absolutely ludicrous . I mean , it's not ludicrous
49:26
to think that TikTok is gathering personal
49:28
information . It's ludicrous to think
49:30
they aren't , and in fact
49:32
, I would be shocked if they weren't doing that
49:34
. And guess what ? I bet
49:36
Meta does it , and Instagram and
49:39
Facebook , and I bet Elon
49:41
Musk does it with X , and
49:45
I bet , you know , every one of these social media platforms does
49:47
it . Microsoft does it , even if you surely you've noticed
49:50
this . But if you buy stuff
49:52
, software you used to buy , like Microsoft
49:54
Office or Adobe Photoshop they
49:56
have switched to these serialized
49:59
license models which require
50:01
a lot more information from you
50:03
, and so not only
50:06
do they want the money , they also want the
50:08
information .
50:09
Yeah Well , these products
50:11
, you know they can be free to
50:13
some extent because we're
50:15
the product . You know they're taking
50:17
our data and they're selling it to
50:20
whatever broker and you
50:22
know it's a mess . And I don't know
50:24
how we come back
50:26
from this perspective without
50:29
having something like Web3 , you
50:31
know , widely deployed , widely accepted
50:34
and , you know , building from there
50:36
.
50:37
That's why I'm interested in Web3
50:39
. I mean , my
50:42
basic meta thought on this is
50:44
I think individuals need
50:46
to be armed with cyber weapons
50:48
and , like when I was
50:50
at Network Solutions , I was running a
50:52
thing called the internet , which is the DNS
50:55
system and other stuff , and I had
50:57
to defend the first . As far as I know , institutional
51:00
denial of service attacks Big
51:03
. No one had ever seen one before and
51:05
they were really stupid and anybody
51:08
today could have stopped it . They were just like
51:10
smurfing on
51:12
some broadcast address
51:14
. But we had to decide what to do because
51:17
there was no precedent and no policy
51:19
. And I made the decision let's
51:21
find out the IP address and let's
51:24
, like , blow them up out of the water
51:26
. And we did , and that
51:28
was my approach . If somebody did that
51:31
to us , I would find out what network
51:33
they were at and I would blow their
51:35
network out of the water . And then I didn't
51:37
have to worry about Smurf attacks . So
51:39
I don't even think you can do that now
51:41
, but yeah .
51:44
It's like Battleship . Yeah , it's like playing
51:46
Battleship .
51:47
It absolutely is , and
51:49
that's how things are with the Chinese in the US
51:51
right now too , right .
51:54
Well , david , you know we're
51:56
at the top of our time . I try to be very conscious
51:58
of my guest's time . You know , when I say it's an
52:00
hour , it's an hour . But
52:02
I mean this conversation has been very
52:05
fascinating , very engaging , and
52:07
I definitely want to have you back on .
52:15
This is a fantastic time . Yeah , well , thank you . I've enjoyed
52:17
it too , and I think the stuff you I looked at
52:19
some of your other ones that you've had guests on , I mean the stuff you're doing
52:21
is really relevant right now .
52:23
Yeah , I try to be . You know I don't want to put out like dated
52:26
, dated information . I want the podcast
52:28
to actually , you know , have value and
52:30
show value to to my listeners
52:32
. Well , you know , david
52:34
, before I let you go , how about you tell my audience
52:36
you know where they could find you , where they could find
52:38
your company , that you're , that you're doing this great
52:40
work with ? Uh , if they wanted to learn more , the
52:52
company is called Naoris .
52:53
It's a Portuguese word , it's N-A-O-R-I-S dot com . N-a-o-r-i-s dot com
52:56
. You can find me at DavidHoltzmancom
52:58
or GlobalPOVcom is another website I use
53:00
and my email address is on there
53:02
and if anybody wants to reach out , I'm
53:05
I'm pretty accessible .
53:07
Awesome , awesome , well , well , thanks David
53:09
for coming on . I'm definitely going to have to have
53:12
you back on , and you
53:14
know . Thanks everyone for listening . I hope
53:16
you really enjoyed this episode and more
53:18
to come in 2025 . Thanks .
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More