Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:53
How's it going , colby , it's great to
0:55
get you on the podcast . I think that we've been
0:57
planning this for quite a while at this point
0:59
, and we've had to delay it , of course
1:01
, a couple times , but I'm glad to have you on .
1:04
Yeah , joe , appreciate it . Sorry
1:06
for the delays too . It's a busy time of
1:08
year with travel and everything going
1:10
on crazy
1:19
it's .
1:20
I like burned myself out three times this year and I'm like , I'm like just recovering from
1:22
my my last one of the year , hopefully
1:24
last one of the year , it's , I don't know . It's a . It's an interesting
1:27
thing . Right
1:29
, trying to develop
1:31
like outside
1:33
, outside things from
1:35
the nine to five . Right , trying to
1:37
develop , you know , a brand
1:40
for yourself and trying to like
1:42
I'm starting to dive more into consulting
1:44
. Right and , and you know , provide
1:46
companies with cloud security . You know
1:49
consulting services and whatnot . And when
1:52
you start adding on those things
1:54
, right , like you have to use your
1:56
time so efficiently now you know
1:58
, especially with a little one . Right , like you have to use your time so efficiently now you know , especially with a little one . Right
2:00
, like I have a 20 month old at home
2:02
.
2:03
Okay .
2:04
Yeah , thanks . You know like I make a
2:06
very concerted effort to
2:09
always be available for
2:11
her Right . So when she's up I'm not working
2:13
right Like I'm spending time with
2:15
her . That , you know that takes out
2:17
, you know , six hours a day . It's
2:20
like okay . Well , let's like use
2:22
the time that I have as efficiently as
2:24
possible yeah , absolutely
2:26
.
2:26
It's like burning the candle at like four ends
2:28
I have a four month old at home so
2:31
I'm right there , first one . So
2:33
I'm learning that whole side of of
2:35
which is . You know it's a blessing
2:37
, it's fantastic . But
2:40
then you know trying to build a startup and
2:42
you know running around customer
2:45
acquisition and my wife is in
2:47
the wine business . So there's , you know
2:49
, another startup kind of going on at
2:51
the same time and it's
2:53
, yeah , there's like candles burning all
2:55
over the place and trying to time , manage
2:58
and be efficient
3:00
while you're . You know growing
3:02
a brand , building a business . Right , you
3:05
know a lot of people choose to do
3:07
just one of those things at any given
3:09
time in life , and you know as
3:11
well as like throw in a new
3:13
family member at the same time
3:15
.
3:16
So it's definitely . It's interesting , a new family member at the same time , yeah , so it's definitely
3:18
it's . It's interesting
3:21
, you know , like I feel like
3:23
, you know , when I , when I was growing up , right
3:25
, when I was graduating
3:27
high school , starting to get into college , right
3:30
, the recession hit Right , and so that
3:32
impacted my family very significantly
3:34
. You know , my dad lost his job
3:37
Right he my family very
3:39
significantly . You know my dad lost his job right . He couldn't find work for like
3:41
two years . That was a very stressful time , right , and going through
3:43
that , you know , puts me into
3:45
a situation , or a mentality at
3:47
least , where it's like , okay , well , my kid's never going to go
3:49
through that , you know . Yeah , and
3:52
now you know the market is in a weird place
3:55
. It's kind of in limbo . It's been in limbo for
3:57
a couple of years now where it's like
3:59
, well , surely it can't keep on going
4:01
up . And then it goes up more .
4:03
That's what everybody said about Bitcoin .
4:05
Yeah , well , like you know
4:08
, the risk side of me is like , ok
4:10
, sure , like it looks great , but
4:13
what's actually going on here ? You know , because
4:15
it can't go up forever and just
4:17
by odds alone , you
4:20
know , the timing is not in our favor
4:22
, right Like the recession , there will be a
4:24
correction , right . And so trying
4:27
to develop other methods of
4:30
, you know , bringing in income and , you
4:32
know , building something that I actually love
4:34
and enjoy and whatnot is , it's a challenge
4:37
for sure , and I , like I always tell
4:39
my wife I'm
4:41
like , hey , you're the stable income . Yeah , I know I have
4:43
a salary and everything , but you're
4:46
the stable one , you're a teacher , they're
4:48
never going to fire you . You work for CPS
4:50
, you are good , you
4:52
need to be there .
4:53
That's right . That's right . Yeah
4:56
, the market's interesting right now . I mean , we're seeing
4:59
just as we work
5:01
with customers . You know there's some
5:03
spending starting to open back up , but
5:06
it's mostly critical projects
5:08
and most of the projects
5:10
we're seeing are around how to reduce costs
5:12
. You know , in
5:14
in areas and I think
5:16
there's a lot of noise in
5:19
the market right now with AI and how
5:21
AI is going to reduce all these costs everywhere
5:23
and I don't think people are really seeing
5:25
that value happening . Maybe it's a
5:27
hallucination , yeah
5:30
.
5:31
I feel like the AI trend
5:34
right , or the AI evolution , is
5:36
almost like it's still in
5:39
its infancy . I feel I talk to people from
5:41
NVIDIA . They kind of argue that
5:43
it's in the middle . It's going towards the
5:45
middle a little bit . Well , from
5:47
the end user perspective . You
5:49
know I've been buying tools
5:51
, working with tools with , you know , an alleged
5:54
AI behind the scenes for
5:56
10 years , right , right , and
6:00
I've seen my costs only go up , you know . So like that's , that's something
6:02
there , that's not something .
6:03
I can just ignore . I like how you said
6:05
. I like how you said alleged AI right and . I'm sure you've
6:07
seen the Scooby-Doo meme
6:10
where they pull the mask off of AI
6:12
and it's if , then else .
6:14
Yeah , yeah , exactly
6:16
, I mean it's , I
6:20
don't know , it's a tough pill to swallow
6:22
, right , because that was like a huge selling point
6:24
for so many years of like . Oh yeah , we
6:26
have a next gen
6:28
, we have an AI thing , you know
6:30
, and I
6:34
feel like the benefits of AI
6:36
where they may be , you
6:38
know , true and valid and whatnot , like cost
6:40
savings and whatnot I feel like we
6:42
won't even see those real benefits
6:44
until , you know , probably five
6:46
to 10 years from now , and
6:48
I know there's a lot of AI people out there that
6:50
are probably going to , like , you know , laugh
6:53
at me or whatever for saying that
6:55
, but like , but look me
6:57
saying five to 10 years . I mean , they've
6:59
been saying it for 20 years , right
7:01
. Right , they've been saying AI is going
7:03
to eliminate everyone's job for 20 years
7:05
.
7:06
Right , right , since , like Terminator
7:08
days , right , but I
7:10
don't know . I mean , I use it tactically
7:13
for certain things here and there , but I
7:15
would certainly you know . Let's
7:17
say I was writing an email to my board
7:19
about how we're doing
7:21
as a company , or what have you . Or I was writing an email
7:23
to a CISO , a customer that
7:26
we're working with . I would never
7:28
let AI do that for me , right
7:31
? I just you know what , if it hallucinates
7:33
and , you know , says something that is
7:36
just not true , or whatever , at the
7:38
end of the day , you're the one who's accountable
7:40
, right ? so yeah , I .
7:42
I don't think that most people
7:44
are really in risk of ai taking
7:46
over their jobs at any time soon right
7:49
, yeah , if it can't , if it's not
7:51
even responding to emails for
7:53
me or proactively right
7:55
looking at things and responding
7:58
and stuff like that Like it's more
8:00
of an assistant you know everyone now has
8:02
like a personal assistant that you can bounce
8:04
ideas off of and get information off
8:06
of it's . I view it as kind of like
8:08
the next iteration of a search
8:11
engine , almost you know , that's how
8:13
I use it for sure , Exactly
8:15
.
8:15
All right , I think it is definitely the next iteration
8:17
of a search engine . It saves you from
8:20
having to collate all the results yourself and
8:22
it kind of formulates an opinion . The
8:25
question is then how much do you trust
8:27
that opinion and how much additional
8:30
due diligence do you do ? And I guess it depends
8:32
on the importance of the decision , right , Right
8:35
, you know if I use it , we've
8:37
been , obviously , with a newborn . We've been checking
8:39
a lot on the internet . Well , every time
8:41
something happens with the baby , you Google it . Right
8:43
, You're like what is ? this . You know
8:45
how it goes right , and some
8:47
of the responses come back and you're like , yeah , okay
8:49
, I get it . But if it came back and said
8:51
like , oh , you should do this remedy
8:53
or something , I'm certainly going to check with
8:56
a doctor before I , you know , just
8:58
base that decision off of that .
9:00
Yeah , yeah it's
9:03
. You know we're going into
9:05
an interesting place , right
9:07
, because I , you know , we just
9:10
got , we're on the other side of the election , right
9:12
, and it has
9:14
been an interesting year because
9:16
I feel like a lot more people are , I
9:19
guess , more aware , right , of
9:21
the media that we're consuming
9:23
and what it's actually doing to us . I
9:25
hope so . I would certainly
9:28
hope so too , you know , and you
9:30
know you always , you always hear like oh
9:32
yeah , you know you're being targeted by these kinds
9:34
of ads and whatnot , right , and so
9:37
I , I live in
9:39
Illinois , right
9:45
, very much , I mean it's , it's a blue state in the County that that
9:48
I live in . It just happens to be like 80% of the people
9:50
in the state live in the County , right , and I get , I get
9:52
I don't want to say targeted , right , but
9:54
my , my algorithms are heavily
9:57
based on where I live and where
9:59
I search them from and everything , right
10:01
, and I search things that are typically
10:04
, I view it as like right in the middle , right
10:06
on the political spectrum and I'm not trying to get
10:08
political on this podcast or anything like that , right
10:11
, but it's fascinating for
10:13
us from a security perspective to see
10:15
what's going on kind of behind the scenes . And
10:18
so over the summer , my family
10:20
and I went for a vacation over in Tennessee
10:22
, right , went there for like a week just hanging out
10:24
, right , tennessee's a red state , right
10:26
? For anyone that doesn't know which , I
10:30
guess that's probably a stupid thing , even because I talked
10:32
to people over in , like Russia , and
10:35
you know Europe , and they like
10:37
know our political system a little bit better
10:39
than us , almost , yeah , probably . And
10:42
so I I go to Tennessee and
10:44
my entire feed is stuff
10:46
that , like , I have never watched
10:49
. I don't subscribe to the channels
10:51
, like none of it was
10:53
for me to click on , right , so , you
10:56
know , I I didn't , I didn't pay any attention
10:58
to anything that was in my youtube feed
11:01
, my google news feed none of it , right , because
11:03
it's not didn't even appeal to me
11:05
, right , so I didn't think anything of it . And
11:08
then , you know , it happened the next day and the next
11:10
day after that I'm like man , what the hell is going on
11:12
here ? Like this is literally nothing that
11:14
I even watched . Like I don't want to watch any
11:16
of this . What is going on ? Yeah , and
11:26
you know , sure enough , right , like you're being targeted based on your region , which is it's a dicey
11:28
thing , right , because it's like , well , how much of my opinion is being shaped
11:31
by where I live
11:33
, and where I live determines
11:35
what I'm being targeted with right , and where
11:38
I live determines what I'm being targeted with right , and
11:40
you know it's a weird situation . You know , and to quickly
11:42
you know , go through this one point
11:44
right with AI , how we're
11:46
using it as a search engine . You
11:53
know , I saw someone on social media they're from Canada
11:55
, right and they put into like chat GPT . You know when was the
11:58
first Trump assassination attempt ? I
12:01
mean , this is a factual
12:03
thing that happened . It took place
12:05
at a date time
12:07
at a certain place , all that sort of stuff
12:10
. Any search engine should be able
12:12
to give you those exact specifics
12:14
. And he
12:16
said that essentially , chatgpt , you
12:18
know , tried to just go around the question
12:21
, didn't even answer it . You know , said
12:23
that it never occurred or anything like
12:25
that . And he had to like really prod for it
12:27
. And so I
12:30
thought to myself well , surely if
12:32
this LLM is learning from itself
12:34
, it knows hey , I made a mistake
12:36
there . Let me go readjust and pull
12:38
in other feeds and , you know , recalibrate
12:41
right . So I mean , a
12:43
couple days after , I went ahead
12:45
and just put in the same question it was like
12:47
the exact same question . You know when was the first
12:49
Trump assassination attempt ? And
12:52
it literally said there was no assassination attempt . It literally said there
12:55
was no assassination attempt
12:57
. And I had to go and say
12:59
no , there was one . And
13:01
it pulled up some 2017
13:05
event where someone threw a shoe at him or whatever
13:07
, and I said no , it happened in 2024
13:10
. And
13:15
I had to literally feed it . I mean several steps down , because even after saying
13:17
2024 , it still
13:19
said that there was nothing in 2024 . And
13:21
I had to then Google what the
13:23
exact date was and I said no , it happened on this
13:25
date . And it said no , it didn't
13:28
happen . And I was like it happened in this
13:30
state , in this town . You're arguing with the
13:32
machine . Yes , I had to feed
13:34
it all of that information . You know
13:36
, after doing this for a bit , it
13:38
was like I made a mistake , or I
13:40
don't even think it said I made a mistake
13:43
. It just posted , you know , like a
13:45
cnn news article that
13:47
was on it and , like you know , we're
13:50
going into a place where there's
13:53
a there's a huge amount of the population that would
13:55
never double check
13:57
that , right ? Like if MSNBC
14:00
didn't report on it or CNN
14:03
didn't report on it or Fox News
14:05
didn't report on it , right
14:07
? They're going to think , hey , this
14:11
never happened , right ? Because they're not saying it
14:13
happened and same thing
14:15
with the LLMs , you know . And so
14:17
we're going into a weird place and I apologize
14:19
, I didn't mean to like take over , no , no
14:22
, it's . I mean it's interesting , right ?
14:23
I mean , it's something I worry about a lot
14:26
is , as these LLMs get
14:28
more embedded into everything
14:30
and more embedded into decisions
14:32
, the fact that they either
14:35
were not trained to know the answer
14:37
no-transcript
15:02
, say we never landed on the moon , the earth is flat
15:04
and we're going to
15:06
, uh , be in a lot of trouble in society
15:08
as we move forward based on facts . Right
15:11
, so it's a ? It's
15:13
a brave new world out there . Yeah , it's
15:15
going to be interesting
15:17
for the next generations .
15:19
How do you try to keep
15:23
yourself informed of , I
15:26
guess , the right information without
15:29
being kind
15:32
of influenced
15:34
by the information ? I feel like there's a very
15:36
fine line between
15:39
being influenced and informed .
15:42
Yeah , you know , we saw that a lot this year
15:44
yeah for sure , and
15:46
it's tough because sometimes you see you
15:49
know bits or whatever , and you're you
15:51
do get influenced by them , right ? Oh yeah , well
15:53
, that's a . That point makes sense . But
15:55
then you have to go back and like , was that actually true
15:57
? Right , and
16:01
that's the thing that I think we all ask ourselves a lot is is
16:04
the information I'm seeing accurate
16:06
? You know , because you hear so many crazy
16:08
things out there , you know this
16:11
company's doing fantastic because they
16:13
posted something on LinkedIn that says they've tripled
16:15
their sales , like
16:17
, but did they , or is
16:19
that just some marketing hype that they're trying to
16:22
? You know , maybe they're going out to raise a round or
16:24
something like that and they're trying to make the company look good
16:26
, you know . So I think it's
16:28
almost living in
16:30
a state of constant
16:32
paranoia , right , and I
16:35
hate to say that , but I think there is good , healthy
16:37
paranoia . Obviously
16:40
, you don't want to be sitting there at your window
16:42
all day long staring out the window , but
16:44
it's good to be cautious and
16:46
it's , I think , good to be a little bit paranoid
16:48
. And I mean , I
16:56
guess I kind of run in that state , maybe from being in cyber for 25 years . We were all a little bit paranoid
16:58
about what's the old expression Just because you're paranoid doesn't mean they're not after you . So
17:01
I think we all kind of operate
17:03
in that kind of a mode and you
17:06
know , so I think , got to keep asking questions
17:08
and got to inspect the answers . And
17:10
you know otherwise keep
17:12
reading , keep researching
17:14
. I think that's the only way .
17:16
Yeah . Yeah , that's a really good point
17:19
. You know it's interesting
17:21
. Recently , you know , I lead
17:24
all of cloud security for my
17:27
current employer right , and a
17:30
part of one of my initiatives
17:32
for the year was to deploy .
17:33
And you must be paranoid because it says undisclosed , undisclosed
17:35
, undisclosed .
17:35
And you must be paranoid because it says undisclosed , undisclosed
17:38
undisclosed Well , so I do that very purposefully
17:40
because I don't want you
17:42
know , I'll give like career
17:44
stories , right , Things that I encountered
17:46
and stuff like that , and I don't ever want someone
17:49
to say , oh , that sounds
17:51
like X place right . Or that
17:53
sounds like this one right , or
17:55
the manager , for there is like I
17:58
know that that occurred . I'm
18:00
still here , like we're going to come after you . You know , that's
18:03
really what I want to avoid at all
18:05
costs and
18:07
you know , and I guess
18:09
maybe it limits the amount of opportunities
18:12
that I get hit up for or whatnot , but I
18:14
feel like if it's a real opportunity , they'll see through
18:16
that and you know still talk
18:18
to me right now , right , but you
18:21
know , since I lead all of cloud
18:23
security for my organization , I'm working
18:25
with about 150 developers , right
18:28
, and these developers because I'm rolling out
18:30
this , this AWS WAF , right
18:33
. So these developers , they decided
18:35
amongst themselves hey , we don't like
18:37
the WAF , we're going to try and get this bypass
18:40
rule through Joe and
18:42
you know , if he approves it , it basically bypasses
18:45
the whole WAF . We don't have to worry about it . There's going
18:47
to be no issues , no troubleshooting , none of that
18:49
. And I
18:51
get on this call and they immediately
18:53
start badgering me
18:56
with , you know issues
18:58
and you know they tried to make it sound like
19:00
it was 15 different issues . But
19:03
through all of my you know questioning , right
19:05
, like insecurity , we're so paranoid Like
19:07
I ask questions until I know exactly what
19:09
is going on , right , because I'm not getting fired
19:12
for something that I did and I didn't know I did , and
19:17
you know they , I , through the questioning
19:19
, I was able to whittle it down to one , one
19:21
core issue that they were trying
19:24
to mask from me . And then
19:26
I spent , you know , probably the next 30
19:28
minutes literally going
19:30
through their , their issues
19:32
and everything , trying
19:34
to see what they were actually trying to get at , because
19:37
they didn't . They didn't want it to make
19:39
it sound like I was going to bypass
19:41
the entire WAF . They wanted to make it sound
19:43
like hey , it's just this rule , you
19:46
know , it's just this rule in the stack .
19:48
Right .
19:48
But they're , but they're effective .
19:49
It's the one that says allow star dot star .
19:52
Yeah , their , their effective rule
19:54
was allow star dot star . Without
19:56
the allow star dot star , it bypassed
19:59
everything else . And so I
20:01
like pulled in my network guy , I pulled in
20:03
my infrastructure guy , I don't
20:05
, I don't think that they thought that I would do that . So
20:08
I pulled them in and I said play into my
20:10
network , guy , what you want to do
20:12
. And they explained it . And
20:14
I said I have one question Does
20:17
this bypass the WAF ? And he
20:19
said yeah , it bypasses the whole thing . I was like we're
20:21
not doing it and like everyone
20:23
was so mad at me , right . But you
20:26
know , I got that skill , though , of being
20:28
able to do that from years
20:30
of being in security and , just
20:33
to put it bluntly , being lied
20:35
to where it's like OK , I
20:37
need to . I need to make sure that I fully
20:39
understand what's going on here before
20:41
I actually make a decision that impacts
20:43
the security posture of our organization
20:45
.
20:46
Yeah , absolutely Absolutely , and
20:48
you know it's I hate to say it , but
20:50
a lot of times there's it
20:53
. Maybe it's some extra work to make something
20:56
work through the security control . And so
20:58
the easy question , the easy path
21:00
is like just , you know , just
21:02
whitelisted or whatever for now , and then
21:04
we'll , you know , we'll
21:07
get to it later , and then
21:09
later never happens , and you
21:11
know how that goes yes , yeah
21:14
, yeah , we have .
21:15
I've seen that so many times and that was a part of their
21:17
argument . Right Once I figured out what they were doing
21:19
, they were like oh well , can you just whitelist it
21:21
? You know , we'll , we'll readdress it
21:23
. You know , in January I
21:26
don't work like that . You know , I know that
21:28
there's other security people that have been in this role
21:30
before and they were , you know , basically
21:32
pushovers for you . Like I do
21:34
not play that game , you know .
21:37
No , you can't . You can't Not when I have people
21:39
on from . You know startup companies , founders and CEOs
21:43
.
21:44
You know the people that are starting these
21:46
companies . They're
22:05
all typically like pretty , pretty young , and I'm not
22:07
trying to you know , age you or anything like that , right but you said
22:09
that you have a four-month-old , so that that tells me that you're
22:11
in a different place of your life . You
22:14
could be in your 20s , right , but you
22:16
, but you're in a different place in terms
22:18
of right , but
22:21
I'm saying you're in a different place in terms of
22:23
, like , the risk that you're willing to accept
22:25
right , because now you have a four-month-old
22:27
, you have another little person that's
22:30
depending on you and
22:32
for a lot of people that's life-changing . I'm
22:34
sure it was probably life-changing for you . It
22:41
changed my entire life , my entire perspective of what life is and love and everything else . But I
22:43
say that because when you're in your 20s , you typically
22:45
have no responsibilities . Well , you got a
22:48
car payment , you got rent , you
22:50
got small little , minuscule
22:52
things . You typically don't have kids . I
22:54
mean , you could absolutely have kids , but
22:56
if you're in that situation
22:59
, you're probably not starting a company . So
23:01
what is that like ? How do you manage
23:04
the risk and the stress
23:06
of having a young family and
23:08
doing a startup ? Because I couldn't imagine
23:10
, you know .
23:12
Yeah , it's a lot . I think it's just
23:14
one of those things
23:16
where my wife and
23:18
I had been working on building our
23:21
family for a long time and you
23:23
know . So that was just kind of , if it's
23:25
going to happen , like it's a blessing and we're
23:27
going to take it whenever , but
23:29
at the same time I wasn't going to put my
23:32
goals and passion on the sideline
23:35
and kind of wait . So I
23:37
figured , well , I'm just going to have to figure out how
23:39
to do it all at once , which people
23:41
can do it . I mean , I'm in my mid forties , I'm 46
23:44
. So I guess I'm pushing towards my late forties
23:46
. But I've always been in
23:48
startups , right . So this is
23:50
startup number five . You
23:53
know , I started at ArcSight back
23:55
in 2000 .
23:56
Wow , okay .
23:57
I think I was employee like 30 there , something
23:59
like that . So , pre-product , you know
24:01
, there was basically a batch file
24:03
that started at JPEG of the console and
24:06
I spent 12 years there and
24:08
ArcSight grew , went public , acquired
24:10
by HP , and then I
24:12
went off to another startup called Silvertail
24:15
Systems and basically spent about
24:17
two years there and we got acquired by RSA and
24:20
I decided to leave shortly after that
24:22
acquisition and go start a
24:24
company for the first time . With
24:26
my co-founder , Greg Martin , we started ThreatStream
24:29
, which grew into Anomaly oh
24:32
wow . And so you know
24:34
that business is still operational . They're doing fantastic
24:37
. So we're over here rooting for them on the sidelines
24:39
. But I decided after about
24:41
eight years of building that company that I
24:44
was ready to go try something else and
24:46
I joined a company called Veriden which was
24:48
in the breach and attack simulation space , where
24:51
I had invested in that company early on
24:54
in the seed round and the A round and
25:01
I think that you know the writing was on the wall that I was eventually going to be there
25:03
and you know I ended up joining as their CTO and
25:05
about a year after I joined
25:07
, we got into talks about
25:10
getting acquired by Mandiant FireEye
25:12
Mandiant at the time and
25:14
so about midway through 2019
25:17
, we got acquired by FireEye
25:19
Mandiant and that was
25:21
interesting , right . So I ended up spending
25:23
three years at
25:26
Mandiant through the
25:28
divestiture of the FireEye stack
25:30
and ultimately
25:33
through the acquisition by
25:35
Google , and about
25:38
four months after the Google acquisition , I
25:40
left Google and started Abstract
25:42
, and
25:45
it was something that I'd been wanting to do
25:47
for a long time , and
25:49
you know , really kind of companies
25:51
at this stage are , like you
25:54
know , really kind of the most fun thing for
25:56
me , right ? Not for everybody , for
25:58
a lot of people don't like companies at this stage
26:00
. They're hard , yeah
26:03
yeah .
26:05
So it kind of sounds like . It
26:09
sounds like you kind of , you know , went
26:11
through that initial stress
26:13
or grew into it early
26:15
on and then it became the norm
26:18
, whereas everyone typically
26:20
starts with the stress of a 9 to 5 , and that
26:22
becomes the norm and you kind of
26:25
stay within that mix . You
26:34
know when , when I was starting out in my career maybe you know , 10 years ago , right I
26:36
I reached out to alissa knight and I was I was talking
26:38
to her , I was trying to like unravel
26:41
this , you know startup thing
26:43
and how do you get , how do you get started
26:46
, like what's the right you know thing that
26:48
you should be doing for it and everything
26:50
. And the one piece
26:53
of advice that really stuck
26:55
with me was that
26:57
you only
26:59
, you know , leave your day job
27:01
when your startup or your side hustle
27:04
is matching the income of
27:06
your day job . Right , because it
27:09
gives you that financial security . You
27:11
understand , okay , I
27:17
have something here and then you can lean in a little bit more
27:19
and see how it grows and everything else like that . And I think if I didn't
27:22
have that framework right or that idea
27:24
you know , kind of planted
27:26
in , I feel like I
27:28
would have either gone one of two
27:30
ways right . I would have gone full-on into
27:33
the nine to five and just been like if
27:35
this is where I'm at this , I'm stuck here
27:37
forever . Or I would have gone full-on startup
27:40
yeah , risk
27:42
, you know , losing everything basically
27:44
yeah , yeah .
27:45
Well , you know , the good news is you don't really
27:48
lose everything . You may not
27:50
, it may not be successful , right , but at
27:52
the end of the day , the experience and the
27:54
lessons learned are invaluable , right
27:57
. So I don't know For me
27:59
. Like I said , I worked at Fire , at Mandiant , for
28:01
three years and you know we had
28:03
a good time and I mean it was a hard time
28:05
. It was obviously during the pandemic , so
28:11
things were different than ever before , but we accomplished a lot while we were there , which some
28:13
things that I was really proud about . I mean , we kind of took
28:16
a legacy software stack and
28:18
converted it to a modern SaaS application
28:20
. Inside
28:22
of , we were almost operating like a startup within
28:24
a big business because we were the
28:26
acquired company . So we kind of had a
28:28
team . All the stuff we did coming
28:30
into that was SaaS-based , and so we're kind
28:33
of taking this legacy sort
28:35
of you know network appliance
28:37
sort of company and building
28:39
a modern SaaS application
28:42
on top of that , you know
28:44
, and our areas were really around
28:46
threat intelligence and the breach
28:49
and attack simulation areas , which
28:51
is what we're focused on , kind of that migration . So
28:55
it was interesting . But you know , I
28:57
think the company was 3000 people give
29:00
or take , if I'm remembering
29:02
that correctly , but give or take around 3000 people
29:05
, which to me is just like a
29:07
huge , huge company . I
29:09
mean the last , I think , arcsight
29:11
, when we got acquired by HP , we were
29:13
maybe like 600 people or something like that
29:15
, and so that
29:18
was kind of my experience . My big company experience
29:20
was that and you
29:23
know , silvertail
29:26
was maybe 100 or so people
29:28
and Anomaly we grew
29:30
to about maybe two , 50 , 300 , something
29:32
like that . Um , so
29:35
those are the kinds of companies like I really love
29:37
that you know zero
29:40
to a hundred million ARR type phase
29:42
. You know the a hundred to two 50
29:44
ARR type phase , um
29:47
, and then as it
29:49
gets into a 3000
29:51
, 4,000 person company , I mean that's , it's
29:53
a different beast , right , yeah
29:56
.
29:56
Yeah , you , you start to like
29:59
have to have things like a whole HR department
30:01
and finance department , right
30:03
, you know ? you get a board
30:06
in place , all that sort of stuff . It's
30:12
a different , different challenges that
30:14
you have to learn and grow through and whatnot . And you
30:16
know , I , I think like I'm a big kind
30:18
of I I don't want to say I'm a big stats guy , but I'm a numbers
30:21
guy , you know . So when
30:24
I , when I do something or when I venture into
30:26
something , right , it's kind of like I
30:28
look at what the odds are . I
30:31
look at like what the odds are of success , right
30:33
. And you know , you , you look at just the
30:35
companies that go to RSA every year , right
30:37
. Something like 86 or 89
30:39
percent of them fail within that year . They
30:42
don't show up again the following year
30:44
. And then
30:46
you look at the ultra
30:49
wealthy . I
30:51
look at people like Elon Musk or Mark
30:53
Cuban , jeff Bezos
30:55
, and when you do your research , all
30:58
of them went through
31:00
several bankruptcies . All of
31:02
them started with relatively
31:05
small amounts of money compared
31:07
to what they have . What they have
31:09
today , right . What they grew into today
31:11
, right . And so that does actually
31:13
tell you something like , hey , you should expect
31:15
a certain degree of failure to
31:18
come with your success , absolutely
31:20
. And you shouldn't allow that failure to hold
31:22
you back . You know you have to use it and grow
31:24
through it because I'm sure you
31:26
know , if one of those billionaires go
31:29
and declare bankruptcy , you know
31:31
this year for the ninth time or the tenth time
31:33
, right For
31:36
them mentally , that's not even on their
31:38
radar of stress in terms of
31:40
, like you know what bankruptcy means
31:43
and everything else like that . Because it's like I did it 10
31:45
other times . Right , like I
31:47
did it 10 other times . I'm going to make it
31:49
through this one . We'll be fine . You
31:51
know , for me , if I were
31:53
to go through that today , I'd be , I'd be terrified
31:56
, yeah , me too , me too .
31:58
So yeah , yeah , I'm looking to
32:00
not go that route .
32:01
I would never want to no , but fail fast
32:03
.
32:03
I mean , you know , I think that is an important
32:06
lesson there . Like you know , we
32:08
try different
32:10
hypotheses all the time as
32:12
we're building product and whatnot and it's
32:14
like , hey , let's try this , we're
32:17
going to put some effort in . Is it going to work
32:19
? It's not guaranteed to , so
32:21
let's try it , see what works , and if it doesn't
32:23
get the lessons learned , figure out a different
32:25
approach . But do it quickly . Approach
32:33
, but do it quickly like it's better to . You know , I don't know .
32:34
Try and fail and never try at all , I guess yeah . So yeah , that's very
32:37
, that's very valid . There's an old adage
32:39
yeah not , but it's , it's very valid
32:41
. And you can really only do that in a small
32:43
startup like environment . Right
32:46
, like you're not doing that at intel
32:48
or ibm . Right where you're , where
32:51
you're failing fast and making adjustments
32:53
on the fly , trying different things , failing
32:55
again yeah , you're .
32:57
That's the definition of getting fired well , and that's
32:59
why , that's why the projects take , you know
33:01
, so much longer to get anything done right
33:04
. I mean that's that's what I love about startups
33:06
is we iterate fast , we build features
33:08
quickly , we know we're right
33:10
there with the customer right . So we're
33:13
like building as the customer's asking for
33:15
something . And you know , at
33:17
big companies you know it just doesn't happen
33:19
that way because there's so
33:22
many customers feeding
33:24
in requirements that
33:26
there's no way you can be that responsive
33:29
. But at our stage
33:31
and I mean I think as you stay nimble
33:34
, even as you grow being
33:36
able to have that level
33:38
of customer support , customer success
33:41
is like critical right and
33:44
I always tell people , always
33:46
tell people this that customers
33:48
will tell you what
33:51
they need . You just have to listen
33:53
and
33:55
that's something that I think too many startup
33:57
founders don't do
34:00
. Well , because they come from
34:02
a place where they think they know
34:04
better than the customer and maybe
34:08
it's their education or their amount
34:10
of experience with a certain technology or
34:12
a certain technology stack that they
34:14
think the customer doesn't know what
34:17
they need . And they're here to tell
34:19
the customer . I've always taken the approach
34:21
of customer does this job every single
34:23
day . This is what they do for a living and
34:26
they're telling me they need this feature . Most
34:28
likely it's because they do and
34:30
they're telling me they need this
34:33
feature .
34:33
Most likely , it's because they do . Yeah , that's a very it's a very valid point
34:35
. You're
34:41
listening to . You're listening to understand rather than listening to reply right , that's
34:44
right , that's right . Yeah , you know it's
34:46
weird because all of school
34:48
right , and I was talking to my PhD
34:51
chair on this right , because I'm working on my PhD
34:54
and it is the most difficult
34:56
thing that I've ever done from an educational
34:58
perspective right , and it's hard
35:00
in ways that you do not expect . Everyone
35:02
says that it's really difficult and whatnot , that a lot
35:04
of people that start do not finish
35:07
. I can completely understand
35:09
why , right , it's because you
35:11
literally just spent 20 years
35:13
in school and they're telling
35:15
you , hey , what's on the next
35:17
test ? They're telling you what
35:20
they want you to write , right , all
35:22
this sort of stuff . And then you go into
35:24
your PhD and they're like no
35:26
, you have to find a topic , oh
35:28
, okay , well , you
35:30
have to write this literature review , that's . You
35:32
know , it could be 10 pages long , it could be 150
35:35
pages long . You have to do
35:37
it . Well , what's a literature review ? Right
35:39
, it is a complete blank
35:41
slate . Like , a literature
35:44
review is a core paper
35:46
in this process , right , and there's
35:49
no set , like defined
35:51
, even outline of what a literature review
35:54
is , right , like , you can Google
35:56
it and you're going to get 15 examples
35:58
and they all look different , they all feel
36:00
different , they all read different , right
36:02
, and so you spend
36:04
literally 20 years in school , you
36:07
know , learning how to reply to
36:10
something that is being told to you right
36:12
, or how to deliver a result based on
36:14
something you know you're
36:16
being told to do right . And
36:18
when you get into kind
36:21
of this startup phase or
36:23
the PhD , right Like now
36:26
, I understand why people that get their PhD
36:28
actually make . You know the money that they
36:30
tend to if they go into the right
36:32
area . It's because you
36:35
literally do not have to tell them anything . You
36:37
tell them what you're thinking about
36:39
and they go and figure out
36:43
everything , Because
36:46
it's a different thought process . So
36:48
talk to me about abstract security
36:50
. You know what's the niche area
36:52
or what's the problem that you're
36:55
designed to fix , that you're working on
36:57
fixing right now .
36:58
Yeah , so basically , you know , our
37:00
mission is building a
37:02
complete platform for data
37:04
security , right , Right ? So
37:06
basically a
37:09
data platform that is
37:11
focused on collection
37:13
and aggregation
37:15
and operationalizing
37:17
security data . So we
37:20
want to make the data collection side of
37:22
things simple . So
37:27
we say we simplify
37:29
data and we amplify insights . So
37:31
the idea is we're providing customers better cloud visibility , we're
37:33
giving them a handle on their log
37:35
management infrastructure . We're
37:38
helping a lot of customers with SIM migration
37:40
. So people are kind of migrating
37:43
from Splunk
37:45
to Google or from QRadar
37:48
to Microsoft Sentinel or wherever the
37:51
case may be . We're helping them on
37:53
that journey by being that data collection
37:55
layer for them . And
37:58
you know , we also have a lot
38:00
of capabilities in kind of the analytics space
38:02
. So as we're collecting the data
38:05
and routing it , optimizing
38:07
it , we can also do analytics on that data
38:09
and provide those results
38:12
to their you know sim of choice or
38:14
their next-gen sim of choice , however
38:17
the case may be these days , Hmm , that's
38:20
interesting .
38:22
So it's almost like a sim collector
38:24
or like a log collector
38:26
, and then you're able to run some analytics
38:29
and analyze the actual data that's
38:31
right .
38:31
Yeah , on the data stream itself . So we
38:34
collect the data , we stream it . As
38:36
it's streaming , we can operate on the data . So
38:39
you know , for example , like , well
38:41
, you're in cloud , you're in cloud security , right
38:43
, and it sounds like you were talking about
38:45
, you know , deploying this WAF
38:47
, right , right , well , the WAF's going to generate
38:49
a lot of logs . Most
38:52
of them might not be useful or
38:54
there might be a subset that's actually useful for
38:57
security detections , and
38:59
so what we would do is we would collect
39:01
those WAF logs out of , let's say , an S3
39:03
bucket or wherever they're being written to , and
39:07
we would then say , okay , out of this
39:09
set of data , what is the data
39:12
that's relevant for ? Either your
39:14
compliance needs , your regulatory
39:16
right . So there may be a requirement
39:19
that you're under that says , hey , we have to keep all data that
39:21
is between system X and system
39:23
Y because it's their regulated systems
39:25
systems
39:31
. But there could be a bunch of internal traffic that maybe you don't need , although maybe not through
39:33
a web , but if you're looking at , like VPC flow logs or some of these other sources , you
39:36
know you have a lot of internal communications that
39:38
. Do you really need that data ? Maybe not , and
39:41
so you can filter out data , you
39:44
can change , you know values
39:46
or you can enrich data . So let's say that
39:48
, for example , you know
39:50
GitHub's a great example . We have a lot of customers who
39:52
collect GitHub logs and
39:55
GitHub is basically a social network
39:57
so you can go in there and create whatever username
39:59
you want . Well , when the log gets written
40:01
, it's going to be tagged with your
40:04
username , right , and so what
40:06
we want to do is actually enrich that
40:08
so that it gets tagged with the actual
40:10
identity of the user , so
40:13
we're able to kind of do that data enrichment
40:15
type stuff on the fly . We enrich data with threat
40:18
intelligence so you
40:20
can know basically like which threat actors potentially
40:22
are associated with an alert , and
40:25
then we forward that off to multiple
40:29
destinations . So you could take , let's
40:31
say you have I don't know , say , an AWS
40:33
data lake and you want some of the data
40:36
to be stored in your AWS data lake in
40:38
maybe OCSF format . And
40:41
then you want some subset of the data
40:43
going to your SIEM where you're paying extremely
40:45
high storage costs , so you don't
40:47
want to send everything there high
40:49
core storage costs , so you don't want to send everything
40:51
there . So
40:55
you can kind of slice and dice route and really figure out
40:57
. You know , a strategy , a data strategy that is going
40:59
to allow you to get the most value out of your
41:01
tech stack .
41:03
So I mean , it sounds like you're
41:05
able to use the data from
41:08
wherever it kind of resides , right
41:10
? I'm thinking in terms
41:12
of , you know , in
41:14
the cloud . Right now , there's a huge
41:17
battle between legacy tech stacks
41:19
and cloud tech stacks
41:21
, especially with logging . Like
41:24
, as you probably know
41:26
, right , I've
41:28
been engaging with a logging
41:30
conversation around this waft for
41:33
six , eight months
41:35
now at this point right , and we
41:37
don't really have a good solution . We have sort
41:40
of a solution and hopefully we
41:42
never have to query it or
41:45
anything else . You know , right , yeah
41:47
because it's it's so expensive
41:49
, it's so extremely expensive
41:52
to go and send that data to Splunk
41:54
right , because we already have Splunk on prem
41:56
, it's already sized right and everything else
41:58
like that , yeah it
42:01
is so expensive , yeah , especially
42:03
with , like the WAF or just network flow logs
42:05
, right man ? Yeah
42:07
, I mean we might as well just try
42:10
and buy slunk from ibm at that
42:12
point , like , or
42:14
whoever just bought them you know , cisco , yeah
42:16
, yeah , cisco .
42:18
Well , that was the going joke , right , that cisco was
42:20
either going to pay the renewal or they were going to buy
42:22
the company , yeah , so probably
42:24
not too far off , but it's so accurate
42:27
they probably only had to spend a little bit
42:29
more . Probably only a little more
42:31
, but you know we could probably , you
42:33
know , look at helping you out if you're interested not
42:35
to turn this into a abstract conversation
42:38
on you know , but might
42:40
be something there yeah , yeah , I
42:42
mean , you know this is something
42:45
that I've definitely been , you know
42:47
, mulling over , right for for
42:49
a while .
42:50
you know , caveat to everyone , right , like I , right for
42:52
a while . You know , caveat to everyone , right ? Like , I don't bring people
42:54
on the podcast for them to sell
42:56
me a product or anything like that . I want to
42:58
talk about interesting stuff because I'm
43:01
actually in this field , right Like I'm in this
43:03
field , I'm dealing with these problems every single day
43:05
, and so it's really beneficial for me
43:07
to see what's out there , what's
43:10
growing , what's coming out , you know , because
43:12
there's so many different people that are going to think
43:14
of these problems in different ways and
43:16
solve them in different ways . You
43:19
know , like , my environment is interesting
43:22
, right , because I don't , from a security perspective
43:25
, I don't have full visibility into
43:27
my environment , right ? So I'm a cloud security guy
43:29
and I don't have full visibility due
43:31
to different restrictions and it
43:33
creates a lot of different challenges
43:36
. So you know , in security engineering
43:38
you're going to be faced with a whole lot
43:40
of unique challenges and
43:42
you have to figure out how to solve them . You know
43:44
, like that's the whole point of the engineer's job .
43:47
Yep , absolutely , and you're always kind
43:49
of operating with like one hand tied behind your back
43:52
right .
43:54
I'm lucky if I only got one hand tied behind my
43:56
back .
43:57
Maybe hamstrung with a hand behind your back
43:59
.
43:59
yeah , yeah , I'm over here like using
44:01
my head as a weapon at this point , you know
44:03
.
44:04
Yeah , yeah , I
44:06
was going to make a funny joke about the Tyson fight
44:08
man .
44:10
That was going to make a funny joke about the Tyson fight man .
44:11
That was an interesting weekend . I will say this Netflix
44:14
better get some more servers going before football
44:16
hits on Christmas
44:18
Day . Because people are going to be not
44:21
happy .
44:22
You know , as a cloud
44:24
security person , I just don't
44:27
understand how
44:29
they could have an
44:31
issue like that , right
44:33
, because I'm thinking like how do you , how do you have your load
44:36
balancers configured and
44:38
how do you not have auto scaling configured
44:40
on a streaming service , probably
44:42
one of the biggest streaming services
44:44
? on the planet for like a decade Right
44:47
and you pride yourselves
44:50
, you talk it up at these , you know
44:52
tech conferences that you
44:54
know all of Netflix is built on containers
44:56
. It's all serverless . It's
44:58
you know this , it's that . So if that's
45:00
true , it's literally
45:03
a checkbox for you
45:05
to . You know , go into your load balancer
45:07
and say auto scale , put it into an
45:09
auto scaling group and give
45:11
it the template right .
45:14
Well , it's a checkbox , but it's also a check
45:16
that they have to write . So maybe , if they
45:18
, maybe they came up with a budget on cloud
45:21
spend for this event and they're like we can't
45:23
go over x , no more
45:25
load balancers I guess I I
45:27
mean , I , you know this .
45:29
This is the thing Like . I feel like maybe
45:31
someone in finance maybe came up with
45:34
that arbitrary budget
45:36
.
45:36
Yeah , right , yeah .
45:37
Instead of customer experience , it was
45:39
the
45:51
cost for the three or four hours that the event was going
45:53
on . Right , you eat that cost for four hours . Okay , it scales right
45:56
back down afterwards . And now you get , you know , a cnn
45:58
article saying of how netflix
46:00
, you know , was able to stream to , I
46:02
don't know , 50 million people all at the same time
46:04
.
46:04
Right , like 100 , 130
46:07
million , I'm sorry , 120
46:09
million , which is 10
46:11
million less than watch the super bowl .
46:14
Imagine just imagine
46:16
if that was the article in the
46:18
news . Right , right , exactly
46:21
, hey , they streamed to 110
46:23
million people flawlessly
46:25
, without issues , right , and now
46:27
we're dealing with the after effects
46:30
of you not doing auto scaling groups
46:32
properly in your cloud . You
46:34
know , wink , wink , there may be someone
46:37
on this podcast that knows how to do it . Like
46:39
it's like common sense
46:41
. I mean they , they're the company that came
46:43
up with chaos monkey and chaos gorilla
46:46
, and if you don't know what those are , it
46:48
is ensuring high
46:50
availability and extreme
46:52
redundancy in your data
46:55
center , in your environment . Like these things
46:57
take down servers randomly , they take
46:59
out data centers randomly , you
47:02
know , and and if you're up
47:04
and you're running that in your environment
47:06
, that's better than probably
47:08
most of the cloud providers at
47:11
that point . You know , I
47:14
worked for a company , a financial
47:16
services institution , right
47:19
A couple of years ago and we bought
47:21
a company in California and this company
47:23
viewed disaster recovery totally
47:26
differently from how we
47:28
even viewed it . Right , like they
47:30
really increased the par for what
47:32
we consider disaster recovery to
47:35
the point where every two weeks , they
47:37
wouldn't just like sever
47:39
network connections in a data center , they
47:42
would go into the data center and
47:44
shut down the power . I mean literally
47:47
shut down the power on that data center
47:49
. And if something failed , then
47:51
they're like okay , we know we have an issue over here and
47:53
there was no turning it back on for two weeks
47:56
, you know . So it's like , hey , you got to fix this
47:58
thing on the fly , which just
48:00
like took it to a whole other level
48:02
, right ? Like we kind of re-augmented
48:05
or redid everything we did
48:07
from a disaster recovery perspective globally
48:09
. Once we bought them
48:11
and we saw that technology , we're like we need to
48:13
be doing this everywhere , like right now
48:15
.
48:16
Yeah , I like the idea of chaos monkey .
48:18
That's uh pretty sweet well
48:21
I every , every , every
48:23
time I go to a , to a new company
48:25
or whatever . I mean , it's one of the first things I ask
48:27
. You guys want to run chaos , monkey . And
48:29
every single , every single
48:31
time they're like , nope , we don't want to touch it , like
48:34
don't even bring that in here , I'm like
48:36
, all right , fine all right
48:38
sure yeah , yeah
48:40
, absolutely , it's , it's a , it's
48:42
a . You know , the . The problem that abstract
48:45
security is solving
48:48
for is a problem that I'm finding
48:50
at a lot of places . Honestly
48:52
, I mean not just , not just my own place
48:55
, right , but you
48:57
know every place that I've been to right , the
48:59
biggest issue is okay , we're heavily
49:01
into the cloud and now we have
49:03
all these logs , we can't even
49:05
query them for something . God forbid , an
49:07
incident happened because
49:09
we don't know how to get that data .
49:11
And if we ?
49:11
send it to our slunk where we already have
49:13
everything . It's an
49:15
insane amount of money . It doubles or
49:18
triples our spend with
49:20
that vendor .
49:20
That's right . That's
49:23
right . And so much of
49:25
that data is just not relevant for cyber
49:27
. Amazing
49:31
, I mean . We did some analysis on
49:33
like CloudTrail logs and found , like you
49:35
know , 70%
49:37
reduction capable . Yeah
49:40
, I mean you're talking a data
49:42
source that generates terabytes of data every
49:44
day . So if you can reduce
49:46
that by 70% , I mean
49:48
you're saving a significant amount of money
49:51
.
49:51
Yeah , yeah , especially
49:54
from a security perspective . I mean , you need
49:56
to know about the transaction
49:58
you know you don't need to know about
50:00
. You know the flow logs
50:03
and everything else like that , right , like it
50:05
just so happens that the information that you need is
50:07
within those logs
50:09
.
50:10
That's right .
50:10
just so happens that the information that you need is within those logs . That has all this other mess with it , and
50:12
you have to be skilled enough to sift through
50:14
it and figure out . You know what's actually
50:16
going on . So it's a it's definitely
50:18
an area that that
50:20
we're struggling with right now , you
50:22
know , in cloud security . Yeah , you
50:24
know , colby , I I really enjoyed
50:27
our conversation . We we're at the top of our
50:29
time here and you know I try
50:31
to stay very cognizant of everyone's time
50:33
. But before I
50:35
let you go , how about you tell my audience you
50:37
know where they can find you if they wanted to reach out
50:39
and connect and where they can find your company if
50:41
they wanted to learn more ?
50:43
Yeah , absolutely Well . Find me on LinkedIn
50:46
, colby Deretiff . Or
50:48
find Abstract Security on LinkedIn . Me
50:54
on LinkedIn , colby Derodiff . Or find Abstract Security
50:56
on LinkedIn . We're around . Or the old , traditional way our
50:58
website abstractsecurity , though maybe that's not exactly traditional
51:00
, but it is on the worldwide
51:02
webs .
51:04
Awesome . Well , thanks , colby
51:06
, I really appreciate you coming on , absolutely
51:09
.
51:09
Jeff , it was a pleasure . Look forward to keeping
51:11
in touch .
51:12
Yeah , yeah , absolutely Well , thanks
51:14
everyone . I hope you enjoyed this episode
51:16
.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More