Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:53
How's it going , Matt ? It's great to get you
0:56
on the podcast . I'm really excited for our
0:58
conversation here today .
1:00
Yeah , hey Joe , how you doing . Glad to be here . Thanks for
1:02
having me .
1:03
Yeah , absolutely . It's an interesting
1:05
time . I feel like everyone's recovering from
1:07
the DEFCON Black Hat scene
1:09
in Vegas last week . Did
1:12
you end up making it out there ?
1:14
I was not there this year but
1:17
I know a a number of people that were and I
1:19
agree .
1:19
I think people are still recovering oh
1:22
, yeah , yeah , I um
1:24
, I skipped this year very intentionally
1:27
. I needed a break from last year because
1:29
last year was non-stop meetings
1:32
all day long . I couldn't even enjoy the conference
1:34
, and then you know afterwards , of
1:37
course , is after all the you know after
1:39
parties and all that sort of stuff . So
1:42
it was a much needed , much needed
1:44
break .
1:46
Yeah , 100% . It's probably controversial
1:48
to say this , but from the
1:50
folks that I know and talk to , it
1:53
feels like you know , black Hat is a
1:55
little bit closer to RSA than it used to
1:57
be , and that trend is it
1:59
just gets bigger and bigger and interesting
2:02
times indeed .
2:04
Yeah , yeah . I don't like
2:06
Black Hat honestly , because it
2:09
is just like RSA and for
2:11
me , you know , I don't want
2:13
anything to do with that world , right , I
2:16
kind of I talk to vendors , you know , on a
2:18
daily basis already . I'm already looking
2:20
at new tech all the time , you
2:22
know . So , from from my perspective
2:24
, it's like when I go to , when I go
2:26
to defcon , like I don't want anything to
2:28
do with the black hat scene . I even try
2:30
to stay away from rsa . I went once
2:32
and I was like , yeah , this , this is enough
2:35
for me . That's funny . Yeah
2:39
, so , matt , how did
2:41
you get into IT ? And I start everyone off
2:43
there . I'll tell you why . It's because there's
2:45
a portion of my audience that could
2:48
be trying to get into IT for the very first
2:50
time . Or
2:53
they're trying to get into security for the very first time and they're not sure if it's possible
2:55
for them . And they're not sure if it's possible for them
2:57
. And I remember when I was going through that phase and hearing someone
2:59
with a similar background to me
3:01
getting in and being successful
3:03
on that route was really
3:06
all I needed to hear , right ? Because it's like
3:08
, okay , this is difficult , but if this person
3:10
can do it , we come from similar backgrounds
3:12
. Maybe I can do it too . So
3:14
what's your background with that ?
3:17
Well , I'm officially a dinosaur
3:19
in this industry these days . I
3:23
kind of made my first
3:25
real step into what you would consider to
3:27
be , I think , modern IT all
3:30
the way back , you know , kind of in 1999
3:34
, 2000 , at a company called US Internetworking
3:36
which was the world's first application
3:39
services provider . You know , the idea was
3:41
you would build this thing called a data center and you
3:43
would run these things called applications
3:45
inside of the data center and you would deliver
3:48
those applications back to customers in
3:50
a service model . So it was the precursor
3:52
of SAS . And
3:55
you know , you think it seems like
3:57
a really good idea to kind of go
3:59
to market in a model like that , and
4:03
it is . I mean , certainly there was a tremendous amount
4:05
of learning and value created as a result of
4:07
that . But at the same time there's a lot of challenges
4:09
that
4:11
surface along the way , not the least of which is security
4:14
. So when you're touching other people's
4:16
software and you're hosting other people's applications
4:18
and you're doing it in your data center , you better be really
4:20
good at your job
4:22
lest there be any kind of risks
4:25
that you undertake
4:27
on their behalf . But that
4:29
was my first foray into
4:32
what I would consider sort of the IT
4:35
world . I was always a
4:37
little bit more interested in kind of higher
4:40
up the stack applications
4:42
as opposed to sort of lower in the stack infrastructure
4:44
. But understanding
4:46
the full stack , I think is important
4:49
for anyone looking at the world
4:51
today . You know it's important
4:53
to have broad perspective on all
4:57
phases of the stack . But
4:59
as an applications person , you know , I eventually
5:01
kind of made my way into a
5:03
couple of my own companies and then , you
5:06
know , became intrigued with open source and how
5:08
open source , you know software
5:10
was increasingly being used to build applications
5:12
and software developers were increasingly
5:15
assembling third party components , components from these
5:17
third-party open source ecosystems . And then
5:20
you get into this whole interesting question about like , well
5:22
, how good is the software that an engineer is actually
5:24
building ? Most of the code that they're actually
5:26
putting into the application is borrowed
5:28
from third-party open source ecosystems . Like
5:30
, what's that all about ? Um
5:33
, and you know so , along the way you
5:35
get exposed to not just security
5:37
as it relates to
5:56
kind of the infrastructure that you're responsible
5:59
for managing . You get exposed to really interesting
6:01
and different questions with
6:04
regards to , you know , creating
6:06
really granular , really
6:09
small policy
6:11
, access control and
6:14
security governance
6:16
that is designed to protect
6:18
the data itself , I
6:20
mean , for a long time now , and I think we've
6:22
all seen this . In this world that we're
6:24
living in , whether it's an RSA
6:26
or Black Hat or pick your favorite
6:28
conference that you might go to , the
6:31
number of identity and access providers
6:33
there is mind-boggling . You know , you
6:35
got your big players like Okta and the rest . It's
6:37
got all these endpoint players protecting
6:39
these endpoint devices . Proud
6:41
strike , you've got . You know , the
6:43
network security guys . Pick your favorite
6:45
flavor of network security , whether it's
6:47
z , scalar or um
6:49
, you know . Pick your favorite one of those guys
6:52
. You know the micro segmentation guys
6:54
. Then you get the application security guys
6:56
and there doesn't seem to be , in
6:58
my view , at
7:01
least historically , enough attention being put
7:03
on to who's out there doing
7:05
really innovative work with open
7:07
standards that's designed to protect
7:09
the data . Forget
7:11
about the endpoint , forget about the network
7:13
, forget about the identity . Let's just assume
7:16
that we've already been breached , because that's
7:18
unfortunately a reality that many of us are
7:20
contending with , and if that's the reality
7:22
that the bad guys are already in our house , then
7:24
what's left to protect it's the data
7:27
. And how do you do that ? That's
7:29
ultimately kind of where I've kind of arrived
7:31
in my journey .
7:34
Yeah , that's a really good point that
7:37
you bring up . You know , I
7:39
always start with protecting
7:41
the data , right ? It's kind of , if
7:44
we had to boil security down
7:47
to like one or two things , it would be IAM
7:49
and data security . 100%
7:51
Right , because nothing else
7:54
matters if they're
7:56
already in , you know , like , especially
7:59
if you're in the cloud . You know , one of the one
8:01
of the key tenants is iam . If you're not
8:03
doing iam , right , then they're able to log
8:05
right in and they'll have access to everything
8:07
. And if they're able to access everything
8:09
, then what's left to protect your
8:11
crown jewels ? Your crown jewels are probably your data
8:14
, right , how are you encrypting it
8:16
? Are you encrypting it ? It
8:19
? That's a huge question
8:21
, right there ? Right , to even ask
8:23
you know taking
8:25
it from like a third-party consultant
8:28
, you know sort of thing , right , that's
8:30
a huge thing . To even ask a company
8:32
is well , how are you protecting your
8:34
data that's stored in that
8:37
aws managed rds
8:39
database ? How are you ? How
8:41
are you actually securing it ? And if they say
8:43
, oh , it's with you know default
8:45
aws encryption or whatnot , so you
8:47
can assume that they'd probably already
8:50
be breached , then right , because
8:52
the default encryption is going to store the key in
8:54
kms and and if they're already logged
8:56
in , they can just get that key and decrypt
8:58
it right . So it's a very
9:01
complex problem and we're only making things
9:03
more complex as
9:05
we kind of delineate away
9:07
from legacy infrastructure .
9:10
Yeah , and listen . I mean I think that
9:12
point about moving away from
9:14
legacy on-prem and moving everything to the
9:16
cloud is important . I
9:19
think to your earlier comment I
9:21
couldn't agree more . I mean , when you stop
9:23
and think about what
9:25
is security like , well , first things
9:27
first . Like what is the job to be done ? The
9:29
job to be done is to protect the data
9:32
Like nothing else matters . Okay , well
9:34
, if that's the job to be done , you
9:36
got to protect the data . I would
9:38
argue that
9:40
it's really important for people to reflect on the following
9:43
there is
9:45
one massive data estate and
9:48
there are two parts of the data estate . There's
9:50
the part of the data estate that you possess
9:53
which is sensitive information that
9:55
you have inside of your business which
9:58
you want to protect from
10:00
bad actors and threat actors
10:03
externally being able to
10:05
get it , steal it , exfiltrate
10:07
it or whatever
10:10
, and you don't want employees of your company
10:12
doing silly things that would result
10:14
in a misconfiguration and cause it
10:16
to be leaked or exposed to a bad actor . So
10:19
part of the big challenge
10:21
is , especially with regards to the movement to
10:23
the cloud , is how do I protect the sensitive
10:25
data that I have in my possession and
10:28
how do I prevent it from being accidentally
10:30
or unintentionally lost to
10:33
these third-party risk actors . That's why I
10:35
got to keep control of what I control Very
10:38
important . That's a risk management defensive
10:40
kind of motivation . The
10:43
other side of the data state , which I
10:45
would argue doesn't get enough attention but is increasingly
10:47
getting more attention , is okay . I
10:50
have a business to run . My business
10:52
requires me to do what I
10:54
have to share sensitive data with third
10:56
parties every single day in
10:58
massive quantities . I
11:01
have to share data with third parties who
11:03
I may or may not entirely trust . So
11:05
the idea is okay . I want to actually
11:08
protect both the data that I possess from
11:10
being lost or stolen and I also
11:12
want to have good governance and control
11:14
with respect to the sensitive data that
11:16
I need to share with third parties . I
11:19
should protect both sides of the data estate
11:21
, not just one .
11:25
Yeah , it's . You know , I
11:27
recently I guess fairly recently
11:29
, right in the last 12 months I encountered
11:31
a situation where , you
11:33
know , I worked for a very large company and I won't
11:35
name them , uh , for for my day
11:37
job , right , and I
11:40
worked for the financial services
11:42
part of this company and
11:44
some of our data was with our
11:46
parent company , you know over
11:48
, still within the country and whatnot
11:51
, right , but it was just residing in
11:53
their sales force , right
11:55
, it was going from our sales force to
11:57
their sales force and to
11:59
us , they're not a financial institution
12:02
, we're the financial arm of this large
12:04
company . To us
12:06
, they're a third party . And that was a totally
12:08
different way of me thinking about
12:10
it , right , because my architect brought this problem
12:13
to my attention and I said , well
12:15
, what's the problem ? Right , like they're're
12:18
, they're a part of us . Like we're
12:20
, we're more part of them than anything
12:22
else . You know , what does it matter ? Right
12:24
? And he said , no , we have to treat
12:26
them as a completely separate entity because
12:28
they don't deal with this data . They don't , they're not
12:31
they're not regulated .
12:32
Probably they may or may not regulated right
12:34
?
12:34
yeah , they're not regulated for any of this data
12:36
, and so we had to go through
12:38
a very arduous process of not
12:41
just saying you know how are you
12:43
encrypting this data or how are you storing
12:45
it , how are you protecting it . Show
12:48
us evidence of how you're doing
12:50
it , show us evidence that you're logging
12:52
, show us evidence that
12:54
, hey , there's alerts that pop up
12:57
and we have a whole process around it . It
13:00
was a new situation for me , even
13:03
being in the finance industry for
13:06
probably the past 10 years , almost at this point
13:08
right , where it's always been
13:10
in-house to me , it's never
13:12
been that sort of situation of
13:14
it's a parent company and we're
13:17
sharing data with them and I have to think
13:19
of them as that third party
13:22
, right .
13:23
Yeah , that's the verb . I mean , I think that
13:25
that's the point . The verb is sharing
13:27
versus protecting . Right , like you
13:29
and your mindset , like a lot of security
13:32
professionals in traditional IT are
13:34
absolutely thinking first and foremost
13:36
about I have data that I need to keep possession
13:39
of and I can't let anyone get it . And
13:41
then there's the other sort of verb , which is I
13:43
have data that I have to share . They're
13:46
not a regulated financial institution
13:48
, but they're part of your larger holding company and
13:50
you need to share data with them . Because , let's
13:52
be honest , data , even sensitive
13:55
data , has to move . It moves by
13:57
definition , and when it does
13:59
move and it inevitably will leave
14:01
your possession , the question is what
14:03
can you do to share that data
14:06
but not sacrifice
14:08
ownership , control , privacy
14:11
or security ? How can
14:13
you share that data with that third
14:15
party and potentially do something
14:17
like expiry , like hey , you can have
14:19
it for 30 days but not 31 days , or
14:22
you can have it today , but you know what ? I
14:24
might change my mind tomorrow and I want to revoke
14:26
it . Like , how can you take security
14:29
architecture when you traditionally think about zero
14:31
trust and you have identity and endpoint
14:33
and network and application and then you have data
14:36
? Can you imagine shrinking
14:38
the security architecture
14:40
all the way down to the granular object
14:43
level , which is the data itself . And
14:45
in many respects I tell people all the time
14:48
when we talk about the open standard that
14:50
we're building upon here , it's called
14:52
trusted data format . I like
14:54
to remind people that it's pretty similar
14:56
to Kubernetes and
14:59
containers , like , if you think about
15:01
like software application architectures
15:04
, like 10 years ago they
15:06
were all three-tier monolithic software applications
15:09
and over a 10-year period of time , engineering
15:12
and software development teams began to
15:14
componentize those applications and
15:16
this thing called microservices
15:18
and this thing called cloud became real and
15:20
everybody realized it was like a good idea
15:23
to build applications with
15:25
microservices as core architecture
15:27
, where everything was smaller , everything
15:29
resided within a container and
15:31
the container itself was this
15:33
granular object of software which made like
15:36
production maintenance , better bug
15:43
fixing , better vulnerabilities , better , like you could do so much more efficiently in an ops
15:45
perspective if the application architecture itself was shrunken
15:47
down into the container . Well
15:49
, if you think about security architecture , it's
15:51
the same thing . If you shrink security architecture
15:54
down into a container or
15:56
know , we like to think of it as a
15:58
, it is in many respects the same thing
16:00
as an application container , except it's a
16:02
data container , but the architecture
16:05
itself , the access control , the
16:07
policy , the entitlements
16:10
associated
16:14
with who can access this information , are all
16:16
defined in that granular level . That's
16:19
where you get to this world where policy
16:21
is defined to your earlier
16:23
point , at the intersection of data
16:25
that's been classified as this is sensitive , and
16:28
there's an identity and identities
16:31
over here that are authenticated
16:33
or entitled in some form or fashion , and
16:35
who gets access to the sensitive data . It
16:38
all depends on what data we're talking
16:40
about , whether it's been classified as sensitive
16:42
or not , and who the identity is
16:44
that's trying to access it , whether they have need-to-know
16:46
privilege or not . And , if nothing
16:49
else , just do that and
16:52
you're all of a sudden sort of thinking about the world
16:54
architecturally in a different way
16:56
that I think has traditionally
16:58
been the case .
17:02
That's really fascinating what you said
17:04
with you know , protecting
17:06
the data beyond your boundaries
17:09
and kind of expanding out
17:11
that security architecture .
17:16
Right , that is
17:18
something pretty novel
17:20
that I certainly haven't encountered
17:22
um that's a totally
17:24
different way of thinking about it , even it's
17:27
happening , though , and and like , just think about
17:29
this . I mean , like , let's just pretend for a second
17:31
and use a use case , that
17:33
everybody today , unfortunately , is very
17:35
familiar with this concept of nato and this
17:37
unfortunate
17:40
thing called war right , where all of a sudden
17:42
you're assembling force
17:44
of allies
17:46
, third parties , other
17:49
countries that are federating together
17:51
in near real time to do a job , and
17:54
you're across different domains and
17:56
the job today is here and the job tomorrow
17:58
is there . So the actual environment
18:01
in which you're executing is temporal , it's ephemeral
18:03
. Tomorrow is there . So the actual environment which you're executing
18:05
is temporal , it's ephemeral . There is no it infrastructure , because it's just incredibly
18:08
hard to build networks and and perimeters
18:10
and and and identity and access
18:13
control and all those traditional sort of it infrastructure
18:16
kind of things at the pace
18:18
at which the mission demands , because the mission
18:20
demands , you know , speed
18:23
and it has to like work here today
18:26
, now , and as
18:28
a result of the mission being very temporal
18:30
and very dynamic and cross domain
18:32
and and collaborative with different mission
18:35
partners , it's not just the us , it's
18:37
it's it's the uk , it's france
18:39
, it's germany and it's even now
18:41
new NATO members like
18:43
Finland and Sweden . And all
18:45
of a sudden you're like , okay , how do I share information
18:48
with my trusted allies and my partners
18:50
across domains in that context
18:52
where I don't have time to build a secure
18:54
network ? How do you
18:56
do that architecturally ? The
18:59
answer is you probably have to get more
19:01
granular . The answer is you probably
19:03
have to examine the possibility
19:06
of a container-like
19:08
capability and hopefully
19:11
you could imagine it
19:13
in an
19:15
open standard . That's what
19:18
trusted data format is and it's
19:20
something that you know . And
19:23
look , I'm not saying that the architectural
19:25
concept of granular is
19:29
the only thing that's necessary for
19:31
modern cybersecurity practices
19:33
to kind of reach their potential . I'm
19:44
saying that it is a component of the architecture . Yes , you're going to continue to
19:46
have to do traditional identity and endpoint and network and application security , of
19:48
course , but I'm also certain that the nature of the business that we all
19:50
have to contend with is increasingly
19:52
going to the benefit
19:54
of having granular security . Architecture
19:57
will become obvious to folks as the world
19:59
continues to kind of unfold as
20:02
the world continues to kind of unfold
20:06
.
20:06
Yeah , you put it an interesting way . You say the world
20:08
unfolds . It
20:12
certainly feels like it . It's an interesting
20:14
time . I
20:17
feel like we've never gone through something like we're going through
20:19
or about to go through before . What's
20:21
the company that you're a part of
20:23
is coming up with this kind
20:25
of open framework and whatnot ?
20:28
Well , first of all , I mean to emphasize
20:30
again the open standard is called Trusted
20:32
Data Format and anyone can go and look at
20:34
it . It is , in fact , hosted today by
20:36
ODNI , which is the Office of Director
20:38
of National Intelligence , so it comes out of the NSA
20:41
. We
20:43
my company , is called Vertru , and
20:46
we have innovated on top of
20:48
this open standard and we've developed
20:50
a variety of integrations
20:52
to different workflows that are all
20:54
about the verb sharing . So
20:56
if I have to share sensitive data
20:58
in a workflow called email , or if
21:00
I have to share sensitive data in a workflow
21:03
called files , or if I have
21:05
to share sensitive data back to your example
21:07
between two different Salesforce instances across
21:12
two different domains that happen to be part of the
21:14
same company all of those scenarios
21:16
, sensitive data that has to be shared
21:19
as part of some value
21:21
stream the question is how
21:23
do you ultimately provide granular
21:25
policy access , control and enforcement
21:28
encryption optionally on that information
21:30
? And you know
21:32
not to . I don't
21:34
want to diminish , you know , the importance
21:36
of Virtru as a
21:38
company , because what we're doing with
21:40
the open standard is really pretty
21:42
innovative , but I'm a big believer
21:45
in the power of open standards and I just think
21:47
that it's , uh , very compelling
21:49
to step back and sort of again
21:51
look at like , wow , man kubernetes
21:54
over a 10-year period of time
21:57
became the standard
21:59
for microservices application architecture
22:01
and there were lots of reasons for it . You
22:09
know , architecturally the world of how software is built and delivered in production and maintained
22:11
in production today is fundamentally different because of
22:14
an open standard , and
22:17
I believe the same will happen
22:19
with security architecture At least
22:21
granular security architecture will
22:23
be supported by an open standard . I'm
22:25
not saying TDF is the only open standard
22:28
that might be benefited as a result of that , but
22:30
it's certainly well positioned to help with
22:32
that sort of trend , that shift
22:35
in architectural thinking . And as
22:37
that plays out , my company , virtru
22:39
, intends to be a leader in
22:41
that regard and we're already doing a bunch of great
22:43
work today by providing granular
22:45
policy access , control and enforcement of
22:47
those policies on that sensitive
22:49
data that's shared through email , file
22:51
and application workflows .
22:55
Do you think you bring up Kubernetes
22:57
? Do you think containerized
22:59
experience or knowledge is
23:01
going to be critical
23:04
to have for any security professional
23:06
, you know , going into the future , Because
23:09
that is something that I actually haven't thought
23:12
a whole lot about , but it seems
23:14
like more of the cloud is going
23:17
towards this containerized slash
23:19
serverless infrastructure .
23:23
Smaller is better . They
23:26
call it microservices application
23:28
architecture , for a reason Microservices
23:31
in the application realm
23:34
is to microsecurity
23:36
in the cyber realm . So
23:39
, whether you're talking about cloud ops
23:41
or you're talking about security , I think
23:43
there is a shift . Whether you're talking about cloud ops or you're talking about security
23:45
, I think there is a shift where granularity
23:48
matters , and the shift
23:50
towards microservices and more
23:52
granular application architectures
23:54
has
23:57
gone full circle . It's a thing , it's
23:59
happened , it's done , it's there . The
24:02
shift towards micro security
24:04
architectures , with something like TDF
24:06
, is underway . You
24:09
know if it's a baseball game , we might be in the second
24:11
or third inning , but I do
24:13
believe it's going to continue
24:15
and it will
24:17
take time . Like any large scale
24:19
tectonic architectural shift in
24:22
IT takes time a decade , but
24:24
it's underway .
24:28
Yeah , it's really fascinating to try
24:31
and guess
24:33
where the market is going , where it's
24:35
all heading right , Because I always try
24:37
to approach it from the perspective of
24:40
giving people advice of what skills to get
24:42
right , Because there's so many out there , there's so
24:44
many you know , different domains
24:47
that you can specialize in and whatnot . If
24:49
there was a key , maybe one to
24:51
three skill sets that you would recommend
24:54
for someone to start mastering now , what
24:56
would those be ?
24:58
I mean number one , without a doubt
25:00
. Two things I would say
25:02
is history with
25:25
Windows to thin client computing , with the browser to on-prem server data center computing , to the
25:27
eventual migration to cloud everything and the eventual migration from
25:29
three-tier monolithic application architectures
25:31
to microservices
25:34
. If you step back
25:36
and you give yourself as a person
25:38
who's really seeking to understand
25:41
, I think , if you give yourself the benefit
25:43
of a 10,000-foot view and
25:45
you take the time to understand the big
25:48
picture architecturally , then
25:51
that's a really sound basis
25:53
from which to dive deep into
25:55
any particular area , to kind
25:57
of develop a sharper expertise . I
26:00
think it's very important to understand
26:02
the history of where we come from expertise . I think
26:04
it's very important to understand the history of where we come from
26:06
, the
26:10
reality of where we are and the potential for where we're going . If you can ground yourself in
26:12
that big picture then it's a lot easier to make decisions as
26:14
to where you want to go deep .
26:18
Hmm , yeah , that
26:20
is really fascinating . You start
26:22
with the history of it . I never
26:24
thought about it like that . To be quite honest , I've
26:27
been doing this for several years now and I've
26:29
never thought about trying to go back
26:31
and see where things were
26:34
and trying to guess , use
26:36
that to judge where everything is going now
26:38
. It's a really interesting method
26:41
.
26:42
It's history . Here's the good news
26:44
. In know it's , it's and it's here's the good news , and
26:46
that it's kind of fun . Um
26:49
, you know , you look at something like lotus
26:51
notes and ray ozzy . You know who
26:53
invented notes , you know
26:55
, and you look at the massive
26:58
implications that that had , as it related
27:01
to what we now know to be modern
27:03
computing . I mean , it was truly , truly
27:06
formal . And you
27:08
think about , I mean cryptography
27:11
. I mean notes was the first product
27:13
in the history of the world to distribute cryptography
27:16
at a time when the federal
27:18
government and the NSA in particular , wasn't particularly
27:20
keen on anybody distributing cryptography
27:23
if you weren't employed by the NSA , like , like , like
27:25
. There's a lot of history there which
27:27
you know goes back to . You know
27:29
kind of their . I do think if
27:32
you spend a little bit of time in the history
27:34
of the , of the industry and the evolution
27:37
of those , um , great company
27:39
stories , great product stories
27:41
, great product stories , great innovation
27:44
successes , they all have an
27:46
opportunity . They all tend to teach you
27:48
a ton , not
27:50
just about what happened in the past , but they
27:52
all tend to give you some really interesting perspective
27:54
with regards to where things are right
27:56
now and why .
27:58
Hmm , yeah , it's
28:01
fascinating , you know , bringing up
28:03
encryption and cryptography and the
28:05
fight that the NSA went through , right
28:07
of trying to maybe , you know
28:10
, keep it behind closed doors , but
28:25
then I also feel like there's a whole lot more
28:27
reasons that it should be out there , right
28:29
? Obviously , you know , people have to be
28:32
able to protect their own data . They have to be able to
28:34
own their own data and ensure the integrity
28:36
of it , right ? Without the encryption
28:38
capability , you're not really able to do that
28:40
.
28:40
There's a terrific book . There's
28:43
a terrific book , if you're interested , really able to do that . And there's a terrific book , if you're interested . It's called crypto and
28:45
it's by steve levy and it's really , really
28:47
awesome and and would encourage
28:50
anyone that might be listening to check it out . But
28:53
it's , it's , it's everything that
28:55
you're talking about . It's and it and it
28:57
goes from the very earliest beginnings
28:59
. You know with like you
29:01
know with diffy and hellman . You
29:04
know public infrastructure
29:07
. It goes through the whole nsa
29:09
. You know hand-wringing about we can't
29:11
let encryption into the hands
29:13
of anyone , because that
29:16
would be bad for us to where
29:18
we find ourselves today , where
29:20
encryption is a necessary component
29:23
of good modern uh it
29:26
engineering and cyber hygiene , because
29:28
, um , you're up against
29:30
really formidable opponents who
29:32
have really top-notch skills and
29:34
you better be able to protect your information
29:37
with with cryptographic skills . If not
29:39
, you're gonna lose . I
29:42
mean , it's been a long time coming , but
29:44
that world has come full circle too .
29:47
Yeah , so how
29:49
does Virtru let's
29:52
talk about how Virtru , you know , solves
29:54
this problem or helps working
29:56
towards solving this problem .
30:00
Yeah , I mean listen at the end of the day . Sometimes
30:03
people look at Virtru from the outside and they go it's
30:05
for example . Somebody might look at it and go , oh
30:07
, it's an email encryption company . I'm
30:09
like no , it's not . Yes , we
30:11
provide a 50-year-old with no IT
30:13
education who's
30:24
a nurse at some healthcare practice in the middle
30:26
of the country , doesn't know the first thing about
30:29
encryption , can compose an email , attach
30:31
a file and click a button and apply
30:33
granular policy and access
30:36
control and encryption to the object for
30:39
the purposes of protecting HIPAA data
30:41
. It is that easy for
30:43
the purposes of protecting HIPAA
30:45
data . It is that easy , and
30:48
all of the magic that happens under the covers is made possible
30:51
by the Virtue data security platform
30:53
and the services
30:55
that we make available in
30:58
that platform . Things like encryption
31:00
, management , key management , policy
31:02
definition , enforcement , access control
31:05
all those things are
31:08
exposed in an application
31:10
that gets integrated into this thing
31:12
called Gmail or this thing called Outlook
31:15
. Alternatively , we can
31:17
integrate into different file sharing services
31:19
like Google Drive services
31:26
like Google Drive . Alternatively , we can provide policy and access control
31:28
between two different SaaS applications that might be sharing data back and forth , but it
31:30
is ultimately for us . It's
31:32
one of the reasons I really wanted to emphasize earlier
31:35
. We're very clear about who we
31:37
are and who we aren't . At Virtru
31:39
, we are not in the business of helping
31:41
you protect sensitive data
31:43
that you possess inside of your business from being
31:45
lost or stolen due to bad guys . That's
31:47
not my business . My business
31:49
is the other side of that coin . My business
31:51
is the other side of that estate . My
31:53
business is helping you get
31:55
to a place where you can confidently
31:57
share sensitive data with
32:00
third parties in the name of driving
32:02
your business forward , because that's what's
32:05
required . You have to share data to do
32:07
business . I want to give you the
32:09
confidence and the simplicity and the ease
32:11
and the elegance to do that in
32:14
a way where you can share the data but
32:16
you are not going to sacrifice control
32:18
, privacy or security
32:20
, and you can do things like exploration
32:23
and revocation , because the data
32:25
belongs to you and you alone
32:27
. Hmm .
32:30
And does this work with
32:33
other SaaS applications like Salesforce
32:35
and all the other myriad
32:37
of apps out there ?
32:39
So we have natively integrated this
32:43
data-centric security granular
32:45
control into SaaS
32:47
applications like Zendesk ticketing for
32:50
help desk . We have
32:52
, I would
32:54
call it , arm's length integration into
32:56
Salesforce vis-a-vis
32:59
what's called an application
33:01
gateway . So as long as your Salesforce
33:04
instance is communicating sensitive
33:07
information out of your Salesforce instance to a third
33:09
party and you're using SMTP to
33:11
do it , we can just very elegantly
33:14
apply policy and give you all the benefits of
33:16
those granular controls that we just talked about
33:18
. We have also developed a
33:20
platform
33:23
that's now increasingly being deployed
33:25
on the high side in support of DOD
33:27
customers and IC customers , which
33:29
gives them the ability to take
33:32
advantage of those low-level system
33:34
services that I described at the platform
33:36
substrate and to incorporate
33:39
them into . You
33:41
might think of them as legacy mission applications
33:43
, like older applications that
33:45
would benefit from granular policy
33:47
and access control on unstructured data that's
33:49
being shared out of the application . That's
33:53
not something that's off the truck , that's a bit
33:55
more custom and bespoke
33:57
for some of those customers . But
33:59
yeah , that's what we're doing
34:02
and it's all about sensitive data
34:04
and I want to emphasize the verb sharing
34:06
of sensitive data , because if you're going to
34:08
share the sensitive data , you
34:10
got to think about security architecture
34:13
differently than if you're only
34:15
focused on protecting
34:17
it from being lost or stolen , and you already
34:19
possess it like like the intentionality
34:22
. you have agency over data
34:24
, um , and you
34:26
have agency over the data that you possess
34:29
, because you don't want someone
34:31
else to get at it . I get , get that , that is
34:33
not my business . But
34:37
you also have to have agency over the data
34:39
that you intend to share with others , and
34:41
that's what we do .
34:44
So you were talking earlier about
34:46
having almost like a container around
34:49
that data . What
34:52
does that look like ? I mean , is it really
34:54
a container and you're assigning permissions
34:57
to that data type
34:59
? How is the container defined ? I guess
35:01
is a better question .
35:03
That is exactly the open spec
35:05
in the trusted data format , which is
35:07
the open standard . It is think of
35:09
it as an XML wrapper or
35:11
a bit . I mean . Sometimes people will call it an XML
35:14
wrapper , sometimes people will call it an XML
35:16
wrapper , sometimes people
35:18
will call it a container . People historically
35:20
have called it a wrapper or an envelope . I
35:23
have become fond of calling it a container
35:25
and the reason I like calling it a container is because it's
35:27
essentially XML
35:30
standard which
35:32
basically allows you to define policy
35:35
and to assert
35:38
policy on the object Like
35:40
. This thing is allowed to be shared with
35:43
this person . This person can access
35:45
it . It's going to require encryption
35:48
and the way that it's going to be decrypted
35:50
is this way . And so defining the policy
35:53
and giving you the ability to enforce
35:55
the policy at that
35:57
intersection between this object , which
36:00
we've determined is sensitive , and that
36:02
entity , which we'll call it a human
36:04
or a machine , once
36:06
it's been authenticated and entitled , is
36:09
how we ultimately
36:12
bring to life the application value
36:14
that Virtue delivers . But we do it
36:16
all on top of an open
36:18
standard . And to your question
36:20
, for anybody who's interested , I would
36:22
encourage them to look at . It's
36:25
very easy . There's the you know the TDF
36:27
spec is available for anyone to see . Just
36:29
simply Google it . There's also the OpenTDF
36:32
project . You can see the
36:34
full spec there . You can see sample code
36:36
, you can see use
36:38
cases . There's a really good , rich , robust
36:41
set of information that's available
36:44
for anyone to kind of dig into and get their
36:46
head wrapped around it . It's pretty robust
36:50
.
36:51
Yeah , it's really fascinating and you describing
36:53
it as a container makes it more
36:56
, I would say , easily consumable
36:58
and understandable to a lot
37:00
of people . Right , Because if I would have seen
37:03
let's just say , for
37:05
example , right , if I would have seen like XML
37:07
wrapper in a description
37:10
or something like that , I'm going to think of it differently
37:12
. But now that you've related those two terms
37:14
to something that I
37:16
understand like container , it
37:18
makes it a lot easier
37:21
to understand , I guess .
37:23
Well , I'm curious . So I'm
37:25
glad to hear you say that . But I'm curious
37:27
, why Is that ? Because you
37:29
have an IT background and you already
37:31
kind of conceptually understand what containers
37:34
are in the application sense
37:36
, and to you a container is nothing more than
37:38
a small microservices
37:40
unit of software which sits inside
37:43
of this container , which allows me to
37:45
manage it in kind of a molecular
37:47
nature . So container means something to
37:49
you in a software sense and it's
37:51
easy to relate that to a security sense
37:53
. Is that ?
37:55
is that true yeah , so I I
37:57
would say I have more experience with
37:59
containers than I do xml wrappers
38:01
or tcp wrappers . Right
38:04
, because that's not my background . My background is
38:06
more infrastructure , turning
38:08
into iam and data security
38:11
and network a little bit , right
38:13
. So when I hear container , I understand
38:15
what a container is . I understand
38:17
, you know the different security principles
38:20
of it and everything else isn't
38:35
an easy concept for a lot of people to grasp .
38:36
And back to our earlier conversation about how do you determine whether you're new to the
38:39
industry and just getting started , or whether you're
38:41
a longtime veteran of the industry and you know
38:43
what you already know . I
38:45
think it's oftentimes easiest
38:48
to kind of convey
38:50
you know . Again
38:52
, be really clear about who you
38:54
are and who you aren't . And in
38:56
order to be clear about who you are , I
38:58
think it's oftentimes easier
39:01
to do that when you can communicate
39:03
in the context of something that everyone
39:05
else already understand
39:08
. So , like the world
39:10
gets containers today because
39:13
of Kubernetes , the world gets containers today because of Kubernetes . The world
39:15
gets containers today because
39:17
of the cloud . The world gets
39:20
containers today because they remember
39:22
the old days , 10 years ago , when
39:25
you had an application in production and you
39:27
had to take it down for
39:29
48 hours just to patch a zero-day
39:31
vulnerability . What your
39:35
software is going to be down for two days to patch a vulnerability that's insane
39:37
. Now they're like , no , the
39:40
application doesn't come down . We're going to patch
39:42
the vulnerability here in
39:44
this container . We're
39:46
not going to do open heart surgery , we're going to
39:48
do laser surgery . It's
39:51
like that's the power of granularity . And
39:53
then it's like , okay , so I get containers
39:55
and how they're valuable to the application
39:57
architecture . And then you're all
39:59
of a sudden having a conversation . You're like , well , now let's talk
40:02
about containers and how they can be powerful to
40:04
security architecture . What
40:06
do you mean ? I mean like a little
40:08
container , you put sensitive data in
40:10
it and you share it . Why
40:13
? So you can define policy , enforce
40:15
policy and access control and
40:17
you can protect the thing , the
40:19
object , at a granular level like
40:21
never before .
40:22
It's just , yeah
40:26
, that that is . That's very interesting
40:28
. And you know , you , you brought up
40:30
, uh , you know europe , right
40:32
, and my mind immediately
40:34
went to gdpr and how useful
40:37
this would be in that environment , right , I
40:40
wonder if this will be used to
40:42
kind of push along even more
40:44
, you know , even more
40:46
, I guess , recommendations or
40:48
policies within the
40:51
United States itself , right within the United
40:53
States itself , right , like
40:56
kind of having that mentality shift and then creating the
40:58
policy to follow it . Does that
41:00
make sense ?
41:01
Well , 100% . So let's talk about that for a second
41:03
, because with the email
41:05
product that I mentioned the integration
41:08
of Virtru into an email workflow like
41:10
Gmail let's use Gmail as an example . You're
41:13
the business and you're a smart it
41:15
person . You understand encryption and keys
41:18
. You get the basics of what I'm talking about here
41:20
. Um , what
41:22
happens is google has your content
41:24
, they have your email , they have your , your , your
41:27
keystrokes , like they've got your content in their
41:29
cloud . And when virtue Virtru
41:31
integrates into Gmail
41:34
, you
41:36
then click a button and you apply a policy
41:39
and access control and encryption and
41:42
the encryption key . They
41:44
have your content , but we have your key , and
41:47
so there's separation of trust , and
41:49
that's a good thing . As you kind of go
41:52
back to your GDPR analogy , it's like
41:54
you know what it's my
41:56
data . It's not Google's . I
41:58
understand Google Workspace is
42:00
a remarkably powerful cloud collaboration
42:02
platform . I love it and
42:04
at the same time , it's my data , not theirs advantage
42:15
of everything that Google Workspace has to offer me in a way where
42:17
I'm in control of my data , not Google . And you know this concept of like a blind subpoena . I don't
42:19
know if you're familiar with it , but , like God forbid
42:22
, you know this is popularized
42:24
now , just recently with the assassination
42:27
attempt on Donald Trump . You know
42:29
, the young man who did
42:31
this , you know , apparently had his iPhone locked
42:33
and there's some
42:35
debate now going on about how law
42:37
enforcement is working
42:40
to get to the device
42:42
, and apparently they were able to do so
42:44
with some assistance from a third party
42:46
who has expertise in that . But
42:49
, as I understand it last and I'm
42:51
not fully read up on this , but I understand that there
42:54
was some subsequent information that was encrypted
42:56
in , I think , whatsapp , which was the application
42:58
on the guy's device . But
43:00
it goes down to this and this , of
43:03
course , goes full circle back to the NSA
43:05
and the law enforcement concerns . At
43:07
what point does society get so
43:10
good at encryption
43:12
and privacy that it makes it difficult for
43:14
law enforcement to do their job
43:18
? I'm not in the business of drawing those lines
43:20
I mean , that's way bigger
43:22
than me but I do absolutely
43:25
agree with you that in
43:29
the world as we know it today
43:31
, more and more human , just normal
43:33
people are beginning to understand
43:36
the value of their data . And
43:38
when they understand the value of their data , they're going
43:40
to ask for capabilities , they're
43:42
going to ask for the ability to control their
43:44
data . They just don't want to simply
43:46
give it up to Google or they don't want to simply
43:49
give it up to their bank or whoever because
43:51
it belongs to them , to their bank or whoever because it belongs to them . And
43:54
in that world where everybody understands
43:56
that it's their data , that
44:04
world will begin to demand more and more capabilities from their cloud providers
44:06
, from their application providers , from their IT
44:09
providers , and
44:11
that's again back to that . It's
44:13
going to take 10 years , but that's where the
44:15
world's going , I believe yeah
44:18
, I certainly hope so .
44:20
I feel like you know , just seeing how our
44:22
own data is being used against
44:24
us to like , form different opinions and
44:27
direct our thinking and our buying
44:30
habits and everything else like that , right , like
44:32
it's frustrating . It's
44:34
frustrating , it's very frustrating and
44:36
it's also very eye opening because it's like
44:38
oh , you guys are , you guys are monetizing
44:40
everything about me when I use
44:42
your platform , whether I , whether
44:45
I know it or not , and that's really
44:48
frustrating because now you're making money off
44:50
me and you're a multi billion dollar company you know probably trillion dollar
44:52
market value , right , and you're a multi-billion dollar company . You know probably trillion
44:54
dollar market value , right , and you're
44:57
you're making that money off
44:59
of me . You know that's . That's
45:02
a very uh , dicey topic
45:04
, right ?
45:05
well , listen , I mean , you know , I don't know
45:07
how big the market for duck duck go is . I've
45:10
heard it's like less than 10 , maybe less
45:12
less than five . It's small , but
45:14
there's a percentage of people out there
45:16
that are all in on DuckDuckGo and
45:18
believe the power of that browser
45:20
and the privacy enhancement capabilities
45:23
that it delivers is well worth the investment
45:25
, for exactly the reasons you just articulated
45:27
. But it's not nine , it
45:30
is a small percentage . And then
45:32
there's , you know , I have a I
45:34
know a guy , a friend of mine , john Doyle
45:36
, who's the CEO of a company called Cape Wireless
45:39
here in Washington DC , who is
45:41
getting ready to launch a really
45:44
innovative , interesting national
45:46
cellular carrier , wireless carrier
45:48
network where you can basically
45:51
go to Cape Wireless and
45:53
get a new mobile phone
45:55
number and a phone and they don't
45:58
ask you for any information about you because they
46:00
don't need it . They're not creating
46:02
an account profile with your name
46:04
and your social and your address and your email
46:06
and all that stuff , because they don't
46:08
want that information . Like
46:11
this is privacy-first
46:13
mobile carrier network
46:16
called Cape Wireless , and so again
46:19
, it's a journey we're all on it and
46:21
privacy and this whole thing is
46:23
complicated . I
46:25
don't . I
46:28
, you know . I separate that a little
46:30
bit from just the IT infrastructure
46:32
side of it . Before you can kind of get to that future vision , there's the IT infrastructure
46:34
side of it . Before you can kind of get to that future vision , there's the practical
46:36
reality today that says I'm just a healthcare
46:39
company trying to share sensitive patient data
46:41
with a client or a patient outside
46:43
of the organization . How do I do that in
46:45
a way where I'm compliant with HIPAA ? Or
46:47
I'm a bank and I have to share really sensitive
46:49
information with a client
46:51
who's on a yacht in the Caribbean and I
46:54
want to encrypt it , but I don't want the person
46:56
on the yacht to struggle mightily
46:58
with the decryption experience . It's got to be
47:00
simple , elegant , seamless for everybody
47:02
the person in the bank sending the information
47:05
and encrypting it and the person on the other
47:07
end receiving it and decrypting it . You
47:09
know these are simple , practical
47:11
things that businesses are
47:13
doing today with virtual products
47:15
powered by that OpenTDS standard
47:17
and you know
47:20
we're excited to play a role in
47:22
I'll call it data-centric
47:24
security . But the founders
47:26
accurately both have
47:28
a deeply held belief that they're
47:30
doing the right thing as it relates to privacy
47:33
.
47:35
Yeah , that's awesome . Well , matt
47:37
, you know we're at the end of our
47:39
time here , unfortunately , but I really enjoyed
47:42
our conversation . I think it was a fantastic
47:44
conversation .
47:45
Yeah , I appreciate you having me , and thanks
47:48
for the opportunity to connect
47:50
and compare notes and we'll catch up with
47:52
you soon .
47:53
Yeah , absolutely . Well , Matt , before I
47:55
let you go , how about you tell my audience where they can find
47:57
you if they want to reach out and where they can find your
47:59
company ?
48:00
Yeah , virtrucom , that's V I
48:02
R T R Ucom , v
48:05
I R T R Ucom , and
48:07
I am available
48:09
right there on the company management page , and you
48:11
can also find me on LinkedIn .
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More