Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
you know he's more or less an IT
0:02
guy as well? No, but I can't
0:04
imagine after your story about stealing things in
0:06
Beeswift, you said that he was part
0:08
of Pirate Bay, maybe? I thought you would
0:10
say IT guy, yeah, because he's stealing. Soon
0:14
we'll be discussing data and AI.
0:17
But first, let's talk a bit about
0:19
music and even fashion. Hey, Robert, I
0:21
have a confession to make. As you know,
0:23
we do record these things from home,
0:25
and I'm currently at home. It's still early
0:27
in the morning as we speak when
0:29
we're recording this, and I am wearing my
0:31
Japanese streetwear, as I often wear inside
0:33
the home. And these trousers, they're
0:35
very white at the face, you know, upper
0:37
than half, and then narrow at the ankles.
0:39
And that's Japanese streetwear, apparently. I wasn't aware
0:41
of that, but that's what they call it.
0:43
Some people call it harem pants. And
0:45
when I thought about that, I had
0:47
to think about MC Hammer. because that
0:49
guy was wearing these pants before they
0:51
were fashionable. And there's, of
0:53
course, his famous 1990 hit. You
0:56
can touch this. And this is a very
0:58
interesting song. And I'll be back on that.
1:00
But it's also famous for its dance. Have
1:02
you seen it, Robert? I've seen it
1:05
run definitely. And I must say, I'm more
1:07
in the punk and rockabilly scene. Of
1:09
course. I was not really impressed. Same
1:12
here. But the music I
1:14
listen to, I have no problem with you can't
1:16
touch this in that kind of song. But
1:18
this video, yeah, no, it's a bit far from
1:20
me. It's a famous video,
1:22
of course. You had this
1:24
musician, the weird Al Jankovic, and he
1:26
made a parody of it that
1:28
he often did. His song was called
1:30
Can't Watch This. You see MC Hammer
1:32
for the first time making that
1:34
dance in his harem pants, in his
1:36
Japanese streetwear. That is life -changing. There's
1:38
more to this song because, Robert, it
1:40
uses a sample. And that sample...
1:42
actually a famous bass riff from Rick
1:44
James, his famous song Super Freak.
1:46
They sampled it and at that time,
1:48
1990, it still was a newer thing
1:51
to do. A similar thing happened
1:53
with the Sugar Hill gang. They were one
1:55
of these very first groups that actually
1:57
made rap, I think, popular and they
1:59
had this song called Rapper's Delight, maybe
2:01
you still know it. And this was
2:03
fully based on the Good Times
2:05
bass riff, magnificently played by the later
2:07
Bernard Edwards. So in both
2:09
cases, because it was so new, they
2:11
simply stole. The bass riffs sampled
2:13
it and used it in their own
2:15
song and they didn't even pay
2:17
attention to the fact who was the
2:19
original artist and composer. So they
2:21
were sued in both cases and had
2:24
to pay literally millions of it
2:26
in royalties. But let's get back a
2:28
little bit to MC Hammered, Robert.
2:30
He used to be a really literally
2:32
an internet emperor in these days.
2:34
He did many tech crunch conferences, for
2:36
example. Around 2007, he was chief
2:38
strategy officer of a dancejam.com. This was
2:40
a social media community site exclusively
2:42
dedicated to dancing video competitions. So
2:45
techniques and styles and MC Hammer would
2:47
be the judge of these things frequently. So
2:49
he was actually over social media
2:51
early stage, very much also
2:53
involved, for example, with YouTube and Twitter,
2:56
very much also involved in devices, not
2:58
only the iPad, for example. He was
3:00
very much an ambassador, very early stage
3:02
for it, but also the so -called
3:04
Zakmates. You know what that is? Yeah,
3:06
I do not. That's an iPad keyboard.
3:08
And then in 2011, already some years
3:11
ago, but you see how long he's
3:13
already doing this, he announced a new
3:15
internet venture called Wiredo, a deep search
3:17
engine. And at the time he
3:19
wanted to rival Google and Bing. Wiredo
3:21
never got out of beta mode, but
3:23
nice try again. So you see, there's
3:25
a lot of technology. That's
3:27
quite an interesting guy. And all
3:29
of this started with my pants
3:31
that I'm wearing this morning. My
3:34
Japanese streetwear. Anyway, we happen to
3:36
be discussing today with Ion Wachter
3:38
from Roseman Labs and this is
3:40
a company focusing on collaborating and
3:42
sharing data through encryption. So that
3:44
the data itself is never visible
3:46
or affected and that makes actually
3:48
them perfect sense. Why we
3:50
have named this podcast episode, you
3:53
can touch this data. Welcome
4:01
to the data -powered innovation
4:03
jam. A podcast series about
4:05
AI, analytics, intelligence and all
4:07
that data jazz. Data
4:09
rings value, inspiration and innovation to your
4:11
business. And that is what
4:14
we explore in every episode. Bringing
4:16
you the latest trends, discussing the
4:18
best ideas and sharing experiences. We
4:20
as hosts well, at least some of
4:22
us, ever to have a background as
4:24
avant -garde musicians. So every now and
4:26
then we can't help but navigate the
4:29
edges of jazz, rock and pop. Because,
4:31
after all, they're just as groovy as
4:33
data on AI. And one
4:35
thing is for sure, we'll always be jamming.
4:37
I'm Ron Toledo. I'm a baby
4:39
film. And I'm Robert Engels. So
4:42
a warm welcome today
4:44
to Ian Wachter from Roseman
4:46
Labs. Wachters, which only the
4:48
Dutch can pronounce. Yeah, well, you
4:50
try that. Ian Wachters. And
4:53
here we're talking about sharing data, encrypting
4:55
it. making it invisible, which so much
4:57
fits into our discussion about MCMR as
4:59
well. It sounds very interesting Ian, so
5:01
welcome to the podcast. We're making this
5:03
a little bit of a Dutch thing
5:05
today because both Robert and myself originally
5:07
are Dutch and now we have a
5:09
third guy into the podcast as well.
5:11
So welcome Ian, very nice to have
5:14
you in the podcast. For those people
5:16
that don't know Rosemann Labs at all
5:18
and it's technology. I think it would
5:20
be worthwhile to start a little bit
5:22
with an introduction from your side. What
5:24
is Rosemann Labs exactly and what are
5:26
they doing? Rosemann
5:28
Labs is a company that builds a
5:30
data platform through which parties can collaborate
5:32
on data. They can bring data sets
5:34
to the platform. They can combine those
5:36
data sets, analyze them, build AI on
5:38
top of it. But all of that
5:40
with a very specific feature, which is
5:42
while the data sits on the platform, you
5:45
cannot see the data. So you
5:47
can run analytics or AI on
5:49
data that you cannot see can
5:51
touch it either you cannot touch
5:53
it only the algorithm the python
5:55
script the AI whatever you want
5:57
to run on the data can
5:59
touch the data but you can't
6:01
touch it and you can't see
6:03
it and that helps organizations to
6:05
collaborate in those situations where they
6:07
have data or they would like
6:09
to combine data or access data
6:11
from each other. But they can't
6:13
share it either because of privacy,
6:15
confidentiality, IP reasons. They don't
6:17
want to exchange the data with each other,
6:19
but they do want to collaborate. Nowadays,
6:22
a senior commercial advisor to Rosemann Labs
6:24
used to be a chief commercial officer for
6:26
them as well. And before that, an
6:28
advisor. So how did you get in touch
6:30
with Rosemann Labs in the first place?
6:32
Because I think you have a wide interest
6:34
in, let's say, the startup community. Right.
6:36
Yeah, that's correct. Yeah, I used to be
6:38
a management consultant with PCG. That was
6:40
the sort of the core of my career.
6:42
Decided to leave about five years ago. Submerged
6:44
myself in the startup scene, first via
6:46
Yes Delft, the incubator of the technical
6:48
university in Delft, my former university. And
6:50
that is where I'm working with many
6:53
startups. I came across Rosemann Labs and
6:55
I had the honor to be their
6:57
mentor in the startup program. The
6:59
company wasn't even founded. It was early
7:01
2020. After the program, the founders decided
7:03
to start the company. So we just
7:05
had our fifth anniversary last week. And
7:07
then I became a member of the
7:09
advisory board. So we stayed in touch.
7:11
I was helping guys a bit. We're
7:14
having discussions from time to time. And
7:16
at some stage, I decided to quit
7:18
the startup that I was working for
7:20
in an operational role. And they asked
7:22
me to join and help set up
7:24
the commercial side of the company. So
7:27
that's what I did. But I
7:29
also said, well, I should make sure
7:31
that after a couple of years,
7:33
I can leave again. I can leave
7:36
you alone. So we started hiring
7:38
a team and last year I decided
7:40
to hand over the CCO role
7:42
to others. So I'm back to more
7:44
a pure operational business development role.
7:46
That's also what I really like being
7:48
outside talking to potential clients explaining
7:50
about our technology what we do. I'm
7:52
no longer in the management team.
7:54
I hear you that actually makes a
7:57
lot of sense to be honest
7:59
before we dive a little bit deeper
8:01
into the actual technology and its
8:03
use cases. So why this
8:05
particular startup, Roseman Labs, attracted
8:07
you so much? Why
8:09
did it draw your attention? Two reasons. I
8:12
like startups on the one hand
8:14
that do something that is really special.
8:17
They really have a great vision
8:19
and something really exciting in terms
8:21
of technology. But on the
8:23
other hand, I've also learned over time that
8:25
especially when you want to get involved
8:27
yourself, the team is super important. And I
8:29
mean, one of the reasons I always
8:31
say, why did I stay with BCG for
8:33
20 years? It was because of the
8:36
people. And so I got
8:38
to know Roseman Labs over time.
8:40
I knew the guys, the founders,
8:42
and I knew that if I would
8:44
join them, that would be a good
8:46
fit on the people side. Yeah, very
8:48
important. That is always important. I was
8:50
just wondering, because this is quite a
8:52
story, going into a startup environment. I
8:55
did some mentoring of startups myself as
8:57
well. I know that it's always very,
8:59
it gets you going, it gets your
9:01
thoughts flowing and it's nice to be
9:03
in this vibrant environment. And then all
9:05
of a sudden you have a startup
9:07
technology under your nails and you have
9:09
to do the business development, like you
9:11
said. How do you
9:13
explain secure multi -party communication
9:16
and so on to non -technical
9:18
executives? Do you have an analogy that
9:20
you use then? No, not
9:22
so much. Initially, we very often talked
9:24
about the technology. And more and more
9:26
over time, we realize that talking about
9:28
the technology is actually not so relevant. You
9:31
should talk about the value that it
9:33
brings. And the easiest way to do that
9:35
is to talk to people about the
9:38
challenges that they have. So if you talk
9:40
to a researcher in an academic hospital,
9:42
what are the challenges that these people have
9:44
when it comes to data? Well, it's
9:46
very often is getting access to data. patient
9:48
data, it's sensitive. They may have access
9:50
to the data from their own hospital, but
9:52
then to combine it with data from
9:54
the GP or the pharmacies or commercial companies,
9:57
all of that is very difficult. And
9:59
then you start to explain, okay, but
10:01
we have a way to solve that
10:03
problem. We can combine your data with
10:05
other data sets and then you can
10:07
run the analytics across the broader data
10:09
set. And that is what is appealing
10:11
to people. In healthcare, that is understandable.
10:13
The privacy is very important. Are there
10:15
any other sectors that you're working with?
10:18
Yes, there are. Healthcare is the area
10:20
where we started to really push. But
10:22
our first client, interestingly enough, was not
10:24
in healthcare, is the National Cyber Security
10:27
Center in the Netherlands. And
10:29
they use our platform
10:31
to collect cyber threat intel
10:33
from organizations across the
10:35
Netherlands. By now, something like
10:37
110 organizations. partly vital
10:39
in infrastructure, but also some
10:41
large companies that experience these threats
10:43
attacks from ransomware groups from
10:45
state actors, you name it. And
10:47
that information that they collect
10:50
is super sensitive. They don't
10:52
want to disclose that to anybody, not even to
10:54
the National Cyber Security Center. So
10:56
the NCSC uses our platform to collect
10:58
this information, but they can't see the
11:00
data, but they can analyze it. And
11:02
they can see the trends. They
11:05
can see the modus operandi. They can
11:07
connect certain cases under encryption. They can
11:09
connect the case and say, hey, these
11:11
cases actually have the same attackers. They
11:13
have the same modus operandi. And
11:16
in that way, they can
11:18
help the national security companies
11:20
in the Netherlands to protect
11:22
themselves against threats. That would
11:24
also be a way in
11:26
an infrastructure to mitigate risks
11:28
by state actors attacking a country,
11:30
for example, while you can
11:32
monitor a whole infrastructure of
11:34
an ecosystem. you can
11:36
still protect all the different endpoints
11:38
of the ecosystem against direct attacks
11:40
through your infrastructure that you build
11:43
over it, because you secure all
11:45
this data. And I really like
11:47
that. That is a very good
11:49
one. Yeah. I would say in
11:51
the security and the public domain,
11:53
that law enforcement security from there,
11:55
it's easy to make a step
11:57
to defense, especially nowadays with hybrid
11:59
warfare, connecting Intel from organizations to
12:01
understand what is happening and what
12:03
sort of threats are there. To
12:05
mention another one is the banking
12:07
sector. Financial institutions have the
12:09
obligation to detect financial crime, to
12:11
stop scams. Doing that on their
12:13
own is very difficult and to
12:15
enable them to collaborate on that
12:17
topic, to exchange information, to
12:19
bring information about certain clients together
12:21
and to analyze that data to
12:23
understand whether there's a criminal activity
12:25
happening. That is a very powerful
12:27
thing. And so far that has
12:30
been very difficult because banks don't
12:32
want to disclose their commercial data
12:34
to each other. Indeed. also,
12:36
of course, there's a privacy issue. And
12:38
another area is basically anything
12:40
in manufacturing where you have
12:42
multiple parties collaborating to produce
12:45
a product. Think about automotive.
12:47
It's a multi -tier buyer approach where one
12:49
company makes something for the next company. Well,
12:51
if you detect a problem somewhere with
12:53
the quality, you want to understand where the
12:56
problem is coming from. and what is
12:58
driving it in order to do that in
13:00
an efficient way, you basically need to
13:02
bring the data together from multiple parties. Yeah,
13:05
that is a challenge because there's commercial
13:07
interests, there's supply chain interests. Talking
13:09
about supply chains and talking about manufacturing,
13:11
this is the year in which all
13:14
of that is completely shaken up, I
13:16
guess, for all sorts of different reasons,
13:18
as we all know them. I can
13:20
imagine, particularly the world of manufacturing and
13:22
the whole supply chain around it, that
13:24
the need for collaboration is not just
13:26
nice to have, but actually more important
13:28
than ever, given the fact that it's
13:30
such a volatile, rapidly changing environment currently,
13:32
talking about manufacturing. I can
13:34
imagine that's very much an evolving use
13:36
case area as well. I
13:39
have already talked with my colleague
13:41
who is working in the telecom.
13:43
He said, this is interesting. I
13:45
want to know more because telecom
13:48
also have a lot of sensitive
13:50
private data and they want to
13:52
find the pattern as well. I
13:54
read that after I talk with
13:56
you, I read that this technology
13:58
training the machine learning model is
14:00
fine, but deep learning have a
14:02
challenge. I want to know where
14:05
the comparison machine learning is still
14:07
Mathematic calculation where is the challenge
14:09
for the deep learning model now
14:11
as you can imagine working with
14:13
data under encryption means there's an
14:15
additional computational step or multiple additional
14:17
computational steps to be made so
14:20
the computational overhead is larger than
14:22
doing things in the clear. We
14:24
have been able to take a
14:26
lot of that challenge away and
14:28
therefore we can do today at
14:30
the processing of hundreds of millions
14:32
of records we can do machine
14:34
learning. The current frontier that we're
14:37
working on with our researchers is
14:39
on deep learning. It's not impossible.
14:42
Mathematically, it's possible. It's
14:44
just doing it in a very efficient
14:46
way, such that you get
14:48
computational run times that are
14:50
acceptable for practical usage. Okay.
14:54
Yeah, I was thinking, is it
14:56
computation or the story? Because story
14:58
in many times, the memory for
15:00
training deep learning model also I
15:02
can course from. I understand that
15:04
you really have more challenge if
15:07
you have a large language model.
15:09
He asked an interesting question. He
15:11
said, okay, we can train a
15:13
deep learning model. How about
15:15
fine tuning? That would
15:17
be difficult, isn't it? That means
15:19
the weight is different. Can
15:22
you translate the weight
15:24
somehow? you know the
15:26
model if you train a model normally
15:28
then you keep that weight that
15:30
is what actually knowledge is storing in
15:32
the deep learning model can we
15:35
use this technology find tuning that means
15:37
you need to keep that information
15:39
in the weight but as far as
15:41
I understand you change this deep
15:43
learning model you need to modify it
15:45
so my questions can we find
15:47
tuning use this technology Yeah, I'm not
15:50
an expert on training deep learning
15:52
models and fine -tuning them. So there's
15:54
in principle no limit to what sort
15:56
of model you can train or
15:58
fine -tune. It's all about doing it
16:00
in an efficient way and being smart
16:03
about it. So I would say
16:05
the answer is yes, but I'm probably
16:07
not the right person to ask
16:09
this detailed question. We had this
16:11
coming, right? We would of course come
16:13
with AI sooner or later and I was
16:15
sort of... with the fact, oh, look,
16:17
we have data collaboration today. It's not AI.
16:19
It's not generative AI even. It's not
16:21
agentic. I'm so happy. And then Weiwei comes
16:23
in and, of course, immediately has to
16:25
bring in the AI. But I think it's
16:28
very relevant. And you want to share
16:30
data, of course, because you want to train
16:32
your models. And in the end, of
16:34
course, we influence these models. And it's obvious
16:36
there's a lot of AI there involved
16:38
as well. And next to, I guess, other
16:40
insights and analytics you put on top
16:42
of it. So thanks for spoiling that way.
16:44
Very well done there. Listen,
16:46
Ian, in one of our previous
16:49
episodes, we had Alberto Palomo, and
16:51
he is a Chief Strategy Officer,
16:53
just like MC Hammer, by the
16:55
way. He's a Chief Strategy Officer
16:57
at GaiaX. Of course, you know
16:59
GaiaX. And he was our guest,
17:01
and we talked about data spaces.
17:03
And I sort of feel that
17:05
there's clearly a connection here in
17:07
terms of the topic of data
17:09
spaces with the ideas that Roseman
17:11
Labs have. You see some alignment
17:13
there, some clear connection between that
17:15
topic? Yes, absolutely. I
17:18
mean, data spaces in the end are all about
17:20
data sharing and collaboration, and we
17:22
focus on the same value,
17:25
but we do it in a different way.
17:27
We very often refer to our solution
17:29
as an encrypted data space, rather than a
17:31
normal data space. And the
17:33
encrypted data space concept is quite
17:35
different, because in a normal data
17:37
space, ultimately what you do is
17:39
you do share data. Meaning that
17:41
you do lose control over the
17:43
data in some sense you do
17:45
it very specific very controlled but
17:47
you do share data. Where is
17:50
in an encrypted data space and with
17:52
our solution. You only share
17:54
the inside from the data you
17:56
never share the role data
17:58
with your counterparties and enables you
18:00
to set up a very
18:02
different type of collaborations and also
18:05
to bring to the party.
18:07
very sensitive data that in a
18:09
normal data space, you would
18:11
never bring to the party. An
18:14
example of that is, let's
18:16
say the, for instance, the Catena
18:18
X. Yeah, we know it,
18:20
automotive. There you have one
18:22
use case, which is the demand
18:24
capacity management, matching the capacity in supply
18:26
chain and demand. They do that
18:28
in a fairly sophisticated, but also cumbersome
18:30
way, because you have the principle
18:33
one level up, one level down, meaning
18:35
I cannot know who the suppliers
18:37
of my suppliers are. They
18:39
don't share bills of materials because those
18:41
are considered very sensitive. They
18:43
don't share actual capacity data. They
18:45
only share allocated capacity data. But
18:47
that makes the whole problem very cumbersome,
18:49
but also very understandable because they can't share
18:51
that data because it's too sensitive. Now,
18:54
in an encrypted data space, you
18:56
can share that data and you can
18:58
do the calculation much faster, much
19:00
quicker and much easier and you can
19:02
run the iterations very quickly. because
19:04
you can leverage that data, but you
19:06
don't need to disclose it to
19:08
each other. That makes it more interesting
19:11
to organizations to actually get involved
19:13
in collaborating on data. Absolutely. And
19:15
the current concept of data spaces may
19:17
have kept them for now from it. I
19:19
think it's all the more relevant these days,
19:21
Ian, because we still want to collaborate and
19:23
share data also between countries, for example, regions,
19:25
different parts of the world. And then nowadays,
19:28
of course, the whole notion of sovereignty and
19:30
being able to keep your own data to
19:32
yourself. And on the other hand, still, of
19:34
course, seeing the need to collaborate, I
19:36
think it's more relevant than ever. So even
19:38
geopolitics right now might point to the need
19:40
of such a platform, right? It's just the
19:42
move towards sovereign data. No,
19:44
absolutely. And you see that
19:46
in the security and defense space. where
19:48
different organizations want to work together
19:51
across borders, but they don't want to
19:53
share their data. But you see
19:55
the same thing, for instance, in banking,
19:57
even within one organization, if you
19:59
have large banks that operate internationally, because
20:01
they operate in different legislative areas,
20:03
jurisdictions, I mean, a bank cannot
20:05
share data from Switzerland to the EU
20:07
or from the EU to the US. But
20:09
if you operate on a global level
20:12
and you have clients on a global level
20:14
and you want to detect financial crime
20:16
on a global level, you would need to
20:18
exchange that information or at least be
20:20
able to connect data from one jurisdiction to
20:22
another. That's where our solution comes in. Exactly.
20:27
Ian, so your company is five
20:29
years now, you said. During these
20:31
five years, the regulatory landscape in
20:33
the world has changed a lot.
20:35
You are offering this data privacy
20:37
technology. How do you see
20:39
the adoption of that in the world
20:41
after these regulations changed and came in
20:43
and came in place? and also against
20:45
the competition, because Rosman Labs is not
20:48
big enough to cover the whole world.
20:50
So how is the adaptation of this
20:52
kind of technology in general during the
20:54
course of your existence? The
20:56
regulation that is coming into place helps
20:58
us as a company. We can
21:00
offer a solution to meet some
21:02
of those requirements. I mean, GDPR
21:04
has been around for quite a while, but
21:06
that is very obvious. GDPR is
21:08
asking things like control, like data minimization,
21:10
purpose binding. Well, all of those things
21:13
would traditionally be done with a contract
21:15
and you would need to trust each
21:17
other. Now, with our technology, we can
21:19
enforce that in a technical way. So
21:21
even if you don't trust each other,
21:23
but you do trust the software, you
21:26
can set up a collaboration. That is
21:28
how you can use the
21:30
solution to comply with those new
21:32
regulatory requirements. Yeah, but
21:34
this is the part that you can
21:36
cover as Roseman Labs. We need
21:38
this on a worldwide basis, I would
21:40
say. So how do you say
21:42
the whole market is changing here? Are
21:44
there many competitions coming up? Are
21:46
there many competing frameworks? How do you
21:48
integrate with that? When you
21:50
think of the world of encrypted computing,
21:52
I would say we as a company
21:54
are really in the forefront. I got
21:57
that feeling that you are actually. We
21:59
absolutely are. I mean,
22:01
if you also look at our
22:03
team, I mean, we have eight
22:05
PhD level cryptographers. Probably with that,
22:07
we are the largest employer, private
22:10
company employing cryptographers in Northwestern Europe.
22:12
So we really are in the
22:14
forefront of it. There are competing
22:16
technologies like fully homomorphic encryption versus
22:18
NPC that we use. Fully homomorphic
22:20
encryption is not performant at all
22:23
compared to NPC. And so doing
22:25
advanced calculations. with FHE
22:27
is still very, very difficult and
22:29
very hardware intensive. And
22:31
then there are other solutions such
22:33
as confidential computing, which is
22:35
more hardware based approach, which don't
22:37
offer the same security levels. Also
22:40
because you simply cannot order that
22:42
hardware. And if you go
22:44
out in the market with that,
22:46
I buy all that. But what is
22:48
the biggest misconception you actually see
22:51
if you talk to people about this
22:53
privacy enhancing technology? I would say
22:55
the biggest challenges not so
22:57
much a misconception. The biggest challenge
22:59
is the awareness. Nine out
23:01
of 10 people, or probably even more, have
23:03
never heard of this. That's why I
23:05
ask, completely alien to them. So when you
23:07
explain it to them, what is possible, their
23:10
eyes open and say, well, this really
23:12
what we need. We haven't heard about
23:14
that before. Just two weeks ago,
23:16
I was speaking to someone who was
23:18
very active in the data spaces area in
23:20
Germany, knew everything about data
23:22
spaces, but had never heard of encrypted
23:25
computing. And the discussion was
23:27
very eye -opening. I said, well, we should bring
23:29
this to some of my clients in context
23:31
because this is really a revolution, but you're
23:33
telling me if this is really true. It
23:35
starts with awareness. And then the second
23:37
thing, there are some people that have maybe
23:39
heard of these type of technologies. Then
23:42
the misconception is this is still
23:44
in an innovation state. This is
23:46
still like TRL level 3, 4,
23:48
5. It's academic, et cetera. And
23:50
then I tell them, well,
23:52
The National Cyber Security Center in the
23:54
Netherlands has been in production with our software
23:56
for three years. So this
23:58
is not TRL 345. This
24:01
is mature software that you can deploy
24:03
out of the box and you can
24:05
have it up and running in no
24:07
time. Did you ever
24:09
get a question of ROI on your
24:11
technology? And if so, how would you go
24:13
about calculating that? This is really related
24:15
to the same discussion with AI. You're going
24:17
to be more efficient and blah, blah,
24:19
blah. How do you measure these kind of
24:21
things? the same holds for privacy. Is
24:23
that a question at all or do people
24:25
have such a need that they don't
24:27
care? It is a very relevant
24:30
question, but it's highly dependent on
24:32
the use case. For instance, if we
24:34
think about law enforcement, you
24:36
think about ROI in a very
24:38
different way than in healthcare. If
24:40
you can come up with better
24:42
treatments, more efficient treatments, then you
24:44
can actually ultimately come up with
24:46
some sort of an ROI. In
24:49
law enforcement, that is maybe much more difficult.
24:52
I mean, there is no revenues
24:54
from catching more criminals. So
24:56
I would say it is very
24:58
relevant because people rightfully so
25:00
ask themselves, how much does it
25:02
cost and how much does
25:04
it give me? But it's highly
25:06
use case dependent. I
25:08
recommend this definitely. Think about
25:10
it. Eight PhDs
25:12
completely on cryptography. I
25:15
can only imagine how lunches must be with
25:17
these guys. That's a nerd first. It
25:19
is. It is. Absolutely. I'm
25:21
a physicist by training, so I've
25:23
been in commercial roles and as
25:25
a partner with BCG for many
25:27
years, but I do like the
25:29
content and being able to explain
25:31
also to clients and to prospects, how
25:34
can you actually do a calculation under
25:36
encryption? And I learn all
25:38
the time from these guys as well.
25:40
That's for sure. Eight of them. That's
25:42
very impressive. I can imagine, by the
25:44
way, that it all depends on your
25:47
audience getting back a little bit to
25:49
the attitude how you sell this thing.
25:51
I think for everybody from a data
25:53
perspective, and Robert already touched on it
25:55
as well, here's an ROI question, but
25:57
in general, to convince people that this
25:59
is important. I mean, nowadays with AI
26:01
and generative AI and agents, we sort
26:04
of noticed they're almost pulling it out
26:06
of your hands. So much expectation, maybe
26:08
a little bit inflated in terms of
26:10
expectations, but then collaborating on data is
26:12
something you want to convince people. And
26:14
I think a lot of data experts
26:16
and data consultants would recognize it. If
26:19
only within their own organization, how to
26:21
convince people that you should collaborate on
26:23
data, because sometimes a company itself seems
26:25
like different regions in the world. There
26:27
are problems in working together, let alone,
26:29
of course, that you would convince multiple
26:31
organizations in sharing data. On
26:33
one hand, you might be talking to
26:35
experts about Python and APIs and everything
26:37
that comes with it there, and also
26:39
how encryption is different from other ways.
26:41
You also have your business. I think
26:43
context and I can imagine it's a
26:45
very different story. Can we learn there?
26:48
Because you had quite some successful implementations
26:50
already. Were it the business people that
26:52
you convinced or were it the technology
26:54
people or is it the combination of
26:56
it? I mean, there's typically
26:58
three sort stakeholders. I mean, there's
27:00
the business person with a business problem
27:02
and the business person might be a
27:04
doctor or a researcher, but it could
27:06
also be someone active in anti -money
27:08
laundering in a bank, but they have
27:10
a business problem. Then there's the technical
27:12
people at the CISO, the architects, the
27:15
data scientists, et cetera. And then
27:17
you, of course, very often have the privacy
27:19
people, the data privacy officer,
27:21
the compliance officer. They
27:23
also want to understand it. And because ultimately they
27:26
need to approve such a collaboration. So
27:28
it's always these three stakeholders and
27:30
very often from multiple organizations. So
27:32
you can imagine that this is
27:34
not an easy thing to do.
27:36
Exactly. Hence my question. Maybe you
27:38
have some best practices there. We
27:40
do. And ultimately the key person
27:42
is the business person because they
27:44
need to see the value and
27:46
they ultimately to be the champion
27:48
and need to be together with
27:50
us in convincing the others. But
27:52
creating the awareness in the broader
27:54
community, let's say the privacy community
27:57
or in the data security community
27:59
helps us a lot because if
28:01
these people are already aware, then
28:03
solving these problems of the business
28:05
people become a lot easier. And
28:07
we see that in organizations where we have
28:09
implemented say one solution for one specific problem
28:11
then it becomes a lot easier to do
28:13
the next one and the next one. A
28:16
nice example of that is a
28:18
large municipality in the Netherlands that
28:20
we started off with and the
28:23
privacy officer was very skeptical initially.
28:25
Basically he said no until he started
28:27
talking to us and we started to
28:29
explain and now he's one of our
28:31
biggest champions. So if we
28:33
would want to do another project and
28:35
we're talking about multiple other projects
28:37
within that. same municipality in different areas.
28:40
It's about easier because he's a
28:42
champion. Suddenly became your sponsor. Exactly.
28:45
And that is brilliant. A chief privacy
28:47
officer, whatever you want to call it.
28:49
By default, I would say very critical
28:51
person. Yeah. That probably sees a lot
28:53
of bears on the road, a lot
28:55
of issues that might come with it.
28:57
So if you happen to turn around
28:59
such a person, let's say attitude towards
29:01
business and technology, I think you're getting
29:03
somewhere and then even they become your
29:05
main sponsor. That's a very nice practice.
29:07
If you overcome even the most critical
29:09
persons like Chief Security Officers are, but
29:11
if you manage to get them convinced
29:13
and then suddenly they become your sponsor
29:15
then and you're saying next cases are
29:17
much easier than to actually create. I
29:20
would say on the privacy side and
29:22
on the security side, it's of course
29:24
very helpful that the National Cyber Security
29:26
Center is one of our clients. And
29:28
they're also really, really active in promoting
29:30
the technology and promoting us, to be
29:32
honest. If I speak
29:34
to a CISO in a large organization
29:36
and I can tell that, I
29:38
mean, the National Cyber Security Center is
29:40
a client that they've audited our
29:42
software that based on that audited intelligence
29:44
agency and the IEVD has certified
29:46
our software. A CISO in
29:48
a bank that helps, huh? That
29:50
helps quite a bit. Even
29:52
the CCO of Cup Gemini, your
29:55
global CCO. Also a critical person, I
29:57
can tell you. Also a critical
29:59
person, but has proved that we got
30:01
our things well organized. You got
30:03
the stamp. That's absolutely very helpful. You
30:06
know, you could say if it works for them, probably works
30:08
for you as well. You know, it's funny
30:10
how time flies. Let's say different topic
30:12
this time, which I love a lot. We
30:14
are approaching the end, but
30:16
then again. An episode of the
30:19
Data Powered Innovation Jam is
30:21
never complete without us trying to
30:23
debunk a myth. For
30:25
that, we have our myth -busting officer,
30:27
also known as Weiwei Feng. And I'm
30:29
pretty sure, Weiwei, you might have
30:31
some sort of myth you want to
30:33
address. A little bit of
30:36
a myth or a little bit thinking.
30:38
If you can't see my data, you turn
30:40
the model. It feels very
30:42
safe. But the safety is not
30:44
only come from data, the safety
30:47
also come from the model. People
30:49
can hide the back door, people
30:51
can hide the malware in the
30:53
model as well. So that's something
30:55
people use this technology still need
30:57
to be aware. Absolutely. And
30:59
this is something that we haven't touched
31:01
upon. But of course, if I encrypt
31:04
all my data, you cannot see it,
31:06
but I can run a model on
31:08
it. Now. I cannot just run any
31:10
model on the data because a model
31:12
that just opens all the tables is
31:14
also a model and obviously we wouldn't
31:16
want to do that. Part of our
31:19
software is what we call an approval
31:21
flow where the owner of the data
31:23
or in better words the controller of
31:25
the data needs to approve the model
31:27
that can be run on the data
31:29
and you can literally inspect the model
31:32
and the model is always a Python
31:34
script. You can see a sample output
31:36
of the Python script in the description
31:38
of it. And before you review that
31:40
and then you approve it and only
31:42
then that specific model can be run
31:44
on the data and nothing else. If
31:47
I would change one line in that
31:49
model, you can no longer run that
31:51
model on the data. So
31:53
if I would build a backdoor in
31:55
my model, then it's up to the
31:57
approver to spot that backdoor. So that
31:59
does mean that your approver needs to
32:01
be pretty fluent in Python. That's what
32:03
they're selected on, of course. And we
32:05
can also help with that. Maybe a
32:08
site topic is the topic of statistical
32:10
disclosure. If I run the same model
32:12
very often on a slightly different data set,
32:14
I might disclose something about the data. But
32:16
that is a topic that we also
32:18
help our clients on. We also have a
32:20
specialist in statistical disclosure management. Yes. And
32:22
sometimes they can hide this in a wait.
32:24
It's a really deep learning model. because
32:27
some weight is insignificant. So
32:29
that is also something add additional complexity
32:31
on the security, because that's why it's
32:33
difficult to detect. You're absolutely right, Ted.
32:35
So that will be an additional challenge
32:37
once we get into the deep learning
32:39
models. And that is maybe a
32:41
topic to pick up later, because there
32:43
is so much more to discuss. And by
32:45
the way, Ian, we're happy. Also, our
32:47
company kept them and I to work together
32:49
with you guys on actual projects, which
32:51
we didn't dive in today. But if people
32:53
want to know more, they know how
32:55
to reach out to you or to us,
32:57
obviously. This sort of concludes our session
32:59
for today. We heard some very interesting things.
33:01
I was particularly struck by this notion
33:03
of 8 -Pretography PhDs. You could almost make
33:05
a joke out of that 8 -Pretography PhD
33:07
walk into a bar. I think the notion
33:09
of collaboration and need for it in
33:11
a safe and private way is maybe nowadays
33:13
in 25 more necessary and crucial than
33:15
ever. So I think this was a very
33:17
relevant discussion. We touched on use
33:19
cases as well in different sectors. And
33:21
I think that was a surprise to
33:23
us. I must say it was quite
33:25
happy not to have discussed a lot
33:27
about generative AI or agents. The only
33:29
agents really beating in the cybersecurity center
33:31
again. That's sort of like police. That's
33:33
different. They are real agents. So that
33:35
was very good. And I must say,
33:37
so interesting. I'm pretty sure if MC
33:39
Hammer still would be a chief strategy
33:41
officer, I probably would think he would
33:43
say, you can touch that. Thank you
33:45
very much Ian for being with us
33:47
today. Thank you, it was a pleasure.
34:00
angles and rondelita. Please let us know if
34:02
you have any comments or ideas for
34:04
the show and of course, if you haven't
34:06
already done so, rate and subscribe to
34:08
our podcast. See you at
34:10
another episode soon and remember, whatever
34:12
you do, always be jamming.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More