Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
You're listening to Gradient Dissent, a
0:04
show about making machine learning work
0:06
in the real world, and I'm
0:08
your host Lucas B. Wald. Christopher
0:10
Alberg is an old friend and
0:13
the CEO and founder of
0:15
Recorded Future, an AI cybersecurity
0:17
startup that was founded in
0:19
2009 and sold recently to
0:21
MasterCard for $2.65 billion. Previously
0:23
he was the founder of
0:26
Spotfire information visualization
0:28
startup. And we talk about
0:30
AI applied to cyber threats,
0:32
war in Ukraine, and compare
0:35
notes on being two-time founders.
0:37
I hope you enjoy this one.
0:39
All right, so Christopher, welcome
0:41
to Grady the said. Thank you, Lucas.
0:43
Thank you for having me. So you
0:45
have like my favorite kind of guest,
0:48
which is someone who's taking AI and
0:50
applying it to an important domain that
0:52
I don't understand super well. So I
0:54
think for our audience, you're going to
0:56
have to tell us about your company
0:58
recorded future and the problems that you
1:01
solve. So maybe we could start with
1:03
that. Yeah, I know for sure, you know, you
1:05
could sort of, I wouldn't use this
1:07
description. widely, but in this context, maybe
1:10
you could say they were doing AI
1:12
for intelligence or intelligence for AI, if
1:14
you want. We're an intelligence company, threat
1:16
intelligence, we try to get our hands
1:19
on everything bad that's going on in
1:21
the world, but maybe more importantly in
1:23
the sort of the internet world that
1:26
I like to say that there's sort
1:28
of this opportunity that came along to
1:30
think about the internet as a... an
1:33
incredible intelligence sensor that happened at the
1:35
same time as the world have sort
1:37
of slowly migrated onto the internet and
1:40
and as that migration is happening it's
1:42
sort of going to the point where
1:44
actually the world is actually becoming a
1:46
reflection off the internet crazily enough and
1:49
so in that we've done hard work
1:51
over the last 10-15 years to try
1:53
to organize this data collected you can
1:56
imagine all kinds of crazy data we've
1:58
had to use a lot of you know,
2:00
call it machine learning, big data, analytics,
2:03
now AI, pick your favorite words. to
2:05
sort of make sense that a lot
2:07
of data, a lot of that stuff
2:10
got started in natural language and processing,
2:12
these sort of things, but then all
2:14
kinds of other data. And in the
2:17
end game, it's all about producing great
2:19
insights. These could be insights about geopolitics,
2:21
they could be insights about cyber threats,
2:24
that's sort of where our main stuff
2:26
is, it could be insights that support
2:28
a warfighter, it could be insights of
2:31
all kinds, but all in this sort
2:33
of... bad news domain and the threat
2:35
Intel domain and we built a good
2:38
business there. Sometimes I summarize it by
2:40
saying it's sort of the Bloomberg for
2:42
intelligence. And so I'm sure that, like
2:45
a lot of our guests, maybe the
2:47
most interesting things that you've done, you
2:49
can't say. But is there any particular
2:52
insight that you pulled out that you're
2:54
proud of that you can talk about?
2:56
Yeah, you know, it was through the
2:58
years, there's been wild things, you know.
3:01
going back to in 2016 when the
3:03
election stuff sort of was the first
3:05
time around sort of thing we found
3:08
some Russian dude who was selling access
3:10
to the electoral access commission or electrical
3:12
yeah whatever it's called you know incredible
3:15
stuff that was probably the first sort
3:17
of that moment where you're just like
3:19
wow yeah what's going on how did
3:22
you talk about how did you figure
3:24
that out that's amazing like what what
3:26
how does that work in that case
3:29
it's this sort of wonderful dark web
3:31
world where that back then was in
3:33
these forums that still exists. A lot
3:36
of that has now moved on to
3:38
telegram, if you want. That's sort of
3:40
where a lot of the bad news
3:43
of the world now happens. But some
3:45
guy there had broken into this thing
3:47
called Electoral Assistance Commission, EAC, and he
3:50
had packed it in good ways using
3:52
a sequel inject, extracted a bunch of
3:54
information, but maybe more importantly. He went
3:57
back out and said, I'm selling access
3:59
to this. Who wants to get the
4:01
access? And we found it before somebody
4:04
else. and actually bought the axis crazy
4:06
enough. Really? Yeah. Wow. And took custody
4:08
of it and then went back to
4:10
the government and said, you may want
4:13
to make this thing go away. And
4:15
yeah, it was a little bit of
4:17
a wild thing. I remember when this
4:20
guy, a colleague, fantastic colleague Andre calls
4:22
me in the middle of the night
4:24
and just says, you know, something not
4:27
so great is going on. What should
4:29
we do? And a long time ago,
4:31
but good story. Okay, you're like instantly
4:34
pulling me off my script, which I've
4:36
been trying to stay a little more
4:38
on script But I can't help myself
4:41
like what what is the dark web
4:43
like is like is this like forums
4:45
that people log into is this like
4:48
on tour like what actually is that
4:50
it is you know like the and
4:52
it's a portion just to be clear
4:55
about and it's it's changed recently we
4:57
can come back to this and talk
4:59
about telegram, but no this stuff is
5:02
still there there there are these forums
5:04
and it turns out if you're in
5:06
the business of buying and selling information,
5:09
you need a place to buy and
5:11
sell it. You can't just sit on
5:13
it, then you're not going to make
5:16
much money. And these guys have, the
5:18
only interest really is in making money,
5:20
different from spies and government spies. But
5:22
these guys, they want to make money,
5:25
so they need places to buy and
5:27
sell. And some of them still exist
5:29
20 years later, 25 years later, crazy
5:32
enough. So they are, most of them
5:34
are on tour, some of them are
5:36
not, and they're sort of essentially marketplaces.
5:39
and we keep a close eye on
5:41
them if we put it that way.
5:43
So today, right now, somebody as of
5:46
an hour ago or a couple hours
5:48
ago was selling access to a particular,
5:50
I'll just call it big software company
5:53
servers and it's crazy creating havoc right
5:55
now, right this moment. And we got
5:57
our hands on all this data. We're
6:00
trying to upload it into our system,
6:02
generating alerts. the clients and this stuff
6:04
is never ending. So yeah. So like
6:07
so literally right now someone's selling access
6:09
to a company's servers. Many of them
6:11
yeah no the the it's software that
6:14
is installed because something is going on
6:16
in real time I'll leave it sort
6:18
of a little bit fuzzy but it's
6:21
a big prominent software company and they're
6:23
selling access to this yeah. And who
6:25
would typically buy it like do you
6:28
pretend to be like someone that wants
6:30
to buy it to to learn, but
6:32
like, who even would want that access?
6:34
It can be, you know, so typically
6:37
these guys will sell access. That's one
6:39
thing. And, you know, if you think
6:41
about the guys who want to buy
6:44
access, there could obviously be somebody who
6:46
wants to get after something very special.
6:48
But what happens in this world is
6:51
that it's very specialized. You have people
6:53
who will... Then, for example, sell ransomware
6:55
software, or rent access to ransomware software.
6:58
You have people who help you get
7:00
the rent, execute the ransomware. You have
7:02
people who then... handle the actual Bitcoin
7:05
flows all the way to moneymules, a
7:07
highly specialized world of criminals there because
7:09
they are criminals. A large portion of
7:12
them are in Russia, not all of
7:14
them. And in this case, it's probably
7:16
somebody who has realized that he can
7:19
sell access to all these different places
7:21
that opens up all kinds of opportunities
7:23
for ends somewhere. I'm sort of guessing
7:26
in this case since it's happening real
7:28
time, but it's a pretty interesting world.
7:30
It's an interesting world, I guess, but
7:33
I'm totally not aware of this. want
7:35
to buy this data. Is that other
7:37
criminals or is that like other criminals?
7:40
Other criminals. Probably the ransomware actors. So
7:42
there is a whole slew of these
7:44
ransomware actors. So there's guys who write
7:46
and operate ransomware software. They then franchise
7:49
out the use of this ransomware software
7:51
to other actors and they'll use it
7:53
execute and in return for holding on
7:56
to say 80% of whatever money they
7:58
get on the ransomware. action, the operator
8:00
gets another, get the other 20% and
8:03
it becomes one of the best business
8:05
models ever operated on planet Earth. And
8:07
presumably if someone's selling something illegal, they
8:10
want to make sure they're selling it
8:12
to a criminal and not you. And
8:14
presumably they're worried about selling it to
8:17
you, so how do you convince them
8:19
that you're not going to turn them
8:21
in with something like that? So, you
8:24
know, that's our job. Okay,
8:27
love it. All right. Is there any
8:30
AI involved in that or is that,
8:32
that seems like a totally different. There's
8:34
AI in, because it turns out that
8:37
these, it's, I wouldn't call it big
8:39
data, big forums, big, but they're tricky.
8:41
You had, they might be locked down
8:43
in all kinds of different ways. So
8:46
you need a lot of trade craft
8:48
to get in and it's a combination
8:50
of human social engineering and smart scraping
8:52
if you want. there could be six
8:55
layers of access to some of these
8:57
places. So that's pretty complicated sort of
8:59
thing. So it's a combination. I wouldn't
9:01
necessarily call it AI. But then at
9:04
the same time, these scrapers need to
9:06
operate in human-like behaviors. So I like
9:08
to joke that they sort of might
9:10
actually pass a very limited touring test,
9:13
because they need to sort of operate
9:15
as if I'm a cyber criminal. I
9:17
don't know. Is that a touring test?
9:20
I guess it's a one kind of
9:22
touring test. by whose definition you go
9:24
with by it. But yeah, I guess
9:26
it is. Yeah. Yeah, I guess there's
9:29
a human looking at it. It's for
9:31
sure. It's during tests. Yeah. No, no.
9:33
So artificial criminal behavior. So I guess
9:35
that's, you know, that's that's a touring
9:38
test of just a special kind. Okay,
9:40
well, can I get a back on
9:42
script that I'm sure we're going to
9:44
come come back to this? Yeah, one
9:47
of the things that we have in
9:49
common is that, you know, we're both
9:51
repeat entrepreneurs and you actually, you know,
9:53
you founded Spot Fire back in, I
9:56
think, 1996 and sold it 11 years
9:58
later. And then you found a recorded
10:00
future a little bit after that. And
10:03
I was wondering if you could go
10:05
back in time to when you started
10:07
recording future and what you were thinking
10:09
at that moment and what the insight
10:12
was that led you to start recorded
10:14
future. Yeah, that's a great question. So
10:16
Spotfire was data visualization. We started that,
10:18
sort of that, I'm aging myself here,
10:21
but in the early 90s, working for
10:23
my PhD on how to visualize large
10:25
data sets and kind of a. predecessor
10:27
to Palantir and a whole bunch of
10:30
other sort of things, not to make
10:32
any claims, but that we were the
10:34
first by the means because we're not,
10:36
but we did a nice job with
10:39
certain kinds of data and so on.
10:41
So then we sold that to a
10:43
company in Palo Alto, Co, Tipco, worked
10:46
out great, and we were very excited
10:48
about that. But then sort of the
10:50
idea struck me that never sort of
10:52
all about visualizing what was in an
10:55
Oracle database or an Excel spreadsheet or
10:57
what have you. That sort of data.
10:59
And so we were, it sort of
11:01
struck me, I was on the treadmill
11:04
running and literally while we had signed
11:06
the deal to sell Spotify, but we
11:08
hadn't closed yet. So it was like
11:10
in this weird in-between timing. And it
11:13
struck me that, oh, wonder if we
11:15
instead of thinking about analyzing what's in
11:17
an Excel spreadsheet, what if I could
11:19
hook up an analytical engine straight to
11:22
the internet? So, you know, use that
11:24
sort of approach to things. So, and
11:26
now the internet is at the surface
11:29
level, at least, is mostly human-produced text.
11:31
So now you had to deal with,
11:33
look at for entities and events and
11:35
these sort of things out of text
11:38
and try to make sense out of
11:40
that and organize it in a way
11:42
that you could do analysis. And yeah,
11:44
that was sort of the inspiration. That's
11:47
how we got into it. Interesting,
11:49
and the neighbor recorded future is super
11:52
evocative. Did that come to you from
11:54
the start? No, not maybe immediately. It
11:56
was one of my co-founders here, Eric.
11:58
very clever guy. And the idea from
12:01
the beginning sort of was this idea
12:03
that we would find these future time
12:05
points in text, you know, like Zee
12:08
Jing Ping is traveling to Moscow on
12:10
Friday. And if you could actually keep
12:12
tabs of everything that's known in the
12:15
world from all the sources, we should
12:17
get a good picture of where Zee
12:19
Jing Ping is. to use an extreme
12:21
example. And from the beginning, Eric suggested
12:24
we should call it, I guess, recorded
12:26
time, I guess in one of the
12:28
Shakespeare's at Macbeth, until the end of
12:31
recorded time, was sort of that thing.
12:33
Then of course, the domain name, recorded
12:35
time, was taken, it became recorded future,
12:37
which was probably a better name to
12:40
begin with. But so, yeah. And it
12:42
fits well with the idea that, you
12:44
know, most secrets are known. It's just
12:47
got to get to where they are
12:49
where they are. Let's get to all
12:51
of them. It's interesting, you know, it
12:54
seems like a little bit of an
12:56
entrepreneurial Anti-pattern here of like it's not
12:58
it doesn't sound like you're really starting
13:00
with like a specific customer pay import
13:03
and working backwards It sounds you kind
13:05
of started with your your sort of
13:07
interest like was it obvious that intelligence
13:10
was going to be like a big
13:12
customer this because it does seem like
13:14
as broadly as you would find it
13:17
like almost any organization would benefit from
13:19
this kind of analysis Totally Totally. And
13:21
that's probably my weakness, but maybe it's
13:23
also a decent strength. But you know,
13:26
so Spot Fire started as we could
13:28
visualize any data set. In that case,
13:30
we stumbled on to visualizing and analyzing
13:33
pharmaceutical discovery data, very super high-end super
13:35
valuable application, and it worked out great.
13:37
Then in that, we actually ended up
13:39
doing a lot of counterterrorism work in
13:42
the sort of CT space in intelligence.
13:44
I've always loved that world and we've
13:46
done a lot of good work in
13:49
that. So when we started to record
13:51
a future, to your point, we knew
13:53
that there was an application in this
13:56
sort of strategic foresight or whatever. Turns
13:58
out that we ended up in the
14:00
cyber world with it, but no, it's
14:02
certainly more of... inventing a big ass
14:05
hammer and trying to figure out where
14:07
you could apply it rather than the
14:09
other way around but you know isn't
14:12
that how a lot of good stuff
14:14
comes up I don't know what What
14:16
you would say, I think a lot
14:19
of good ideas come that way. Well,
14:21
look, I mean, you've been incredibly successful.
14:23
So I think anything, you know, people's
14:25
strengths and weaknesses often always connected. I'm
14:28
just curious about your process. Like, you
14:30
know, when you started this company, did
14:32
you have like a big list of
14:35
possible use cases and you kind of
14:37
ran down the list talking to people?
14:39
Or how did you, how long did
14:42
it take you to kind of get
14:44
to these like specific use cases and
14:46
how did you approach that? We also
14:48
thought about commercial intelligence and sales intelligence
14:51
and lots of other sort of things,
14:53
you know, for sure. And there's people
14:55
who've built similar type companies in lots
14:58
of different spaces. We had a long
15:00
list. I think I still have those,
15:02
whatever you want to call them, presentations,
15:04
you know, you can imagine the first
15:07
venture presentations. Yeah, yeah. We quickly went
15:09
to Incatel. and Google Ventures barely existed
15:11
at the time, but ended up taking
15:14
money for Minketel and Google Ventures back
15:16
in 2008-9 or something like that, and
15:18
sort of honed in on this Intel
15:21
thing pretty early, but could have gone
15:23
many other places too, for sure. So
15:25
one question that I always get asked,
15:27
that I feel like I never have
15:30
a good answer to, but I think
15:32
I'm going to turn around to you
15:34
and ask it. to you is you've
15:37
now done these two companies and you've
15:39
done both for quite a long period
15:41
of time. What did you do differently
15:44
in your second company? Like what did
15:46
you take away as a second time
15:48
entrepreneur? Oh. That's the sort of Peter
15:50
Thiel question that sounds like you know
15:53
like that. I only bring that one
15:55
up because literally every podcast I do
15:57
I always get us a question and
16:00
I never know what to say so
16:02
I think maybe all of a better
16:04
response to me. The sort of the
16:06
if I started with the bad side
16:09
you know so with the first company,
16:11
it took us a little time and
16:13
we stumbled on, we started very generally
16:16
and then we focused on pharmaceutical discovery
16:18
and killed it in that domain and
16:20
then worked from there. That domain ended
16:23
up being fairly limited and probably limited
16:25
the outcome of the company at some
16:27
level. We sold that for $195 million.
16:29
Nothing to sneeze at, but you know.
16:32
Congratulations, man. Oh, no, but these days,
16:34
people are, yeah, whatever. People are like,
16:36
it's... To put it in terms like
16:39
now, that would be like a $5
16:41
billion exit in 2025, just for the
16:43
younger listeners. Man, man. Yeah, so then
16:46
here we, I think we, like the
16:48
failure point, we thought we could do
16:50
multiple application errors. We for multiple years
16:52
ran an intelligence track and a quand
16:55
trading track to create quand trading signals.
16:57
And not, and we had a. serious
16:59
sea investor came aboard and he you
17:02
know you shouldn't listen too much to
17:04
your board but but this one guy
17:06
he just said kill that and I'm
17:09
like you're actually right right we should
17:11
just kill it and we killed it
17:13
and that was a great sort of
17:15
like freed us and we could just
17:18
run and and it is sort of
17:20
already started withering a little bit but
17:22
so so That was certainly that we
17:25
got, what do you call it, when
17:27
you think too big, or you know,
17:29
like we thought too highly of ourselves.
17:31
In terms of what we did differently,
17:34
that was better. We probably picked a
17:36
bigger problem. That was great. This cyber
17:38
intelligence turned out that we ended up
17:41
sort of riding a massive wave. This
17:43
investment in cyber security that have sort
17:45
of gone through the last 10, 15
17:48
years has been incredible. And even though
17:50
we certainly dissolve a subset of it,
17:52
you know, by writing a big wave,
17:54
you can make a lot of mistakes.
17:57
You know, you've seen that in AI
17:59
also, like the wave is big enough.
18:01
It allows for a lot of mistakes.
18:04
And maybe also... No, that would probably
18:06
be the answer to that. I'm sure
18:08
there's more, but. No, actually great segment
18:11
to my next question, right, which is
18:13
that, you know, you've been in this
18:15
massive wave of, not just, just like
18:17
cyber security, but the application of ML
18:20
and AI throughout your kind of 15-year
18:22
arc and recorded feature. I'm really curious
18:24
how the availability of data and the
18:27
new applications of ML and I changed
18:29
recorded futures business over the time that
18:31
you operate it. It was good. So
18:33
there is sort of two aspects. One
18:36
is the data and then there's like
18:38
what you now can do with data
18:40
here. So we started off with, you
18:43
know, the stuff that we called AI
18:45
and oh, oh, eight or nine, ten
18:47
sort of thing. We wrote our first
18:50
entity extractors and event extractors with like
18:52
lots of if-then- else, else just stacks
18:54
of if-then- else statements, sort of, they.
18:56
We also wrote a lot of good
18:59
test cases to test whether it was
19:01
right or wrong, but this test... text
19:03
lead to this, that sort of thing.
19:06
Though obviously ridiculous from a point of
19:08
view of like what people are doing
19:10
now and we've probably gone through three
19:13
generations even for those entity extractors and
19:15
we use some commercial stuff now we
19:17
sort of own all that internally and
19:19
it's been a great journey with that
19:22
and now that's all based on these
19:24
sort of models as you could expect.
19:26
The availability of data for sure have
19:29
sort of helped through that. We used
19:31
to have entity extractors for each language.
19:33
Now that that is one coherent model
19:35
that sort of spans, I don't know,
19:38
1530 languages and it's amazing how it
19:40
cross learns between languages that are not
19:42
just inside into European languages, but it's
19:45
like cross languages. It's sort of bananas
19:47
that this stuff even works. It's mimicking
19:49
human brains somewhere. Somewhere there is something
19:52
wild going on in that. And so
19:54
that's sort of super interesting. The other
19:56
big piece, probably, sort of could spend
19:58
a lot of time on this, but
20:01
is how that was one thing to
20:03
do feature extraction if you want, or
20:05
like, yeah, feature extraction, data extraction, out
20:08
of content. Now with all this generative
20:10
stuff, you could obviously sleep. So in
20:12
intelligence, there are sort of two aspects
20:15
to it. You think about if you
20:17
run fancy intelligence agency, you've got the
20:19
collectors, the James bonds running around in
20:21
the world, and you've got the analysts
20:24
who sort of putting stuff together. The
20:26
collection part is sort of what I
20:28
first described. The second part of writing
20:31
to, you know, everything from the simple,
20:33
summarize what happened in Somalia last week,
20:35
to what are the second order implications
20:38
of what happened in Somalia last week,
20:40
you know, like so from basic questions
20:42
to more advanced, or summarize the second
20:44
order implications of what happened in Somalia
20:47
and get a report written in Arabic
20:49
that I can share with my partner
20:51
in Egypt about that. Those are like
20:54
pretty juicy, juicy, heavy questions. And by
20:56
the way, do that every week for
20:58
me and deliver me at 8 a.m.
21:00
You know, now you take an fair
21:03
amount of work and stacked up there.
21:05
Now, and we do that. And it's
21:07
just, it's just happens. It's sort of
21:10
mind-boggering, isn't it? So, so if somebody
21:12
would have told me that. five years
21:14
ago, I would not have believed it.
21:17
Maybe, maybe three years ago, but it's
21:19
it's pretty mind-boggling that this works, and
21:21
it works with a mix of text
21:23
and images, and actually we collect a
21:26
lot of other weird data, sort of
21:28
more net flow data, malware data, very
21:30
technical data, that we spend a lot
21:33
of time on how to wrap that
21:35
in human language to the LLMs can.
21:37
handle that sort of data too. And
21:40
even there it works. So, you know,
21:42
it's a, I'm full off the chair
21:44
here and, you know, like, maybe that's
21:46
because I'm dumb, too dumb to understand
21:49
it, but it's pretty mind-boggling. I totally
21:51
agree for what it's worth. Actually, I
21:53
wanted to ask you. When ChatGPT came
21:56
out, I think it was two or
21:58
three years ago, was that like a
22:00
the same like watershed moment in the
22:02
Intel community that it was here at
22:05
Silicon Valley or did it take longer
22:07
for people to realize the implications? I
22:09
think, you know, so first even inside
22:12
recorded future, it took me just like,
22:14
I remember I was sort of debating
22:16
with, you know, my co-founder Safant Reebad
22:19
sort of like, you know, are we...
22:21
coming to AI sort of yet another
22:23
AI winter? Or is it about to
22:25
take off? And I'm like, ah, it
22:28
feels like it's done. That was Christopher
22:30
Commons and stuff. And I was like,
22:32
it's going to take off, buddy. And
22:35
I'm like, he was, of course, very
22:37
right. And I was very wrong. So
22:39
to begin with. But then I think
22:42
there were people in DC who ran
22:44
with this very cool, in a very
22:46
cool ways. I don't want to talk
22:48
about them, put names to things and
22:51
so on. But there are areas in
22:53
there where people have been extremely forward
22:55
leaning and built incredible stuff. And we've
22:58
had the good chance to spend time
23:00
with those people if they listen now,
23:02
they'll know who I'm talking about. And
23:05
where we do comparison of notes on
23:07
what we're doing, what they're doing, and
23:09
you know, sometimes their benefit is if
23:11
they have access to very special data,
23:14
we have access to the internet in
23:16
a way that we think. maybe other
23:18
people don't and we try to collaborate
23:21
on some of that but no I
23:23
would say in general the government is
23:25
never the best that seeing advantage of
23:27
new tech we're not never that's not
23:30
true because sometimes they put crazy satellites
23:32
in the sky and stuff but in
23:34
general not always the best adopter of
23:37
tech but in this case some parts
23:39
of the government was pretty pretty amazing
23:41
so okay another big moment that I
23:44
think happened around the same time as
23:46
GPT was the invasion of Ukraine. And
23:48
I know that recorded features, this is
23:50
an important moment for recorded future. Could
23:53
you kind of think me that back
23:55
to that moment inside of recorded future
23:57
and what you were doing and how
24:00
you responded? Yeah. So I would say
24:02
a couple of different things. We have,
24:04
they were. customer from before in some
24:07
areas and you know we we have
24:09
probably 47 different countries around the world
24:11
that use us in in some sort
24:13
of national capability and all sort of
24:16
in the West plus plus or the
24:18
extended West are sort of a weird
24:20
way of describing the world especially with
24:23
those who live in the East but
24:25
you know it's sort of a got
24:27
to explain it in one way or
24:29
the other. So in that world 47
24:32
countries and Ukraine was one of them.
24:34
But so when the invasion happened, we
24:36
did not approve of that. So we
24:39
said, let's help out. And we provided
24:41
our technology and it's been deployed nicely
24:43
in a number of places over there.
24:46
It has been very good for them,
24:48
I think. You can find a lot
24:50
of good. I'll leave it to you,
24:52
to find the... the good proof points
24:55
that they've been willing to talk about,
24:57
very very very specific such. Well, wait,
24:59
why don't you tell us about the
25:02
proof points? I would love to hear
25:04
about them. It's a lot to brag
25:06
on the scene. You have to be
25:09
careful with this and that. But there
25:11
are some very specific examples where we've
25:13
been able to. So because part of
25:15
this is that when something is like
25:18
sort of think about when. a spy
25:20
tries to again we talk about the
25:22
criminals who can be pretty brutal we're
25:25
not brutal of course they're brutal but
25:27
pretty non-suddle they're like just they want
25:29
to go make money they many cases
25:32
they don't really care about whether they
25:34
burn something whether they create havoc as
25:36
long as they make money and and
25:38
you know they don't necessarily think the
25:41
most the longest term a spy if
25:43
it's a doesn't matter if the Russian
25:45
spy or Chinese spy or frankly our
25:48
own spies if they want to go
25:50
get access to information for are willing
25:52
to take, you know, if you have,
25:54
if the policy is that we should
25:57
know what's in Putin's diary or in
25:59
Zing Ping's travel records or the other
26:01
way around, one is willing to spend
26:04
any number of years. to get to
26:06
that information, and you have to be
26:08
very subtle about it. So if you
26:11
assume that Russian spies or Chinese spies
26:13
or anybody who's trying to come out
26:15
Ukraine are trying to do that, they'll
26:17
be very careful, very subtle, and so
26:20
on. So what it turns out then,
26:22
or for that sake, when somebody wants
26:24
to come and destroy something in the
26:27
world of computers, When they do that,
26:29
they sort of become a gigantic honeypot
26:31
if you think about it. It's like
26:34
just everything shows up there. So when
26:36
we deploy the recorder future across a
26:38
whole bunch of places there, we ended
26:40
up also learning a time. So you
26:43
sort of get in the way of
26:45
the absolutely most modern malware of various
26:47
sorts and all the other stuff that
26:50
comes around that. And so... any number
26:52
of times, we've been able to sort
26:54
of detect incredible things and then be
26:56
able to have that data flow into
26:59
all our other customers in a way
27:01
so that they can be defended against
27:03
the same. So it ends up being
27:06
sort of a, I don't know, probably
27:08
not a popular inality, but virtual iron
27:10
dome type of thing that this turns
27:13
into, and so pretty incredible sort of
27:15
outcome out of that, and yeah, continues
27:17
well to this day. Interesting,
27:19
one question that comes to mind for
27:22
me there is, you know, is it
27:24
ever tricky to decide what governments to
27:26
work with and not work with? I
27:28
mean, running weights and biases, I've been,
27:30
you know, kind of surprised by how
27:33
many different, you know, customers we have
27:35
where, you know, employees might object and
27:37
then, you know, I think, like, it's
27:39
really not like... my expertise to figure
27:41
out like who the good guys and
27:44
the bad guys are but you know
27:46
we're not exactly at the same level
27:48
that that you are I mean it
27:50
kind of is your job maybe to
27:52
figure out who the good guys and
27:55
bad guys are and it's actually like
27:57
probably pretty complicated like what do you
27:59
really start digging in like how do
28:01
you approach that like if a new
28:03
government wanted to work with you do
28:06
you have some kind of like flowchart
28:08
that decides if they're on our team
28:10
or not? Yeah. And how does that
28:12
work? You know, as you can imagine,
28:15
it's not something we necessarily publish, but
28:17
I just read some general principles. But
28:19
you're right, first of all, it is
28:21
our job to be able to make
28:23
those sort of judgments. And you have
28:26
to make choices and no such choices
28:28
will be perfect. And it's not like
28:30
we can claim to have the most,
28:32
we're... software dudes, we're not necessarily, or
28:34
software people, we're not necessarily moral philosophers,
28:37
just to be in not overstate our
28:39
own abilities. But, you know, we chose
28:41
to not have customers in China and
28:43
Russia and Iran and North Korea, those,
28:45
that was sort of an easy choice.
28:48
There are countries that you as law
28:50
and European law nor the same countries
28:52
to some degree but also places like
28:54
Cuba and Sudan and you know like
28:56
there's six seven countries that Syria that
28:59
you actually go straight to prison if
29:01
you sell them stuff so that makes
29:03
it easy that's easy those are the
29:05
easy ones yeah then you have maybe
29:07
other places that are more tricky and
29:10
they are not going to start naming
29:12
but we made our choices where there
29:14
is a number of countries we just
29:16
stay away from and then as well
29:18
as companies there is a number of
29:21
countries sorry companies that for example you
29:23
know engage in illicit hacking behavior that
29:25
we stay away from so there we
29:27
we call it an audit list call
29:30
it whatever you don't publish that but
29:32
and and then there is the companies
29:34
that help out some of these things
29:36
and so on. So there's a, we
29:38
have a process maybe and it is
29:41
a flow chart is not a bad
29:43
way of thinking about it. And then
29:45
we have sort of a committee at
29:47
the very core of this when there's
29:49
choices to be made. And sometimes you
29:52
actually have to make choices and we
29:54
make those choices. So I make no
29:56
claims about that being easy, but it's
29:58
never been that hard either. It's sort
30:00
of. Sometimes it would be hard to
30:03
write down. But we have sort of
30:05
a committee and in the end game,
30:07
that committee is who decides and I
30:09
think we've been able to do that
30:11
in a way that we can go
30:14
to bed nicely and so on. Now
30:16
our stuff is for cyber defense also.
30:18
It's pretty worthless. It's not like you're
30:20
not going to use our software too.
30:22
decide where you're going to drop a
30:25
bomb or to do, you cannot even,
30:27
you cannot physically use us to intrude
30:29
into a phone or do any of
30:31
the malicious, it's cyber defense. So even
30:33
if a bad guy got their hands
30:36
on it, it wouldn't be great, but
30:38
it's not like it's going to like
30:40
create complete havoc in the world. So
30:42
in the end game, it's hosted softers.
30:45
If the wrong guy got their hands
30:47
on it, we can cut it any
30:49
time. So, you know, there's a number
30:51
of controls to it. So it. So
30:53
it's good. So it's good. So it's
30:56
good. I guess one of things that's
30:58
important to me is that, you know,
31:00
our customers know that we're not going
31:02
to cut them off if they get
31:04
sort of politically unpopular, which is why,
31:07
you know, I want to be like
31:09
super clear about our criteria. Do you
31:11
worry about that at all for record
31:13
features? Just so kind of clear what
31:15
you guys are willing to do and
31:18
not. So first of all, our stuff
31:20
is fairly expensive, so we only have
31:22
2,000 customers. You know, whereas somebody else,
31:24
you might have many, many more. So
31:26
that then, so and we know who
31:29
they are. They run every customer of
31:31
these 2,000 goes through sort of a
31:33
KYC process, know your customer, sort of
31:35
thing. So it's, if somebody shows up
31:37
and they're called a company name that
31:40
we have no idea of, they have
31:42
no website, they have no website, nothing.
31:44
It's just not going to spend $100,000.
31:46
If our stuff ranges from that up
31:49
to millions and millions, you know, that
31:51
itself selects a lot of different things.
31:53
It's like, whereas if you have a
31:55
little dev shop with four guys who
31:57
might use some AI tools, it's I
32:00
think you might be more difficult for
32:02
you to make those sort of choices.
32:04
So I don't know, tell me if
32:06
I'm wrong, but I think we sort
32:08
of, there's a whole bunch of variables.
32:11
that have made it so that this
32:13
has been less of an issue, actually.
32:15
That's great. Actually, shifting gears, I think,
32:17
you know, we found a saying of
32:19
yours that I think you repeated earlier
32:22
here that I want to ask you
32:24
about, which is saying, you know, I
32:26
think it's like in the last 25
32:28
years, the internet reflected the world and
32:30
the next 25, the world will reflect
32:33
the internet. Can you expand on what
32:35
you mean by that? the trivial one
32:37
is sort of like everything that's happened
32:39
in the physical world sort of whether
32:41
it's everything from sort of transactions to
32:44
interactions to like how we do business
32:46
how we sort of the simple stuff
32:48
is sort of moved on to the
32:50
internet and and and that the internet
32:52
then through that becomes a reflection of
32:55
what's going on in the world and
32:57
I think we're all very happy to
32:59
see that and there's not a lot
33:01
of drama that comes out of that
33:04
then But then sort of saying that
33:06
as soon as those systems here is
33:08
sort of its internet first is sort
33:10
of what's starting to happen and whether
33:12
that sort of. the trivial social, when
33:15
you know, you see kids where, where,
33:17
what's going on in Tiktok or Facebook
33:19
or whatever is sort of that, maybe
33:21
not kids and Facebook, that's not true,
33:23
but, but, you know, whatever the preference
33:26
of social network of preference is, it's
33:28
sort of, it starts in the internet
33:30
world. When democracy is sort of first.
33:32
internet and then on to the world
33:34
when business is first internet and and
33:37
now we end up in a world
33:39
where what happens on the internet is
33:41
actually what defines the world and that's
33:43
sort of what I think becomes pretty
33:45
interesting because now if you run a
33:48
country it's becomes tricky you know who's
33:50
who's in power is it the country
33:52
or is it Mark Zuckerberg with Facebook
33:54
and I think it's been pretty interesting
33:56
even observing Facebook and just seeing how
33:59
Facebook switched head of government affairs at
34:01
the same time as years. if we're
34:03
president and that was probably not my
34:05
window. So I think we're going to
34:07
see a lot of this and if
34:10
you sort of fast forward where that
34:12
will be 25 years from now, I
34:14
wouldn't, I'm not keen to be a
34:16
politician but large, but I certainly wouldn't
34:19
want to try to be a politician
34:21
in 25 years because it's probably going
34:23
to be pretty tricky. And I guess,
34:25
you know, you talked about elections earlier
34:27
and we found a lot of people.
34:30
on the show actually talking about different
34:32
kinds of election interference and mitigation efforts.
34:34
When you kind of roll forward, the
34:36
trends that you're seeing, do you think
34:38
that, you know, the US society has
34:41
to operate a different way to be
34:43
successful in this more vulnerable world? Big
34:47
question. I'm a huge democracy fan. I
34:49
sort of like whether you take it
34:51
from a positive way that how positive
34:53
isn't it that we can live in
34:55
a world with sort of humanity's been
34:57
around for tens of thousands of years
34:59
and and it's sort of most of
35:02
the time has been the strong man
35:04
that rules and you sort of follow
35:06
and and us only and even when
35:08
there had been democracy it's only or
35:10
some it's only being for the super
35:12
privileged or whatever you have sort of
35:14
thing. pretty damn positive that we can
35:17
have a vote. So that's sort of
35:19
pretty amazing that we get to live
35:21
in that time. Now, if you then
35:23
fast forward and start thinking about what's
35:25
going to happen there. First of all,
35:27
I was going to say that the
35:29
other non-positive ways is to say there's
35:32
a lot of other ways and maybe
35:34
democracy isn't the best. But it's the
35:36
best of the worst. There's sort of
35:38
like none of that stuff is amazing.
35:40
It all has problems and so on.
35:42
But it's... I certainly haven't seen anything
35:45
remotely as good as democracy. Now, so
35:47
if you go forward and you just
35:49
say, what are the ways the democracy
35:51
could develop? Again, there is this sort
35:53
of thing, what, because we think very
35:55
much in democracy in terms of countries,
35:57
and I think that. That's important and
36:00
that's probably going to remain true. But
36:02
again, if the sort of the groups
36:04
of people are not necessarily following national
36:06
borders, it might make it very difficult.
36:08
Now, maybe we are sort of a
36:10
new era right now. I can't really
36:12
make that judgment where. There's a new
36:15
set of nationalism going on in a
36:17
bunch of different places. Better for worse,
36:19
we can only observe it and that
36:21
seems to be the case, but maybe
36:23
that's a reaction and what's going on
36:25
would happen on the internet. And I
36:27
think there's many other of these sort
36:30
of things. I think maybe the negative
36:32
view would be that there's the world
36:34
that's been manipulated in many different ways
36:36
and there's amazing opportunities in front of
36:38
us to manipulate if I put on
36:40
my... I want to be evil brain.
36:43
There's many ways to be evil. But
36:45
maybe on the other hand, people are
36:47
also, let's be positive and say, maybe
36:49
people are getting smart and, you know,
36:51
a lot of manipulation is. pretty primitive
36:53
so hopefully you know like I'm sort
36:55
of always say look people always come
36:58
up with wonderful text solutions to deal
37:00
with this how about if we do
37:02
what they do in Finland and just
37:04
get people great education and and we
37:06
can sort a lot of these problems
37:08
let's make sure that people can read
37:10
really well and other sort of things
37:13
and it can work out so I
37:15
am rambling a little bit but I
37:17
guess I'm a generally an optimist democracy
37:19
is better than anything and and yeah
37:22
That's great to hear. I mean, what
37:24
do you like? Do you think that
37:26
when you think about sort of like
37:28
AIs like offensive techniques and defensive techniques?
37:31
Do you have a sense on like
37:33
what wins in terms of like, I
37:35
don't know, election interference or kind of
37:37
any other conflict that we come into
37:40
here? Like, um, Yeah, like you can
37:42
use AI to look for security vulnerabilities
37:44
and fix them or look for security
37:47
vulnerabilities and exploit them. I mean, if
37:49
you think about like sort of, if
37:51
cyber attacks with large have this happened
37:53
faster and faster, and so if I
37:56
start there just because that's sort of
37:58
closest to home, and clearly, so for
38:00
now the AI stuff you're really seeing
38:02
is really only. these sort of better
38:05
fishing emails and so on. But even
38:07
there, it's like, you know, like, there
38:09
was a reason most cyber attacks happened
38:11
in English speaking countries for a long
38:14
time because right in the fishing emails
38:16
was easiest to do in English, sort
38:18
of thing. But now you can do
38:20
that in all kinds of different things.
38:23
You can do more targeted stuff, blah,
38:25
blah, blah, blah. But people are going
38:27
to start writing software that once it
38:29
gets in, it traverses through systems automatically.
38:32
That for sure people are. slow, slow,
38:34
moving things that tries to not detect,
38:36
or it's the sort of stuff where
38:38
they don't really care and there's a
38:41
prum, run through the system. That's gonna
38:43
happen. That's gonna set up the bar
38:45
for the defenders to try to build
38:47
things that you're not gonna actually be
38:50
able to run counter that with just
38:52
humans. You're gonna have to use AI
38:54
to sort of get or some version
38:56
of. automated analysis, but we can call
38:59
that AI to fight against that for
39:01
sure and that who wins that there?
39:03
Yeah, I like there's a guy Rob
39:05
Joyce who's a terrific man who used
39:08
to run cyber defense for the US
39:10
government at NSA. He before that ran
39:12
the hacking team there one of the
39:14
groups and famously called TAO at the
39:17
time and he has a great saying
39:19
where he said you have to know
39:21
your network better than the bad guy
39:23
coming at you. And if you're a
39:26
big company. It's freaking hard to know
39:28
their network. And a, you know, a
39:30
crawler plus some AI is going to
39:32
outsmart most IT guys even if they
39:35
work there for 25 years. So, so
39:37
there you have that. And I don't
39:39
think it's certain who wins that. We,
39:42
for sure, in disinformation to your point,
39:44
I'd like to think that we should
39:46
be able to write. you know so
39:48
in the recorded future when you get
39:51
text and images and all this stuff
39:53
now we try to classify and say
39:55
instead of information you're looking at here
39:57
is very likely machine generated and it
40:00
being machinery generated with these models that
40:02
sort of stuff to help the researcher
40:04
now when that can be in an
40:06
our operational system so whether I'm on
40:09
tic-toc or my email or whatever to
40:11
say that, but maybe on the other
40:13
hand, maybe 75% of what I'm going
40:15
to deal with in the future is
40:18
going to be machine-gen rated. So maybe
40:20
that's not enough. It's like the world,
40:22
the future is maybe machine-gen-rated. So I
40:24
think that's sort of interesting. And then
40:27
you have the actual warfare stuff, where
40:29
what's going to happen is that I'm
40:31
going to paint a square and staying
40:33
in this 10 by 10 K kilometer.
40:36
There's going to be drones hanging over
40:38
that stuff with guns and grenades that
40:40
what you pick. And anything that moves
40:42
in there is dead. You know, like
40:45
that sort of stuff. And people are
40:47
working on that right now. That's going
40:49
to change the nature of warfare in
40:51
a brutal way. So no, there's be
40:54
all kinds of nasty stuff in front
40:56
of us. So sorry, just I'm doom
40:58
and gloom here, but. In a vein,
41:00
I guess, and maybe it's like even
41:03
a, you know, practical question, like what
41:05
does it mean that the. the dark
41:07
web is moving from kind of forums
41:09
to telegram. I mean, I'm sort of
41:12
vaguely aware of telegram as an alternative
41:14
to signal. I've watched many, many, maybe
41:16
most of my CEO friends gradually moved
41:18
to signal in the last... you know,
41:21
year or two to the point where
41:23
I feel like I'm using it quite
41:25
a bit more than I expected to.
41:28
Is telegram a signal like equivalent or
41:30
why didn't, shouldn't a CEO be using
41:32
the same, you know, network that the
41:34
criminals are using? Would that be the
41:37
most? So first of all, use signal,
41:39
good choice. Make good choices, look, because
41:41
use signal, just so you know. Signal
41:43
is great. Signal publishes their source code.
41:46
you could argue this is actually I
41:48
have still yet really figured it out
41:50
but but there's no real server to
41:52
attack in signal there's lots of really
41:55
good things and all the fancy spies
41:57
of the world in all the good
41:59
countries use signal okay the fact that
42:01
they've got uncomfortable now that mean might
42:04
mean that they have a big cabal
42:06
where they all share you know or
42:08
you look at your mind emails but
42:10
no I've seen plenty of very smart
42:13
people use use signals I'm a huge
42:15
signal fan of The good telegram on
42:17
the other hand, you know, so the
42:19
guy, I'm not going to remember his
42:22
name now, but you know, very cool
42:24
entrepreneur in many ways. He built the
42:26
UK first, sort of the Russian Facebook.
42:28
He was forced to sell that because
42:31
the Russian government did not appreciate that
42:33
he had built this. They forced him
42:35
to sell it to an oligarch in
42:37
Russia. He was sort of really... Imagine
42:40
your arm being up here and you
42:42
sort of have to say yes, and
42:44
he had to say yes, and he
42:46
sold it. He then started telegram and
42:49
moved to Dubai to be able to
42:51
sort of run it, Abu Dhabi, or
42:53
Dubai, I think it's Dubai, and built
42:55
a messaging platform that is unlike signal,
42:58
if I understand correctly, does have sort
43:00
of a centralised place to be, it
43:02
has encryption that is not... at the
43:04
same way of end-to-end encryption as of
43:07
signal and others do in any number.
43:09
Like some of this is way above
43:11
my pay grade, but there's a whole
43:14
set of reasons why telegram is not
43:16
as good. It is interesting from this
43:18
sort of, and it's dangerous in their
43:20
world to sort of say good and
43:23
evil, but the set of people that
43:25
we don't necessarily love that are, or
43:27
many of them, are on telegram. signal
43:29
in a different way, but it ends
43:32
up being a good place to go
43:34
look for, and especially the criminals are
43:36
on there, whether it's sort of cyber
43:38
criminals or people in trafficking and all
43:41
kinds of interesting stuff, are there, and
43:43
it's turned into a very good information
43:45
source. Is there something about telegrams feature
43:47
set that it's better for criminals like
43:50
when if any criminals like listening to
43:52
this podcast you could be convinced to
43:54
switch over from telegram to signal I
43:56
mean surely they wouldn't watch it. I
43:59
think it's no I think it's it's
44:01
social this is sort of like and
44:03
it also turns out that Ukraine there
44:05
it does obviously in in the Russian
44:08
sort of world. is very popular, so
44:10
it's popular in Ukraine as well. So
44:12
it's not, I'm not by any means
44:14
saying that all telegram people, users are
44:17
bad by any means. It just sort
44:19
of happens to be a certain class
44:21
of criminals that ends up being concentrated
44:23
on telegram and, okay, interesting stuff. All
44:26
right, well, like, rolling forward into the...
44:28
There are some people who'd be very
44:30
mad at me for saying this stuff,
44:32
but... Wait, why? For saying what part
44:35
of it would they be mad at
44:37
it? Because once you sort of say
44:39
that one platform is more criminal than
44:41
the other, you know, like I'm... You
44:44
know, this is... Well, we can edit
44:46
it out if you want, but... No,
44:48
no, no. It's all good. So good.
44:50
Yeah, actually another question maybe that comes
44:53
to me is like, yeah, I feel
44:55
like, you know, in my very kind
44:57
of like low level of celebrity, you
44:59
know, like every like, you know, month
45:02
or two, I get someone reaching out
45:04
to LinkedIn, like, say it, I'm going
45:06
to come kill you or something, but
45:09
you must get these kind of threats
45:11
like all the time. Like, do you
45:13
worry about you and your family's like
45:15
safety being so kind of involved in
45:18
this world? Do you like, do you
45:20
walk around with the bodyguard? You must
45:22
be constantly concerned about getting hacked. Like
45:24
how do you think about that? Do
45:27
I look afraid? You don't look afraid.
45:29
Is that for cause or for hubris,
45:31
I guess? No, it would be funny
45:33
here. No, I'm not afraid. That's sort
45:36
of, you know, and you choose when
45:38
you get involved in this that that
45:40
you sort of and no bodyguards, unless
45:42
it's needed. It's sort of one way
45:45
of thinking about it. One should be
45:47
cautious. The sort of Russia deemed as
45:49
a, whatever they call it, undesirable enemy
45:51
of the state. So just before Christmas,
45:54
there was a verdict, if you want,
45:56
from the national prosecutor or the... The
45:58
general prosecutor of the Russian Federation put
46:00
out a statement around why, record a
46:03
future, is an enemy of the state,
46:05
is an undesirable, very specific language, not
46:07
like random stuff. It was very specific.
46:09
So that was less great if you
46:12
want. That sounds bad, man. people that
46:14
are undesirable, according to Russia, like get
46:16
turned out of windows and stuff. I'm
46:18
not super up on this, but... I
46:21
can assure you I'm not traveling to
46:23
Russia any time soon. Sure. That seems
46:25
to be after the... No, look, I,
46:27
you know, the... I think we're very,
46:30
very careful with information security here. We
46:32
run a good physical security program here
46:34
as well. We try to be very
46:36
careful and thoughtful about how we travel
46:39
what we do and be thoughtful about
46:41
what we do and be thoughtful about
46:43
things. then at the same time, once
46:45
you're not overestimated where you are on
46:48
the list of problems that the bad
46:50
guys have, you know, whether it's the
46:52
Chinese government or the Russian government, they
46:55
have a lot of shit to deal
46:57
with and I don't think we're sort
46:59
of at the top of that list.
47:01
So that's sort of, I don't want
47:04
to simplify it, but information security, we've
47:06
got to be on our egg game.
47:08
All right, well let's let's let's wrap
47:10
this up and come back to the
47:13
here and now so a couple months
47:15
ago You announced that you you were
47:17
getting acquired by by Masterguard for 2.65
47:19
billion. It's got to be one of
47:22
the biggest You know acquisitions of the
47:24
of the last year. Can you talk
47:26
about what the thesis was and how
47:28
that came about and what your plans
47:31
are going forward? Yeah, no, so we
47:33
had a long discussion with them, you
47:35
know, multi-year sort of discussions that has
47:37
been going on. We, the sort of
47:40
the simple thesis is that it's sort
47:42
of maybe not as known, but they
47:44
run a great payments business, of course.
47:46
They have a services business that actually
47:49
includes pretty nice assets on the cyber
47:51
and intelligence side. They wanted to expand
47:53
on this and we could be a
47:55
nice fit into that. We also, the
47:58
financial intelligence side is super interesting, that's
48:00
especially as we talked about cyber criminals
48:02
and so on, this sort of financial
48:04
end point to things. If we get
48:07
our hands on a stolen credit card,
48:09
it would be awfully interesting to know
48:11
whether a card has been used or
48:13
not. And the cards that have been
48:16
used or not actually tells you a
48:18
whole lot about the whole supply chain
48:20
of where those cards came from, just
48:22
to have one random sort of example.
48:25
So there is a lot of those
48:27
sort of things. So we're going to
48:29
be. building out our business. We're continuing to
48:32
operate on our own standalone. There's a lot
48:34
of cool synergies that we can do together.
48:36
And it was also one of those
48:38
where it was just felt like a
48:40
really good home for the company as
48:43
great people and sort of long-term value
48:45
in like mindset of things, which was
48:47
important to me, because kind of to
48:49
your point for both you and I
48:51
have built companies for a long time,
48:54
you want to make sure they end
48:56
up in places that are not just
48:58
going to be like five by night
49:00
type stuff, but people who are serious
49:02
and want to do things over long
49:05
term. any number of reasons that sort
49:07
of stacked up where this was a
49:09
great place to take it. Any
49:11
advice for me for success post
49:13
acquisition? I think, you know, make
49:15
sure they'll keep doing execution very well.
49:18
It's sort of like you have to
49:20
keep at your A game. We're early
49:22
into it, but, you know, build good
49:25
friendships, good relationships with people because this
49:27
stuff is not easy. When you do
49:29
it, as you know, it's like, you
49:31
know, it's easy to come up with
49:34
a lot of clever thoughts on paper,
49:36
but we're going to make it do
49:38
it, it's like with people. So
49:40
make sure you have good relationships
49:42
with those people. Yeah, I think
49:44
that's it. And don't just think
49:46
it happens sort of automatically. It's
49:48
sort of like it's, it's, unfortunately,
49:51
I guess it continues to be
49:53
hard work. It's sort of unavoidable.
49:55
Awesome.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More