Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
week, we have an interview with the CEO
0:02
of Sandfly Security. It's an incredible interview. You're
0:04
not going to want to miss it. We
0:06
talked about a lot of cool stuff with
0:08
cybersecurity, AI, and so much more. And,
0:11
you know, there's so much to talk about.
0:13
You're just going to have to just
0:15
sit back and relax and enjoy the
0:17
massive amount of content we have. I've
0:19
learned so much during this episode. This
0:21
is an awesome thing. This is fantastic.
0:24
No, we could probably make a video
0:26
out of each every single question because
0:28
all of his answers were just so
0:30
fascinating. By the way, whoever wrote those
0:32
questions was a freaking genius. I
0:34
mean, I would not say that
0:36
necessarily because it was Brian, but I
0:38
agree. I just won't say it. Good.
0:42
As long as you agree, but
0:44
don't say it. Yeah. Exactly. Welcome to
0:46
Destination Linux, where we discuss the
0:48
latest news, hot topics, gaming, mobile, and
0:50
all things open source. And Linux
0:52
my name is Michael and with me
0:54
are the dark magic mages casting
0:56
firewalls to protect their data Ryan and
0:58
Jill. Fireball fireball fireball.
1:01
Magic missile. This
1:04
episode is going to have
1:06
plenty of cyber security magic
1:08
to discuss. Look I
1:10
am so excited this week
1:12
is someone studying the dark
1:14
arts of cyber security. This
1:17
episode is epic and anybody interested
1:19
in security at all wanting to
1:21
know what's going on, how to
1:23
get into a career in security,
1:26
epic hacks that have happened,
1:28
everything is covered in this
1:30
episode. Truly one of my favorites. You're
1:32
going to love it. We
1:34
also have our community feedback to cover this
1:36
week and so much more, so check it
1:38
out. This new episode
1:40
of Destination Linux with
1:42
Ryan. As you can tell, he's
1:44
back. So
1:47
we can get
1:49
this show
1:51
on the road
1:53
toward destination
1:55
Linux So Michael I
1:57
wasn't here last week, but
1:59
I cannot wait to listen to
2:01
last week's episode Because Jill
2:03
talks about all the work she did
2:05
in Hollywood with Linux So I am
2:07
going to absolutely stick back and enjoy
2:09
that episode So if you missed it
2:11
with me you want to check that
2:14
one out, but also I heard Twill
2:16
is a 300th episode. That's
2:18
right. This week
2:20
in Linux. This week in
2:22
Linux Twill. So the live stream, when
2:24
this comes out, if you're a patron,
2:26
you might be able to watch it.
2:28
But if this, this will come out
2:30
after the live stream. So
2:32
you can watch the latest
2:34
episode that's going to have the
2:36
celebration stuff in it. And it'll
2:39
be already available right now. So
2:41
check it out. Thisweekinlinux.com slash 300. I
2:43
don't know what we're doing because I
2:45
haven't decided yet and we're recording
2:48
this before that. Of course you haven't
2:50
decided yet because we do things
2:52
last minute. He will decide one minute
2:54
before he hits record and it
2:56
will probably just be a graphic with
2:58
a little balloon on it or
3:00
something. That is ridiculous. It'll be seven
3:02
minutes and it'll be two graphics. Two
3:05
balloons. Um,
3:07
no, uh, first of all, I want
3:09
to tell people, um, because I saw
3:11
somebody in the comment section this week
3:13
in discord go, Hey, did you know
3:15
that the other hosts have their own
3:18
little shows on YouTube and stuff they
3:20
do? So, uh, yes, Michael has, uh,
3:22
done an incredible show and does also
3:24
tons of other videos, uh,
3:26
called this week in Linux where you
3:28
just get the news. So it's a
3:30
fantastic compliment to our other shows on
3:32
the network because if you're wanting to
3:34
know. What's going on in the world
3:36
of Linux he covers all the news
3:38
topics that matter that week every single
3:40
week and the one thing i want
3:42
to tell people about this is you
3:45
don't know how much work goes into
3:47
that show michael and i have spent. So
3:49
many hours over the years
3:52
talking about the content of
3:54
that show. changing formats for that
3:56
show, making that show a
3:58
new show as engaging as possible.
4:00
Michael puts a ton of work writing
4:02
each of those new stories because it's
4:04
not like you just go on Google
4:07
News and type Linux and read stuff
4:09
off. You've got to find stuff that's
4:11
actually interesting to people. Then you have
4:13
to talk about it in a way
4:15
that's interesting. Then you have to edit
4:17
this stuff and put graphics in and
4:19
all of that. And we didn't have
4:21
an editor forever. So that was all
4:23
manually done. And Michael did all that
4:25
work himself and all of these things.
4:27
So to do 300 episodes is
4:31
insane. And I
4:33
don't say it often because. That
4:36
would just break character for me but
4:38
congratulations on the incredible too much
4:41
for my yeah but really is a
4:43
huge accomplishment and we're super proud
4:45
of you i'm super proud of you.
4:47
And that's the last time you'll
4:49
hear that until episode 1 ,000. So
4:51
what about 400 or 420? No, no,
4:53
now you got after 300, it's
4:56
a thousand is the next mile marker.
4:58
That's a big jump. What about
5:00
500 at least? Maybe. Yeah,
5:02
that's special. Maybe. That's good. You might
5:04
get a compliment again, 200 more
5:06
episodes from now. All right. So I
5:08
can look forward to that. I'm so happy. Yeah.
5:11
Our feedback this week comes from
5:13
Stahl, S -T -A -L. They
5:15
have this to say, hi, Jill, Michael
5:17
and Ryan. Alphabetically,
5:19
well done. I
5:21
don't play favorites, and I love
5:23
you all equally. First
5:25
of all, that's never been said
5:27
before. Yes, that's the
5:29
first. I don't know
5:32
if Stahl's being honest with us
5:34
because everyone loves Jill the most
5:36
out all. Jill is first on
5:38
the list, just technically alphabetical, I
5:40
guess. Yeah, but no, doesn't
5:42
play favorites. Love you all equally. Thank
5:44
you for that deal would not be
5:47
the same if any of you are
5:49
missing. Gosh, did
5:51
we write this to ourselves or something? Because
5:53
this is like it. Yeah, it feels like
5:55
it. Thank you for saying that. I've been
5:57
listening to the show for a couple years
5:59
now. And what I appreciated throughout that
6:01
time is the chemistry between you
6:03
guys, despite of how different you
6:05
are. Michael doesn't care about up
6:07
to date as long as he
6:09
can enjoy the best kitty. Plasma
6:12
experience, fact. That's pretty much true.
6:14
I have been updating much more
6:16
recently. stop, Mike. Stop. I have
6:18
been. However, I also
6:20
have two other installs that are not. So
6:22
it's still true. Still true. Ryan
6:25
seems to care so little about the
6:27
desktop environment to the point he just
6:29
uses whatever default is placed in front
6:31
of him as long as it's up
6:33
to date. See? That is 100 %
6:35
Ryan. The only part that's not true
6:38
is I do change the wallpaper. So
6:40
I leave everything default, but the wallpaper
6:42
has to be a Batman wallpaper. So
6:44
if you're ever breaking into my machine
6:46
and you're wondering, did I get in
6:48
the DOS Geeks machine? Your
6:50
clues will be default desktop with a
6:52
Batman wallpaper. Then you're probably in
6:55
my machine. not much customization. Yeah.
6:59
There are times where he set up a
7:01
new Distro right before the show and
7:03
I was like, did you get time for
7:05
the Batman wallpaper? Only
7:08
important part you know Jill probably
7:10
uses everything under the sun and
7:12
knows how to shine a positive
7:14
light on any use case and
7:16
that's awesome This person has definitely
7:18
been paying attention because this is
7:20
not Jill after show after show
7:22
Jill yells at us a lot
7:24
She says no, I don't Ever
7:27
tell a stupid joke like that
7:29
again you better be more professional
7:31
and all that you don't see
7:33
the after show Jill it's it's
7:35
harsh we call we call Jill
7:37
we own the show we refer
7:39
to her as Jill on the
7:41
in the after show we just
7:43
refer to her as savage. We're
7:48
teasing Jill. We're lying. We're sorry.
7:50
Now I feel bad. We
7:52
got a community feedback from
7:54
somebody saying, y 'all should be
7:57
nicer to Jill. What are you
7:59
talking about? We love Jill.
8:01
We're constantly glazing on Jill all
8:03
the time. We have
8:05
to give her a hard time sometimes. No,
8:07
she really is this nice outside of the
8:09
show and in the show. I don't know.
8:11
I think she's from another planet. I'm very
8:13
convinced that she's not from earth because no
8:15
human is this kind and nice and happy
8:17
all the time. So yeah, if you
8:19
think that it's not possible, which we thought
8:22
as well, it is and Jill is proof. Yeah,
8:24
absolutely. So it
8:27
is proof aliens invaded though also.
8:29
So also on the fence about
8:31
that. She also did like the
8:33
E .T. Atari game. So yeah,
8:35
they're really nice aliens. They're
8:37
not the evil ones, you know, happy, joyful.
8:39
That's like the best kind of invasion
8:41
ever. Yeah. We're glad you
8:43
did our planet jill and you're
8:46
all just having a good time
8:48
with linux and open source without
8:50
making any or at least no
8:52
serious judgments on each other we
8:54
just ruined that dang it. The
8:56
one episode where we make judgments
8:58
on each other and then. Got
9:01
the comment to be fair, we're making
9:03
judgments on each other that are not
9:05
real and we don't mean them whatsoever.
9:07
Yeah. Yeah. You also just
9:09
complimented me before this. So, you know,
9:11
that didn't happen. Wendy, edit that.
9:13
I have recording. I'm going to I'm going
9:15
to make like an animated frame. So I
9:17
put like a digital thing. Goes
9:21
on to say thank you for keeping it.
9:23
up all these years with and without sponsors. I
9:25
don't like driving, but when I notice a
9:27
new episode of my podcast app, I look forward
9:29
to my commute home. That means a lot
9:31
to us. I'll sit behind the wheel with a
9:33
smile and sometimes have to laugh out loud
9:35
in the middle of traffic. I
9:37
love that. I love that because I've
9:39
actually listened to our own show and
9:41
it sounds, I have to do QA
9:43
to make sure that the edits are
9:45
right and everything. So I also listen
9:47
to every episode and the amount of
9:49
times that I've found myself laughing at
9:51
what are bantering and whatever we're doing
9:53
is it countless so you know what's
9:55
embarrassing michael's i listen to the show
9:57
for q of course purposes only and
9:59
uh sometimes you don't even do the
10:01
q a because you don't you don't
10:03
tell me what you're gonna be laughing
10:05
you know at things that we say
10:07
in the show because i forget you
10:09
know after the show and thing uh
10:12
and my wife will walk in on
10:14
me and she's like what are you
10:16
laughing at and the embarrassment of having
10:18
to show her my own videos. You
10:20
know of like i'm laughing at my
10:22
own stuff and she the look she
10:24
gives me of like really you're that
10:26
arrogant into i'm like no it's like
10:28
what michael sir what jill said like
10:31
yeah sure you're just listening to yourself
10:33
sitting there laughing the best part is
10:35
that. So i have this
10:37
is pretty much the same experience my girlfriend
10:39
and i were sometimes sitting on the couch
10:41
and i have to do the q a
10:43
and there's been times where i've literally laughed
10:45
at what i said. Hey,
10:48
check this out. This was
10:50
so funny. You should watch this, too. I
10:56
do run the shows for my hubby. He's
10:58
really good, and he'll go and watch them. But
11:00
if he has it, I go, honey, honey,
11:02
you got to see this part. We
11:07
enjoy this show, and I'm
11:09
glad you enjoy listening to the
11:11
show, and we're thankful for
11:13
everybody who's listened for so long
11:15
from 190 countries. 190
11:17
countries. I can't name but 20, but
11:19
190 countries out there listen to this
11:21
show regularly. And it's, it's. I have
11:23
looked at the data, the data for
11:25
related to all these different countries. I
11:28
also name them all. Yeah.
11:30
It's, it's incredible. awesome. The
11:32
fact that we have so much, so many people from
11:34
so, so many places, let's like enjoying
11:36
the content. And I was actually going through a
11:38
list of like all the feedback we've had
11:40
because I was making a category of like, you
11:42
know, organizing stuff today. And the amount of
11:44
emails we've gotten over the years of people saying
11:47
how they loved our show, it's their favorite
11:49
and stuff. It actually kind of got me a
11:51
little teary up a little bit to see
11:53
like all that stuff. Oh, you don't even have
11:55
a heart and you got teared up. That's
11:57
incredible. I mean, I have
11:59
tear ducts. I don't need a heart to have
12:01
tear ducts. So
12:03
they go on to say, if you're
12:05
a star for topics and want
12:07
to cover something not A .I. related, I
12:10
think that's a hint there. It
12:12
might be. They might be suggesting. that
12:14
we cover something a i related
12:16
a i's just so excited other than
12:18
okay i may have a suggestion
12:20
for an episode subject open mandrieva full
12:23
disclosure i don't run it myself
12:25
yet so why it has roots i'm
12:27
not going to read everything here
12:29
because they went into full detail and
12:31
time but it has its roots
12:33
in mandrake which is one of the
12:35
first desktop focused linux distros it
12:37
features katie plasma as the flagship variant
12:39
so there you go michael. And
12:42
some background information is they never
12:44
hear anyone mention mandrakes derivatives which also
12:46
include magia. I just came across
12:48
open mandrieva as a curious to see
12:50
if there were other desktops focused
12:52
easy to use rolling distribution similar to
12:54
open susa. Personally, I only seriously
12:57
used a bunch of basic derivatives. I've
12:59
only played with fedora manjaro and
13:01
open susa, but magia and open mandrieva
13:03
never crossed my radar. Sadly, I
13:05
don't have time to just go hop
13:07
and play around anymore. But
13:09
it will also be on the list of
13:11
things to try when I do find the time,
13:14
kind regards, stall. So incredible
13:16
email. We may use this as
13:18
a full episode in the future because
13:20
we are always looking for unique
13:22
takes. been a while since I've used
13:24
either one of those. Yeah,
13:26
same thing. But Jill and Michael, I've
13:28
never used any of those. So,
13:31
I mean, I use OpenSUSA, that is that.
13:33
But any of those... I'm going to rely on
13:35
you here to kind of give your feedback
13:37
on what you think of this. I know Jill
13:39
knows a lot more about this topic than
13:41
I do. So I'm just going to give you
13:44
like a little bit of stuff. Oh,
13:46
you know, Michael. We'll
13:48
see. But
13:51
so Mandrake was my first distribution
13:53
was technically Red Hat Linux. And
13:55
then I found Mandrake because Mandrake
13:57
was kind of like the beginner
13:59
distro in the 90s and early
14:02
2000s. And it was like They
14:04
run as it wasn't beginner. It
14:06
wasn't beginner friendly, but it was
14:08
more beginner friendly than anything else.
14:11
And it made desktop usage
14:13
easier because, for example, back
14:16
in the day, I like to use
14:18
this example because of how ridiculous it is
14:20
to think about, because now you just
14:22
install the system and you're good to go.
14:24
Back in the day, you used to
14:26
have to know what kind of monitor you
14:28
had, otherwise you would set it on
14:30
fire. uh,
14:32
that's uh, that's not even a joke. He
14:34
literally could set it on fire because it
14:36
was like short circuit and stuff. And
14:39
uh, there's, yeah,
14:41
exactly. So mandrake made
14:43
it possible for it to automatically detect. I
14:45
think it might have been the first. I
14:47
don't know if it was officially the first,
14:49
but it was one Yeah, one of
14:51
the first. Yeah, where it automatically detected the right
14:53
monitor and everything and you didn't have to do
14:55
it. And I was like, this is amazing. And
14:58
like just something that simple these
15:00
days is like, You know of
15:02
course whatever but mandrake was like
15:04
the go to thing for me
15:06
and it also it was a
15:08
shame because it went away and
15:10
then a later manjiva came back
15:12
and there was like some legal
15:14
stuff involved in that and then
15:16
open manjiva came back when manjiva
15:18
went away and then well i
15:20
think manjiva and magia happen at
15:22
the same time. And
15:24
i actually have no idea if magia
15:26
is the correct way of saying
15:28
it and it's been this long. i
15:31
still have no idea which one
15:33
it is but it's your magia magia
15:35
something like that anyway so these
15:37
distributions are very interesting and the kde
15:39
thing yeah you know you know me
15:41
i do like kde but i
15:43
haven't i've used magia and open mandrieva
15:45
many years ago but it's been
15:47
a very long time and it's probably
15:49
time to check it out again.
15:51
There you go, Michael. Yeah. what your
15:53
next challenge is going to be when
15:55
I get you to somehow, when
15:57
I get you in a weak moment
15:59
and get you to a degree
16:01
to a challenge, that's what it
16:03
will be. Oh, so is it a challenge to
16:05
not, it's not a punishment challenge is a challenge
16:07
challenge. This is a challenge challenge. Yeah. Okay. So,
16:09
okay, I'll just see if it was the rat
16:12
poison type of thing or if it was. That's
16:15
when you misbehaved. Oh, okay.
16:17
Yeah. And I'm gonna elaborate on
16:19
what Michael said, it had
16:21
one of the best installers, easiest
16:24
used installers of the time,
16:26
and it would actually, you know,
16:28
detect your printer and even
16:30
your sound card as long as
16:32
you had a sound blaster
16:34
or compatible. And that was
16:36
huge for that time. It was
16:38
the very first distro I installed that
16:40
I didn't have to go through
16:42
X -Free 86. do
16:45
all that configuration in a terminal,
16:48
it would just set it up
16:50
with a nice user interface.
16:52
Exactly. It would even show you
16:54
the stuff in a GUI.
16:56
Yeah, in a GUI. People are
16:58
so spoiled now. Ryan is
17:00
so spoiled now. That was very
17:02
spoiled, really. Yeah. Back
17:05
in the early days, it was
17:07
called Linux Mandrake. Then later, it
17:09
was renamed to Mandrake Linux. Interesting.
17:15
Yeah, so me and That seems very important
17:17
to do. Yeah, yes. I don't remember this
17:19
ever happening with the different names. I always
17:21
just called me a freaking ignored everything else. So
17:25
me and Michael actually both
17:27
talked about the importance of Mandrake
17:29
in the early years and
17:31
shared our stories with Till Computer
17:33
and Michael Sweet of Open
17:35
Printing on Destination Linux episode. number
17:38
351 until actually worked
17:40
on Linux Mandrake. So
17:42
those stories are really
17:44
fascinating. And I actually
17:46
even showed off one of my
17:48
original boxes. Actually it was my
17:50
original box of Linux Mandrake that
17:53
I bought at the store. Which
17:55
is awesome that Jill still has it,
17:57
yes. Yes. For those
17:59
who are curious, yeah, of course she does. And
18:03
yeah, we really do need
18:05
to talk about OpenMendriva and Majiya
18:07
on a future DL because
18:09
I want to revisit those distros
18:11
definitely, especially since they have,
18:13
you know, a rolling addition that
18:15
Ryan would appreciate. I
18:18
didn't know that. I
18:20
don't want to be left out
18:22
of this conversation because you also have
18:24
all this information about these older
18:27
distros and things. So I'll tell you
18:29
what I know is Mandrake is
18:31
the root of a plant historically
18:33
derived either from plants of the Guinness
18:35
Mandragora found in the Mediterranean region
18:37
or from other species such as the
18:39
bryonia alba or the American mandrake,
18:42
which also have similar properties. So in
18:44
case you were curious. I
18:46
appreciate that information, Ryan, Peter. Very
18:48
good. That was off the top of
18:50
my head. Someone wants to fact check me.
18:54
There's also for extra information, the
18:56
mandrake plant is poisonous and
18:58
you should not consume it. And
19:01
also the next thing is
19:03
that Mandrake plants were also a
19:05
critical thing in Harry Potter.
19:07
Harry Potter. I was just going
19:09
to say that. Yeah. Yeah.
19:13
Very. And they were emotional, too. Very
19:16
emotional. You don't want to hear a Mandrake
19:18
cry. You don't want to hear it squeak. Oh,
19:20
my goodness. Wow. bunch of
19:22
Harry Potter fans here. You just
19:25
squeaked, Jill. Are you a
19:27
Mandrake you come from Planet Mandrake? No,
19:29
that's why our ears are still working because
19:31
she's not a man. I
19:33
can go much higher than that, but I don't
19:35
want to hurt our viewers' ears. Yeah,
19:37
just use your imagination,
19:39
folks. All right. So,
19:42
uh, stall, thank you so much for,
19:44
uh, sending an incredible email. You got us
19:46
all talking and got us laughing. Uh,
19:48
you got me to really think about what
19:50
a mandrake is and give a definition
19:52
off the top of my head, which is
19:54
amazing. Very impressive. Yeah. Very impressive. You
19:56
were able to pull that, you know, so
19:58
quickly. All right. You were able to
20:00
pull that mandrake out of the hat. Oh
20:02
my God. Or the pot. Joe,
20:04
please, please. No
20:06
more. Michael's enough. All
20:09
right, let's get right enough dad
20:11
jokes right now Do it Well,
20:13
I mean we should also probably
20:15
tell people how they can send
20:17
feedback if they want to oh
20:19
If you want to send it
20:21
really you're gonna make me look
20:23
embarrassing like that by making by
20:26
keeping that oh my gosh After
20:28
I complimented you now. I'm gonna
20:30
look like a goofball clapping at
20:32
the screen because you Go
20:34
to destinationlinux .net slash comments to send
20:36
in your comments or destinationlinux .net
20:38
slash forum. And if you have
20:40
a specific comment about how Michael mistreats
20:43
me, we will be read on
20:45
the show. So. Okay, to
20:47
be fair, I didn't do it on
20:49
purpose, but also this is why
20:51
people love our show. Yeah.
20:54
Is your mean to me? No, I was
20:56
saying good stuff earlier about, you know,
20:58
like he does make good videos and stuff
21:00
like that, too. So like you should
21:02
check it out. He's got a good cybersecurity
21:04
videos, got hardware information, all sorts of
21:06
stuff on his channel, DOS Geek. So check
21:08
it out. I will have a link
21:10
in the show notes for all of that.
21:13
Now. Cut
21:15
to the interview. This episode is sponsored by
21:17
Sandfly. But instead of just telling you about
21:19
them, how about we bring on the CEO
21:21
of the company? And today we're going to
21:23
be doing just that. So we'd like to
21:25
welcome the CEO of Sandfly, Craig Rowland. Welcome
21:28
to the show, Craig. Great. I
21:30
appreciate being here and listening to your podcast. I really
21:32
enjoy it. Thank you
21:34
so much. You
21:36
know, Michael, I'm going to steal the first
21:38
question here because I'm geeking out over
21:40
having a cybersecurity. All right, fair enough,
21:42
but don't take too long. what I want to do for
21:44
a living. Yes. I
21:46
get the first question. So Craig, I looked through
21:48
your career when we were going through this. We
21:50
knew were going to have you go on the
21:52
show and I was blown away by the way,
21:54
just fascinating career. But
21:57
I wanted to ask first
21:59
about your career breaking into
22:01
computer networks professionally in the
22:03
U .S. Air Force Information Warfare
22:05
Center. What is the wildest
22:07
lesson that you learned from your
22:09
experience there? And how is that
22:11
stuck with you through to today? Yeah,
22:14
so I'll probably back up and clarify
22:16
a bit then. So I got hired
22:18
to work with the group of guys
22:20
out of San Antonio, Texas, who are
22:22
the founding members of the Air Force
22:24
Information Warfare Squadron. So they were spinning
22:26
up a company at the time called
22:29
Wheel Group, and I got hired initially
22:31
to do network pen testing. I
22:33
was doing it before, I eventually they
22:36
found out like a code. So I
22:38
became the lead exploit developer for the
22:40
vulnerability scanner. So I wrote all the
22:42
network attack libraries. But I think one
22:44
of the most interesting things I learned
22:46
from doing this type of work early
22:49
on, even before I started at Wheel
22:51
Group was just how the smallest error
22:53
could lead to a mass compromise, right?
22:55
So one of the earliest things I
22:57
did was we're going up against the
22:59
network to do an audit. and they're
23:01
pretty well secured. But I
23:04
found one night that they had an
23:06
FTP server. And of course, I'm going
23:08
through the FTP server, and I find
23:10
one directory buried deep, deep down in
23:12
it that was writable. And
23:14
in this directory, they had
23:16
some scripts. And I determined
23:18
by reading these scripts that they were
23:20
writing to this directory and that something
23:22
inside the network was grabbing these scripts
23:24
or running them each night to do
23:27
maintenance. So at that point,
23:29
I was able to modify the script so that
23:31
when they ran, they would throw me back
23:33
an X term. So basically, I modified
23:35
the scripts. I went to bed. I
23:37
waited for the crons to kick off 2, 3 in
23:39
the AM. In the morning, I woke up in the
23:41
morning, and a whole bunch of terminals were on my
23:43
screen. So the
23:45
lesson here really is it
23:47
could be something so
23:49
small. It's just so easy
23:51
to mess it up. And it could
23:53
cause the whole things to fall like
23:56
a bunch of dominoes. So when we
23:58
talk about things like network segmentation, separating
24:00
your network, separation of privileges, things like
24:02
that, there's a reason and lessons behind
24:04
why this is all done. So
24:06
I think that's something we even still
24:08
see today. A lot of these lessons really
24:10
just happen over and over and over
24:13
again. And your adversaries will study you and
24:15
they will. As one of our advisors,
24:17
Rob Joyce, he's head of the NSA's tail
24:19
and access operations group. He said your
24:21
enemies will know your network better than you
24:23
do. Right. And I think that's
24:25
really true. So that that's really
24:27
one of the lessons I learned is that as
24:29
a red team, you really have to do
24:31
reconnaissance well. You learn a
24:33
lot about how networks work and you
24:35
just become really good at finding the cracks
24:37
that were left open often by accident.
24:39
And that's it. So that's one of the
24:42
lessons there is just understand your adversaries
24:44
may know your network better than you and
24:46
they're looking for the smallest edge to
24:48
get into that system. You know what's so
24:50
interesting is we're talking about this from
24:52
an enterprise standpoint, but even personal security, right?
24:54
It's the very same kind of methods
24:56
is your enemy's going to know you better
24:58
with all the data breaches and everything
25:00
going on and sometimes our own partners know
25:02
us. And it's the little things that
25:04
we're not thinking about that they spend all
25:06
day looking for a loophole. And we're
25:08
just going about our day to kind of
25:10
get in. So I think that's a
25:12
valuable lesson. It's also a lot of people
25:14
think about the whole if they know
25:16
so much because not that they have the
25:18
information is because they also analyze it.
25:20
They're analyzing you as in you're not doing
25:22
it to yourself and your friends and
25:24
family are not going to be analyzing you
25:26
like at least not to that degree. So
25:29
it's the little details too. Like you
25:31
got me thinking, Michael, about we were talking
25:33
earlier about posting things on social media
25:35
before the show started and not posting photos
25:37
and things. But if you notice that.
25:39
Sometimes you hear about these breaches and things
25:42
because they were able to read an
25:44
address on an envelope on a picture that
25:46
somebody posted on social media and somebody
25:48
Oh, yeah. Little things you don't think about
25:50
that can lead to, you know, something
25:52
bigger even, you know, in a personal perspective.
25:55
Share less is my advice.
25:57
Unfortunately, social media people, they
26:00
want to share a lot and, you know,
26:02
the dopamine hit, they want to get a like,
26:04
they want to get a like, but you
26:06
might be, you might be feeding someone who really
26:08
doesn't like you. Information that they really shouldn't
26:10
have and unfortunately can't control what your family does,
26:13
but. Yeah, you just have to
26:15
be exceedingly careful. Certainly if you're in
26:17
cyber security, I urge extreme caution for people
26:19
in the industry not to be posted
26:21
a bunch of personal stuff online. Oh
26:23
yeah, I think that's true for
26:25
everybody, but especially for those people. Yes,
26:27
definitely. Yeah, and I don't
26:29
post anything unless I'm traveling, then I'll
26:31
post it after I've already left the
26:33
place I was at. So if I'm
26:35
gonna be posting something. Yeah. So
26:38
you don't get doxxed or swatted and
26:40
all these other things that people have come
26:42
up with. And I never do it
26:44
in the area I actually live in. It's
26:46
only if I'm traveling and only after
26:48
the fact. Yeah. Yeah, definitely. So
26:51
you founded some startups that caught
26:53
the eye of giants like Cisco and
26:55
3Com. And I'm just, I'm very
26:57
curious about this. Like looking back, what
26:59
do you think gave you your
27:01
early ventures that age, that edge? And
27:03
how does that experience to shape
27:05
your, the way you look at things
27:07
like, you know, leading sand fly
27:09
security now. Yeah. So I've been pretty
27:11
fortunate, I think, because I was
27:13
early in the industry in cybersecurity. I
27:16
remember telling my father, he goes, oh, what do
27:18
you want to do when you get out of college?
27:20
This has been the late 80s, you know, so
27:22
I'm in my 50s now. And I said, I'm going
27:24
to do internet security. And he's like, what's the
27:26
internet, right? Back then it was called ARPANET, right? No
27:28
one knew what this thing was called. So.
27:32
So I think I've always had kind of
27:34
had this inclination to go in the
27:36
industry and I'm lucky that I've been able
27:38
to just kind of select good groups
27:40
of people to work with. I think that's
27:42
part of it is being able to
27:44
recognize that you have to have an A
27:46
team. Like a venture capitalist early on
27:48
told me that they will fund an A
27:50
team with a B idea, but they'll
27:52
never fund an A idea with a B
27:54
team. Right you always have the A
27:56
team can adapt the A team can adapt
27:58
right the B team does never adapt
28:00
and the B team will hire C players
28:02
and D players the players A team
28:04
tends to keep it very high high. So
28:06
i've been very lucky so i did
28:08
that work with real group we did a
28:10
Cisco secure scanner Cisco secure IDS we
28:12
got bought by Cisco. I then left to
28:14
start my own company that didn't automated. intrusion
28:17
detection response system that would
28:19
eliminate 95 % of all
28:21
false alarms. That got bought by
28:23
Cisco. And I was also sharing offices
28:25
with a company at the time called Tipping
28:27
Point Technologies. They did a multi
28:29
-gigabit line speed intrusion prevention system. And
28:31
so in exchange for helping them with
28:34
some early product design and some higher
28:36
some key personnel, they're giving my startup
28:38
free office space. So I've always just
28:40
managed to kind of find good people
28:42
to work with. I think that's really
28:44
part of my advice. And what was
28:46
funny was that With the company I
28:48
founded that got bought eventually at the
28:50
time I was thinking of trying to
28:52
apply to another job and this would
28:54
have been 1999 or so when I
28:56
was looking and there's this brand new
28:58
search engine that started I was like
29:00
I like these guys. They're doing a
29:02
really good job. And so I set
29:04
my resume in and they actually responded
29:06
and Then I never followed up. So
29:08
that was my chance to get the
29:10
Google Right so but again, it's for
29:12
some reason. I think there's a bit
29:14
of A bit of a sense that
29:16
you got to find something that looks
29:18
good, simple products, simple ideas, sharp people
29:20
that all kind of mixes together to
29:22
make it a good company. Complexity,
29:25
I avoid like the plague. You
29:27
were talking about your history. I
29:30
used ARPANET as well. And
29:32
I came from kind of the
29:34
world of the Bolton board systems
29:37
and the Bolton board systems. What's
29:39
your experience with that? Yeah,
29:41
pretty much the same deal. In
29:44
high school, the idea of
29:46
getting on a Unix system was just
29:48
unheard of. You had Apple II. And
29:51
if you wanted to learn and other computer
29:53
systems, the only way was to seek it through.
29:56
extracurricular activity, as we say. I
30:02
would sometimes give complimentary security
30:04
audits, as I would say
30:06
that too. Nothing
30:10
bad. It's just basically teenagers
30:12
mess it around. Yeah,
30:14
the bulletin board scene, I met a lot
30:16
of guys locally. We used to do dumpster diving
30:18
together. I still know some of these people
30:20
in the industry today. Never
30:23
dumpster dive out back of Apple
30:25
computer. drinks so much coffee, you
30:27
just get covered in coffee grinds, right? And
30:29
you just get a lot of funny
30:31
stories from that whole scene. So I had
30:33
a very similar background. In fact, I
30:35
learned to drive stick shift after we were
30:37
dumpster diving out back of TRW and
30:39
rest in Virginia one night. And, you know,
30:41
going through someone's dumpster is not illegal,
30:44
but it was three in the morning and
30:46
we were the only car around, except
30:48
for one other car, which is a police
30:50
officer. And he pulled us over because
30:52
he obviously wants to know what you're doing.
30:54
driving around this place and so we
30:56
pull over and i'm in the back seat
30:58
and the guy driving had expired license
31:00
and the other guy was someone at the
31:02
german passport so he couldn't drive and
31:04
you know he's asking you know what what
31:06
are you guys doing and he's like
31:08
oh nothing just driving around and he goes
31:10
are you guys into computers. And
31:12
the guy drives like, no, no, no, we're not.
31:15
And of course, he shines a flashlight to me in
31:17
the back seat. I absolutely covered with nine track
31:19
tapes and electronic circuit boards to just pull on this
31:21
dumpster. Yeah.
31:25
And at that point, I was the only one
31:27
with a valid license, a learner permit, and
31:29
the guy had expired. I had to
31:31
get this car at three in the morning. He
31:33
didn't do anything to us. He goes, he can't,
31:35
you know, again, we're just pulling out stuff. But
31:37
I had to learn to drive stick shift at
31:40
three in the morning grinding gears while with a
31:42
couple of my local BBS friends going back. So
31:44
yeah, I had definitely some good times back back
31:46
in the golden era, I guess they would say
31:48
the golden age, you know, that nowadays you could
31:50
probably get in trouble doing stuff like that. But
31:52
back then it's like, you know, whatever. Hey,
31:55
phone freaking. Yeah, all
31:58
sorts of stuff. You had a different form
32:00
of war driving. Yeah, yeah, yeah
32:02
was yeah, my mom would pick up the
32:04
phone and my I'd have a war dialer
32:06
going on all night She'd be interrupting it.
32:08
I'd get so mad at her all sorts
32:10
of stuff like that. Yeah, so good to
32:12
good times I I miss it in a
32:15
way the sound of the modem was is
32:17
almost like classical music It's something when you
32:19
hear it it instantly brings you back to
32:21
that time Yeah, yeah, yeah, exactly a second
32:23
you enjoy the nostalgia and then you imagine
32:25
yourself having to deal with it all the
32:27
time and you're like, no, it's okay Well,
32:30
our team here, we joke sometimes on
32:32
how much better the internet would
32:34
be if we only had modems and
32:36
gopher. Oh, yeah. Like
32:39
we did before you have to have
32:41
an internet license to get on the internet.
32:43
And it involves answering some questions about
32:45
the TCPIP protocol and stuff. We just weed
32:47
out a ton of trouble. Yeah, there
32:49
you go. That's true. Oh, cool. So
32:52
Craig, you've actually, you've
32:54
spoke at conferences like DEF
32:56
CON and USENIX, which
32:58
contain some of the best
33:00
security minds in the
33:02
industry. What's the most unexpected
33:04
question or challenge an
33:06
audience member ever threw at
33:08
you? I thought this would be
33:10
interesting. Yeah, so. They
33:13
use Nick, you know, they're fairly
33:15
peaceable actually, you know, if you're not
33:17
technically incompetent, right, they kind of
33:19
leave you alone. I was lucky. I
33:22
think I spoke at. Defcon
33:24
six. So I was
33:26
pretty early, pretty early on back then. I
33:29
think the first one I went to is
33:31
like Defcon three or four. I don't recall,
33:33
but back then everyone fit in a single room.
33:35
Like now I went, I went to the
33:37
first time to Defcon last year, like. over
33:39
20 years. I couldn't imagine. It's like 30 ,000.
33:41
It's absolutely crazy. I know. So
33:44
I went to the first four
33:46
in that little motel off
33:48
the strip. Yeah. I forgot the Alexis
33:50
Park. Was that it? Yeah. Yeah. All right. You
33:52
know, I might have run into you and just
33:54
didn't know at the time. It's hilarious. Two
33:56
ships passing in the dark, right?
33:58
Yes. Yeah. Yeah.
34:02
The last one, I think the first one I went to is at the
34:04
old Aladdin. There's
34:06
a lot of mischief. I was not
34:08
involved. I actually was working at the
34:10
time and it was security clearances and
34:12
stuff. I couldn't be involved with trouble,
34:15
but yeah, it was interesting times, that's for
34:17
sure. I hear
34:19
the stories from Def Con
34:21
about how it drives the hotel
34:23
employees and things nutty because
34:25
they have do special security for
34:27
their elevators. put
34:30
all of these special things in
34:32
place for their Wi -Fi because
34:34
you're dealing with the best minds
34:36
in hacking. And 30 ,000 of them.
34:38
So, you know, it's like nothing
34:40
is secure. Talk about the details,
34:42
Craig. When you're at DefCon and you run one
34:44
of those businesses where all the hackers are
34:47
coming in, and most of them probably just want
34:49
to see, can they? Because that's
34:51
what they do. They're curious. That's right.
34:53
I'm big into like, man, I've
34:55
always believed like keeping a low profile. Right.
34:57
You get away with a lot more stuff.
34:59
So, but, you know, if you go to
35:01
DEF CON and you got to tend those
35:03
hanging out of your backpack, you look like
35:05
a porcupine. Like they're going to ask you
35:07
some questions. The Vegas security now, I mean,
35:09
they're not going to mess around. So like
35:11
I would recommend people if you go, keep
35:13
a low profile. A
35:15
lot better, a lot better time
35:18
usually. Yeah. Make
35:20
sense. So sand fly
35:22
security, you know, you focus on
35:24
agentless intrusion. We talk about it when
35:26
we're discussing sand fly every week,
35:28
and that's a very unique niche for
35:30
focusing specifically on Linux as well.
35:32
And I'm just curious, like what sparked
35:34
your idea? We've learned a little
35:36
bit about your career and history for
35:38
sand fly. And why did you
35:40
decide Linux needed some special attention there?
35:43
Yeah, so I'm a longtime Linux
35:45
user. Like and probably back since I
35:47
mean back when used to come
35:49
on the front of a magazine as
35:51
a floppy disks kind of like
35:53
slackware kind of right. Yeah
35:55
exactly. Yeah. Yeah. So
35:57
I've used it for a long
35:59
time. Linux is interesting because
36:01
everybody thinks the internet runs on
36:03
Windows but that's like not. I
36:05
know. Absolutely. There's absolutely the
36:08
wrong case right so it
36:10
runs. like 90, 95 %
36:12
of all cloud workloads is in
36:14
industrial control systems, critical infrastructure. I
36:17
currently live in New Zealand. When I got
36:19
on a flight, the in -flight entertainment system on
36:21
my headrest is Linux. I've seen it reboot, right?
36:23
So just like, it's all over the place. And
36:26
I saw as a pen
36:28
tester, I had an
36:30
early experience when we were auditing a
36:32
hospital that really burned my mind
36:34
what's going on. And what happened is
36:36
I broke into a system there. It
36:39
was an AIX system, but it could have been Linux. It
36:41
doesn't matter. The main thing about it
36:43
though is had an uptime of four years, meaning
36:45
that the system had been up for four years
36:47
without any reboots, without any patches. And I knew at
36:50
that point we were home free. We were not
36:52
going to be caught because nobody was looking at that
36:54
system. I call them internet crack
36:56
houses. Right. These systems that are unwatched. Nobody
36:58
knows the criminals move in, you know, pretty
37:00
scenes of graffiti show up. There's some cars
37:02
are being broken into, you know, this is,
37:04
it's the same deal, right? An abandoned house
37:06
in neighborhood. So a lot of Linux ended
37:08
up like that. A lot of Linux is
37:11
not being real watched. Linux
37:13
is an umbrella term. So it means everything
37:15
from a big GPU cluster up in the cloud
37:17
down to these Raspberry Pis that I'm surrounded
37:19
with here at home, right? So, so when someone
37:21
says, oh, we're going to load an agent
37:23
on Linux, my first question is, what, you know,
37:25
what distribution? you know, what hardware,
37:28
you know, what patches, can I update my system?
37:30
If I update the agent, is it going
37:32
to break the system, right? It's critical infrastructure. So
37:34
you find out the marketing says we're going
37:36
to do this, but the actuality is very, very
37:38
narrow in terms of what they can actually
37:40
do. So with all these problems
37:42
we find with stability and performance impacts in
37:45
the agent, I said, what if we got
37:47
rid of the agent? All right, this is
37:49
the biggest problem. So what if we just
37:51
got rid of it? Well, if we got
37:53
rid of it, then at that point, we
37:55
could run on way more Linux systems. We
37:57
could run on systems that are over a
37:59
decade old, all the way back to Linux
38:01
kernel 2 .6 .32, all the way up to
38:03
a modern system. We could run on Intel,
38:05
AMD, ARM, MIPS, IBM Power, and IBM S390
38:07
CPUs. So that means we cover everything, yeah,
38:09
everything from cloud systems to embedded devices, you
38:12
know, ubiquity routers, Synology systems,
38:14
appliances. Raspberry Pi,
38:16
we run on IP cameras.
38:19
So you might not think, but there
38:21
are people who sell access to
38:24
cameras. So you say, I want to
38:26
see all the cameras in Boston. There are
38:28
Israeli companies are like, right, we can get you
38:30
access those cameras. And it's like an on -the
38:32
-ground recon, right? So you don't want people in
38:34
your cameras. But the idea here is, let's
38:36
watch everything and do it without an agent so
38:38
we get this massive visibility. And that's really
38:40
what Sandfly that's kind of how it came about.
38:43
I just got tired of seeing unwatched Linux.
38:45
It just it really bugs me because I know
38:47
it's a really big problem. A really big
38:49
threat. Yeah, that's a good point. A lot of
38:51
like especially embedded devices and stuff like where
38:53
they don't even bother to do updates or anything.
38:56
And so like that plus all the
38:58
companies who just assume that everything
39:00
is good to go and don't even
39:02
bother to do any updates is. Yeah,
39:05
absolutely. There's IoT devices that run Linux,
39:07
but like you said, they don't get
39:09
patched. And I don't even know the
39:11
company's names. Like you purchase these
39:13
things for cameras where I've never even heard of
39:15
them before. Medical devices
39:17
are another big problem. You know, I
39:20
tell people like one of my nightmares
39:22
is I wake up in the hospital
39:24
after the auto rack and I see
39:26
the morphine drip as a Wi -Fi
39:28
sticker on it. Yeah, just like it's
39:30
like the it's a bad, you know,
39:32
we recently last week, you know, some
39:34
heart monitors out of China, they were
39:36
found to have a backdoor communicating back
39:38
to IP addresses in China with patient
39:41
doctor information that was built in the
39:43
firmware. So it's just like, yep, you
39:45
know, it's all over and
39:47
it's not it's not closely watched.
39:50
Yeah, that's true. And I think people
39:52
have like a A false sense of
39:54
security and sense in a way too
39:56
because of the it's known as being
39:58
so good in terms of security wise
40:01
They just kind of assumed they don't
40:03
have to do anything but we were
40:05
talking about the agent list stuff and
40:07
The agent list stuff basically sounds kind
40:09
of like almost magical So could you
40:11
peel back the curtain a bit for
40:13
that? And you know without giving away
40:16
the secret sauce unless that's that's sure
40:18
dressin. Yeah And explain how sand fly
40:20
pulls that off in a way that's
40:22
different from the traditional approach. Sure.
40:24
So the basic function of the
40:26
product is we work a lot like
40:28
Ansible and that we come in
40:30
over SSH because SSH is ubiquitous protocol
40:32
Linux. We
40:34
have a specialized binary written
40:36
in Golang that is
40:39
purpose built just to do
40:41
Linux forensics investigation, meaning
40:43
that it's fully self -contained.
40:45
It's statically built. We don't
40:47
trust the remote system. We assume the
40:49
remote system is compromised and that the
40:51
libraries are all compromised. That's trying to
40:53
trick us. And then it's specially designed
40:55
to investigate for processes, files, directories, users,
40:57
system deservices, scheduled tasks, and also mechanisms
40:59
to de -cloak people are trying to
41:01
hide, like loadable kernel module rootkits, things
41:03
that are trying to hide from us.
41:05
We have mechanisms to build that. So
41:07
essentially it goes over, it will investigate
41:09
the areas we instructed to, and then
41:11
that forensic data is removed from the
41:13
system and then taken back to our
41:15
server where we could further analyze it.
41:17
But not just for size of intrusion,
41:19
we also look for things such as
41:21
SSH keys. We track SSH keys.
41:23
We want to know who's using those keys, where
41:25
they are, first time we saw them, what they
41:27
have access to. We could do drift
41:30
detection, meaning we could tell you if a new
41:32
process has started that we've never seen before. New
41:34
users have been added, things like that. And
41:36
we could also do things like a password
41:38
auditor. We could send over a list of
41:40
the worst passwords for Linux, and we could
41:42
check your user accounts directly on the device
41:44
to say, hey, this camera has some accounts
41:46
that are going to need to be compromised.
41:49
But the idea here is to cover a wide range of
41:51
these Linux all these
41:53
Linux attacks attack vectors. Because as
41:55
I discussed earlier, Linux is not
41:57
Windows. So Windows attackers tend to
41:59
use like a pre -compiled binary. Windows
42:02
is great in that binaries are hyper
42:04
compatible across all the systems. It's not
42:06
necessarily true on Linux. But
42:08
Linux attackers tend to work differently
42:10
too because they use a lot of
42:12
living off the land tactics, meaning
42:15
that built -in commands. And I tell
42:17
people, Linux is not an operating system.
42:19
Linux is a programming language. You
42:21
can just drink together all the Linux
42:23
commands that are benign and I can
42:25
make them very malicious. That's
42:27
just the power of Linux. That's why we love the
42:30
thing. That's how the
42:32
attackers know this. A lot of attacks might
42:34
not load any malware at all. They might
42:36
simply use the built -in utilities of Linux to
42:38
execute their attack. Think how many systems are
42:40
Netcat built in? You know, or even end
42:42
map. Why do you have a port scanner
42:44
loaded by default? I don't know. Why not?
42:46
It's Linux, right? You know, or
42:49
just basic bash shell could run
42:51
a verse shell if you know what
42:53
commands to run. So for us,
42:55
the secret sauce, as it were, is
42:57
just that we know how these
42:59
tactics are. We're specially designed just for
43:01
this task. We're not doing configuration
43:03
management. We're not tracking vulnerabilities. We just
43:05
don't care. We're really looking for
43:07
compromise. Linux systems and for the
43:09
things that are going to get you compromised such
43:11
as for instance again, a private key hanging
43:13
around that's unencrypted or keys that might be moving
43:15
things like that. So that's kind of the
43:18
whole, that's not it in
43:20
a high level what we're
43:22
doing. Well, you know, we
43:24
were just talking about on
43:26
the show how Linux powers
43:28
everything from servers to IoT
43:30
devices and routers, but it's
43:32
often seen as secure by
43:34
default. How is the landscape
43:36
changing where we can no
43:38
longer just assume Linux doesn't
43:40
require additional security? You've
43:42
touched on this a little bit, but can you
43:44
expand? Linux is
43:46
secure by default. If you don't do anything
43:48
with it, I'll then set it up and plug
43:50
it in, right? So like, you know, if
43:52
I put up on a Ubuntu server red hat
43:54
sensors, it's got SSH port 22, getting
43:57
hacked are very, very low,
43:59
but nobody uses a system in
44:01
that configuration. They're loading up. other
44:03
applications, third -party software, they're making
44:05
changes, they're making
44:07
temporary fixes, things like
44:09
that. So that's how Linux gets hacked,
44:11
or it's a user gets an
44:13
SSH key stolen or they pick a
44:16
bad password or something. That's how
44:18
it's getting hacked. So while it is
44:20
secure out of the box, for
44:22
the most part, is what people do
44:24
to it once it's put in
44:26
an operational setting, that's where the problems
44:28
begin. So you look at something
44:30
like Log4j that hit systems a couple
44:32
of years ago. It was buried
44:34
deep down inside a complicated java library
44:36
and there's no way. Anybody
44:38
is going to find it right not not
44:40
pie you know i mean obviously so found
44:42
out to exploit it but it just from
44:44
a linux security perspective there isn't there's no
44:46
change you could have made to that default
44:48
linux box that would have really helped you
44:50
maybe you had se linux enabled it would
44:52
have. Prevented the attack but you
44:54
know there's there's other things like that
44:56
as well just barely configured server again
44:58
bad passwords things like that so. I
45:01
would never assume Linux is secure if
45:03
people are using it operation. You got
45:05
to keep an eye on it. I
45:07
think would be the short answer there.
45:10
I think that's really important lesson for
45:12
people because we, I think, take for
45:14
granted how protected we've been in the
45:16
Linux world. You know, you compare it
45:18
to giving your family members a Windows
45:20
PC. They're going to have malware and
45:22
everything on it maybe within a few
45:25
hours if they don't know the internet
45:27
very well. Whereas Linux, can unfortunately, they
45:29
can't get their toolbars anymore. Yeah yeah
45:31
i'm with linux you know you could
45:33
give your family a secure system and
45:35
you could go very long time without
45:38
having to worry about all my desktops
45:40
taken over an ads are popping up
45:42
every time over more rows are all
45:44
that type of stuff but the landscape
45:46
is changing quite a bit. Um, and
45:48
Linux is still very secure by default,
45:50
but what I mean by changing is
45:53
there's so many things that utilize extensions
45:55
to these days like Python, you know,
45:57
there's these extensions that community members are
45:59
writing. We just assume. That
46:01
it's secure when we're installing these things because
46:03
we read this article that says, oh, you
46:05
want to get this module working, then install
46:07
this pip, and then you get this extra
46:09
code and it works and you're great. And
46:11
then we're assuming they're doing due diligence, but
46:13
wait, they're not. Yeah, it is a problem.
46:15
Yeah, you know the supply chains attack have
46:17
been a problem for a big year for
46:20
many years, you know in our product for
46:22
instance see. Generally speaking like if
46:24
we use a third party library, it's usually
46:26
coming from Google or something we versioned into
46:28
the product and we hand audited the whole
46:30
thing, right? But you know, even today I
46:32
read about people using VS code and there
46:34
was a theme out there that was malicious. You
46:37
know, we even instruct our employees like. Any
46:40
browser plug and you have to get approval
46:42
from us for reloading you know same thing
46:44
for these things i mean i would just
46:46
be. I urge people be exceedingly careful if
46:48
if you're using these plugins and these themes
46:50
and stuff if they're not coming from the
46:52
vendor or from a vendor you trust, you
46:54
know like it, you know Google or something
46:56
like that, right? I'd be exceedingly careful I
46:58
mean you're if you're pulling a random code
47:00
from ninja 69 off GitHub You know ninja
47:02
69 might a cool dude, but you know
47:04
ninja 70 who takes the project over in
47:07
a year and a half He might not
47:09
be so nice and you don't know right
47:11
the maintainers change and you're like I
47:13
don't know right you know that things
47:15
can happen so i wanna ask a
47:17
follow up to this you know we're
47:19
talking about kind of security in general
47:21
but there's always a question in linux
47:23
community we've tried to cover it multiple
47:25
times on the show are anti virus
47:27
or anti malware solutions needed on linux
47:29
do you feel like. The answer is
47:31
changing, and where do you stand on
47:33
this so far? Yeah, we get asked
47:35
that certainly about ransomware. They're like, why
47:37
is that ransomware taken off more Linux
47:39
and stuff? And it has, it's in
47:41
like ESXi systems and stuff, which are
47:43
not really Linux, but that's a separate
47:45
topic. I think Linux is funny because
47:47
it's not, each Linux system
47:49
is almost like a bespoke custom thing
47:51
when an attacker gets onto it. You
47:53
think about an or it could take
47:55
a basic Linux and then the power
47:57
of it is you can customize it.
47:59
You have the sources, you can compile,
48:02
you can do all these things. The
48:04
malware has a harder time, I think,
48:06
really spreading because of that. But with
48:08
that said, a lot of
48:10
desktop distros tend to be fairly uniform,
48:12
and we are seeing some malware starting
48:14
to go after desktop users. Do
48:16
you need to run anti -matter or anti -virus? Yeah,
48:19
I don't know that that's probably up
48:21
for the end user to decide I
48:23
would just say that a lot of
48:25
antivirus scanning is kind of a windows
48:27
idea this idea that I'm gonna scan
48:29
all these files and they're not gonna
48:31
change you know the Binary's tend to
48:33
stay the same Linux malware tends to
48:35
be very very fluid a lot of
48:37
its open source It's easily recompiled. It's
48:39
easily modified. It's easily obfuscated
48:41
So the anti -mower stuff on
48:44
Linux from my experience is like,
48:46
yeah, it can maybe find really
48:48
low -grade stuff, but really the more
48:50
sophisticated or even moderately sophisticated, it's
48:52
gonna have a tough time. That's
48:55
my experience with it in general.
48:57
So do you need to run it?
48:59
Your system like Sandfly would catch
49:01
it at the point where it's starting
49:03
to change files or somebody's getting
49:05
into the system utilizing something like that,
49:07
right? Yeah, we tend to be
49:09
what's called a tactics hunter. So
49:11
we're, if you run our product, we're not
49:13
looking for the specific piece of malware. Like it
49:16
could be like the Fubar A worm. We
49:18
don't care because tomorrow is Fubar
49:20
B, Fubar C, Fubar D, but the
49:22
tactics of how they maintain presence
49:24
on these systems is really what we're,
49:27
you know, how did they hide? How did
49:29
they get in? So I use an example
49:31
of a log file cleaner. So on
49:33
Linux, he has some log files like WTemp
49:35
and UTemp. These just show who logged in.
49:38
So there's three main ways to modify
49:40
log file. One, you can nuke it
49:42
from orbit. I just delete the thing.
49:44
But that's fairly obvious and crypto miners
49:46
do it because they just don't care.
49:48
They're just trying to steal some CPU. The
49:51
second way in is I could go to the
49:53
line that showed me logged in and I could
49:55
overwrite it with nulls, overwrite it with zeros. Because
49:58
the Linux auditing tools, when they see an invalid entry,
50:00
they don't say this is invalid. They just skip
50:02
it, don't tell you anything. The
50:04
third way is I could delete the entry
50:06
and then splice the log back together again, kind
50:08
of like you're editing a film. And
50:10
at that point, you just won't see the entry. So
50:12
these are the three ways the logs can be edited. Now,
50:14
there are dozens of log cleaners that do this. The
50:17
traditional Windows malware approach would be like,
50:19
we're going to scan the binaries for these
50:21
signs of these log cleaners, where our
50:23
approach is like, we're just going to look
50:25
at the log. Because if we
50:27
could tell you nuked it, or if we could tell
50:29
that lines went over rent with zeros, which is
50:31
very unusual on Linux, or we could see a splice
50:33
has happened, I don't care what
50:35
tool did it. It's completely irrelevant. The tactic, we
50:38
made the tactic no longer work. That's
50:40
how we get into it, even with some of this sum. More
50:43
advanced novel malware. They say it's
50:45
undetectable has been detected. We can often
50:47
find it because the tactics are
50:49
Revealing, right? You know Linux is the
50:51
premier open source operating system the
50:53
world and the word is open, right?
50:55
So the minute we see something
50:58
trying to hide You've got our attention,
51:00
right? We got attention and we
51:02
actually I actually like it when you
51:04
try to hide you make my
51:06
job easier I have a video on
51:08
YouTube called ninja on rooftops, which
51:10
explains this concept Okay, the idea
51:12
with ninja on rooftops is You want to
51:15
break into a bunch of houses? So you
51:17
go on YouTube and the guy's like yeah
51:19
putting your ninja suit start running around no
51:21
one sees you So you run around you
51:23
run across the roofs you're just a black
51:25
and you're getting away with Murder, but that
51:27
works great until the first time someone sees
51:29
you and at that moment the fact that
51:31
you're wearing a ninja outfit in the middle
51:33
the night Has everyone's attention and they're all
51:36
looking for you right and that's the same
51:38
Michael Yeah, that's the same
51:40
with a lot of the stealth malware
51:42
and hiding right the minute you hide
51:44
in my plans Yeah, exactly. Yeah, the
51:46
minute you hide if we see how
51:48
you hide at that point We know
51:50
it's absolutely malicious intent and we could
51:52
see you everywhere. So that Oh, that's
51:55
interesting. you go. So that's that's why
51:57
we focus on tactics honey versus specific
51:59
malware signatures. Nice, you know,
52:01
there was a There's a major breach
52:03
that took place with telecom not
52:05
so long ago and What
52:07
was interesting is was it ninjas.
52:09
It was a course on rooftops
52:11
naturally michael. But what was interesting
52:13
about it is some of the
52:15
companies did catch it before it
52:17
got through. which wasn't as reported.
52:19
But of the companies that got
52:21
caught, you were talking earlier about
52:23
the fact that there are systems
52:25
that don't get looked at. And
52:28
as I understand how this hack
52:30
worked is essentially telecom companies that created
52:32
backdoors for government operations when they
52:34
had certain subpoenas and different things to
52:36
be able to go in and
52:38
look at those records. And of course,
52:40
Since it's just supposed to be
52:42
for the government only the government used
52:44
it and nobody monitored the fact
52:46
that people were going on to these
52:48
systems and using them for nefarious
52:50
purposes because it's just for the government
52:52
and so that's how they kind
52:54
of got got and it's it's those
52:56
like you said those little things
52:58
number one because it's just this little
53:00
section of an overall telecom business
53:02
this little back door that you put
53:05
in it's just for government officials
53:07
that get got. But also they didn't
53:09
try to really hide because nobody
53:11
was watching So they were just going
53:13
through and getting into the systems
53:15
and started pulling all of the records
53:17
that the government was interested in
53:19
There people they had subpoenaed and other
53:21
stuff. So it was kind of
53:23
an interesting But it's only for the
53:25
government how dare they yeah, I
53:27
know right I always tell people Michael
53:29
Michael it was a government. It
53:31
just wasn't the US government. Oh, right
53:37
Exactly. It's probably accurate. And also,
53:39
just by the way, I had
53:41
to look it up. GitHub,
53:43
Ninja69 and Ninja70 do exist,
53:45
but no one sees them. Oh,
53:48
there you go. I
53:50
don't want give you any ideas. So
53:55
Craig, if I see
53:57
a Ninja with Sandfly Security,
54:00
there is an update, and it's there
54:02
in a minute, because there's a
54:04
Ninja. Exactly.
54:07
That's how you know ninjas are real because you never see
54:09
them and that proves they're real. That's
54:11
right. You never know what you're going
54:13
to run across one. Love
54:15
it. A lot of our listeners
54:17
work in the field during the
54:19
enterprise and then there's a lot
54:21
of listeners who are wanting to
54:23
break into the field or they
54:25
are just working enthusiasts who manage
54:27
home labs or servers at home.
54:29
And maybe one day, hoping that
54:31
they'll have a chance to break
54:33
into the industry in one way,
54:35
their career being focused around Linux
54:37
and cybersecurity and those things. So
54:39
what's some career advice that you
54:41
would give for people who are
54:43
looking to maybe become an employee
54:45
at Sandfly one day, let's say?
54:47
That's an interesting question. When I
54:49
got in the industry, there were
54:51
no certifications. There was no cybersecurity
54:53
degree. Just none of this
54:56
stuff exists. I think it was just
54:58
the people who did it, did it
55:00
out of pure burning. interest and passion,
55:02
right? I've always liked the topic. I
55:04
think in general, we still
55:06
kind of adopted that same philosophy. We
55:08
just want to get people who are
55:10
really good. I don't even necessarily look
55:12
at your degree. One of the best
55:14
developers I've ever worked with was a
55:17
high school graduate only, right? So I
55:19
think more and more especially, it's
55:21
more of show me the code. I
55:24
think you've been seeing Elon and these guys kind
55:26
of the same way. The way they're hired is show
55:28
me your GitHub repo. And, you know,
55:30
so I'm not, you know, that might be
55:32
one thing to look for is like, you know,
55:34
I would, if I was going to cybersecurity, there
55:37
are a lot of people who say you don't need to learn out of code. My
55:39
advice is I would ignore that advice.
55:41
It helps you immensely. I'm
55:44
not asking, look, you don't need
55:46
to be the second coming of, you
55:48
know, the greatest coder ever. But
55:50
I think to have some familiarity with
55:52
a language like a Python is
55:54
common C, of course, it's all
55:56
over maybe, you know, one of the
55:59
new. memory safe languages like go like a
56:01
rust might be useful as well, but
56:03
the house of basic fundamental knowledge of programming
56:05
would be helpful. I would also encourage
56:07
people, um, you know, to have some basic
56:09
network experience as well. And you know,
56:11
a lot of people who are really good
56:14
at cybersecurity started off on the admin
56:16
side. They knew how to run and maintain,
56:18
uh, uh, Linux networks. Um, and
56:20
they ran on the problems because a lot
56:22
of times security is really just problem solving
56:24
on a different level, right? So being able
56:26
to debug network and Linux system issues and
56:28
things like that. That that's how you get
56:30
into the more deeper forensics and understanding of
56:33
intrusions. And of course, just a natural interest
56:35
as well. You know, there might be some.
56:37
Interesting courses, you know, you could take out
56:39
there for basic hacking and red team. I
56:42
want to get obsessed with the certificates.
56:44
Like I said, we don't really look at
56:46
them, but if it helps you. to become
56:48
better at your skill, you might want to
56:50
consider doing that. So I would try to
56:52
focus on practical knowledge. The other
56:54
thing I would say about cybersecurity is when I started,
56:56
it was, you could be very
56:58
broad, but now you got
57:00
to, I think eventually, yes, you want
57:03
some breadth in your knowledge, you
57:05
need to have some specialization. It's just
57:07
such a deep field that, you
57:09
know, I would encourage you to have
57:11
broad understanding, sure. Do you need
57:13
to understand basics of cryptography? Sure. Do
57:15
you need to be a crypto
57:17
analyst? No, probably not. But you might
57:19
want to have a specialty where
57:21
you're like, I focus on network attacks,
57:24
detection, or I focus on, like in my case,
57:26
I tend to focus just on Linux forensics. I
57:28
don't really know much about Windows anymore. I had to
57:30
specialize. I think that's what you're starting to find
57:32
more and more is you need to pick something you
57:34
like doing. It might be Windows. I'm not knocking
57:36
if you like Windows, but it needs to be,
57:38
I think you need to have some type of
57:40
specialty would be really helpful and some technical knowledge
57:42
to back it up. I think that's
57:44
fantastic advice and i know you know i
57:47
grew up my father had a small computer
57:49
business and i remember we would have people
57:51
come in for interviews and the best people
57:53
we hired were the ones who were. In
57:55
the basement building computers all the time and
57:57
working on them and fixing them and the
57:59
ones that came in with the degrees. Like
58:02
they had the book knowledge but then we
58:04
one of our tricks we would do is
58:06
a test for interviews we would switch the
58:08
power supply in the back from two from
58:10
one ten to two twenty and so we'd
58:12
be like you broke fix it and the
58:14
ones who were you know the basement dwellers
58:16
working on their computers all the time. Maybe
58:18
five minutes and they would see that switch
58:20
and flick it back the ones that just
58:22
went to school and had all the certificates
58:24
they never would solve it just there just
58:26
perplex tearing everything out of the machine putting
58:28
it back in like. They had the book
58:30
knowledge, but they didn't have that experience and
58:33
that love and that curiosity, I
58:35
think would probably be something that would
58:37
be really important for there. You know,
58:39
we talked about, or you
58:41
talked about for a moment, kind of desktop
58:43
distros, you talked about being surrounded by
58:45
raspberry pies and things. So what Linux distro
58:47
do you run out of curiosity? Keeping
58:51
with my off -sec, I won't
58:53
tell you. I love
58:55
it. I won't tell you.
58:58
I'm sorry. Um, so because
59:00
like I said, like my name
59:02
is confirmed. No, I'll
59:04
tell you that. I don't write Cali.
59:06
I don't write Cali Linux, but I
59:08
don't mean to be hyper paranoid, but
59:10
it's like, uh, as I think I
59:13
told you guys early on, my, my
59:15
name is actually shown up inside on
59:17
Linux malware as a variable name, right?
59:19
So I'm just. The people out there
59:21
who will target me specifically so I
59:23
we understand we don't even if you
59:25
see my videos if you think you
59:27
know what distro I'm running. I might
59:29
just change it. You just don't really
59:31
Yeah, the next time I'm not doing
59:33
Ryan what it is. Yeah I've tried
59:35
almost all the desktop distros. They're so
59:38
good today I know people joke about
59:40
Linux on the desktop, but I switched
59:42
from Mac Several years ago just the
59:44
full -time Linux on desktop. I don't
59:46
miss a single thing about the Mac
59:48
So i most all the desktop dishes
59:50
today are really really good if you
59:52
have it's just amazing how much better
59:54
they've gotten in just the years i've
59:56
been in linux it's just been incredible
59:58
change but speaking of desktop users. Sand
1:00:01
fly introduce something really cool this week
1:00:03
and i wanna make sure we have
1:00:05
a chance to talk about that so
1:00:07
what did you all what were you
1:00:09
all up to this week you cookin
1:00:11
something cool yeah so over the years
1:00:13
we've generally been viewed as kind of
1:00:15
an enterprise. security software product
1:00:17
now we always offered a free version of
1:00:19
the license because you know we we contribute
1:00:21
that open source projects we donate money and
1:00:23
stuff so we kind of like to have
1:00:25
something free for the linux users but the
1:00:28
free version had restrictions in it such as
1:00:30
how many you could only see five results
1:00:32
at a time and other things are shut
1:00:34
off and so you could use it but
1:00:36
it was kind of a hassle so over
1:00:38
time people have said hey look i'm willing
1:00:40
to pay for a license But
1:00:43
can you sell me a license? But we
1:00:45
just didn't really have it set up to
1:00:47
do that. So this week, we introduced some
1:00:49
new licensing tiers. So we have a home
1:00:51
user edition, a pro license, which is more
1:00:53
commercial user, and then our traditional air gap
1:00:56
license, because our product work on an air
1:00:58
gap networks. So the
1:01:00
home user license is designed really for
1:01:02
a home lab. So it's basically
1:01:04
for $99 a year, you can watch
1:01:06
10 hosts without. Pretty much any
1:01:08
restrictions. The only restrictions really have is
1:01:10
like there's no SSO. There's no,
1:01:12
you can't export to Splunk. So like
1:01:14
these enterprise features are not there,
1:01:17
but everything else in the product is
1:01:19
basically unlocked. So at
1:01:21
that point, you know, $99 for 10
1:01:23
hosts, we think it's a pretty
1:01:25
good deal. We also for, because
1:01:27
we're destination Linux fans, we also
1:01:29
are offering a coupon code. If you
1:01:31
go to the website, you put
1:01:33
a destination 50 and the coupon code
1:01:35
will give you 50 % off. So
1:01:38
for about 50 bucks a year
1:01:40
for 10 hoes, you think it's a
1:01:42
pretty good deal. All the features
1:01:44
are basically unlocked and you could use
1:01:46
the product at home. This
1:01:49
isn't an academic thing. A
1:01:51
lot of nation -state hackers
1:01:53
have been going after SOHO
1:01:55
and home networks and using
1:01:57
them as relay points to
1:01:59
attack. nation -state critical infrastructure. So
1:02:01
we actually want you to check your Linux systems
1:02:03
at home. We actually feel it's an important thing
1:02:05
to do for various reasons. But yeah, get the
1:02:07
product to go. We feel
1:02:09
it's a pretty reasonable price for a
1:02:11
full -featured, agentless Linux EDR. Very
1:02:14
generous too. 10 hosts is extremely
1:02:16
generous, I think more than what most
1:02:18
people would need. Jill needs about
1:02:20
100 ,000 of those licenses because she
1:02:23
runs a museum, but I've heard about
1:02:25
the museum. Maybe
1:02:28
think about it, it's like it's less than a
1:02:30
dollar a host, really. Yeah. Yeah. And
1:02:32
that's with the regular one. If you have
1:02:34
the coupon code destination 50, it's even
1:02:36
less. It's 50 cents a host. So,
1:02:38
I mean, yeah. I think the education
1:02:40
piece of this is fascinating, though, too. I
1:02:42
love what you said that, hey, this
1:02:45
isn't just about education, but hey, all those
1:02:47
people, I mean, we get Questions all
1:02:49
the time people wanting to break in the
1:02:51
industry and things you gotta use these
1:02:53
tools you gotta understand how they work you
1:02:55
wanna be able to show and demonstrate
1:02:57
if you're going into an interview that you've
1:02:59
you know utilized and played with enterprise
1:03:01
grade software before and integrated with it and
1:03:03
so i think it's very important thing
1:03:05
from an education standpoint. People check
1:03:07
this out as well. It's it's incredible
1:03:09
offer. And if you haven't checked out sand
1:03:11
flies interface by the way, who writes
1:03:14
your guys is gooey because it's incredible man.
1:03:16
Yeah One of our team members down
1:03:18
here losing Christchurch and I'm sure he'll listen
1:03:20
to this part of the show and
1:03:22
he'll be quite happy to hear it Yeah,
1:03:24
we we have a Definitely a lot
1:03:26
of compliments. We put a lot of work
1:03:28
into the user interface just because if
1:03:30
the product is easy to use and easy
1:03:33
to understand it helps helps a lot
1:03:35
so You know litics in particular I'd
1:03:37
say for your people who want to get into cybersecurity, I
1:03:39
tell people I would focus
1:03:42
on Linux because it is our
1:03:44
experience that in the InfoSec
1:03:46
teams and big companies even, they'll
1:03:48
have a lot of people doing Windows
1:03:50
and a relatively small number of people looking
1:03:52
after a lot of Linux boxes. They're
1:03:55
horribly outnumbered. So if you were to start
1:03:57
putting a focus, you're like, I want
1:03:59
to get into cybersecurity. I
1:04:01
like Linux. Yes. Those are good
1:04:03
things to marry together. You
1:04:05
could walk into a very high -paying job.
1:04:07
If you understand Linux intrusion detection, incident
1:04:10
response and forensics, you become extremely
1:04:12
valuable, much more than someone focusing
1:04:14
on Windows. That's my advice. Love
1:04:16
it. It's kind of like
1:04:18
the reverse of saturation because there's
1:04:20
so much abundant opportunity at this point.
1:04:23
You were talking about the UI
1:04:25
and Ryan was complimenting it. You
1:04:28
said something that was really Interesting
1:04:30
to me because it's about making
1:04:32
it easy to use, but also
1:04:34
the back end is still being
1:04:36
is very powerful to be able
1:04:38
to do all the detections and
1:04:40
everything. And as a UX designer
1:04:42
myself and a marketer myself, this
1:04:44
was something that it basically like, I don't
1:04:46
know if you could tell, but as soon
1:04:48
as you said it, I was just smiling
1:04:51
because it's like that's exactly thank you because
1:04:53
a lot of people don't look at the
1:04:55
fact that the front end is just as
1:04:57
important as the back end. The back end
1:04:59
needs to do everything but if people are
1:05:01
not gonna enjoy using it or not gonna
1:05:03
do it very efficiently. It's
1:05:05
not gonna do it's the back end
1:05:07
work as well so i really
1:05:09
like the fact that sam fly takes
1:05:11
that in consideration. Yeah great
1:05:13
i appreciate that i'll pass it on
1:05:15
and also i think it's time it's
1:05:17
time in the show it's time interview
1:05:19
we're gonna talk about a i. Everybody
1:05:21
are Michael every show you got to
1:05:23
bring this up. I'm not
1:05:25
really here by the way. I I'm
1:05:27
actually in the Bahamas. I'm having my
1:05:29
A .R. So we've
1:05:32
been covering a
1:05:34
I a lot
1:05:36
on this show
1:05:38
and sorry. We're
1:05:41
sorry and you're welcome. And it
1:05:43
continues to emerge as like the
1:05:45
next big technology, even though it's
1:05:47
already been. a technological breakthrough for
1:05:49
a long time it's still continuing
1:05:51
to grow at what kind of
1:05:53
threats does ai now add into
1:05:55
the landscape and how is sand
1:05:57
fly planning to incorporate ai and
1:05:59
overcome these threats. I
1:06:02
think we've been seeing i being
1:06:04
used to make a lot of fishing
1:06:06
campaigns a lot more professional. Right
1:06:08
like i said it's no more
1:06:10
broken english from the nigerian prince it
1:06:12
is you know pretty sophisticated you're
1:06:14
not gonna find typos. We've
1:06:17
had also at our company we've had
1:06:19
email show up claiming to be for
1:06:21
me right to our employees saying hey
1:06:23
contact me it's an emergency on this
1:06:25
phone number and stuff like that so
1:06:28
you start to get more sophisticated obviously
1:06:30
the voice cloning is a really really
1:06:32
big problem. You know i
1:06:34
was just called my bank the other day
1:06:36
like hey i'm gonna send you over to
1:06:38
your voice verification prompt and i was like
1:06:40
i would wonder if the voice cloning would
1:06:42
bypass this. You know what I mean? So
1:06:44
you're starting to think about that or just
1:06:46
scams in general. We're getting a lot of
1:06:48
that. So on the hacking side of things,
1:06:50
I've seen some buzz around some exploits. I
1:06:52
don't know if it's quite there yet. I
1:06:55
think that the AI tools
1:06:57
have been useful in some
1:06:59
regards for some malware analysis. The
1:07:02
thing with the LLMs that I've
1:07:04
seen so far is you have
1:07:06
to know what question to ask
1:07:08
and they could get very confident
1:07:10
wrong answers. Right, so if you
1:07:12
know what questions to ask you kind of
1:07:14
know what the answer is they could be
1:07:16
helpful. So for instance we've used some of
1:07:18
our Linux forensics data to pass in some
1:07:21
AI LLM some of them are pretty good
1:07:23
almost like calm like a junior analyst, but.
1:07:25
You really need to know questions asked so
1:07:27
i still think they need a bit of assistance
1:07:29
in terms of the prompting is to say
1:07:31
well look at this and have you consider this
1:07:33
and they could sometimes glean something that you
1:07:35
might have missed like for instance like well this
1:07:37
date indicates something that happened 13 days ago
1:07:39
but this other date here says it happened now
1:07:41
why is this discrepancy there and that's i'm
1:07:43
like oh yeah you're right the malware messed up
1:07:45
they got to take they got their time
1:07:48
their time stop. that they tried to do was
1:07:50
wrong. So there are little things like that
1:07:52
that I think they're really useful. So I think
1:07:54
they can make, if you're good at your
1:07:56
job, I think they can make you better if
1:07:58
you know what to ask for. But I
1:08:00
don't think they're going to turn a junior person
1:08:02
into an expert malware developer, for instance. But
1:08:04
definitely, the AI is definitely
1:08:06
being used by attackers to improve a lot
1:08:09
of mundane type phishing activities and things like
1:08:11
that. So I think we're going to see, I
1:08:13
mean, at least that that's a low
1:08:15
bar right now. But they might get more
1:08:18
sophisticated over time. So we'll have to
1:08:20
see where it all goes. I watched this.
1:08:22
This is obviously just more of a
1:08:24
scammy thing than an intrusion thing. But I
1:08:26
watched this one that blew me away.
1:08:28
It was a YouTube video. It was live
1:08:30
stream. It was live chat going. It
1:08:33
was on a well -known channel
1:08:35
that had been hijacked. And
1:08:38
it was Elon Musk
1:08:40
live talking to fans.
1:08:43
And he was telling them that,
1:08:45
hey, I really want people to
1:08:47
get into cryptocurrency. Cryptocurrency is the
1:08:49
next big thing. So if
1:08:51
you send me, you know, I
1:08:53
don't know, one ethrom, I'm going to send you
1:08:55
20 back. It was that
1:08:57
type of scam. Yeah. And he's
1:08:59
talking. And normally AI has
1:09:02
a lot of inconsistencies in the
1:09:04
facial features or the voices or things,
1:09:06
but Elon's been on so many shows. that
1:09:08
on so many interviews so much video
1:09:10
that this a i was perfect it looked
1:09:12
just like like there was nothing that
1:09:14
told me this was a i generated except
1:09:16
for the fact it was too good
1:09:19
to be true that's how i knew it
1:09:21
was a scam was it was too
1:09:23
good to be true but otherwise they had
1:09:25
the full fake chat going where all
1:09:27
i got mine i got mine and all
1:09:29
of the stuff building the hype and
1:09:31
i started thinking you know the sophistication of
1:09:33
these hacks now utilizing a i is
1:09:35
incredible we talked about the voice. Just pick
1:09:37
taking your voice clips and being able
1:09:39
to pretend your kids or a
1:09:41
family member. Hey i'm hurt or i'm being
1:09:43
arrested i need five hundred bucks and you're
1:09:46
on the phone live with your kid and
1:09:48
it sounds like your kid and you're freaking
1:09:50
out of course because it's your child and
1:09:52
they're saying they need this money right away.
1:09:54
There's all of these things of course usually
1:09:56
that sense of urgency and stuff is the
1:09:58
first clue that it's just going to make
1:10:00
these attacks so much more sophisticated and. In
1:10:03
the industry i work in my career
1:10:05
started to go towards a i and
1:10:07
i mess with it a lot these
1:10:09
days but i quickly realized i need
1:10:11
to go into cyber security because this.
1:10:14
Is gonna be a disaster what's
1:10:16
coming down the pipeline is
1:10:18
going to be just things we've
1:10:20
never seen before because when
1:10:23
you combine in my mind. The
1:10:25
amount of data that's been taken from
1:10:27
us the data breaches the amount of information
1:10:29
that's been collected with a i. You've
1:10:31
got the perfect storm for people being no
1:10:34
longer like you said able to easily
1:10:36
detect all the persons grammars bad or whatever
1:10:38
it is now going to feel as
1:10:40
real as if. Your family members literally on
1:10:42
the phone with you talking and they're
1:10:44
gonna have details about you that only family
1:10:46
members should know and that's where it
1:10:49
starts getting really scary. Yeah we talked before
1:10:51
about how I would strongly urge you
1:10:53
to have a password set up with your
1:10:55
family. that you know if you're getting
1:10:57
a phone call or voice call it sounds
1:10:59
odd you should be asking what this
1:11:01
password is and they should be telling it
1:11:03
to you and i was struck to
1:11:06
my family members about it already it just
1:11:08
is it's something that's going to happen
1:11:10
you know the other thing you could do
1:11:12
too is to go into some of
1:11:14
these systems and just ask what it knows
1:11:16
about you and you know you might
1:11:18
be interested to see what it knows about
1:11:21
your family and this and we talked
1:11:23
about this before the social media just i
1:11:25
just encourage people to share less. Right?
1:11:27
You know, Mark Zuckerberg doesn't need to
1:11:29
know anything more about you, right? It's
1:11:32
hard to avoid all this stuff. It's
1:11:34
just I remember years ago when they
1:11:36
were Facebook introduced the tagging that many
1:11:38
years ago now and the photographic tagging
1:11:40
people thought how awesome it is. But
1:11:42
really, they're like, I would tag my
1:11:44
own face. I don't use Facebook anymore.
1:11:46
I've been on Facebook for over a
1:11:48
decade now, but I would tag it
1:11:50
as random, different random people. I didn't
1:11:52
want to train their algorithms. But the
1:11:54
thing is, everyone's training all these faces,
1:11:56
but They're like, well, it's only
1:11:58
for your photos. Well, he didn't say that. It
1:12:00
could be if you're in a restaurant, someone takes
1:12:02
a photo and you're in the background, even though
1:12:04
they took it for their Facebook feed. Yeah,
1:12:07
they're going to show who they know, but
1:12:09
that doesn't mean Zuckerberg hasn't seen you in the
1:12:11
background attack you as being in that location.
1:12:13
Yeah. And they were training on that. Yeah. So,
1:12:15
yeah, I don't know how to avoid it
1:12:17
now, other than where a mask ground looks like
1:12:19
a weirdo. Yeah,
1:12:24
exactly AI systems have everything from your
1:12:26
your biometrics to all your personal details.
1:12:28
And yeah, it is a scary environment.
1:12:30
We'll see we'll see how it goes.
1:12:33
Well, now it's kind of normal to wear a
1:12:35
mask for like sicknesses and stuff. So you
1:12:37
can just do that all the time. That's true.
1:12:39
They've got AI that can pass that they
1:12:41
can look at your eyes and be able to
1:12:43
determine who you are, you know, just your
1:12:46
irises itself. So well, okay, that's better. You have
1:12:48
to wear a full face mask, Michael, a
1:12:50
full Full on face mask. Oh, I get to
1:12:52
wear a balaclava. Awesome. Yeah, sure. Whatever
1:12:54
you want. Don't be like a ninja. I'll
1:12:57
remind you, don't forget that your ear is
1:12:59
actually as unique as your fingerprint. So
1:13:01
even your ear showing through a mask could be
1:13:03
a help. They might be able to see. Come
1:13:05
on. Yeah. Sad to say. Sad
1:13:08
to say. You're cooked
1:13:10
no matter what you do. So
1:13:12
our beloved Linux controls the
1:13:14
world. The penguins are marching. We're
1:13:17
in. All the things
1:13:19
are on the cloud containers
1:13:21
embedded systems on the
1:13:23
internet. So how does sand
1:13:25
fly adapt to such
1:13:27
a diverse ecosystem? And what's
1:13:29
what's the trickiest environment
1:13:31
you've had to tackle so
1:13:33
far? The main thing
1:13:35
we do is because we're not
1:13:37
tying into the kernel, it saves
1:13:39
a ton of grief and people
1:13:41
like you see VPF does not
1:13:43
solve the problem. It's like. It
1:13:45
shifts it around a bit. It's still the
1:13:47
same. You still are vulnerable to
1:13:49
causing system performance and crashing things. So the
1:13:52
biggest thing, why we don't have compatibility
1:13:54
issues is, first of all, we're not tied
1:13:56
into the kernel. We're not sweeping the memory. We're not doing
1:13:58
anything that could cause the system to tip over. We're
1:14:00
just known as a company that we don't tip
1:14:02
systems over. This is what node for. Second
1:14:05
thing is we use a memory
1:14:07
safe program and language Golang, which
1:14:10
is, it was designed by Google
1:14:12
and it is just. phenomenal.
1:14:14
We never get any type of issues with it.
1:14:16
It's garbage collected. We don't have any memory issues.
1:14:18
We're not going to have to worry about overflows,
1:14:20
all this other stuff that you typically see. The
1:14:23
other thing we do is we just try
1:14:25
to maintain, when we're on a system, we
1:14:28
don't do anything tricky. We try to maintain
1:14:30
read -only operations. There's no direct way to
1:14:32
have our code run something on the remote
1:14:34
system. We're there to read and we don't
1:14:36
write anything. We're not trying to change the
1:14:38
system around. I think all these
1:14:40
things add together to something that's really You
1:14:43
know that allows us to work on
1:14:45
you were like I said really like
1:14:47
crazy stuff like everything from like I
1:14:50
we could work on softer to find
1:14:52
radios at the cameras up to the
1:14:54
same product watch all these things You're
1:14:56
talking about that hack earlier with the
1:14:58
telcos that you know, they were getting
1:15:00
into edge devices a lot of which
1:15:02
run CIS around Linux and you know,
1:15:04
we could get on a lot of
1:15:06
these edge devices So it just comes
1:15:08
down to Having that safety allows
1:15:11
us to get all these places without the I
1:15:13
call it no drama Linux security That's like
1:15:15
one of my life models is no drama. Like
1:15:17
I don't want you in my life causing
1:15:19
drama, right? So So the same
1:15:21
thing. Yeah, it's the same thing here, man
1:15:23
We just you know, we're like no
1:15:25
drama Linux security. I think we're able to
1:15:27
do that because of how we work
1:15:29
Yeah, I just don't know how do I
1:15:31
know if it's working if I don't
1:15:33
have this thing popping up every five seconds
1:15:35
telling me that it's collecting malware and
1:15:37
it's protecting me and resubscribe and get another
1:15:39
subscription and your time's almost up and
1:15:41
add this feature and have you ever used
1:15:43
these recent ones? Oh my gosh, they're
1:15:45
so bad. Don't forget the big scan that
1:15:47
happens at 10am in the middle of
1:15:49
your most important call the day. Don't
1:15:51
forget that one. Like some
1:15:53
of the software that is supposed to protect
1:15:55
you is worse than a virus. Yeah,
1:15:57
yeah, it definitely is. So,
1:15:59
you know, being in this field,
1:16:01
I know you've seen some things and
1:16:03
I just want to try to
1:16:05
get you to give us one of
1:16:07
the most sneakiest or creative linux
1:16:10
specific attacks that you've either experienced and
1:16:12
worked on yourself or that you've
1:16:14
just seen in the wild so far.
1:16:16
And preferably something that's already been
1:16:18
patched. Yeah. You
1:16:20
know that the thing is, it's like
1:16:22
we get asked all the time about linux
1:16:24
stealth root kits. I actually
1:16:26
think they're all junk. I just just junk,
1:16:28
but it's marketing and people want to
1:16:30
think about it. The best things I see
1:16:32
are guys who are using living off
1:16:35
the land commands on the Linux system to
1:16:37
hide. And I'm sorry, I don't have
1:16:39
it in front of here, but there's a
1:16:41
root kit that was 100 % living off
1:16:43
the land. It hit under
1:16:45
the PROC file system. So you know,
1:16:47
on Linux and PROC, PROC lists all
1:16:49
the processes. So that's a virtual file
1:16:51
system. But what this guy did is
1:16:53
he would unmount that file system. copy
1:16:56
over his binaries onto the files
1:16:58
underneath and remount proc on top.
1:17:01
So now when he went to go
1:17:03
look for the binaries, it was just
1:17:05
a proc file system. You couldn't see
1:17:07
him. Now they were still running, right,
1:17:09
on the system. And so he had,
1:17:11
using living off the land, he had
1:17:13
a backdoor that was watching your log
1:17:16
files. And if you hit certain keys,
1:17:18
it would activate through the SSH, like
1:17:20
it would see certain attempts to log
1:17:22
in, it would activate. I had log
1:17:24
cleaners on it. It had a hidden
1:17:26
process that was running. It could hide.
1:17:28
Um, um files that were in directory.
1:17:30
So I think it's pretty good. I
1:17:32
like guys Uh and malware that are
1:17:35
using a simple Tactics in a very
1:17:37
clever way. I don't like complicated malware.
1:17:39
It's all junk It's it they have
1:17:41
bugs they get noticed they mess things
1:17:43
up the more features you add to
1:17:45
them the worst they get So I
1:17:47
just like really simple purpose built malware
1:17:49
or purpose built like living off the
1:17:51
land Stuff so I say I say
1:17:54
the sneakiest stuff is not what most
1:17:56
people think is sneaky the sneakiest stuff
1:17:58
are people who are Just using built -in
1:18:00
stuff in a very clever way plain
1:18:02
site Yeah, or like I said, you
1:18:04
know very simple like we wrote a
1:18:06
big report up on the BPF door,
1:18:08
which is a piece of suspected Chinese
1:18:11
backdoor It was great because it had
1:18:13
a simple magic pack to activate it
1:18:15
hid in a very kind of non
1:18:17
-slot, you know way that was good
1:18:19
enough. It just did one thing
1:18:21
and then that's it. And it was small,
1:18:23
you know, so stuff like that, that's simple
1:18:25
is good. Complicated is
1:18:28
go away. So what's
1:18:30
sneakiest, sneakiest is simple.
1:18:33
Interesting. I love it. It's also interesting
1:18:35
to see like the cybersecurity aspects of
1:18:37
like people who are into cybersecurity can,
1:18:39
you know, like respect the effort involved
1:18:42
in doing something horrible. You have to.
1:18:44
Yeah, because it's like. Like there's so
1:18:46
many like it's it's like the XZ
1:18:48
thing like there is it was so
1:18:50
far deep in that like the way
1:18:52
they got access was not that clever
1:18:54
but the way they were doing it
1:18:57
was clever and how they buried it
1:18:59
inside of stuff like. You know that
1:19:01
stuff is interesting. Yeah, no, I mean,
1:19:03
that wasn't it. Look, it was an
1:19:05
awesome hack. Look, I'll read hacks today
1:19:07
and I'd still laugh out loud if
1:19:10
someone does something really good, you know. So,
1:19:12
you know, I'm like, hey, man, props
1:19:14
to them. Man, I wish I'd done that.
1:19:17
Right. So, yeah, it's it's the same
1:19:19
like the XZ hack was clever because those
1:19:21
guys are just like, we're just
1:19:23
going to shove a backdoor in the middle
1:19:25
of one of the most popular compression algorithms and
1:19:27
not let anyone see it. I'm like, awesome. Yeah,
1:19:30
they almost got away with it, right? Um,
1:19:32
you know, but also at the same time,
1:19:34
it reflects to how badly a lot of
1:19:36
these open source maintainers are treated. Because
1:19:39
I used to maintain open source software. I
1:19:41
have software that's still like, you know, I wrote
1:19:43
the original versions of log check and log
1:19:45
tail, which are now in every Debbie and distro
1:19:47
and stuff like that, right? But the thing
1:19:49
is, it's like. These guys, it's like that cartoon
1:19:51
where they show the whole internet, this one
1:19:53
guy in Minnesota maintaining that critical piece of software,
1:19:56
you know, the one I'm talking about. Yeah.
1:19:58
Yeah. Like the maintainer XZ got such grief. I'm
1:20:00
like, how much money did you guys ever
1:20:02
give this guy? Did you ever support him? Did
1:20:04
you ever offer any help? Yeah. You know,
1:20:06
so, you know, these open source authors are under
1:20:08
tremendous pressure. A lot of times they are
1:20:10
doing something well past the point that they like
1:20:12
doing it anymore. You
1:20:14
know, and it's a risk. It was a
1:20:17
thankless job a lot of the times. Yeah.
1:20:19
Of course it is about the fact that,
1:20:21
you know, you get people who are using
1:20:23
stuff for free and they complain. A lot
1:20:25
of these are very big companies using stuff
1:20:27
for free like a major project. They won't
1:20:29
even throw these guys 50 bucks, right? It's
1:20:31
kind of pathetic. You know, we do as
1:20:33
a company, we do donate to some of
1:20:35
these projects just because we've been on the
1:20:37
receiving end. And it's just like, I would
1:20:39
encourage companies that are listening to this, you
1:20:41
know, to support these guys because, you know, it's
1:20:44
a thankless job. It's a
1:20:46
thankless job. Yeah. That's also
1:20:48
what you're doing that. And
1:20:51
also, uh, Sanfly's mission is all about
1:20:53
catching intruders and responding to the incidents.
1:20:56
I'm curious if you have one you
1:20:58
can share with us. What's the most
1:21:00
dramatic success story that you can share
1:21:03
without like naming names or anything, like
1:21:05
where Sanfly saved the day for a
1:21:07
Linux deployment? Probably, I think
1:21:09
when the log for J stuff
1:21:11
happened, we were, we were called out
1:21:13
in incident response. um
1:21:15
situations for customers and that they you
1:21:17
know that happened over the Christmas break I
1:21:19
think they were quite happy to have
1:21:21
us able to like you know come in
1:21:24
on the holiday or have our product
1:21:26
help help them through the holidays that that
1:21:28
affected a bunch of different people and
1:21:30
again that wasn't a linux specific the software
1:21:32
but it was executing things that were
1:21:34
right so it was it was we were
1:21:36
able to help with some of that
1:21:38
other customers have just been using our ssh
1:21:40
key tracking because they
1:21:42
initially thought they want us for edr but
1:21:45
then we found their ssh enterprise is a
1:21:47
mess and so they're using us to help
1:21:49
clean that up and they're like we had
1:21:51
no idea we had all these old keys
1:21:53
we had employees that still would have had
1:21:55
access maybe still didn't have access they didn't
1:21:57
know so we kind of find these things
1:21:59
going on and then we've also had proof
1:22:01
of concepts were in the middle of the
1:22:04
proof of concept i'm looking in. We're looking
1:22:06
at a result. I'm like, yeah, man, that's
1:22:08
BPF door on that system right now. So
1:22:10
we find active malware, active compromises on systems
1:22:12
sometimes even during a POC. So
1:22:14
these are all the kind of things where
1:22:16
I think the customers have really appreciated it.
1:22:18
Again, because I get back to a very
1:22:20
limited number of Linux people having to watch
1:22:23
a very large number of systems. And
1:22:25
even if you got someone like
1:22:27
me on your team, was like,
1:22:29
I can't scale. The worst I
1:22:31
heard once was three people had
1:22:34
to watch 100 ,000 Linux VMs.
1:22:36
It's a completely unscalable problem. It's
1:22:38
impossible. You got to automate it.
1:22:40
These situations were viewed pretty favorably
1:22:42
because we could just come in
1:22:44
and rip these systems very quickly
1:22:46
and get a line on what's
1:22:48
going on. Is it
1:22:50
saved a day? Well, I probably
1:22:52
saved a lot of grief and aggravation.
1:22:54
Yeah, certainly. Got people
1:22:56
back to Christmas. This is very
1:22:58
important. Got them back to their families. Yeah.
1:23:03
Speaking of family, have
1:23:05
you ever used drift
1:23:07
detection to help catch the
1:23:09
family with Dominic Toretto
1:23:11
and the other Furious? Why
1:23:15
are you bringing a stupid Furious
1:23:17
5 movie reference? Michael, you're
1:23:19
banned. I know
1:23:21
this is me going to be totally
1:23:23
clueless because like I'm totally disconnected from
1:23:25
pop culture now. So I'll
1:23:27
act like, oh wait, that was
1:23:29
a great movie. Stay there.
1:23:32
Am I supposed to say that? No,
1:23:34
it's a terrible movie that's
1:23:36
fun, I guess is the way
1:23:38
to say it. Well,
1:23:42
Craig, you've made it to the
1:23:44
end of the main part of
1:23:47
the interview. Congratulations. Thank
1:23:49
you for having me. No,
1:23:51
it's it's you're fantastic. I
1:23:53
could talk with you for
1:23:55
hours. Oh, great. I
1:23:57
appreciate that. Yeah, same thing. Absolutely.
1:24:00
You've been amazing. But
1:24:02
unfortunately. You just
1:24:04
made it through the gauntlet. Now you've got
1:24:06
the lightning round, which is generally another
1:24:08
45 minutes of questions. Uh, no,
1:24:11
I'm kidding. Uh, we're going to do
1:24:13
a quick lightning round with you. And really,
1:24:15
uh, this is meant for fun. So
1:24:17
we're going to ask you a question. First
1:24:19
thing that comes to mind, shoot it
1:24:21
out. And, uh, you can't change your answer
1:24:23
though. So if you shoot, I'm ready
1:24:25
to take back. All right. Here
1:24:28
we go. Lightning round begins now. And,
1:24:30
uh, I'm going to kick us off your
1:24:32
favorite candy bar. Stickers.
1:24:35
Nice. Good one. Nice. The best
1:24:37
hacker movie. War games. Nice.
1:24:42
Favorite Linux distro. No
1:24:46
comment. No comment. Best
1:24:49
music genre or
1:24:51
song to ethically hack
1:24:53
to? Oh,
1:24:56
geez. Probably. I'll
1:24:58
go back to 90s prodigy. 90
1:25:03
90s prodigy. Yeah. Nice. So
1:25:05
this is probably rock modern off.
1:25:07
It depends on my mood.
1:25:09
Rock modern off. If I'm deep
1:25:11
thinking prodigy is an interesting
1:25:13
prodigy. If you're just coming in
1:25:15
to wreck the place prodigy. I
1:25:20
can hear the song in my head
1:25:22
right now. If you're just if you're just
1:25:24
coming in to burn the village, you
1:25:26
would prodigy. A lot of
1:25:28
smacking I were to do there. Yeah.
1:25:30
Yeah. And so this
1:25:32
is a very important one. If
1:25:34
you could only choose one, you
1:25:36
can't, it doesn't matter what flavor of
1:25:38
either, but you only choose one
1:25:40
cupcakes or muffins. Cupcakes. Unfortunately,
1:25:45
that's the wrong answer. You
1:25:47
know, that's all cybersecurity enthusiasts and
1:25:49
geniuses say, Michael, that is
1:25:51
the correct answer. Obviously. They're
1:25:53
basically the same thing. Just one
1:25:55
is icing on top, right? Yeah.
1:25:57
Well, it's just an ugly cupcake,
1:25:59
you know. Yeah. I
1:26:01
feel like I'm being attacked now. What
1:26:06
company creates a dedicated
1:26:08
and reliable Linux security solution
1:26:10
that works across all
1:26:12
systems without endpoint agents or
1:26:15
drama? Stand fly.
1:26:20
I love it. I
1:26:22
love it. Look,
1:26:24
I'm so excited about not only this
1:26:26
interview and our audience hearing it, but
1:26:29
our partnership with Sandfly because there's so
1:26:31
much security stuff that goes on in
1:26:33
the news. And now we have someone
1:26:35
we can go to when these new
1:26:37
segments happen too and be like, hey, what's
1:26:40
really happening here? What's going on behind
1:26:42
the scenes here? In fact,
1:26:44
we talked about early on, we want to
1:26:46
have you all on. And hopefully it
1:26:48
never happens but if there is another epic
1:26:50
hack which does happen you know these
1:26:52
couple times a year usually i have you
1:26:54
all on the kind of discuss it
1:26:56
and hopefully it never happens but at least
1:26:58
a couple times yeah sure yeah at
1:27:00
least arm people with the information of how
1:27:02
they can avoid it and things and
1:27:04
i also want to thank you for giving
1:27:06
our audience. We didn't even
1:27:09
know about that until today that
1:27:11
you guys were going to give
1:27:13
a coupon code to our audience
1:27:15
specifically so destination fifty. And that
1:27:17
gets you 10 hosts for people
1:27:19
when you use the home user
1:27:21
edition. Of course, we have a
1:27:23
lot of professionals in listening in
1:27:25
our audience as well. So check
1:27:27
out Sandfly for your enterprise. But
1:27:30
most importantly, you did an amazing interview,
1:27:32
Craig. Thank you so much for coming on
1:27:34
the show. And thank you for all
1:27:36
you do for Linux and everything you've done
1:27:38
for Destination Linux, too. We appreciate Sandfly
1:27:40
so much. I appreciate it, guys. I love
1:27:43
your show. So I'm happy to be
1:27:45
on here. Thank you. Thank you. We appreciate
1:27:47
it. And also, the amount of sound
1:27:49
bites you've provided for content is so good.
1:27:52
We can have him all over social
1:27:54
media to his joy. Yeah,
1:27:56
well, I could probably make a video for
1:27:58
each question, you know. Yeah,
1:28:00
well, let me know. Sometimes I say stuff
1:28:02
I regret later, but hopefully this one
1:28:04
went all right. I think it went
1:28:07
good. I didn't tell you what distro I use. I
1:28:09
think I kept my mouth shut about that. You did
1:28:11
good. You even kept it shut on the lightning round,
1:28:13
which is awesome. We tried to sneak it in there
1:28:15
to see what happened. to get them on another angle
1:28:17
of just tell us your favorite. And
1:28:19
I might change. Even if I told
1:28:21
you, it's just you guys know how it
1:28:23
goes, right? Oh, yeah. We're all professional. That's
1:28:26
it. Everything's set up perfectly. Time
1:28:28
to change. So there you go. Exactly. Yeah,
1:28:30
I've been on it for 35 minutes and
1:28:32
that's today. I booted up my
1:28:34
machine. I was like, I think I'm going
1:28:36
to change distros again. So yeah. So
1:28:39
if you want to come hang
1:28:41
out with Jill this year, you
1:28:43
need to go check out scale
1:28:45
22 X the 22nd annual Southern
1:28:47
California Linux Expo will take place
1:28:49
on March 6th through the 9th
1:28:52
2025 this year. That's this year
1:28:54
at the Pasadena Convention Center in
1:28:56
Pasadena, California, California. You
1:28:58
can meet Jill comes in the
1:29:00
Linux chicks LA wheel of swag
1:29:03
for a surprise at the Linux
1:29:05
chicks LA booth number two eighteen
1:29:07
and you can use. Chicks,
1:29:09
the code Chicks, C -H -I -X,
1:29:11
for 50 % off your skill registration
1:29:13
there. We got all kinds of
1:29:15
coupons on this episode. This is
1:29:17
a coupon episode. sure. All
1:29:19
kinds of discounts. We're like a
1:29:21
deal site for this. Yeah, like slip
1:29:23
deals over here for Linux. I'm telling
1:29:26
you what, it's incredible. I couldn't even
1:29:28
think of one. I
1:29:30
was like a deal site. If
1:29:33
you want to find Jill in the crowd,
1:29:35
just look for the penguin hat. And
1:29:37
she actually wears that the whole time.
1:29:39
Yes. So you'll be able to find
1:29:41
her there. And also
1:29:43
Jill may be doing some content for us
1:29:46
there at the show. We haven't fully
1:29:48
figured it all out, but we may have.
1:29:50
We haven't like, we haven't trapped her
1:29:52
into agreeing yet, but we will. Don't worry.
1:29:54
I kind of just did now. Like
1:29:56
she kind of has to now because I
1:29:58
mentioned it. Exactly. That's the
1:30:00
tactic of recording. They'll
1:30:02
come from scale and Michael and I
1:30:04
will be if you want to meet
1:30:07
us going to that. We're not going
1:30:09
to that one, but we're going to
1:30:11
be at the Red Hat Summit. Yeah.
1:30:13
Right. Red Hat Summit in May, which
1:30:15
is in Boston. Boston.
1:30:17
You got to say Boston. We're
1:30:19
going to get in the car and go to Boston. Get
1:30:22
in the car, get some coffee and go Boston. That's
1:30:25
what we're going to do. So you're going to come
1:30:27
hang out with us in Boston. So. gonna
1:30:29
have to go, if you want to
1:30:31
see all of us, you're gonna have to
1:30:33
go separate places in the United States.
1:30:35
It's worth it. They're complete opposite sides of
1:30:38
the country too. You're welcome. You can
1:30:40
do a whole little tour on your way.
1:30:42
This is for you to have a
1:30:44
good traveling experience. We did
1:30:46
this for you. A big thank you
1:30:48
to each and every one of you
1:30:50
for supporting us by watching or listening
1:30:52
to Destination Linux. However you do it,
1:30:54
we love your faces. And this episode
1:30:56
has so much to discuss and talk
1:30:58
about. So many things that we can
1:31:00
pontificate upon as a community and that
1:31:02
word came out of nowhere. I did
1:31:04
that was interesting. Yeah. Yeah.
1:31:07
In our discord, go
1:31:09
to tux digital.com slash discord.
1:31:11
Join the discussion. And
1:31:13
Michael, there's some other things people can do too
1:31:15
because they love this show so much. They're going to
1:31:17
want to support it. If you love
1:31:19
this show so much that you want
1:31:21
to support it, you can go to tux
1:31:24
digital.com slash membership and get. Access to
1:31:26
a bunch of cool stuff you can you
1:31:28
be become a patron and you get
1:31:30
the access to watch the show live get
1:31:32
access to unedited episodes access to the
1:31:34
just the patron section only for the discord
1:31:36
server so you can join us in
1:31:38
the live stream and also join us in
1:31:41
the patron only post should have it's
1:31:43
every week after the show. and so much
1:31:45
more we have tons of stuff and
1:31:47
if you become a patron of tux digital
1:31:49
is not just destination linux you also
1:31:51
get stuff for this week in linux and
1:31:53
more so tux digital dot com slash
1:31:55
membership and something else you can get is
1:31:58
merch discounts that's right we have merch
1:32:00
at tux digital dot com slash store so
1:32:02
you can get hats mugs t shirts
1:32:04
hoodies all sorts of stuff tux digital dot
1:32:06
com slash store and if you become
1:32:08
a patron you get discounts at the store.
1:32:10
So so much stuff to get there.
1:32:12
And there's so many things that I can't
1:32:15
even think of. And we actually have
1:32:17
some more coming that people have requested specifically.
1:32:20
So including mouse pads. Yes. For
1:32:22
the people who sent the
1:32:24
message in. Yes. Mouse pads. Not
1:32:26
a Bucky's travel mug, but
1:32:28
a destination Linux one. Definitely
1:32:31
not what Ryan holds up ever. And
1:32:36
make sure to check out all
1:32:38
the awesome shows here on Text Digital.
1:32:40
That's right. We have an entire
1:32:42
network of shows to fill your whole
1:32:44
week with geeky goodness. Head to
1:32:47
textdigital.com to keep those Linux penguins marching.
1:32:49
And if you want to marinate
1:32:51
on some more content and pontificate on
1:32:53
some more things, be sure to
1:32:55
join us in the Discord server. And
1:32:57
I just wanted to say marinate
1:32:59
for no reason. Then pontificate. Then pontificate,
1:33:01
exactly. Everybody have a
1:33:03
great week and remember that the
1:33:05
journey itself is just as important
1:33:07
as the destination Pontificate on that
1:33:09
people like we say it every
1:33:11
week, but really pontificate on what
1:33:13
that means let that marinate for
1:33:15
a second Yeah, think outside the
1:33:17
box. We love all our viewers
1:33:19
most of them contemplate what that
1:33:21
means You know ugly it is
1:33:23
that we say that you know,
1:33:26
I can't believe you came up
1:33:28
with that Michael that most of
1:33:30
them that was not me You
1:33:34
have like an instinct to come
1:33:36
up with things like Which which I
1:33:38
respect in fact because it is
1:33:40
it is 100 % not Everyone it
1:33:42
is most of you. There are some
1:33:44
people send some nasty emails and
1:33:46
you know who you are Yeah, we
1:33:48
do not a tiny white easy
1:33:50
anyway, know We know who
1:33:52
you are. We don't like you. We
1:33:54
know who you are We don't care
1:33:56
about you. So that's why we make
1:33:58
sure it's not everybody cheeto stains
1:34:00
on your Whitey tidies emails
1:34:03
you send talking moment
1:34:05
you hate us. Thanks for
1:34:07
watching this show everybody.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More