Starlink Bugs, Bank Regulator Breach, and the LastPass Fallout

Starlink Bugs, Bank Regulator Breach, and the LastPass Fallout

Released Thursday, 17th April 2025
Good episode? Give it some love!
Starlink Bugs, Bank Regulator Breach, and the LastPass Fallout

Starlink Bugs, Bank Regulator Breach, and the LastPass Fallout

Starlink Bugs, Bank Regulator Breach, and the LastPass Fallout

Starlink Bugs, Bank Regulator Breach, and the LastPass Fallout

Thursday, 17th April 2025
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

cybersecurity industry is very vocal. They like to

0:02

open their fucking chops. Pardon my language.

0:04

But your boy Hector's upset. Hector

0:07

Monsieger was responsible for some of the

0:09

most notorious hacks ever committed. FBI

0:11

Special Agent Chris Tarbell. Hackett and

0:13

FBI informants participated in some of

0:15

the world's most infamous hacks. That

0:17

caused up to $50 million in

0:19

damages. A life in the shadows.

0:22

Cyber attacks on the rise. Welcome

0:35

to Hacker in the Fed. I'm Chris

0:37

Tarbo, former FBI special agent, working my

0:39

entire career in cybersecurity, and joined, as

0:41

always, by Hector Montsegur, my friend and

0:43

podcast co -host. Hector is a former black

0:45

hat hacker who once faced 125 years

0:47

in prison for as many years of

0:49

hacking under the codename Sabu.

0:52

Our stories collided in June of 2011

0:54

when I arrested Hector and then convinced

0:56

him to work with me at the

0:58

FBI. Hector is now a red teamer,

1:00

researcher, cybersecurity expert, and great friend. Hector,

1:02

how are you, brother? Oh, I'm

1:05

doing great. Oh, man, that

1:07

was a hell of an introduction. It's

1:09

all true. Oh, that's a beautiful

1:11

thing, man. Well, I'll tell you,

1:13

you know, it's great to be here

1:15

with you, as always. It's always a pleasure.

1:17

I love talking to you. I'm

1:19

glad your travels were safe and you made

1:21

it back. Oh, yeah, yeah. I'm

1:23

back in New York City, baby. You know, I had

1:26

a great weekend, you know. Good.

1:28

It's quite educational on

1:30

the academic side. I

1:32

just came back from Miami. Big

1:35

shout out to all my peoples

1:37

out there in Miami, especially the

1:39

Miami -Dade team, Miami -Dade and FIU folks

1:41

and everybody else in between. We

1:45

did a CTF and workforce development

1:47

kind of like a session. CTF?

1:49

What's a CTF for the people? I'm so

1:51

sorry. See, I'm the one complaining about jargon in

1:53

our industry. I'm the one throwing the jargon

1:55

out. CTF has captured a

1:58

flag. And how does that

2:00

work? Oh, that's great. Well,

2:02

give me some context. So two

2:04

of my teammates, fantastic, wonderful

2:06

young guys, they put together, you

2:08

know, 50 or so vulnerable

2:10

virtual machines and environment scenarios where

2:12

you as a student or

2:14

an adversary, you would try to

2:16

engage, do some reconnaissance and

2:18

discovery, identify vulnerabilities, try to exploit

2:20

those vulnerabilities, and eventually getting

2:23

flags and so on. As you

2:25

gain more flags, the more

2:27

points you gather. Then, of course,

2:29

by the end of it,

2:31

you get a prize in most

2:33

cases. And so the way the

2:35

CTF works is exactly how I laid it out. And

2:38

in most cases, it can be quite

2:40

boring. I can assure you that. It's

2:44

not like the movies and hacking? No, man.

2:46

I wish. You know, I remember when I

2:48

watched the movie Hackers and I was like,

2:50

wow, this is so fascinating. It's so like

2:52

many colors until I started hacking myself. I'm

2:54

like, wow. I got bamboozled.

2:57

I got hoodwinked. But

2:59

yes, CTFs are usually somewhat eh.

3:01

But this time, no. We had a

3:03

great event. We had a lot

3:05

of students show up. Again, big shout

3:07

out to Risk Labs out of

3:09

FIU and the teams out of Miami

3:12

-Dade College. You know, did

3:14

a great job. They gained some really

3:16

cool points. We gave out a ton

3:18

of prizes. We catered breakfast. We catered

3:20

lunch. We gave out rubber duckies and

3:22

triple zeros. Yeah, you

3:24

know, not the one of those, you

3:26

know, quack, quack. It's like a really cool

3:28

tool that physical pen testers use during

3:30

engagements. Yeah. But it was

3:32

great, man. It was such a blast, Chris,

3:34

that I could give back. You know, my

3:36

team could give back. And, yeah, we all

3:38

came back fulfilled. Absolutely. Good, good. I'm glad

3:41

you were able to get on and do

3:43

that. So it sounds great. Oh, yeah. So

3:45

let me ask you this. I'll sort

3:47

of in the way I see on

3:49

Capture the Flag and all that. Talk to

3:51

me. I saw this week that Elon

3:53

Musk opened up Starlink for Bug Bounty. That's

3:56

great. I guess you've been doing it for

3:58

a little while, but to open up a live

4:00

network with millions of subscribers to Bug Bounty,

4:02

do you think that's right? Absolutely.

4:04

Well, you have to, and this is actually

4:06

a really good question. glad you asked a question.

4:09

So I've had customers, clients say,

4:11

hey, Hector, we're thinking about joining HackerOne,

4:13

which is kind of like a

4:15

public bug bounty platform. Or, hey,

4:17

we're thinking about joining BugCrowd, right? You

4:20

know, what are your thoughts on

4:23

these platforms? Do you feel biased because,

4:25

you know, you run a pen

4:27

testing service company and maybe a bug

4:29

bounty platform might be competitive? I

4:31

say, no, absolutely not. The more

4:33

eyes, the better. They

4:35

work really well. But,

4:38

and here's the big, I would

4:40

say the nuance that you have

4:42

to really pay attention for, which

4:44

is a lot of companies are

4:46

usually not ready for that, right?

4:48

Your organization has to reach a

4:50

certain level of security maturity. Your

4:53

program has to be mature enough to

4:55

be able to deal with that. Because if

4:57

you open up a bug bounty program,

4:59

meaning you're opening up your systems to the

5:02

world for testing and discovery, reconnaissance, et

5:04

cetera. You have to

5:06

be able to scale and manage

5:08

incoming vulnerability reports. And you need

5:10

to be able to identify and

5:12

classify which are legitimate vulnerabilities and

5:14

which you need to prioritize, prioritize

5:16

rather, and which are false positives.

5:18

Because you have a lot of

5:21

young folks getting into cybersecurity. They

5:23

think something's a vulnerability and sometimes

5:25

it's not. And so, yeah, it's

5:27

a big deal if you get to that point. And

5:29

shout out to Elon for setting that up. I

5:31

think it's a great opportunity. for security

5:33

researchers out there including potential bad actors

5:35

that want to change from that lifestyle

5:37

to get involved and look at the

5:39

systems yeah i just i you don't

5:42

doesn't put any of the user data

5:44

at risk of you know allowing people

5:46

to bang against you know what if

5:48

they define an opening they can you

5:50

know start monitoring clients or monitoring you

5:52

know people that use starlink and grabbing

5:54

some of their traffic and this is

5:56

why i put the emphasis that your

5:58

organization must reach a certain level of maturity

6:01

and security program before you get into

6:03

it because you just bring up something

6:05

that happened many years ago, which is

6:07

the big Uber hack. You remember that?

6:09

Sure. It was an open bug bounty

6:11

program. Uber thought it was mature enough

6:13

for it. Somebody was able to get

6:15

a ton of subscriber data, right? They

6:17

got a bunch of information. And

6:19

they were like, hey, would

6:22

you mind paying me X amount of dollars?

6:24

And I think the guy got paid. And

6:26

he still leaked the information. So it ended

6:28

up in a lawsuit. People got fired. And

6:30

this was prior to the SEC rules where you

6:32

have to, like, disclose a breach within a certain

6:34

time. So, yeah, that was

6:36

a big problem. And, again, if you

6:38

are an organization, you're listening to me

6:40

right now, you're a CISO, a CSO, whatever,

6:43

what happens, CIO, if you're thinking about

6:45

bug bounty program, you have to make

6:47

sure that your organization has reached a

6:49

certain level of maturity prior to that.

6:51

Because, yes, if an adversary gets in using

6:53

the bug bounty program as kind of

6:55

like a noise on the line, they

6:57

will get in and they might abuse

6:59

that. I think if Hacker

7:01

and the Fed goes on for another 10,

7:03

15 years, I think that might change

7:05

the show title to Security Maturity. Two

7:08

old guys talking about cybersecurity. Oh,

7:11

wow. That's so funny. That's great. You know what?

7:13

Let's keep that on the back burner. Yeah, yeah,

7:15

yeah. That'll be the second show. But speaking of

7:17

shows, we just did our first Raw Patreon episode.

7:19

You know, let's see how it goes. Maybe if

7:21

Phineas loves it, then he'll put it out there.

7:23

Shout out to Phineas and Will who go through

7:25

it and see if they want to put it

7:27

out there. If not, know, we talked about some

7:29

crazy shit. But guys, you want to hear our

7:32

behind -the -scenes shit that we talked about before the

7:34

show? Join the Patreon. Oh,

7:36

yeah. And I tell you, if

7:38

you're sensitive to language, make sure

7:40

you wear some earphones or muffled

7:42

headphones. Yeah. Don't share this with

7:44

children. No, definitely not. Imagine

7:47

two pirates having a conversation about

7:49

cybersecurity and everything in between. That's

7:51

what that was. Wait.

7:54

Between pirates and ninjas, you were a pirate? Ah,

7:57

that's a great question. I would

7:59

be kind of like a hybrid, right?

8:01

I would be like the bastard

8:03

child of a female ninja and a

8:05

really ridiculous pirate. Can you explain

8:07

to people about the old school pirate

8:10

ninja battle? No.

8:12

Can you? No. Anyways,

8:14

back in the day when you were a hacker, you were either

8:16

a pirate or a ninja. And you kind of had to pick

8:18

your team. Yeah, no, that's true.

8:20

There was a whole mindset where, you

8:22

know. If you're like a ninja -style hacker,

8:24

you would be stealth and whatever. If

8:26

you were a pirate, you were very

8:28

noisy and causing a bunch of ruckus.

8:30

And there was a lot of us.

8:32

And all about that booty. It was

8:34

about the booty. Well, in the 90s,

8:36

it was about the facements, right? The

8:38

facements were the big thing in those

8:41

days. And that's where a lot of

8:43

the pirate stuff comes in. Because

8:45

you're like, yeah, I'm here, causing

8:47

a ruckus, destroying stuff. And ninjas,

8:49

which is what I mostly did, by the way. I know

8:51

I talk about, know, me being a hybrid. I

8:54

did the faces for a while on the political

8:56

side. But as a ninja, you know,

8:58

I broke into systems and I would sit there

9:00

for years. There's one university in Japan, shout

9:02

out to my boys in Japan, that

9:04

I sat on their server for like nine

9:07

years, not even exaggerating. You

9:09

know, and I took care of the server. It was

9:11

an old SunWest machine. I took care it. I learned

9:13

how to update it. It gave me an opportunity to

9:15

learn how to become a systems administrator. So, yeah, that's

9:17

what that is. So

9:23

hackers spied on 100 U .S.

9:25

bank regulators' emails for over

9:27

a year. So a LinkedIn user

9:29

posted that there's significant data

9:31

breach involving U .S. bank regulators

9:33

and that the hackers accessed the

9:35

emails of approximately 100 bank

9:38

regulators at the office of the

9:40

comptroller of the currency. I

9:42

don't even know if that's real.

9:44

Maybe they might get doged.

9:46

Who knows? But they compromised. 150

9:49

,000 emails. And the breach began

9:51

in June of 2023 and

9:53

continued undetected for over a year

9:55

until the hackers were discovered

9:58

and removed in early 2025. So

10:01

these were ninjas. Yeah,

10:04

very similar. It's like the OPM hack. know, you

10:06

get in, you sit there, you collect information, right?

10:09

Now, when we look at like the OPM

10:11

hack, the attribution for that was like Chinese

10:13

actors, right? Because it's intelligence, right?

10:16

I wouldn't be surprised if this

10:18

is kind of the same

10:20

thing here. It makes sense. Yeah,

10:22

supposedly they were looking for

10:24

emails involving internal communications and deliberations

10:26

about financial conditions and federally

10:29

regulated financial institutes. So

10:31

it sounds like they were using it

10:33

to try to make some money. So

10:35

that's the one thing that it's kind

10:37

of tough. So trying to make money

10:39

off these cyber attacks, the SEC can

10:41

sniff out some of this stuff. If

10:43

you're outside the normal means, you start

10:45

getting investigated pretty easily. Yeah,

10:47

you know, I've thought about that

10:50

because I've seen I've seen

10:52

stories where bad guys or people

10:54

that hired bad guys have

10:56

been caught by the SEC for

10:58

exactly this. A good example

11:00

is, do you remember? Yeah,

11:02

this is this goes along. This goes

11:04

back like at least 10 years, maybe

11:06

less. But there was a whole thing

11:08

with St. Jude. Remember St. Jude? There

11:10

was an incident where some adversaries got

11:13

information. that some of

11:15

the pacemakers out of St. Jude had

11:17

defects. Oh, no, I don't

11:19

remember this. Yeah, and they went

11:21

public with, like, kind of like

11:23

a... You ever seen those, like,

11:25

Heisenberg exposures or the company? They

11:27

did it to Nikola. They did

11:29

it to Tesla. They do

11:31

these big exposures. Anyways, a

11:33

very similar company did exposure like that

11:35

with insider information about pacemakers that could

11:37

potentially be defective. And, like, St. Jude

11:40

lost a bunch of money. There was,

11:42

like, a whole bunch of, like... trading

11:44

with the companies associated to

11:46

St. Jude that manufactured the devices.

11:50

And you know what? The SEC caught on to

11:52

that, I think. So

11:54

it's difficult. It's not easy

11:56

in these situations. It's

11:58

always crazy to me about these persistent

12:00

attacks. These guys have access for so

12:02

long. You know, there was a guy

12:04

on my squad that was working on

12:06

a case of a major international bank.

12:09

And they were hackers that were in

12:11

their system for over three years. No

12:13

one knew about it. They had 3

12:15

,000 cybersecurity experts at this bank watching

12:17

things. And they never saw these guys

12:19

in there. And because these guys

12:21

weren't doing anything crazy, all they were

12:23

really going after was poaching contact

12:25

information for the elite clients. Interesting.

12:29

Yeah. They just wanted to be able to reach

12:31

out to the elite clients for, you know,

12:33

try to pull them out of the bank or

12:35

try to get them to, you know, do

12:37

stuff for them. But they finally got caught. Well,

12:39

think about it like this, right? You're an

12:41

adversary. You're inside this massive bank. And you're like,

12:43

okay, what can we do? Can we transfer

12:45

money out? Possibly. But,

12:47

you know, there's probably to be a

12:50

process that may require physical access,

12:52

right? Similar to like the

12:54

Bank of Bangladesh hack from many years ago. Can

12:56

we get physical access enough to be

12:58

able to transfer $20 million to a Hong

13:00

Kong bank account? Likely

13:03

not. Okay, cool. So what else do

13:05

we have? Information. Can we leverage information? Can

13:07

we do insider trading? Or can we

13:09

just sell the information to maybe a competitor

13:11

bank, maybe a bank in Macau, whatever,

13:13

right? Yeah, I

13:15

think about this stuff all the time.

13:17

And when you see a story like this,

13:19

you know what comes to my mind,

13:21

Chris? What's that? Tip of the iceberg, brother.

13:24

Oh, yeah. This is just the tip of the iceberg.

13:27

Imagine all the other agencies and organizations

13:29

that have breached long -term persistent campaigns.

13:31

No, I agree with you. Yeah, I mean,

13:34

you know, the FBI director, well, back in

13:36

my day, the old, we were three FBI

13:38

directors ago. You know, those companies have been

13:40

hacked and those that don't know they've been

13:42

hacked. So, you know,

13:44

that's a scary situation to think about it

13:46

that way. Well, you know what

13:48

the old joke is, right? So before

13:50

cloud computing became a real big thing, right,

13:53

the old joke was like, Well, how do

13:55

you know that you've been compromised? Well,

13:58

how are your systems running? Oh, my

14:00

systems are running fine. Yeah, you've probably

14:02

been hacked. Right? Because the

14:04

adversary gets in, they have to make sure to

14:06

maintain persistence. What does that mean? It means everything

14:08

in the sun. That means they're going to take

14:10

care of your network to make sure you're online. Yeah,

14:13

they're going to clean out all your

14:15

bad shit. Yeah. Stuff's going good. Hey,

14:17

I speak from experience. I used to

14:19

get on there and update the systems

14:21

and fix SQL errors and PHP errors.

14:24

I want to maintain access. So yeah, I get

14:26

it. How would you do it and not

14:28

get caught? Well,

14:30

it depends, right? So now it's

14:33

different because now there's a lot of

14:35

tools, depending on your target, obviously,

14:37

but there's a lot of tools looking

14:39

for you, right? If you're an

14:41

organization, you've invested in like a next

14:43

generation firewall, they're looking for SSH

14:45

tunneling, DNS tunneling, depending on your configuration.

14:47

They're looking for weird activity at

14:49

three in the morning. If you have

14:51

any zero -trust products, you have to

14:53

deal with that as the adversary. But

14:56

back then, brother, it was the Wild West.

14:58

Once you got in, you got in. Yeah, it

15:00

was just a hard outer wall. It was

15:02

literally like the Great Wall of China, a big

15:05

wall around things. Once you're inside, you do

15:07

whatever you want. Yeah, all they had back then

15:09

was Snort, right? And Snort was a signature -based

15:11

system. So if you did not use any

15:13

signatures that the Snort user was using, then you're

15:15

good. You would fly right by. Yeah.

15:17

it's interesting so would your when

15:19

you does that mean you change

15:21

your approach now that you're when

15:24

you're red teaming oh yeah now

15:26

as emulating adversary attacks absolutely um

15:28

you know can you do good

15:30

red teams like you know if

15:32

there's good endpoint you know the

15:34

ai based endpoint stuff and you

15:36

know watching you do it in

15:38

the middle of the night well

15:40

the thing is it's like So

15:42

now in 2025 with all the

15:44

great tools we have, we have

15:46

Central One and CrowdStrike and all

15:48

the EDRs and NDRs, all that

15:50

good stuff. It becomes much more

15:52

difficult to engage a red team,

15:54

okay? So as someone is emulating

15:56

an adversary, what you're looking for

15:58

is the slightest misconfigurations and oversights

16:01

because that's what you're going to

16:03

leverage to circumvent these systems, right?

16:05

You might have an organization that

16:07

just bought an EDR and they

16:09

set it and forget it. No

16:11

customization. No extra

16:13

sysconfigs, no extra anything. And

16:15

they're expecting that their EDR is going to

16:18

stop everything, every single attacker. That's not reality.

16:20

It's the furthest from the truth. This is

16:22

why we talk about all the time. If

16:24

you buy something, you're going to invest in

16:26

the product. Talk to

16:28

the security team over there. Talk

16:30

to your sales engineer. Get

16:33

in contact with support, whatever. Learn

16:35

about the product. Read the documentation. Customize

16:37

it. Because these products are not... know one

16:39

size fit all that's not that's not

16:42

the truth at all so uh but yes

16:44

going back to your question i if

16:46

it's a red team red team um we're

16:48

trying to minimize noise that's one two

16:50

we're trying to live off the land as

16:52

best as we can and then three

16:54

we're looking for misconfigurations if it's just like

16:56

a pen test we're noisy as hell

16:59

because that's the point of a pen test

17:01

the point of a pen is to

17:03

identify gaps right including are you guys actually

17:05

catching us while we're making noise If

17:08

they're not catching you and you're doing a

17:10

pen test, then there's something really wrong. Unfortunately,

17:12

that happens quite often, my

17:14

friend. Really? Yeah. On just a

17:16

box that was dangling off

17:19

on an IP they didn't know

17:21

about? No. Listen,

17:24

sometimes an organization will invest in

17:26

an EDR, but EDRs are

17:28

very expensive. So what they'll do

17:30

is they'll deploy the EDR

17:32

on 50 % of their workstations. They'll

17:36

deploy an EDR on all of

17:38

their employee workstations except the servers.

17:41

Or they'll do the opposite. They'll put the

17:43

EDR on all the important servers, and

17:45

everybody else is, don't click on the link,

17:47

right? I don't understand that. It's like

17:49

wearing half a condom. Yeah, it's exactly it.

17:51

It's like wearing little finger condoms, right?

17:53

Yeah. Those things get lost after a while,

17:55

you know I mean? Here's

17:57

the reality. Budget. Budget, budget,

17:59

budget. Those

18:01

EDRs, listen, if you're a company with

18:03

5 ,000 employees, and that's not even

18:05

a lot, right? That's considered SMB,

18:07

right? Small or medium -sized business. 5 ,000

18:09

employees, that means 5 ,000 workstations. You

18:11

know what that bill is going

18:13

to be every year, Chris? Yeah. That's

18:15

like a million dollars plus. Yeah.

18:17

And plus, when they do it, they

18:19

just set it and forget it. They plug

18:22

and play and don't configure it properly.

18:24

Oh, yeah. And now you come back, you

18:26

can retort. You could be an audience

18:28

member in return and be like, well, Hector,

18:30

if they have 5 ,000 employees, they could

18:32

easily afford a million dollars for, you

18:34

know, whatever. Well, the truth of the matter

18:36

is not a lot of companies are

18:38

built that way. A lot of companies have

18:40

loss leaders. A lot of companies have

18:42

to think about the future. A lot of

18:44

companies' budgets are in the future. So

18:46

with that being said, yeah, I've seen a

18:48

lot of organizations that they'll just buy

18:50

EDRs for all the servers and the workstations

18:52

are as is, or they'll try to

18:54

mix it up and, you know. I don't

18:56

know. I've seen some weird setups, bro. Yeah.

19:01

And that never surprised me. You get

19:03

a new client and you're like, how

19:05

the hell have you existed this long? How

19:08

have you not been hacked yet? That's

19:10

the question. All right. Shuckworm

19:13

targets foreign and military missions

19:15

based in Ukraine. So Shuckworm is

19:17

a Russian -linked... espionage group targeting

19:19

western countries military missions in

19:21

ukraine started in february of 25

19:23

and continued into march of

19:26

25 and it looks likely they're

19:28

infecting removable drives indicated by

19:30

window registry value under the user

19:32

assist key link to a

19:34

link file name d colon backslash

19:36

files dot link so did

19:39

you read about this one hector

19:41

of course this is uh

19:43

this is fun you know what

19:45

this reminded me of It

19:48

reminded me of what was that

19:50

really popular one that happened in Iran?

19:52

I forgot the name of it.

19:54

You know what I'm talking about. With

19:57

the nuclear facility? Stucknet. Stucknet? Yeah.

20:00

It reminds me of Stucknet a little bit, right?

20:02

Because there was the propagation through removal drives. This

20:04

is exactly what we're seeing here. Now,

20:06

it wasn't the actual initial entry,

20:08

but the initial entry here, there

20:10

was an attack chain. There was

20:13

that initial like payload, the payload

20:15

executed PowerShell. Then it used HTA.

20:18

It basically lived off the land for the

20:20

majority of it. And then it propagated through

20:22

removable drives. So, yeah,

20:24

it's pretty bugged out to see that this still

20:26

works in 2025. This should not be a

20:28

thing. Now, if you had, and

20:30

this is a great example. We were just

20:32

talking about a moment ago, Chris. If

20:35

you have an environment and you've

20:37

customized, if you've customized your endpoint

20:39

security. Aside from

20:41

buying and deploying an

20:43

EDR, we're talking about you setting

20:45

up sysconfigs, you're following stig, you're following all

20:47

these different rules. Something like this

20:50

probably would not be a thing, but

20:52

unfortunately. It looks like

20:54

they were using part of a

20:56

Tor network proxy to obscure the

20:58

original IP of where stuff's going

21:00

back. So shouldn't your EDR be

21:02

blocking Tor IPs? Well, that would

21:05

be the NDR, right? Your NDR

21:07

slash, yeah. Your NDR is your

21:09

network detection and response and or

21:11

your next generation firewall could have

21:13

caught some of that. Because,

21:16

you know, again, again, Chris,

21:18

you buy a product, you do

21:20

not customize it, right? You

21:22

take a next generation firewall, it could block

21:24

tour traffic right off the jump, okay? Depending

21:27

on the product, obviously, and the

21:29

rules. But they

21:31

could have caught this if they've

21:33

made the proper investments and

21:35

configurations. Yeah,

21:37

but it looks like Symantec has

21:40

put out some indicators of compromise,

21:42

IOCs in our world, if we

21:44

want to keep going on with

21:46

that stuff. Put out some file

21:48

hashes, and Symantec endpoint has blocked

21:50

some of it. So something to

21:52

look into, guys. If you guys

21:55

are running your networks out there,

21:57

something you're going to want to

21:59

think about blocking some of these

22:01

files and blocking these connections. So

22:03

LastPass hack keeps giving us, Hector.

22:05

Gives us, gives us, gives over

22:07

and over and over again. It

22:09

just keeps on going. Keeps on

22:12

coming. The latest thing we're seeing

22:14

is that details of ongoing cryptocurrency

22:16

theft that started with the 2022

22:18

LastPass hack where hackers stole vault

22:20

data is now affected over 25

22:22

million users, now disclosing

22:24

that crypto is being put

22:27

over. Dust is being

22:29

swept over into at least

22:31

95 confirmed victims of

22:33

the LastPass hack, losing over

22:35

$50 million. Oh,

22:37

yeah. Part of the larger

22:39

430. million dollar theft total

22:41

from LastPass. Well, here's

22:43

what we know. We know that LastPass was breached

22:46

at some point, and that's an episode on

22:48

its own. In fact, remember when we did the

22:50

whole Pegasus episode? You should probably do an

22:52

episode on LastPass at some point, because it was

22:54

such a massive breach, and there were a

22:56

lot of mistakes that were made, Chris. Yeah,

22:58

the only problem with doing an episode with it

23:00

is new information just keeps always coming out. I

23:02

love doing a wrap -up. Yeah, I always love

23:04

doing a wrap -up once we know everything. And who

23:07

knows when LastPass is going to end. But you're

23:09

right. We should do an episode on it. Yeah.

23:11

I mean, we could try to bring

23:13

the audience up to speed, but you're right.

23:15

This could go on for the next

23:17

five years minimum because these adversaries stole so

23:19

much plain text data from these vaults.

23:21

Remember, LastPass is great for

23:24

what it does, but if there's a

23:26

plain text section there, you have to

23:28

assume that it's also going to be

23:30

plain text for the adversary if the

23:32

platform is compromised. And that's exactly what

23:34

happened here. So, yeah. So Secret Service,

23:36

FBI, DOJ all collaborated to link the

23:38

LastPass breach to at least 150 million

23:40

cyber heists. But it sounds like it's

23:42

going up more and more and more

23:44

and more. Even the Ripple's co -founder, Chris

23:46

Larson, was part of that. You

23:49

know, the hack began in August of

23:51

2022 with hackers stealing the vault data

23:53

and the backups. And now we're up

23:56

to almost $430 million in loss from

23:58

these guys. So I'm going to guess

24:00

we're going to see some civil complaints

24:02

on this whole thing. Well, I mean,

24:04

this happened in 2022. We haven't seen

24:06

shit since then, right? Yeah, that doesn't

24:08

make sense to me. You

24:11

know, we saw what happened with

24:13

Oracle last week. Oracle last week said,

24:15

hey, we were not breached. Some

24:18

guys said they were breached. And then

24:20

immediately, once there was enough evidence

24:22

posted online, boom, lawsuit, right?

24:24

We saw a series of lawsuits against Oracle. But

24:27

last pass, they're getting a pass,

24:29

apparently. Damn. I

24:31

don't know, Hector. There's so much to talk about

24:33

going to cybersecurity. We could go on and on

24:35

about this. We got a Windows Zero Day exploit

24:37

out there going around. WordPress plug

24:39

-in vulnerability. Do you see that

24:41

one going around with at least

24:43

affecting 100 ,000 WordPress sites? Yeah. Ransomware

24:47

attack going around against a pension fund

24:49

in Australia. spyware targeting

24:51

advocacy groups. And, you know, and they also

24:53

put out a report of AI -powered cybercrime

24:55

surge. You know, research shows 55 % increase in

24:57

spear phishing effectiveness. Have you seen this? I've

24:59

been getting a lot more sophisticated spear phishing

25:01

attacks lately. A ton. And I'm getting on

25:03

the text messages too. Like these guys. Text

25:05

is out of control. Oh, yeah. The text

25:08

is out of control. And, you know, I

25:10

would hope, I would have hoped right now

25:12

we would have protections set in place to

25:14

kind of deal with this. But

25:16

it just highlights the fact that

25:18

our mobile communication systems are still

25:20

30 years behind. It really is.

25:22

Have you ever been sent a

25:24

text that had a legitimate link

25:26

in it that wasn't from, well,

25:28

not even a known entity? I

25:30

don't get any texts with links

25:32

in them. No, I get texts

25:34

with links, actually. We get signal

25:37

back and forth, but I feel

25:39

a little bit more secure with

25:41

signal. But if I get a

25:43

text with a link in it, well, I

25:45

guess I have an iPhone and auto -load

25:47

some of that shit. Well, that's a problem,

25:49

right? You don't want your phone to automatically

25:51

create a URL redirector for you. But here's

25:53

another thing. You bring up a good point.

25:55

If an adversary, and you know what's funny?

25:57

This is actually a really funny point. And

25:59

this is for the audience here that's into

26:01

web application security. Or you guys manage or

26:04

own or develop web apps. If

26:06

you are to use, like,

26:08

I love Burp Suite. Burp Suite.

26:11

OAuth, Zap, these are like

26:13

web application, you know, in between

26:15

proxies to help you identify vulnerabilities, et cetera, et

26:17

cetera. You use these tools

26:19

or use an automated tool to, you

26:21

know, do an assessment of your

26:24

web application and it might find maybe

26:26

an open redirect or something. And

26:28

in most cases, that open redirect is

26:30

considered a low severity issue. So

26:32

from a developer's point of view, you're

26:34

like, eh, I can fix that

26:36

next week, next month, next year. It's

26:38

not critical until your open redirect

26:41

is used for a phishing campaign. Now

26:43

it's a critical, especially if it's

26:45

against your clients. If against your users,

26:47

your customers, and they're getting a

26:49

legitimate link from your website, your web

26:51

application, because you did not validate

26:53

the destination URL in your redirect, it's

26:55

a wrap. It's going to be

26:58

super successful. And yeah, we've seen that.

27:00

Absolutely. Again,

27:02

we're in 2025. That's

27:04

an issue that's been

27:06

targeted, exploited, used, abused. since

27:09

like the 1990s. You know, phishing

27:11

campaigns in the 90s, listen to this,

27:13

it's crazy. This is back when

27:15

AOL was still very popular. And

27:17

the AOL web servers, brother, were

27:20

so vulnerable. Shout out to Steve Case

27:22

from AOL. LOL. Those

27:25

servers were so vulnerable, brother, that you could find

27:27

an open redirect on anything. And when you send

27:29

a phishing link to somebody on AOL, it had

27:31

an AOL link. So you're thinking like, oh, this

27:33

is legitimate. Until you click it, you get phished,

27:35

and you're done. You're owned. Yeah,

27:37

it's like, you know, then it morphed into

27:39

putting things on Google Drive, so then it

27:41

had a Google link to it and all

27:43

that. Yeah, then eventually it would Google links

27:46

and then box and, you know, yeah. It

27:48

goes efficiently. Guys, don't click on

27:50

links. Just don't click on them. Yeah,

27:53

look what happened when I clicked the link. I

27:55

got locked up. That's right. That's right. Thank God you

27:57

clicked on that link. Shout out to Tropo, my

27:59

boy. All right,

28:01

guys. I'll spit it out this

28:03

time a little better. Questions at hackerinthefed .com.

28:05

If you guys got anything you want

28:07

to ask us, reach out to us.

28:09

Questions at hackerinthefed .com. Hector is excellent

28:12

about answering your questions. I am shit.

28:14

I am not good at reaching out.

28:16

I owe people still some answers. I

28:18

will do that. I'll say

28:20

tonight, but I'm going to say tomorrow. Bro,

28:22

you have an email there from March 14th.

28:24

It's specific to you. I know. I know.

28:26

I am a dick about getting back to

28:28

you on email. Every time I go

28:30

to sit down, it's just, I don't know. I

28:33

got to do it. I will do it. I promise. I

28:35

it to you. I owe it to the listeners. Well, let

28:37

me tell you. You know what makes me happy? I

28:39

love that last email that came

28:41

in today. You know what I'm talking

28:43

about. That is such a beautiful

28:45

email. And I'm happy to hear that,

28:48

you know, some of you guys

28:50

are listening to us. You guys are

28:52

getting the bug. You're getting motivated

28:54

to get into the industry or swap

28:56

from blue team to red team.

28:58

I love that. And I love to

29:00

hear when you guys are having

29:02

success in that space. God bless. Hector,

29:04

it was such a positive message

29:06

you just gave the listeners about writing

29:08

those questions at HectorAndTheFed .com. But now

29:10

it's time for Hector's weekly rant.

29:12

Oof. Is it going to be

29:14

a good one? I'm excited. It's going to

29:16

be a good one. It's going to

29:18

be a tough one. It's tough because the

29:20

last time I did a rant that

29:22

had a political alignment, you got an email

29:24

from a listener that said, hey, guys.

29:26

I am never listening to you guys ever

29:28

again. And I wrote the

29:30

person back. I said, listen, brother, I

29:33

understand. And that's your prerogative. I appreciate you.

29:35

Thank you for listening. And then I

29:37

sent him some corrections. I think he was

29:39

wrong with something. But nonetheless. But I

29:41

don't understand that because I like listening to

29:43

the opposite side of me. I do

29:45

not want to live in an echo chamber.

29:47

Like, to me, it's just, it's great.

29:49

Now, sometimes I can't watch movies if it's

29:51

over the top on the other side.

29:53

Sure. But, like, I like talking to educated

29:55

people and hearing educated people speak about

29:57

the other side. I love it. I agree.

29:59

You know, I am somebody that will

30:01

watch. I go to Drudge Report all the

30:03

time. Drudge Report is a conservative -leaning news

30:05

site. You know, I

30:08

listen to conservative podcasts. You

30:10

know, I listen to liberal podcasts. As you guys know,

30:12

I'm in the middle, right? You know,

30:14

I'm not specific to any side. But I

30:16

try to get as much perspective on both sides

30:19

because I learned one thing very early on

30:21

in my life. And that is I can never

30:23

take one source of information as truth. I

30:25

have to look at, you know, multiple sources because,

30:27

you know, I want some correlation. I want

30:29

to be able to correlate some of this stuff.

30:31

Well, you and I have been in the

30:33

media enough to realize how much the media gets

30:35

wrong. Like stories that either one of us

30:37

have been involved in, you know, I've never read

30:40

one that was completely truthful. Even

30:42

to this day, like I

30:44

look at my Wikipedia sometimes and...

30:47

Or I get reminded of it. I keep forgetting I have a

30:49

Wikipedia. And I'm like, damn, half this shit is wrong. But

30:51

I'm not going to be the weirdo that's going to update and

30:53

try to fix this. Because then to look like a narcissist. So

30:55

I just let it be. But even to

30:58

this day, 12 plus years later, people still

31:00

get my story wrong. And I'm like, I

31:02

don't know how else to correct it. I've

31:04

answered all the questions. But

31:07

it is what it is. But going back to

31:09

the rant, guys. As

31:12

you can imagine, Chris and I

31:14

are very opinionated people. And

31:16

we're not scared to speak our minds. When

31:19

it comes to cybersecurity, I've been very passionate

31:21

about it since I was a kid. Ever

31:23

since I saw war games and

31:25

I watched the Nets. And, you

31:27

know, I got really big into the scene, into the

31:30

community. I thought I wanted to

31:32

be a hacker. I thought the scene was

31:34

cool until I realized it wasn't. I even

31:36

became radicalized along the way. And I admit that

31:38

and obviously I've changed. But there's

31:40

one thing that I don't like, my friends. And

31:43

that is a lack of consistency. And

31:46

so tonight's topic is going to

31:48

be on a recent story that happens.

31:50

And it's kind of ongoing. It's

31:52

a big deal in the cybersecurity industry.

31:55

And it's on the... Let's

31:57

see how I can put

32:00

this. So you guys know

32:02

there's an organization called CISA

32:04

that is a cybersecurity and

32:06

infrastructure security agency. CISA is

32:08

really fantastic. Go check the

32:10

website out. It's awesome. In

32:12

fact, CISA was heavily supported...

32:14

pushed by our president here

32:17

in the United States, Donald

32:19

Trump. God bless him. And,

32:21

you know, I personally

32:23

feel it's one of the things that

32:25

he did right, right? Pushing CISA and

32:27

pushing cybersecurity initiatives. And

32:29

it was mostly

32:31

agnostic politically, right? Because

32:33

not everyone at CISA was like conservative.

32:35

It was a mix of people. And I

32:37

love that because it was a reminder

32:39

to us, my friends. that cybersecurity does not

32:41

need to be politicized. I'm sure you

32:43

heard me say that before. Unfortunately,

32:46

things have changed. And

32:48

there's a story where

32:50

the former director of

32:52

CISA, Chris Krebs, who

32:54

now heads Sentinel -1, or

32:57

is associated with Sentinel -1, there's

32:59

been a story that back in

33:01

2020, when CISA was overseeing the

33:03

election, Mr.

33:05

Krebs posted a tweet saying, We

33:08

found no evidence of election

33:10

interference. The election systems and

33:12

everything worked as intended. And

33:15

that's that. But unfortunately,

33:17

and as reported by April 10th,

33:19

a couple of days ago, a

33:21

few days ago, our president has

33:23

decided to revoke the security clearances

33:25

of both Krebs and Sentinel One,

33:27

the organization that he works for

33:29

or is associated with, because

33:31

of that tweet. It

33:33

is a political decision. I

33:36

think I would have loved if Trump would have

33:38

sat down with Mr. Krebs and said, hey, buddy,

33:41

I know back in 2020 you said X, Y, and Z.

33:43

Here's what I think. Here's the evidence that we have.

33:45

I want to see your evidence. Let's

33:47

sort this out. Let's clear this up. But

33:49

unfortunately, it's not the case. Now, that's

33:52

not the problem that I have, Chris. Chris,

33:54

that's not the problem I have. What's

33:57

the problem here? The

33:59

problem that I have

34:01

is this cybersecurity industry

34:03

is mostly silence. My

34:05

friends, I like

34:07

consistency. The cybersecurity industry is

34:09

very vocal. They like to open their

34:11

fucking chops. Pardon my language. But your

34:13

boy Hector's upset. You

34:16

have people in the cybersecurity industry,

34:18

they like to talk. They like to

34:20

tweet. They'll make a YouTube video

34:22

when something's not right until something like

34:24

this happens. And what

34:26

that tells me, Chris, you know, and

34:29

let me just point out something. Because

34:31

I was the bad guy in my former

34:33

life. I'm used to being the bad guy.

34:35

I get it. I'm not afraid to

34:37

say what I feel. I

34:40

might lose a customer. That would suck. But

34:43

the one thing I'll tell you is I'm

34:45

not biased when it comes to stuff like

34:47

this. I try to be as neutral and

34:49

objective as humanly possible. But

34:52

the rest of the industry,

34:54

they're kind of hightailing it. They're

34:57

not really saying what they were saying two

34:59

weeks ago about other topics. They're

35:01

totally cool with this because they're

35:03

afraid that Donald Trump is going

35:05

to turn against them. And that's

35:08

just the reality. Our friend

35:10

Jeffrey Carr, you know Jeffrey Carr. I

35:12

love Jeffrey. You know, he's been in the

35:14

game for a long time. He's not

35:16

in information security. He's in intelligence. He's a

35:18

different part of the industry. But

35:20

even he went on LinkedIn. And, you

35:22

know, my friend Jeff, you

35:24

know, he's dealing with a cancer situation. He's

35:26

publicized. I could talk about it. And

35:29

even he had to come out of recovery

35:31

to make some posts and say, dude, what the

35:33

hell is going on with this industry? Nobody's

35:35

saying anything. You guys are so vocal

35:37

with everything else, and all of a sudden you're tongue -tied.

35:40

It's okay to have dialogue, my

35:42

friends. It's okay to offer an

35:45

opinion. But when it

35:47

comes to politics, folks start to fold

35:49

because they want something in return. They

35:51

want to play the game. They enjoy that

35:54

play -to -play, and I don't like that. I

35:56

don't pay to play. Nobody pays me to

35:58

play, Chris. So, because

36:00

of that, my rant

36:02

today is on the inconsistency

36:05

and straight -up foolery of

36:07

the cybersecurity industry and

36:09

the leaders within. And

36:11

many of them are my friends, and they're quiet about

36:13

it, and that's okay. Until now.

36:15

Speak the fuck up. Don't be afraid to

36:17

talk. Because at the end of the day,

36:19

what was that great quote? Remember that great

36:21

quote? You know, first they

36:23

came for this, then they came for that. And then

36:25

by the time they got to me, it was my

36:27

left to defend me. Remember that quote? Sure. Really famous,

36:29

right? Really popular. That's where the fuck we're at. Right?

36:32

Speak up. It's okay. All

36:34

right? So your problem isn't necessarily with

36:36

Trump. You don't like what he

36:38

did, but you think it's bullshit. The

36:40

cybersecurity experts aren't standing up for

36:42

Chris. That's exactly right.

36:44

And by the way, Chris, the person, he

36:46

may not even like me. He may be like,

36:48

why the hell is Hector? Trying to

36:50

defend me. No, no, Chris, I'm not defending you,

36:52

brother. Right? That's not what this is. I'm

36:55

speaking up because the industry is very

36:57

vocal with one thing, but when it

36:59

comes to things like this, they're not.

37:02

They put their heads in the sand

37:04

and wait for the drama to kind

37:06

of, you know, slide off, right? To

37:09

move on. And that's not realistic.

37:11

That's not real. We are all in this

37:13

together. How many times have I said

37:15

on the podcast, Chris, that we all, all

37:17

of us play a very important role. in

37:19

the overall security posture of the United

37:21

States. And for our listeners, our friends,

37:23

my boys in Australia, Aussie, Aussie, Aussie,

37:25

right? And all my friends in Deutschland

37:27

and all my friends in France, you

37:29

guys get it. It's okay to speak

37:31

up. You don't have to defend Mr.

37:33

Krebs and you don't have to hate

37:35

Donald Trump, but it's okay to be

37:37

able to speak. And I'm not keen

37:39

on what I'm seeing. Infosec

37:42

Twitter, the ones that are speaking up

37:44

are the ones that are usually the fringe.

37:47

But the ones that are out there

37:49

creating content and making videos and posing

37:51

for pictures with a nice suit on,

37:53

with a nice, beautiful, pearly teeth, they're

37:55

hiding, Chris. And that's bullshit. So that's

37:57

my rant. Be consistent, cybersecurity industry, because

38:00

what you're going to do is you're

38:02

to start pushing people away. And there

38:04

we go. I hope not, Hector. I

38:06

hope that's not true. Well, unfortunately,

38:08

it is. Because, look, you have these young

38:10

people. Look at these young people now. We

38:12

get a lot of emails asking us, hey,

38:14

how do we jump into the cybersecurity industry?

38:16

Okay? If

38:18

those young people start to see this

38:20

kind of inconsistency, they're going to start

38:22

questioning whether or not they even want

38:24

to get involved in the cybersecurity industry.

38:26

And that's the last thing. And you

38:28

might doubt me, fellas and ladies, right?

38:31

But that's the last thing we want, especially

38:33

in a space. Get this, Chris. Something

38:35

I want to kind of point out there. I

38:38

love Chinese people. I love Chinese

38:40

history. I love China. I may

38:42

not like their government. But

38:45

there's a difference right now between the

38:47

U .S. and China that we're kind of

38:49

overlooking. There's two main differences. One, China's

38:51

looking 100 years ahead. The

38:53

United States looks two years ahead. That's

38:55

one. Our system, that's as far

38:57

as they can look. That's as far as we

38:59

can look, and that's really going to hurt us

39:01

in the long term. Two,

39:07

China, on a whim, because of the

39:09

way it's structured over there and

39:11

the government and all that. They could

39:13

push an entire generation of people

39:15

into STEM, right? Science

39:17

and technology, right? While

39:20

we are pushing potential

39:22

employees or rather potential

39:24

students away from STEM.

39:27

We're doing it actively. We're

39:29

scaring people away, right? And

39:32

if you like what it is

39:34

that you have in this

39:36

country right now, right,

39:38

you may want to rethink your strategy.

39:41

in the long term. So with that

39:43

being said, I want folks to get

39:45

into this industry. I want as well

39:47

to work together. I don't care if

39:49

you're black or white, if you're religious

39:51

or not, or politically aligned or not. We

39:54

need to be able to work together, and

39:56

that includes being able to have a conversation. And

39:58

that's all. Well said, said,

40:00

friend. said. All right, guys,

40:02

Five reviews, Share us on social media.

40:04

Tell us your coworkers your friends about

40:06

us. We're just trying to make cybersecurity

40:08

better. you want to support the

40:10

show, join our Patreon. Hopefully Phineas and Will

40:13

has up soon. guys can join. Friend,

40:15

I've loved catching up with you. I look

40:17

forward to our next show. Cheers,

40:19

friend. Cheers.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features