Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
A few years back, a listener wrote to me
0:02
to tell me about a problem they're facing. Okay,
0:05
check this out. They went to buy
0:07
a house, right? And when you
0:09
go to buy a house, there's like a little dance
0:11
that everyone does. Like, do you
0:13
give them the money first?
0:16
Or do they give you the deed first and the keys?
0:18
Or do you do like a quick swap at the same time? What
0:22
if it's a phony check or the deed is made
0:24
up? This is where escrow
0:26
comes in. Both the seller
0:29
and buyer hand their things to a third
0:31
party. Someone that both sides
0:33
trust and waits for everything to
0:35
clear. If the check clears and
0:37
the deed is valid, then escrow says, okay,
0:39
the deal is done and gives the money
0:41
to the seller and the keys to the
0:43
buyer. So this guy, a
0:45
listener of mine, says he bought a house.
0:48
And during this process, he gave $250,000 to the escrow
0:50
company. But
0:54
then someone scammed the
0:56
escrow company. They posed as the
0:59
seller and said, hey, could
1:02
you just deposit the money into our bank account
1:04
directly? And escrow's like, oh yeah, of course,
1:06
no problem. We do this all the time. Here you go.
1:09
And they deposited the $250,000 into the scammer's account
1:11
instead of the actual seller. But
1:16
here's the crazy part. Because
1:18
the seller never got
1:21
the money, escrow wouldn't
1:23
give the keys to the buyer. They
1:25
were being jerks about it. They were trying
1:27
to say, oh, sorry, we lost the money.
1:30
No house for you. The deal has been canceled.
1:33
The buyer's like, whoa, whoa, whoa, no,
1:35
no, no. That's what escrow is for.
1:38
You're our trusted third party. We trusted
1:40
you to do this deal. You
1:42
screwed up. And that's not our
1:44
problem. That's yours. But escrow's like,
1:46
no. I
1:51
never got an update on what happened here and
1:53
if this got resolved. I think the buyer took
1:55
escrow to court to try to get their money
1:57
back. What a nightmare.
2:00
nightmare though, to send a huge check
2:02
somewhere only for it to go to
2:04
the wrong place and then someone else runs off with
2:06
the money. Arrrgh! These
2:12
are true stories from the dark side of
2:14
the internet. I'm Jack Resider. This
2:19
is Dark This
2:40
episode is sponsored by NetSuite. Your business was
2:42
humming, but now you're falling behind. Teams are
2:44
buried in manual work. It's taking
2:46
forever to close the books. Getting
2:48
one source of truth is like pulling teeth. If
2:51
this is you, you should know these
2:53
three numbers. That's
2:59
the number of businesses that have
3:02
upgraded to NetSuite by Oracle. NetSuite
3:04
is one of the top cloud
3:06
financial systems. Streamlining, accounting, financial management,
3:08
inventory, HR, and more. NetSuite
3:12
turns 25 this year. That's
3:14
25 years of helping businesses do
3:16
more with less. Close
3:18
their books in days and
3:21
drive down costs. Because
3:24
your business is one of a kind. Manage,
3:26
risk, get reliable forecasts, and
3:29
improve margins. Everything you
3:31
need, all in one place. Right
3:33
now, download NetSuite's popular KPI checklist.
3:35
It's designed to give you consistently
3:37
excellent performance, absolutely free. Get
3:41
it at netsuite.com/darknet. That's
3:44
netsuite.com/darknet to get
3:46
your own KPI
3:49
checklist. netsuite.com/darknet.
4:00
Some things in life are no-brainers. You always
4:02
choose the window over the middle seat. You
4:04
always go to the restaurant with the free
4:06
breadsticks. You outsource business tasks that you hate.
4:09
But what about selling with Shopify? From
4:12
the launch your online shop stage to the
4:14
first real-life store stage all the way to
4:17
the did we just hit a million orders
4:19
stage? Shopify is there to help you grow
4:21
and running a growing business means getting the
4:23
insights you need wherever you are. With Shopify
4:26
single dashboard you can manage orders, shipping and
4:28
payments from anywhere. What I think is cool
4:30
about Shopify is it really doesn't matter how
4:32
big you want to grow. They have a
4:35
huge list of integrations and third-party apps that
4:37
are going to help you revolutionize your business.
4:39
You want to integrate on-demand printing or accounting
4:41
or chatbots? Shopify's got it covered. Sign
4:43
up for a $1 per month trial
4:46
period at shopify.com/darknet and that's
4:48
all lowercase. Go to shopify.com/darknet
4:50
now to grow your business
4:53
no matter what stage you're
4:55
in. shopify.com/darknet.
5:05
I was clicking around the other day and
5:07
came across this story on Good Morning America.
5:10
Shreya Dada thought she'd met the man of
5:12
her dreams on a dating app only to
5:14
find out her Prince Charming was a scam
5:16
and she was out more than $450,000. What
5:19
the hell in the
5:21
world is some guy on a
5:24
dating app scam someone for $450,000.
5:26
That's insane. This person presented them
5:28
a test to be everything I was looking
5:30
for. She
5:33
was the victim of a scam
5:35
known as pig butchering. A scammer
5:38
pretends to be looking for love
5:40
online. They find a love interest
5:42
casually encourage them to invest in
5:44
crypto via a fake app but
5:46
eventually they can't access the money
5:48
at all. The money is gone.
5:50
The investments not real. Dang
5:54
things we do for love huh or
5:57
maybe it was for money or
5:59
maybe it was for the love of money, I don't
6:02
even know. Yeah, so hearing that
6:04
story, I've heard it a thousand times over.
6:06
Okay, hold on, who are you and what
6:08
do you do? Oh yeah, yeah, so my
6:10
name is Ronnie Tukazowski. I've been fighting business
6:12
email compromise for the last eight years now.
6:15
So my role in this is I work behind
6:17
the scenes with a lot of people who are working
6:19
with the romance scam victims. I do
6:21
a lot of work with Secret Service, FBI.
6:23
I also work back and forth with victims
6:25
too, because a lot of what happens is
6:28
the scammers will go to different dating websites,
6:30
they will go and find people
6:32
in order to date, they will move the discussions
6:34
off of the platform, just because most of the
6:36
platforms cost, but they'll move them to like WhatsApp
6:38
and then from there they'll start grooming the person.
6:41
They'll say loving things, we've had picked cases
6:43
where some of the victims might send new
6:45
pictures over to their lover. And
6:47
once they go and are exchanging those
6:49
sweet nothings, the scammers directly build that
6:51
relationship, build those emotions. So
6:53
I heard this term pig butchering and
6:56
I just, I'm not connecting the dots
6:58
here. And nowhere in this romance or
7:00
crypto or cold, sending
7:02
money to people, is there a pig involved? Where
7:05
is this term pig butchering coming into? Yeah,
7:07
so the term pig butchering comes from
7:10
a Chinese phrase called shaju pan, which
7:12
is essentially a brutal, I think it's brutal meat,
7:14
if I remember right, I forget the exact translation.
7:17
But what the concept is, is the scammers
7:19
will go and try and fatten the pig
7:21
if you will. So what they will do
7:23
is extract as
7:25
much money as they can out of
7:28
a victim. And where the
7:30
pig butchering comes in is that
7:32
once the scammers get to a point where they feel
7:34
like they can't get any more money out of
7:37
the victim, they will take the pig in
7:39
for slaughter or slaughter the pig. And
7:41
what they mean by that is actually pulling the rug
7:43
out from under the victims and like walking away and
7:46
essentially be like, I got all the money that we could. So
7:49
that's kind of where the phrase pig butchering comes from.
7:51
Okay, so for some reason, Ronnie is
7:53
attracted to this type of scam or
7:55
fraud or whatever you wanna call it
7:58
and zooms into whenever he... sees these stories
8:00
come up. And one day he
8:02
heard about a colleague who got pig butchered
8:06
and wanted to help him out. Him and his
8:08
girlfriend, they were dating for several years. Like they've
8:10
been together for as long as I've known. It's
8:13
probably, probably about eight years now
8:15
that they've been together. So
8:18
they were engaged to be married, they
8:20
had a house together. And unfortunately, things
8:22
happened and that relationship kind of flops.
8:24
So they went their separate ways. He
8:27
lost the house. And unfortunately,
8:29
it wasn't really a good circumstance. Breakups
8:32
are hard. It's a tough time for anyone.
8:34
You can sink into deep levels of depression.
8:37
Your defenses are weak,
8:40
and your vulnerabilities are exposed. So
8:43
he went to go online and go date
8:45
somebody. So he went onto a dating platform
8:47
found this really pretty French girl who was
8:49
very involved with him and very kind of
8:52
attached to him. So they both
8:54
the two of them really hit off. And at
8:56
some point, she popped the question to say, Hey,
8:58
I'm also doing a lot of crypto investments. Is
9:01
that something that you'd be interested in? Okay,
9:04
I don't see any red flags yet. And
9:06
he didn't either. At this point, they were
9:08
just chatting through texts, like a lot. She
9:10
seemed to be into everything he was interested
9:13
in. And he was liking that. He
9:15
was coming out of his breakup and she seemed to
9:17
be caring and helpful. Yeah, okay.
9:19
So she's into crypto investments. That's fine.
9:22
She could be into that. But
9:24
he was curious. Was it
9:26
really working for her? He
9:29
had some crypto somewhere. I
9:31
was like, tell me more about what you're
9:34
invested in. So she tells him,
9:36
man, there's this hot investment. It's making
9:38
mad bank. And he's like, Yeah,
9:40
okay. Well, what is it? Show me. So
9:43
she keeps talking up. I'm basically just
9:45
living off the profit from this thing. It's nuts.
9:48
And he's like, you got to show
9:50
me what you're talking about. So she's like,
9:52
okay, so you know how your savings account
9:54
makes interest, right? This is like that, but
9:56
it just pays much more. You put your
9:58
money in and then daily. It makes
10:00
interest and you could just take that interest out
10:02
if you want or leave it in and it
10:04
adds up and you make even more. So
10:06
he's like, well, how much interest are you
10:08
earning? And she's like 20%. If
10:12
you have $1,000 invested, it'll earn you $200 in interest a day. And
10:17
at any time you could just take your $1,000 out if you want. And
10:20
he's like, man, that does sound too
10:22
good to pass up. So she
10:24
gives them the links to read up on. Sitting
10:27
in the field, he knew a good bit of crypto. He's
10:29
naturally a very skeptical person. So
10:33
he did his research on a lot of the way
10:35
that they present the money. So
10:37
he went, they provided links and information
10:39
for him to check once he went
10:41
and submitted his money. This
10:44
scheme was very, very clever. I
10:46
mean, this guy was a cybersecurity
10:48
professional. He knew about the dangers
10:50
of cryptocurrency and was suspicious
10:52
about all this. But
10:54
this had a mix of legitimate
10:56
information with just a small dash
10:58
of fraud. See, the
11:00
way they had this set up was they
11:02
made it look like it was using a
11:04
legitimate exchange, in this case, crypto.com. And
11:07
the way that the application was presented to him
11:09
was, and this is his perspective, I'm still trying
11:11
to get the full scope here, but
11:13
there was actually a browser that they
11:15
could use within crypto.com that will
11:17
have it show up that actually looks like
11:19
the application. And looking at some
11:21
of the screenshots, it looks like it was right
11:24
within the crypto.com application. And because of that, when
11:27
your user goes and clicks that stuff, it appears to
11:29
be 100% legitimate. I
11:31
looked at some of these screenshots myself. It's hard to
11:33
tell what's going on, but one thing is clear. They
11:36
social engineered him and tricked him into
11:38
sending his crypto to the scammers wallet.
11:40
They just disguised the wallets to look
11:43
trustworthy. Basically, he would
11:45
buy cryptocurrencies on crypto.com with real
11:47
money and then send those crypto
11:49
coins to this investment project, investment
11:51
in quotes there. Really
11:53
it was a scam and it looked
11:55
really good. It didn't look like a scam at
11:58
all. You could see your balance. earnings,
12:00
you could interact with it, you could pull your money
12:02
out at any moment. So he decided
12:04
to give it a try. He
12:06
put some money in, sent the crypto,
12:09
and when he saw it was generating interest,
12:11
he tested it by taking some out and
12:13
was like, wow, this is actually working because
12:15
it looked like it was. But
12:18
this is where the pig butchering scam comes in.
12:20
The scammers wanted him to take the bait,
12:22
start with putting in a little, see that
12:24
it's working and then hopefully put in some
12:26
more and more and more and hope that
12:28
he dumps a ton of money into this.
12:31
And when they think he's put in enough, they'll
12:33
take the money and run. So
12:36
as he starts watching the money grow on this
12:38
site, the scammers start ramping
12:40
up the pressure. They tell him
12:42
if he invests a little bit more within this
12:44
timeframe, he'll get locked in for bonus interest, basically
12:47
presenting him with more exciting
12:49
opportunities that were time sensitive. In
12:52
addition to putting his own money in there
12:54
because of the high returns that
12:56
were being shown, he also went and had
12:59
filed a, had gotten a loan. So we
13:01
actually use a loan to go and put
13:03
more money into it because again, if you
13:05
can use that loan to go and get
13:07
more money, who would do that? So
13:09
that's another common thing we see with a lot of people is they'll
13:11
go take loans out from a financial
13:14
institution. They'll take a second mortgage
13:16
out on their homes in order
13:18
to go and get
13:20
more money based on those investments. Oh
13:23
wow, taking loans out. Now
13:25
I see why someone can end up losing a ton
13:27
of money in this scam. But not
13:30
only that, these scammers were really tricky.
13:32
They would sometimes tell him, look, we
13:34
locked your account because there's not enough
13:36
funds to cover withdrawals. Please deposit
13:38
another $40,000 in the next 96 hours to unlock your account.
13:43
And he's like, well, wait a minute. What if I
13:45
don't deposit that? Then you risk losing
13:47
your money. So he's like, oh no, I don't
13:49
want that. And so he goes scrambling,
13:52
looking for even more money to put into this. So
13:54
this guy eventually goes all
13:57
in and then some putting all
13:59
his savings. and taking a loan out
14:01
to add more because to
14:03
him, this was a way to get
14:06
out of debt a past financial freedom
14:08
and it was very exciting. From there, the
14:11
scammers were able to successfully collect about $90,000
14:13
out of him. Oh, how cruel. And yeah,
14:15
this $90,000 was a nice fat pig. And the
14:22
scammers were like, okay, that's right.
14:24
Let's take it. And they did.
14:26
They took his money leaving him
14:28
high and dry. Ouch. He
14:31
saw his money disappear and he knew
14:33
he was screwed. But
14:37
he sat and thought about
14:39
it for a bit. Is
14:43
there a way to get any of this money
14:45
back from the scammers?
14:47
What he did was he
14:49
used the exact same emotional
14:52
manipulation tactics against the scammers.
14:54
And what he did was he was like, hey, I'm going to
14:56
go ahead and invest more, but I need to
14:59
pull this little bit of money out in order to help
15:01
with this loan. So if you can let me pull some
15:03
of my money out or wire it over here, I'll go
15:05
ahead and do that. So he was able to get $10,000
15:07
of his back by, again, deploying the
15:11
same tactics against the scammers. And he was able
15:13
to build up enough trust with them to where
15:15
he's able to get that money back. He
15:18
scammed him back. Hilarious.
15:21
Man, that reminds me of this story I have. Okay.
15:23
So this one time I was in Vegas,
15:25
right? Yeah, I was actually going there for
15:28
a Def Con. And when I went, I
15:30
brought a burner phone with me, right? It's
15:32
just a phone that I paid with cash.
15:34
You got a prepaid plan, all that stuff.
15:36
It was a new phone number. And when
15:38
I got to Vegas, I was getting text
15:40
messages from a scammer. I sniffed it out
15:42
right away. They were trying to play on
15:44
my empathy, saying things like, we
15:46
can't afford money to buy food for our kids
15:48
and medicine and clothes and something. And they specifically
15:51
asked for $749 to get themselves sorted. And I'd
15:55
be an absolute angel if I could help. And I'm going to go back to the next one. I'm
15:57
going to And
16:00
I was like, hmm. I'm
16:02
sad. Look,
16:06
I'd love to help, but I'm currently stranded.
16:09
My boyfriend and I got in a fight and he
16:11
dumped me off in the middle of nowhere. And I
16:13
don't know anyone here who can help me. I don't
16:15
have any money to get home. I am screwed. I
16:18
was trying to use the scammers tactics on
16:20
themselves, trying to be someone in distress, just
16:22
like they were saying. It
16:25
did not work. They kept asking me for
16:27
money. And I was like, okay, listen, I'm
16:29
happy to help you. I have money to
16:31
help you. But my boyfriend took my purse
16:34
and all I have is my phone and
16:36
there's strangers all around me. So
16:38
unless you can help me get home, like, I don't know,
16:40
send me $200. Then
16:42
once I get home, then I can help you. They
16:48
stopped texting after that and
16:50
just left me alone. So when you run
16:52
into someone who's been a victim of this, how do you help
16:55
them? So the way I help
16:57
them is I help them a couple of
16:59
ways. So the first place is that when
17:01
it comes to understanding
17:03
the emotions in our body tied back to a
17:05
lot of the way the scam works, people
17:08
feel a lot of shame. They feel a
17:10
lot of hurt. They feel a lot of
17:12
disconnect because of the stigmas associated with it.
17:14
What I mean by that is when
17:17
you're a victim like this, people don't want to
17:19
come forward on this. So I try and help
17:21
them learn how to work with their own bodies
17:24
in that regard. So that's one way that I
17:26
help them. The second way
17:28
is I point them to the resources where they
17:30
can go and submit a lot of
17:32
requests. So they may be working with IC3 and
17:34
may be working with colleagues who also
17:36
work with romance scams, or
17:38
it may be helping introduce them over to some
17:40
of the crypto assets where they can start pulling
17:43
some of that money back. The
17:45
third thing I do is, again, just trying to
17:47
help put them in contact with the right people
17:49
because what happens is when
17:51
you're in this scam, your head's spinning
17:54
a thousand miles an hour. You
17:56
don't know which way is up. Don't know which way is down.
17:58
You don't know who you trust. And many
18:00
of us work behind the scenes to try and help
18:02
be that good driving force for many
18:05
of these victims. And when we go and
18:07
we try and help them out, that's kind of where we do our assistance.
18:09
In addition to that, we've also been running a mailing
18:13
list for the last seven years, talking
18:15
on many things as a result of business
18:17
email compromise and kind of overlapping things with
18:20
that. And we have close contacts
18:22
with a lot of the banks of financial
18:24
institutions to help either try and reverse some
18:26
of that money or do what we
18:28
can to get some of that money back or
18:30
try and flag those assets where we know, hey,
18:32
these are actually part of a scam. $90,000,
18:35
that's a lot of money to lose. Is
18:38
that kind of the upper limit of where you've
18:40
seen people losing stuff or people losing more? I
18:43
really wish I could say that that was the upper
18:45
limits, but I have seen so much more. I'm
18:48
working with one victim now, I've been working with him for the
18:50
last two weeks, where he was
18:53
suicidal and didn't know which way to turn. Jeez,
18:56
you really take some heavy phone calls. So
18:59
how did this guy lose his money? So very
19:01
much the same way as the first
19:04
person. He found the relationship
19:06
and as the relationship built, they're like,
19:08
hey, I have this great investment opportunity.
19:11
They strung him along as far as they could. And
19:14
once he went and put some of the money in, he
19:16
saw his returns. It was the same story. This
19:19
individual actually was ready
19:21
to retire. He had
19:23
several homes as well. So because of that,
19:25
he ended up opening and doing a second
19:27
mortgage on a couple of his homes or
19:29
to pull some money out. So
19:31
because of that, and because of what he was able to pull
19:33
out on those homes, he may now be facing
19:36
losing those homes as well. And
19:38
as it stands right now, he has lost over $1.7
19:40
million. Dang. I
19:46
mean, I've heard of people losing their life savings,
19:48
but for some reason this feels worse than that.
19:51
I guess it's one thing to lose all your stuff when you're
19:53
young, but it's different when
19:55
you've worked your entire life to
19:58
save up for retirement. and lose all of
20:00
that. Your retirement is
20:03
now gone. Poof. You were
20:05
financially stable and now super in
20:07
debt and your whole future is
20:10
screwed. It's
20:12
awful. I was
20:14
at RSA last year, this year, as a
20:16
matter of fact, got to speaking with somebody
20:19
who had a it was a grandfather
20:21
who had committed suicide. And
20:23
they didn't know why. And they ended going to
20:25
look through his records. And it was over $5 million
20:28
that he had lost. People
20:31
are actually killing themselves over
20:33
pig butchering scams. This
20:35
is nuts. Whoever is
20:38
behind this is just ruthless. I wish
20:40
that was an isolated case. But I've
20:42
also had I had another victim out
20:45
at DEF CON a couple years ago.
20:47
And for her, she ended up losing
20:49
her house losing custody for kids, her
20:52
loss for relationship with her ex,
20:54
her husband and lost her business.
20:57
And she lost shoes into over
20:59
two million dollars. And when
21:01
I asked her what kept her in, she said her husband was
21:03
abusive and she just wanted to feel loved. And like
21:05
that's the reality of many of these crimes is
21:07
that people don't realize that you have
21:09
two factors at play here, you have
21:12
the financial losses, and then you have
21:14
the emotional heart that goes along with it. And
21:16
somebody may lose $90,000 that
21:18
may mean nothing to them. Or
21:21
you may have somebody who loses $8,000. That's
21:23
the entire world to them. So it really,
21:26
right now, we're not accounting for the emotional losses
21:28
on this or the emotional damages for many of
21:30
the victims. So,
21:33
so in this
21:35
first few stories we've heard, it's, it's,
21:39
it keeps getting back to
21:41
romance, right? Do you
21:43
do you see like kind of a pattern
21:46
of who the victims typically are? Are they
21:48
usually people who are looking for love? Or
21:50
what are some other, you
21:53
know, like if we're gonna watch our own
21:55
back, like we got it, we got to
21:57
know when we're in a vulnerable state. And what
21:59
makes a person more vulnerable to this sort of stuff.
22:02
Yeah. So first and foremost, one of the
22:04
constant patterns I've seen, and this is something
22:06
I've seen with many victims, I've kind of
22:08
discussed the resource of the topic, many
22:11
of them tend to be extremely trusting, where
22:13
if you were to be walking on the
22:15
side of the street, this is the
22:18
type of person who would go and help a homeless person
22:20
in need. If a dog was hurt on the side of
22:22
the road, they would go and help them out. And
22:25
they're some of the most kind of souls you'll
22:27
ever meet. And because of
22:29
that trust, the scammers have
22:31
figured out that they can go and manipulate
22:33
and abuse that person and get them to
22:35
do things that they want. A lot
22:38
of what happens is from that
22:40
control perspective, they will actually quote
22:42
unquote, I'm going to use a term that one of the victims
22:44
used to make, is that they'll
22:46
essentially hijack their own consciousness and
22:49
give them a different perspective of reality and
22:51
a different perception of reality. And
22:54
what happens is, is the victims will be
22:56
manipulated to a point where they will be
22:58
pulled away from friends, they'll be pulled away
23:00
from family and only put all their trust
23:02
in this one person. And because
23:05
of that, and because of the kind words that
23:07
they're saying, the victims will want to go
23:09
and be with that person. In
23:11
addition to that, you've also got a case
23:13
where they will say the right words in the right way to make
23:16
the victims want to stay in it even longer.
23:18
So like I said, it's a matter of working
23:21
with the emotions to kind of manipulating the people in
23:23
that way too. The
23:26
another piece I also know this is
23:28
that when it comes to how we as
23:30
humans process our emotions, so many of us
23:32
are just disconnected and we don't even know
23:34
how our emotions work. It's like, we might
23:36
feel this one way about this one thing,
23:38
we might feel this one way about another,
23:40
but we don't realize that how, that we
23:42
actually pick up emotions from other people. And
23:45
because of that, it's something where we don't
23:47
understand how those mechanics work in our own
23:49
bodies, let alone how we are emotionally manipulated
23:51
to go and do this thing or influence
23:53
to go and do that thing. Yeah.
23:56
So what are some of the
23:58
skillsets that these scammers? or
24:00
thieves have because
24:02
it sounds like they understand psychology
24:04
a bit. So that would put
24:06
them in social engineering skills, right?
24:08
To trick people posing as someone
24:10
on the dating app, whatever. But
24:13
also being able to set up these websites
24:16
and understanding crypto and putting
24:18
malware on systems or whatever the case is. What
24:20
do you see as their skill sets in these
24:22
cases at least? Yeah, so
24:24
I'll kind of talk on the geographic
24:27
of where some of these skill sets
24:29
are. So for the pig butchering angle,
24:31
which is out of mostly out of
24:33
like Southeast Asia, we see
24:36
scammers who are skilled in setting up
24:38
websites. They're skilled at working with cryptocurrencies.
24:40
They understand that they need
24:42
to influence a person's emotions
24:44
and play on the emotions.
24:47
We have some tutorials and documents from the scammers where
24:49
it's like 30 page PowerPoint
24:53
in Chinese that essentially comes out to,
24:55
here's where you go and tell them this
24:57
piece. Here's where you influence your emotions here and
25:00
do this. So they understand
25:02
that emotional manipulation piece there. For
25:05
some of your romance scammers in Nigeria, they're
25:08
a whole different basket. For them,
25:11
they're sophisticated in money laundering. They
25:13
know how check systems work. They know
25:16
how to wire money from a
25:18
United States bank out to another bank. And
25:20
they also understand the underlying
25:22
cryptocurrency networks to go and
25:24
cash out a gift card or move
25:26
money over here for Bitcoin.
25:29
So it's something where depending on the geography of
25:31
where the scammers are coming from, it really
25:33
depends on what that skill set is.
25:36
And that's just two of the top countries
25:38
that we see, but there's probably four
25:41
more that I could list off that we
25:43
see elements of silks and zerings scams coming
25:46
out of that, again, go back to that
25:48
human emotion and kind of those human pieces.
25:51
The thing that strikes me,
25:53
I think
25:56
it should strike us all with a bit of fear,
25:59
is that This isn't, you see the
26:02
cybersecurity news every day. It's ransomware
26:04
hit by this company and
26:07
this other company got hacked and all that. This
26:09
is us getting hacked. This is you and me.
26:11
This is each one of our neighbors. This is
26:13
individuals of the world, the citizens of the United
26:15
States or wherever they are. And
26:18
that is just such a close to home thing. It's
26:20
not far away in some other
26:22
company that I don't have to deal with.
26:24
It's me and my personal assets are being
26:27
attacked. And
26:29
I don't know, like when you
26:31
realize that the threat actor is
26:33
right here in my bedroom on
26:35
my computer, it gives
26:37
us a different sense of safety.
26:40
Yeah, and the other thing too, because of
26:42
that safety, we will go in place
26:45
so much on trusting
26:47
the social media providers to be like, okay, this social
26:49
media provider has a really big name. So that means
26:51
they have to be safe and I can trust anything
26:53
that's coming from there. So because of
26:55
how large many of these providers are, there's
26:57
inherent trust of using these platforms.
27:00
And so many victims will go and
27:02
be like, okay, I'm gonna go and trust Facebook for
27:04
seeing this stuff. Yet there was an article that
27:06
came out a couple weeks ago that said, no, eight out of 10
27:09
cyber crime or eight out of 10
27:11
cases of cyber fraud originate on Facebook.
27:14
So when you see numbers like that,
27:17
it's something where the scammers are going
27:19
to use those trusted platforms to try
27:22
and go after people on that. But no, I agree with
27:24
you 100% is that it definitely adds
27:26
a different level of fear to
27:29
how the scam actually works. Because yeah, it's like that
27:31
scammers now in your bedroom with you. And
27:33
they're now stuck in your head as
27:35
you're ruminating over all of the ways
27:38
where they're like, okay, does this person love me? Or are
27:40
they trying to build this relationship? What else is going on?
27:42
And the victims run through their head over and over again.
27:45
With these victims, you've talked to like, the $90,000
27:47
one, the $1.7 million one. Are
27:50
they actually, like how far along in the, how
27:55
close are they to these people, right? Are they
27:57
having video calls with them? Are they having phone
27:59
calls? calls? Are they texting? Yeah.
28:02
So many of them will be texting back and
28:04
forth or using WhatsApp to communicate. Like
28:06
I said, we know that that's how some of them
28:08
are and many of them are receiving like multiple
28:11
messages per day. The
28:13
one colleague who was in for $90,000, I'm
28:17
pretty sure they would have been sending pictures back and forth. Just
28:20
because again, you're now, you're
28:22
not thinking of it in the case of, okay, this is
28:24
a victim. You're now trying to think of it, who's somebody
28:26
who believes they're in a relationship. So you're
28:28
going to go and do everything that you can that you
28:30
believe of that you're in a relationship. Like
28:32
I had one victim who was sending pictures
28:35
of his food to his girlfriend. And
28:37
the scammers do all kinds of weird things. Like
28:39
they'll send photos of two different outfits and ask,
28:41
which outfit should I wear today? And then when
28:44
the victim picks one, it gives them just that
28:46
little bit more of information to know about them.
28:48
Like they like formal clothes more than casual clothes.
28:50
So let's send them more photos of that. Keep
28:52
them on the hook. And just
28:54
think about how much you share about yourself
28:57
on a personal level. When you have a
28:59
new love interest, a scammer could
29:02
easily write all that down and figure out
29:04
your vulnerabilities and play on that if
29:06
they're really good. But I
29:08
still think one way to sniff out
29:11
these scammers is just to pick up
29:13
the phone and call them. I'm betting that a lot
29:15
of these scammers, there's guys posing as women, you know,
29:18
how do they sound on the phone? Even if
29:20
they grabbed someone else to just pose as them and get on
29:22
the phone, that person isn't going to
29:24
know your whole chat history and won't be able
29:26
to carry on a conversation in any way that
29:28
makes sense. Or even more, let's
29:30
do a video call and see what you really
29:32
look like. And so just keep
29:34
that in your head that it's probably a
29:37
red flag. If your love interest refuses to
29:39
answer the call or get on video
29:41
chat with you. Yep. So some,
29:43
so sometimes that is a red flag.
29:45
However, some scammers have figured ways around
29:47
that. I know in the concept
29:49
of like deep fakes and AI, and I know it's
29:51
a whole buzzword right now, but some
29:54
scammers are using that technology in
29:56
order to generate video messages
29:58
back and forth. The other thing
30:00
too, some of them will also use online
30:03
video without audio and they'll just
30:05
be kind of like moving in the camera, like, oh, my
30:07
microphone is not working. Or they'll
30:09
go and share and have a
30:11
phone call with them and they won't share
30:13
video and just say, hey, this part here,
30:15
my video isn't working. So
30:18
they know that that's a piece that people use of
30:20
the metric, but they will go and try
30:22
and find different ways to
30:24
bypass that. Oh, yeah. Dang, I didn't
30:26
even think of that. So I've
30:28
done video interviews with people a lot,
30:30
you know, but I use a Snapchat
30:33
filter on my video to obscure my
30:35
face. In real time,
30:37
on a live video call, my
30:39
face gets distorted. And
30:41
yeah, you could absolutely just use a filter
30:43
to change your face to be a pretty
30:45
lady, even though you're just some dude
30:47
who doesn't even speak English. We're
30:49
going to take a quick ad break here, but stay with us
30:51
because when we come back, we're going to
30:54
talk about black acts. And you're not
30:56
going to want to miss this. Support
31:00
for this show comes from Drada. Far
31:03
too often, security teams are tasked with
31:05
manual evidence collection to become compliant, dedicating
31:07
hundreds of hours that could otherwise be
31:09
automated. And I'd like to think
31:11
that you'd rather use your time for implementing the
31:13
privacy and security program. As
31:15
one of G2's highest rated cloud
31:17
compliance software, Drada streamlines your SOC
31:19
2, ISO 2001,
31:22
PCI DSS, HIPAA, GDPR, and many
31:24
other framework compliances and provides 24
31:26
hour continuous control monitoring
31:29
so you can focus on scaling
31:31
securely. That's like having your cake
31:33
and securing it too. Countless
31:35
security professionals from companies including Notion, Lemonade,
31:37
and Bamboo HR have shared how crucial
31:39
it has been to have Drada as
31:41
a trusted partner in their compliance process.
31:44
Listeners of Dark Net Diaries can get
31:46
10% off Drada and
31:49
waived implementation fees at
31:51
drada.com/darknet diaries. That's
31:53
spelled drata.com/darknet diaries, which
31:56
is all one word.
32:02
Okay, so I'm looking you up online.
32:04
You're known as that BEC guy. What's
32:07
BEC? BEC is
32:09
a business email compromise. Okay, so let's stop
32:11
there. Okay, sounds good. Sounds
32:14
good. BEC, we break down the term business
32:16
email compromise, right? So let's, the compromise part
32:18
makes me think somebody has taken over my
32:21
Office 365, you know,
32:24
email server and is in
32:26
my emails. They've compromised my emails. But
32:29
that's not what you say is BEC. No,
32:32
so if you go and look up the
32:34
history of BEC, business email
32:36
compromise has been the number one crime seven years in
32:39
a row, minus last year. But
32:41
the way most people know it as is
32:44
if you receive an email that
32:46
says, Hi, I'm the CEO of your company. Hey,
32:49
I need you to do this urgent wire transfer for me. Can
32:51
you wire $40,000 out to this account? And
32:55
that's what most people think of as
32:57
business email compromise. The
32:59
problem is that... Well, when you tell me that story, I
33:01
just think that's a phishing. I don't
33:03
call phishing BEC. I just call
33:05
it phishing. Right. And
33:08
phishing is kind of the overarching
33:10
term for any email based threat
33:12
like that. Is BEC always money
33:14
related or is it sometimes, no, we're
33:17
just going to phish them so that
33:19
we can get our malware on to
33:21
steal their intellectual property? Yeah, yeah. Business
33:23
email compromise, in most of the cases,
33:25
it does not use malware. It does
33:27
not employ any of those tactics around
33:29
trying to install software on the computer.
33:32
At most, they will do credential phishing where
33:34
they'll try and harvest the email credentials and
33:36
email passwords. But for a vast
33:38
majority of business email compromise, there is no
33:40
malware tied to that. There's
33:42
only been a handful of cases that have
33:44
been publicly documented specific to BEC
33:47
actors using malware or something like
33:49
that. And just for
33:51
the most case, there is just no malware that's tied
33:53
back to those types of crime. So
33:56
if we're going to classify something, let's say
33:58
we get phished. somebody sends
34:00
us a fish, we click the link, we installed
34:02
malware, you'd say, Oh yeah, that wasn't BEC. But
34:06
if it was, it was okay, we got
34:08
fished. It was send money to this. Uh,
34:11
and I sent the money that you'd say, Oh yeah, that was BEC.
34:14
Yep. Okay. So
34:16
it typically, if you're going to
34:18
classify as BEC, it's likely going to be
34:20
financial related. Yeah. Yeah. So now
34:23
this pivots the whole thing in my head,
34:25
right? Instead of you and me being targeted.
34:27
Now they're like, well, why target somebody who
34:29
has thousands of dollars when we can target
34:32
a business who has hundreds of millions of
34:34
dollars. Yep. And that, and that is exactly
34:36
what it is. So when you, so we
34:38
did a study, what we found was that
34:41
when you go and think of your Nigerian print
34:43
scams, your 419 scams, your,
34:45
you have this long lost relative in
34:47
Nigeria, you go send me this money.
34:50
Um, what we found was that
34:52
business email compromise was not some
34:54
new crime. It was
34:57
a symptom of ignoring your
34:59
quote unquote easy 419 scams. And
35:02
we've had direct confirmation that
35:05
the scammers behind business email compromise are
35:07
the same people who have been doing
35:09
these Nigerian print scams for years. By
35:11
the way, 419 scams are those
35:13
Nigerian print scams. You know, the ones where they, they
35:15
send you an email saying, or if you pay us
35:17
some money, we'll release the inheritance that we owe you.
35:20
And the reason why it's called 419 scams
35:22
is because specifically in Nigerian law,
35:24
section 419 makes it illegal to
35:26
do this. We've all laughed
35:28
at these scams in the past, but they're
35:31
getting more sophisticated now they're evolving. So
35:33
very much with what you said, they realize, Oh
35:35
wait, no, I can go and get $40,000 out
35:37
of this company, as opposed to
35:40
going to hit this one victim over here. And
35:43
that's where we see the overlap between the
35:45
romance scams is that when
35:48
the is when they go
35:50
and send that phishing email to that company,
35:52
they will use those romance scam
35:55
victims as the money mewling networks
35:57
to send money for these scams. So
35:59
the. victims will be the ones who'll
36:01
be receiving the money who then wire it from
36:03
the United States elsewhere in order to launder it
36:05
up the chain. I mean, what I was... That's
36:09
amazing, but what I am surprised
36:11
of is just like hearing the evolution
36:14
of it. It sounds like
36:16
they've really honed their skills over
36:18
time. They have, they have. Yeah.
36:21
And it's a combination of honing their skill,
36:23
yet still keeping the stigma that these things
36:25
are simple and unsophisticated. And
36:27
that's the thing is that quote unquote simple
36:29
and unsophisticated crime minus... Again, minus
36:32
last year, it was number one
36:34
crime seven years in a row based on
36:36
financial losses. What's the number one crime? Business
36:39
email compromise. So from 2015 to
36:41
2021, it was the number one cybercrime based on losses year after
36:43
year. And
36:50
the only reason it was not the number one lap for 2022 was
36:52
because we had this crime called
36:55
pig butchering that came up. So the
36:57
way it was ranked was pig butchering was number
37:00
one business email compromise was number two. Wow.
37:03
So this is the number one crime. I
37:06
guess I'm just so surprised that it's those
37:08
awful Nigerians cameras who are doing this. And
37:11
when I say awful, I mean the least
37:13
sophisticated fishing e-mails I've ever seen. You know
37:16
the ones. Sir, you had a long lost
37:18
relative who was the Prince of Nigeria and
37:20
he has recently died and left a large
37:22
inheritance for you. Send us
37:25
$500 so we can process this and we'll
37:27
get the money over to you. Like
37:29
who in the right mind thinks their long loss relative
37:31
is the Prince of Nigeria and you never knew it.
37:34
It's just the absolute dumbest attempt at
37:36
a fishing scam that everyone laughs
37:38
at. And it's those guys who
37:40
are number one. This is
37:42
the biggest criminal financial loss for
37:45
companies today. Now
37:47
getting a business to pay a fake invoice can
37:49
take a lot of prep. You got to
37:52
figure out who this company normally pays large bills
37:54
to and then try to pose as them.
37:56
And one way to pose as them is to register
37:58
a domain. That's one. are off from the real
38:01
one. So at first glance, it looks
38:03
like it's from that person you normally do business
38:05
with. But it's not. Or
38:08
sometimes you can pose as like the CTO
38:10
sending a bill to the CEO of the
38:12
same company. But still to know who
38:14
the CTO and CEO are, you got to know who
38:17
the people are that work with this company and what
38:19
their emails look like and what their invoices look like,
38:21
so that it can be as close to the original
38:23
as possible for this to work. And
38:25
that takes a lot of work. We've seen
38:28
cases where they will go and
38:31
find and use different lead generation services
38:33
in order to identify the key controllers
38:36
and the key stakeholders within the company.
38:38
And when they do that, that's where
38:40
they get that information on who's
38:43
the person within the company that they can
38:45
go ahead and target. And based
38:47
on something intelligence that we've seen, we
38:49
know they'll target the controllers of companies,
38:51
they will target different financial
38:55
advisors so they will go and find
38:57
that recon in order to identify who
38:59
can target within the company. Oh, it's
39:01
not always bill paying. Sometimes they try to scam
39:03
these companies to send them gift cards. The
39:06
scammers will pose as like some manager in
39:08
the company and they'll ask someone higher up,
39:11
Hey, the company did such a great year.
39:13
I'd like to give my employees gift cards
39:15
as rewards. And the person's like, it's
39:17
a great idea. Then the scammers like,
39:20
okay, well, since everyone's remote, could you just purchase
39:22
the gift cards and then send me a photo
39:24
of the back of the cards and I'll just
39:26
pass those gift cards out to the employees. And
39:29
that's how these companies end up sending
39:31
gift cards to Nigerian scammers.
39:33
It's crazy. Mm hmm. And the and we
39:36
actually did, we actually without one, we actually
39:38
did a study where we gave gift
39:40
cards to the scammers and tracks where they clicked
39:42
from crazy, crazy insights that
39:44
we were able to gain from that. But
39:46
it was such a different perspective
39:48
of what we thought was going to be we're going to get.
39:51
But like I say, it was really fascinating. It's some of the
39:53
data we had that came back from that. Now,
39:55
email providers or system admins need to
39:57
work to protect users from all day.
40:00
You can't just present every email that comes into
40:02
the user. That used to be the case in
40:04
the old days when we didn't filter any emails
40:06
at all. But think about this.
40:08
Suppose you do get an email, but it's
40:11
one letter off. They switch the lowercase L
40:13
for the capital I, and it
40:15
looks the exact same to the human eye to
40:18
make you think this email is from someone
40:20
you normally get email from, but that one
40:22
letter off means it's not. So
40:24
if a human can't detect it, we
40:27
better have machines that are detecting it.
40:29
There's a thing called the Levenstein distance, which
40:31
is an algorithm that will compare two words
40:34
to tell you how different they are. And
40:36
I sure hope that email providers today
40:39
are using this to first develop a
40:41
baseline of who you're normally getting email
40:43
from, and then look for emails coming
40:45
in with a very similar domain. If
40:48
the Levenstein distance is very low, meaning
40:50
it's only one letter off from someone
40:53
you normally see email from, then that
40:55
should be flagged, maybe rejected or
40:57
quarantined, and let the user know. Another
40:59
area to look at for
41:02
a lot of domains is how long
41:04
has the domain been registered? If
41:06
it's been registered within like the last month,
41:09
more than likely it's going to be a phishing email. So
41:12
looking for the reputation, the age of
41:14
domain is a very, very successful
41:16
way to do stuff because most scammers will
41:19
go and just like get one month's worth
41:21
of domain time and then use
41:23
that for their attack. And
41:25
now that I think about it, I'm disappointed that
41:27
there's not better information on these emails I get.
41:30
Sure, I have a spam folder and stuff gets thrown in
41:32
there, but I'd love to
41:34
see reasons for why my email provider
41:37
put it in spam. To me, spam
41:39
is ads I don't want. So
41:41
why not have a second folder of threats? Spam
41:46
and threats are two different things in my
41:48
mind that they all seem to end up
41:50
in the same bucket in my email. I
41:52
would love, love, love to get threat intelligence
41:54
on my inbox. I could
41:56
see a little dashboard that says we've
41:58
blocked 24. phishing emails for you
42:01
this month. In there, we had five
42:03
BEC attempts, two pig butchering emails and
42:05
13 emails containing malware from a threat
42:07
actor known for targeting journalists at
42:10
a bare minimum. Just show me a big
42:12
bright red banner on the email that says,
42:15
look out this email comes from a domain that
42:17
was registered two days ago. That
42:19
would be really cool. Google if
42:21
you're listening, fix that picture and
42:23
fix the Google dot bug too. I
42:28
mean, they might be already filtering it out and putting
42:30
in a spam, but stuff
42:32
that gets through, I'm like, Hey, that
42:34
is a good tip. Yeah.
42:37
And just from the way
42:39
BEC is, so many
42:41
of these emails still get through. There's a reason
42:43
it's been the number one crime 70 years in
42:45
a row. So many email gateways are trying to
42:47
put protections and a lot
42:50
of information security focuses on the
42:52
malware, the APTs, the blinky boxes
42:54
and like this stuff still
42:56
gets passed because there's no malware, there's no
42:59
malicious URLs or content in there. It's manipulating
43:02
the human. So many of these
43:04
attacks just bypass your email gateways.
43:08
With a lot of your BEC actors from
43:11
an attribution perspective, this ties back to groups
43:13
such as like Black Axe, where they will
43:15
go and use those type of manipulation in
43:17
order to gain that foothold. Wait, so what's
43:20
Black Axe? So Black
43:22
Axe is one of the larger
43:25
Nigerian confraternities that dabble in
43:27
this. So if you're
43:30
unfamiliar with the term confraternity, think
43:32
of a college fraternity here in
43:34
the States, but mixed with Black
43:36
magic and voodoo. And
43:38
what I mean by that is some of the
43:41
hazing rituals for Black Axe include
43:43
a human sacrifice or trying to use
43:45
those type of techniques in order to
43:47
quote unquote, gain extra powers to become
43:49
a better scammer. Are
43:52
you still on the same podcast? What is
43:54
going on here? Hey, hey, trust me, trust
43:56
me. Yeah, no, I'm dead serious on it.
43:58
No, like I went off into It's
44:00
not into Cyberland, but no, no, it's just, but
44:02
no, Black Axe is one of the larger groups
44:05
who's doing a lot of the business, hemo compromised
44:07
activity. Okay. Are we really
44:09
going here? I
44:11
mean, when someone tells me they're
44:14
using voodoo and black magic to
44:16
become a better scammer, I'm
44:19
like skeptical and just want to
44:21
move on past that. I don't even want to pick
44:23
that up. For some
44:25
reason, I'm feeling compelled to look this one
44:27
up. So first of
44:29
all, I watched an hour
44:31
long BBC documentary on who
44:34
Black Axe is and it's
44:36
absolutely bonkers. I mean, just listen to
44:38
the first 40 seconds of their documentary.
44:42
This morning, several bodies along with their hessic
44:44
happening that were listed around the city of
44:46
Anthony who had been killed in all of
44:48
their lives and killings within the last week.
44:52
The great escort is driving
44:54
in Nigeria, more terrifying than
44:56
anything I've ever seen. Around
44:59
the world, crime agents are cracking down
45:01
on their multi-million dollar internet fraud
45:03
and human trafficking network. Nigeria
45:08
is a private nightmare too. But
45:12
here in their homeland, the
45:14
cult is in the store. In
45:17
thousands of young boys, I'm in this
45:19
store. This documentary
45:22
explains that Black Axe is a
45:24
cult full of gang
45:26
violence. They
45:30
have agreed to let us film what they call
45:32
a gyration. It causes ceremony.
45:42
And these guys are really dangerous.
45:44
They go around murdering people all
45:47
the time, sometimes shooting up
45:49
buildings or causing massacres, which I guess in
45:51
the US is called mass shootings. The Black
45:53
Axe has killed thousands of people. this
46:00
violence began. The
46:02
Black Axe formed here 40 years ago
46:05
and students have still been murdered on
46:07
campus today. The Black Axe emerged out
46:10
of a student fraternity known as the
46:12
Neo-Black Movement of Africa or NBM. The
46:15
movement initially stood for peace but
46:17
over time became linked to crime.
46:21
Today many people use the
46:23
names Black Axe and NBM
46:25
interchangeably. This has been going on
46:27
for 40 years? What? That's
46:31
interesting because they initially started as a Neo-Black
46:34
Movement to fight oppression
46:38
but it's very different now and it's unclear
46:40
to me what their motives are now. Something
46:43
something freedom, something something defend
46:46
but even though Wikipedia thinks
46:49
NBM and Black Axe are
46:51
the same the people within
46:53
NBM don't agree. He's the
46:55
president of NBM. NBM is
46:57
not Black Axe. NBM has
47:00
nothing to do with criminality.
47:02
NBM is an
47:05
organization that tends to help
47:08
achieve greatness in the world. Despite
47:12
the president's denial, the NBM
47:14
is facing mounting international pressure.
47:17
With Saffar interview the FBI arrested
47:19
more than 35 NBM members in
47:21
the US and South Africa charged
47:23
with multi-million dollar internet fraud. But
47:25
a US Department of Justice statement
47:27
named the Neo-Black Movement of Africa
47:29
as a criminal organization and part
47:31
of the Black Axe. You've
47:37
got this extremely violent street
47:39
gang, a cult, Black Axe,
47:41
slash NBM but
47:44
they seem to also be involved
47:46
with internet scams. Here's
47:48
Vice explaining what they found. How
48:00
many people are trying to get
48:02
out of you? Like 96,000 people are
48:05
going to go to jail? In
48:08
October 2021, eight men were
48:10
arrested in Cape Town on
48:12
serious fraud charges. The men
48:14
were allegedly members of the Black Acts,
48:16
a notorious Nigerian organized crime group.
48:19
And specific to the
48:22
human sacrifice, the way that that plays out is
48:25
for your Nigerian scammer, they're
48:27
called a Yahoo boy. So
48:29
in order to become a better scammer, a
48:31
Yahoo boy plus, there
48:33
is a human sacrifice ritual where you
48:36
have to kill somebody to gain better
48:38
powers to go and continue this
48:40
type of scamming. And like
48:42
I said, sounds far out there, but
48:44
it's widely documented that this is
48:46
unfortunately one of those cases. And that's why
48:48
I get so bitter towards ransomware is that people
48:51
like, oh, somebody might die here over here, somebody
48:53
might go over here because of this ransomware attack.
48:55
I'm like, no, we have people literally sacrificing each
48:57
other because of this stuff. And like, that's where
48:59
the problems are on some of these cases. Holy
49:03
moly. Yes. Yeah.
49:08
I also watched a few videos about
49:11
Yahoo boys. I guess they get
49:13
their name because they started out using Yahoo messenger
49:15
to conduct their scams over. And
49:17
they interviewed some of the Yahoo boys who then
49:20
explained how they do it. And they were open
49:22
about what they were doing. They're like, yeah, we
49:24
scam people with still lots of money from them.
49:27
In fact, they even
49:29
posted a video of one of their
49:31
victims on the verge of suicide. Here,
49:33
listen. So
49:45
even though they're ruining people's lives
49:48
and know that some of these victims
49:50
that they have are committing suicide and
49:53
they say they're all addicted to drugs, they
49:55
deny their involvement with
49:58
human bloodshed. It
50:00
wasn't exactly clear from these interviews I
50:02
watched, but it did seem
50:05
like they were killing cows or other animals
50:07
to try to level up their scamming. Which
50:09
I have to admit, at first I'm just
50:12
like shocked that anyone would think that they'd
50:14
become a better scammer because of an animal
50:16
sacrifice. But the thing is,
50:18
the culture of Nigeria is rich with
50:20
a lot of this voodoo and hexing
50:22
and charms and stuff. In
50:24
fact, when the BBC reporter went to investigate
50:26
the Black Axe cult, he found a vigilante
50:29
group who was trying to stop the Black
50:31
Axe and they gave him a charm to
50:33
protect him during his investigation. They
50:35
gave them an amulet to
50:38
protect them from gunshots. He
50:42
still wore a bulletproof vest
50:45
though, but
51:00
this is what I mean. The culture there
51:02
is really big into this. And
51:04
you know, luck is a weird thing. It
51:07
feels like a mysterious force. Can
51:10
it be changed in any way? So I
51:13
can see why somebody would want to do
51:15
weird stuff to try to improve their luck.
51:18
And if you really, really, really want
51:20
to improve your luck, then maybe you've
51:22
got to do something a little insane. And
51:25
I can see how bloodshed can get mixed up
51:28
in all this. It's very
51:30
awful and strange though. How
51:34
the hell did we get from romance scams to
51:36
this? Man, the places we go on this show. Now
51:40
I can see why you're so fascinated by all
51:42
this. These stories are crazy. Tell
51:45
us about that one story
51:47
you heard about going on in South Africa.
51:50
Okay. Yeah. So this
51:52
was a Black Axe case they had down
51:54
in South Africa. Like
51:57
I mentioned earlier, I do a lot of work backing forth
51:59
with law enforcement. So I get to hear a lot
52:01
of the good stories as a result
52:03
of this, but they were
52:05
doing the case. They went down to go and
52:07
arrest the individuals and they were
52:09
kind of at this compound down in South Africa
52:11
and they didn't really, and they were
52:14
able to get into most of the houses and
52:16
most of the buildings. And there was one building
52:18
in the, or one window in the back that
52:20
they couldn't get into. So we will bust it
52:22
down and got in there. And in that building,
52:25
what they found was they found a pile of
52:27
money covered with blood and dead chickens. So
52:29
as they came out and unlocked the door to get in
52:32
there, they kind of got talking to the
52:34
people that they were had dressing and they were like,
52:36
what's this? Because you don't really
52:39
expect to find that on a
52:41
law enforcement engagement. So what
52:43
the scammers had said was, well,
52:45
it turns out that the magic here in South
52:47
Africa is not as strong as the juju in
52:49
Nigeria. So we need a larger pile of money.
52:52
And that's one of the things that most people don't realize
52:54
is that there is a spiritual aspect that plays on this
52:57
that many of the scammers believe. And
52:59
when you account for that and you account for a
53:02
lot of the way that they perceive a lot of
53:04
that stuff, it gets really, really interesting. And because of,
53:06
again, that spiritual aspect, it's, like I
53:08
said, it's there's so many other things that the
53:11
scammers are kind of playing with and
53:13
using or believe that they don't
53:16
fully understand like, well, they're playing with in my opinion.
53:19
Man, Ronnie, I don't even know what to ask
53:21
you at this point. You've just
53:23
got me going down Jack rabbit
53:26
holes or something. Yeah.
53:30
Yeah. Yeah. I, yeah. I'm the
53:32
kind of guy who's at a dinner table. I was
53:34
like, Hey, let's talk about blood sacrifices and voodoo. So,
53:37
okay. So while looking up these
53:39
Nigerian scammers, I saw
53:41
something about this group called scattered
53:44
Canary. Do you, can you tell us
53:46
about this? Yeah. Scattered
53:48
Canary was a, mostly
53:51
Nigerian cyber frog group that we found back
53:53
in 2018 that was engaging in
53:57
business, female compromise. The reason we named
53:59
them. scattered canary was because one, they were
54:01
very scattered in their targeting. And
54:03
two, they were kind of our canary in the
54:06
coal mine that led us identify a lot of
54:08
things around 419 scams and
54:10
physically compromised. One of the
54:12
things that happened during the pandemic was
54:15
unemployment money was fair, was
54:17
given out fairly easily. And
54:20
whenever one of these programs happened, the scammers
54:22
are quick to jump on that and
54:25
they quickly jumped on that bandwagon for
54:27
a lot of the unemployment funds. What
54:30
scattered canary did was
54:32
they used different email accounts
54:34
or email accounts that had the Google dot bug
54:36
in them. And they went
54:38
and hit the unemployment fraud systems. And
54:41
at the peak, we saw them hitting 14
54:43
different states for unemployment
54:46
fraud. In general, where that stands,
54:49
we are upwards of around $400 billion that had
54:52
been that's been stolen as a
54:54
result of some of these things. And there's some new
54:56
information coming out from about ID.me
54:59
and how some of the
55:01
stolen money may not have been fully
55:04
articulated. But what we know of
55:06
right now is that $100 billion
55:08
was confirmed from Secret Service. We
55:10
know that $400 billion is up
55:12
in question for the
55:15
money that was taken. $100 billion was confirmed?
55:17
Yep, $100 billion. So that
55:19
was I'll
55:24
submit unemployment on behalf of some American
55:26
and then I'll tell them to send
55:28
the money here to me in Nigeria.
55:30
But it probably is money mule through
55:32
and then to Nigeria. But that's where
55:34
the $100 billion. That's
55:36
what I'm... Yeah, billion with a B. Billion with a B. Yeah.
55:39
Yeah. So and that's kind of
55:41
where the lines get muddy between
55:43
business email compromises because we
55:45
know that Scouter Canary again, who is doing
55:47
business email compromise, we know they were doing
55:49
romance scams. We know they were doing unemployment
55:51
fraud. And that's kind of why I say
55:54
BEC is the number one crime that's out
55:56
there because that's over $500 billion
55:58
that's... We
56:00
know our tied back to business
56:03
email compromise scammers who are doing this and
56:05
we know other scammers were involved in that too
56:07
But no, it's I get it was a hundred
56:09
billion dollars that was confirmed from Secret Service There's
56:12
a possible it's a possible 400 billion
56:14
dollars that is up for discretion
56:17
and kind of be being pushed through for
56:19
Congress But that's what it looks like the
56:21
new number is gonna lay out is about
56:23
400 billion dollars that has been confirmed I
56:25
mean, I've got to try to understand these numbers
56:28
more Okay, so I'm just walking through it in
56:30
my mind. So a hundred billion is
56:32
coming from the US Treasury That's
56:37
a lot of money that's just like Because
56:39
Treasury has lost not only is that a lot of
56:41
money that the US Treasury lost That's a lot of
56:43
money that came out of are you an American citizen?
56:46
Yeah, okay So that's a lot of money that came
56:48
out of mind in your pocket in addition to that
56:50
scammers May it looks like it may have been upwards
56:52
about 400 billion dollars. So And
56:55
the other kicker here too is that that's fraud
56:57
is still happening to my intelligence sources
57:00
out in Nigeria Within
57:02
the last two weeks. They're still
57:04
stealing money from the government the
57:06
average salary for a Nigerian Is
57:10
a hundred US dollars per month. So when
57:12
you go and you have that much plenty coming
57:14
in it becomes very enticing for Your
57:16
youth out there. It's one go and try and do this
57:18
fraud But still I can't
57:21
fathom this amount of money coming
57:23
in like the entire GDP of
57:25
Nigeria is 500 billion
57:27
dollars you're telling me that
57:29
this one group has stolen almost the
57:32
equivalent to the whole country's GDP From
57:35
the US government almost doubling
57:37
Nigeria's GDP. It's just
57:39
not for you Secret Service has
57:41
nearly a hundred billion dollars in pandemic relief funds
57:43
have been stolen that adds up to about 3%
57:46
of the cash handed out by the government
57:49
Most of the lost money is from unemployment
57:51
fraud right now The Secret Service says it
57:53
has more than 900 active criminal investigations into
57:55
pandemic fraud with cases in every single state
57:58
Man the more I look into the stolen.
1:00:01
Who out there thinks it's totally fine that we
1:00:03
lost a trillion dollars? I want
1:00:06
my voice to be clear. As an American,
1:00:08
this is unacceptable to me. I'm very
1:00:11
disappointed that the US government handed this
1:00:13
much money to the same Nigerian scammers
1:00:15
who tried to convince us all that
1:00:17
our long lost relative was the Prince
1:00:20
of Nigeria. I would
1:00:22
be understanding if the government fell victim
1:00:24
to some sophisticated cyber attack like a
1:00:26
ruthless, unstoppable but
1:00:28
you got taken by the least
1:00:31
sophisticated scammers on the planet. You
1:00:34
need to do better. When
1:00:36
you're handing out this much money as fast as
1:00:38
you can, you've got to look at who you're
1:00:40
handing it to. At the
1:00:42
very least, give it to an American. What
1:00:44
is this? Your first day on the internet?
1:00:48
Listen to Secret Service agent Roy Dotson here.
1:00:51
He's the lead investigator of this case. Fast
1:00:53
money equals fast crime. I
1:01:00
mean, at this point of this interview, I'm just
1:01:03
kind of feeling defeated. Welcome to
1:01:05
the last seven years of my
1:01:08
life. Because it's something where it's
1:01:10
like, it's very disheartening. And like I said,
1:01:12
staring at this stuff for so long, it's
1:01:15
something where it's like, it is very disheartening
1:01:17
because you do feel defeated. You do feel
1:01:19
like, okay, we've literally lost $500 billion.
1:01:22
And that's just what we know. If we
1:01:24
were to actually piece together what we knew,
1:01:26
I'm just going to throw this out there.
1:01:28
We're easily over a trillion dollars that we
1:01:30
lost here. And a lot of
1:01:32
what it comes down to is admitting
1:01:35
that there was a problem, admitting that something
1:01:37
needs to be fixed, admitting that something needs
1:01:39
to give. Because if you keep having this
1:01:41
much money that's going out and you don't
1:01:43
admit that it's a problem, you're
1:01:45
just going to be stuck. And when you
1:01:48
go and look at the 20, 25 years
1:01:50
of Nigerian Prince scams, this is the whole
1:01:52
reason that we're here right now is because
1:01:54
no one wanted to admit that no, this
1:01:56
is actually something that's happening. Yes, there are
1:01:58
people who are actually being socialized. engineering to
1:02:00
this. We have to work with those people in
1:02:02
order to identify some of that. So
1:02:04
trust me, I totally resonate with you. I
1:02:07
totally feel you when you feel
1:02:09
defeated on that because a lot of times I do too.
1:02:12
But knowing that I'm on the right side of
1:02:14
this, knowing that I'm helping victims, knowing I'm helping
1:02:16
them recover their money, and knowing that
1:02:18
I'm helping reshape a lot of the way that the
1:02:20
industry thinks about this stuff. That's what keeps me fighting
1:02:22
this stuff every day. A
1:02:32
big thank you to Ronnie Takazowski for sharing
1:02:34
his stories with us. He worked for a
1:02:36
place called Intelligence for Good, and he's the
1:02:38
chief fraud fighter there. If you run into
1:02:40
any of the problems that you heard today,
1:02:42
you might want to check out Intelligence for
1:02:44
Good because they might be able to help
1:02:46
you. This episode was created by me, the
1:02:48
master of disaster, Jackary Snyder, assembled by the
1:02:50
juicy Smoocher, Tristan Ledger, mixing done by proximity
1:02:52
sound and our theme music is by the
1:02:54
mysterious Breakmaster Cylinder. You might be wondering what
1:02:56
my political association is. I'm
1:02:59
Alt-Tab. This
1:03:01
is Darknet Diaries.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More