Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:02
Morning! It's a rough
0:04
start. It's an early
0:06
morning, 6am here. When I
0:09
get up, or something, I
0:11
close all the distractions.
0:14
It is a travel day
0:16
as well. So today, we're
0:18
off to London. Haven't been
0:21
to London since February
0:24
2020. Now if you recall,
0:27
that was an odd
0:29
time. Remember sitting there.
0:31
in Heathrow about to fly home
0:33
and we're looking at the news
0:35
and everyone's like making
0:37
jokes about this new virus and
0:40
then things sort of escalated
0:42
and here we are five years
0:44
later yeah so anyway first trip
0:47
back to London if you're not
0:49
from Australia you're probably
0:51
aware we are on the
0:53
other side of the world
0:56
to absolutely everything so it's a 14
0:58
hour flight well Drive an
1:00
hour and a half up
1:02
to the airport. 14 hour
1:04
flight to Dubai. Hang around
1:07
Dubai for a couple of
1:09
hours. Seven hour flight to
1:11
London. I'll get there. Door
1:14
to door, when you got
1:16
two hops, about 24 hours,
1:18
25 hours, something like
1:21
that. Chris is there.
1:23
How's the weather? It's
1:25
a stupid question. How's
1:27
the weather? Do you have
1:29
two digits in the temperature? What
1:31
is the temperature in London? What
1:33
am I getting myself in for?
1:35
Problem is, it's not a problem, but
1:38
we're going to London and then to
1:40
Iceland and I expect Iceland to be
1:42
cold. So you've got a dress for
1:44
Iceland, but the stuff that I wear
1:46
when you're like freezing and below is
1:49
quite different to the stuff you wear
1:51
and it's like five, six, seven degrees.
1:53
So, we're big suitcases. Let's have a
1:56
look. Where are we? Where are we?
1:58
London, 13. Wow, you got two numbers.
2:00
numbers in your temperature. That's impressive.
2:02
Let's see what the forecast. Cloud,
2:05
rain, rain, rain, cloud, cloud, cloud,
2:07
rain. What's that yellow thing behind
2:09
the cloud? Oh yeah. That's what
2:11
it is. Ah. So I said
2:14
we do. Said 14 earlier today.
2:16
I am seeing, I'm seeing on
2:18
this forc. Oh holy. Today, a
2:21
high of 19 for London. What's
2:23
going on there? Geez. Should pack
2:25
my thongs. Okay, so there's that,
2:28
where's that? So actually, let's go
2:30
through his lunch break. I like
2:32
the fact that that's newsworthy. Instead
2:35
of my hay food for the
2:37
first time in 2025. Yeah, Alastair
2:39
19. Great, well, okay. So there's
2:42
that, where's that? So, actually, let's
2:44
go through events and things. Yeah,
2:46
so we'll get to London. Saturday,
2:49
your time, somewhere in the middle
2:51
of the day. I have some
2:53
meetings with our friends in high
2:55
places on Monday and Tuesday, Wednesday,
2:58
going to be in Oxford. I'm
3:00
doing a talk at the University
3:02
of Oxford on Wednesday. It is
3:05
on my events list on the
3:07
front of my blog. It is
3:09
an open event, so if you
3:12
are in or around Oxford. That
3:14
one's there. That's organized by my
3:16
good friend Kieran Martin, who some
3:19
of you know. Kieran started the
3:21
NCSC in the UK. When you
3:23
start that? Must be 10 years
3:26
plus ago. Anyway, so I'll be
3:28
there. Scott's there, good day Scott.
3:30
I'm going to see Scott on
3:32
Tuesday. Bring the one where you'll
3:35
be sweating, mate. 19 degrees. You'll
3:37
need a cold one. Have to
3:39
find some cold ones. Richard's here
3:42
as well. Only 8C on the
3:44
coast. Oh yeah. Yeah, it's not
3:46
good. So anyway, we'll be at
3:49
University of Oxford on Wednesday doing
3:51
an event there. That's a free
3:53
open event. That's, I think that's
3:56
going to be a lot of
3:58
fun. I think they're good. crowd
4:00
there. Next day we're going to
4:02
go to Iceland, so we will
4:04
be in Iceland. I think we
4:07
might have an event in Iceland.
4:09
There seems to be a
4:11
little bit of misinterpretation as
4:13
to what it means to
4:15
come into a pro bono
4:17
event. We're trying to sort
4:19
that out in advance. So
4:21
if there is an event
4:24
Monday week... in Reykjavik. We'll
4:26
get back to you on that.
4:28
We'll be there and then we'll
4:30
be off to Dublin. I don't
4:32
have anything public planned in Dublin
4:34
yet. If anyone does want to
4:36
stand up a open community event, let
4:39
me know. We've got a couple
4:41
of days in Dublin. So that is
4:43
that. And then we will come home for
4:45
a day. Literally, literally 24 hours
4:48
and a little bit. And
4:50
then go to Perth. We
4:52
now have an event in
4:54
Perth. Now let me get
4:56
the exact term right this
4:58
is a a Microsoft the
5:01
event the Microsoft Student Accelerator
5:03
meet up Scott I reply
5:05
to you at our party
5:07
your message as soon as I
5:09
finish this actually know what
5:12
I'll do Scott I'll just
5:14
forward you something and then
5:16
I can talk to you
5:18
after this. Cheers. If people
5:20
could see Scott the things
5:23
that we talk about offline
5:25
Which one are you? Which
5:27
scots? Just a lot of
5:29
Scots. You notice that? We've
5:31
got to not mix up
5:34
the Scots. Okay, there you
5:36
go mate. Whichever other later?
5:38
Where are I now? Yes, Microsoft.
5:40
So, Perth. So, on
5:43
the 14th of April
5:45
in Perth, we are
5:48
doing a Microsoft Student
5:50
Accelerator meet-up. Famous security
5:53
expert. Try hard. Lessons
5:55
from Billions of Breach
5:58
Records. Now, little... little
6:00
inside bit of insight. I give
6:02
every event I do the same title
6:04
and then I figure out what
6:06
I'm going to talk about at the
6:08
time. I call them all lessons
6:10
from Billions of Breach Records. It's the
6:13
same in Oxford and I will
6:15
probably have a slide deck with some
6:17
interesting stuff on it. I just
6:19
aren't like for Oxford though. Can
6:23
I talk about Shoot Express? Does everyone
6:25
know what Shoot Express is? All right, if
6:27
you haven't seen this before, this is
6:29
hilarious. I have gone and spoken at this
6:31
at many different law enforcement agencies as
6:33
well and taken them through this. I think
6:35
the ones that I have a good
6:38
understanding of where their level of sense of
6:40
humour is, go to shittexpress .com. It is
6:42
safe for work, so long as you
6:44
can say that word. A
6:47
simple way to send a
6:49
piece of shit around the world.
6:51
Imagine all the people who annoy
6:53
you most, an irritating colleague, a
6:55
schoolteacher, your ex -wife, a filthy
6:57
boss, a jealous neighbour, that
6:59
successful former classmate, all those pesky
7:01
haters. What
7:04
if you could send them a
7:06
smelly surprise? There's nothing that can
7:08
replace the expression on the recipient's
7:10
face after opening the bag of
7:13
shit. Now anyway, the relevancy to
7:15
Shoot Express and me going around
7:17
doing talks is that there's
7:20
a four step process when you want to
7:22
send someone a box of shit. Number one
7:24
is you choose an animal. Different
7:27
animals produce different types
7:29
of excrement. We particularly love
7:31
organic wet horse poop. Now
7:35
it turns out it's all horse shit.
7:37
We'll come back to that. Number two, give
7:39
us an address. Number three, pick a
7:41
stick. And number four, and this is where
7:43
the rub is, pay and stay anonymous.
7:45
The service is 100 % anonymous. We will
7:47
never reveal your identity even if you pay
7:49
by credit card or paper. Now have
7:51
you ever paid by credit card or paper
7:53
for anything and felt that you were
7:55
anonymous? They
7:57
had a data breach. Shit
8:00
every, bad example, bad term,
8:02
data everywhere and there
8:05
are many interesting insightful
8:07
things that you could glean from
8:09
the data. One of them being
8:11
that they only ever sent horseshit.
8:14
So somewhere in here they were
8:16
talking about like pigs and donkeys
8:18
and whatever else, but the data
8:21
shows it is all horseshit.
8:23
Occasionally when I do this, people
8:25
like, oh so and so like do
8:27
they send... Human shit. It's like,
8:29
no, that would be weird. It's
8:32
all just horse shit. Yeah. So,
8:34
can I talk about that at
8:37
Oxford? Get its prestigious
8:39
university and talk
8:41
about sending boxes of
8:43
shit to people. Anyway, we
8:46
got on that topic because
8:48
I tend to just pick
8:50
and choose bits. Put them on
8:52
a side. And depending on
8:54
the audience, and apparently this is
8:56
an audience that's very engaged and
8:59
active, which is good, it will
9:01
just become a lot more, a
9:03
lot more interactive. I feel that
9:05
going and speaking to a bunch
9:07
of students in Australia is like
9:10
that as well, I think if
9:12
we get the Iceland event over
9:14
the line, I feel like it
9:16
will be a more demure audience
9:18
just purely based on my experiences
9:20
in the Nordic. We'll see.
9:23
That's good comments here. I
9:25
think people like the shit
9:28
story. Fritz is in Germany.
9:30
Good day Fritz. Scott
9:32
says if my chat history
9:34
is ever leaked, we're doomed.
9:36
We do have these regular...
9:39
Wow, don't you think this
9:41
is time that maybe we
9:44
just delete the history? Mind
9:46
you. Mind you. I've
9:48
had another discussion this
9:50
morning. I think is partly explainable by
9:53
the fact that the chat history was
9:55
automatically deleting and some of the context
9:57
of things that were said quite some
9:59
time ago. completely lost so there
10:01
are pros and cons to deleting
10:04
your chat history I don't think
10:06
there's any any con with Scott
10:08
not doing it though. Stephen Jones
10:10
says Troy's way of saying who
10:12
the F is Scott Helm. I
10:14
have packed some of those stickers
10:16
too so if you're around me
10:19
I do have some Scott Helm
10:21
stickers for some reason I have
10:23
way more Scott Helm stickers than
10:25
have I been Pone stickers so
10:27
I will bring some of those
10:29
with me as well. Richard so
10:31
did it really hit the fan?
10:33
So it really did hit the
10:36
fan, yeah, okay, shit express. Hmm.
10:38
Hmm. I'd love someone to actually
10:40
try and use that service and
10:42
see what happens. The funny thing
10:44
is, like somewhere in here, they
10:46
say, we deliver packages to all
10:48
countries in the world. You can
10:51
send a box of shit anywhere
10:53
in the world. And I find
10:55
that fascinating, for many reasons. One
10:57
of which is, you try getting
10:59
a banana into Australia. and see
11:01
how you go with that. It's
11:03
not going to work out well
11:05
for you. They will find it
11:08
and you will be fine. So
11:10
how do you get a box
11:12
of shit? Unless they have some
11:14
sort of affiliate model, like if
11:16
you had affiliates in Australia that
11:18
could obtain local shit and send
11:20
it out for a fee, of
11:23
course. That would make sense. Actually,
11:25
I'm glad it's here because we
11:27
can get on those cloud flow
11:29
discussion and he, like me, like
11:31
me, like me, do have a
11:33
lot of... A lot of time
11:35
spent with Cloudflow. Now, I don't
11:37
know. There are many fascinating things
11:40
about this story we're about to
11:42
tell. Let's just got Cloudflow blog,
11:44
and I'll find her it is.
11:46
Just as a recap, I've been
11:48
using Cloudflow. It must be almost
11:50
10 years now. Rapped around have
11:52
I been poned. And I largely
11:55
started using them to... to try
11:57
and keep abuse of the underlying
11:59
services out. Being able to do
12:01
things like, what's the word I'm
12:03
looking for here on the site
12:05
I'm looking for, being able to
12:07
do things like put rate limits
12:09
on individuals coming to the service
12:12
or block didos attacks. I regularly
12:14
get notifications from cloud. And there's
12:16
just these like. ridiculous volumes of
12:18
traffic coming through, you know, like
12:20
tens or hundreds of thousands of
12:22
requests per second, and they get
12:24
requests to the home page. And
12:27
I'm like, what are you doing,
12:29
mate? Like, it's all cashed, on
12:31
the edge. I don't care how
12:33
many times you request a cloud
12:35
flow conserver, but anyway, the point
12:37
is they do good stuff sitting
12:39
in there in the middle of
12:41
that connection, making sure that the
12:44
bad stuff stays out. And of
12:46
course then there's all these other
12:48
things that they've implemented over the
12:50
years that we make really extensive
12:52
use of our cloud flare workers
12:54
are to Storage for case. What's
12:56
the word they use? It's like
12:59
that persistent case which replicates everywhere
13:01
Loads and loads of things that
13:03
we use on cloud flare really
13:05
important part of what Scott does
13:07
for report your eyes well So
13:09
all of that is wonderful stuff
13:11
the sun has just come up
13:13
hang Wonderful stuff. So one of
13:16
the things that Collaps has been
13:18
involved in with myself for many
13:20
many years is poem passwords. Now,
13:22
poem passwords is our massive corpus
13:24
of password hashes, hashes of password
13:26
hashes, hashes from previous data breaches.
13:28
That sits there in cloud flare,
13:31
massively educated. We're serving 10 billion
13:33
requests a month from organizations like
13:35
say Telster in Australia. Now I
13:37
can pick Telster as an example
13:39
because they're calling us Asink. You
13:41
can literally see the requesting of
13:43
Dev Tools. You go to Telster.
13:45
which is our largest telco, and
13:48
then you go to put in
13:50
a password and as you type
13:52
the password it hits. Have I
13:54
been paying his API? Has the
13:56
password been seen in a data
13:58
breach before? And it does it
14:00
using a really cool anonymity model,
14:03
which without going through the whole
14:05
thing again, hashes the entire password,
14:07
takes the first five characters of
14:09
the 40-character shower one hash, sends
14:11
it to have a backbone. A
14:13
backbone comes back with everything that
14:15
possibly matches, and then the client
14:17
figures figures out if the typed
14:20
password exists or not. Really cool
14:22
privacy preserving anonymity model So they
14:24
they helped develop that back with
14:26
Jeanard Lee I think in about
14:28
28 and Jeanard I see April
14:30
has moved on to other things
14:32
But he was awesome in helping
14:35
us get that set up We
14:37
have grown and grown and grown
14:39
and grown that service cloud flare
14:41
provides it for free to the
14:43
community which is good on them.
14:45
That is awesome and last year
14:47
they spoke about something cool that
14:49
they had implemented Now I'm going
14:52
to find the correct term. It's
14:54
effectively leaked credential, what do I
14:56
call it, leaked credential check. I'm
14:58
going to go back to the
15:00
start. I'm finding the purpose of
15:02
comparing goes leaked outside. Okay, that
15:04
goes back to that one. So
15:07
you're in the comments while I'm
15:09
sliding. Stephen, super tempted to drive
15:11
to Oxford. No, my work had
15:13
been hoping you come up to
15:15
Cambridge. I did hear about someone,
15:17
yeah. Are you Redgate? The Seven?
15:19
Because Redgate had said, could you,
15:21
if you go to Cambridge, I
15:24
have been to Cambridge before and
15:26
seen Redgate and Cambridge, but always
15:28
that was a while again. So
15:30
anyway, mate, I'm flying the other
15:32
side of the world, it's not
15:34
far to go for you. Okay,
15:36
yeah, got you, all right. Mm.
15:39
Alfred, I got trying to defend
15:41
my network connections to the best
15:43
mobility is ages ago. Your tax
15:45
surface is just way too big
15:47
for me. Yeah, and like this
15:49
is the value proposition proposition. of
15:51
Cloudflav. This is in response to
15:53
James' message. I was Cloudflip for
15:56
my home lab. It stopped a
15:58
bunch of crucial stuff. attacks on
16:00
my home VPN. They started slowing
16:02
down my network several times in
16:04
the past. Now just because this
16:06
is part of the the story I'm
16:08
going to tell as a recap of cloud
16:11
flows value proposition, they're
16:13
a reverse proxy. So you have your
16:15
website that sits on a server somewhere
16:18
in the world, let's keep it simple.
16:20
One server, some in the world, have
16:22
a bean pines, sits on a server in
16:24
the West US data centre. we route traffic
16:27
through cloud flare which means they control all
16:29
of our DNS they have some funky networking
16:31
tricks let's just call it that for the
16:33
sake of simplicity whereby no matter where you
16:35
are in the world you connect to the
16:37
cloud flare edge node that closest to you
16:39
there are more than 300 edge nodes around
16:41
the world so what it means is that
16:43
for me is the guy running the service
16:45
I put it out there put it behind
16:47
cloud flare and then no matter where you
16:49
are on the world you connect to a
16:51
cloud flare edge node that's close to you.
16:53
There's one up the road here in Brisbane.
16:55
There is one, there's probably many in London,
16:58
they've got an office in London. Why aren't
17:00
we going there next week? All right,
17:02
different story. I'm sure there's one
17:04
in Reykjavik and there's one in
17:06
Dublin, everywhere else we go. And the very
17:08
proposition of connecting somewhere really close to
17:10
you is that you can get a
17:12
very fast response from something like Kesh.
17:15
Now also everyone knows Edge nodes
17:17
can inspect traffic and apply security
17:19
rules and this is a really
17:21
really critical thing because how do
17:23
you do that in an era
17:26
of ubiquitous transport layer security? We've
17:28
got the padlock, everything is
17:30
encrypted, how can they see the traffic?
17:32
And again this is a key part of
17:34
where things have gone off the rails the
17:36
last few days. When you make a request
17:38
to have I been poned, let's say you're
17:40
doing a domain search on have I been
17:43
poned and you're adding that domain to your
17:45
dashboard, you're going to have to talk to
17:47
the database, we can't just serve all that
17:49
from cash. Your request is encrypted on
17:51
your device, it goes to cloud flares
17:53
edge node close to you, that's where
17:55
TLS initially terminates. It decrypts the
17:58
traffic, it looks at the traffic. It
18:00
then re-encrypted the traffic, sends it to
18:02
the origin server, and then the server
18:04
responds and the same thing happens the
18:06
other way. They are able to inspect
18:09
traffic. That's the value proposition. And everyone
18:11
putting their website behind it with the
18:13
expectation of getting anything from cashing on
18:16
the edge through to using their waft
18:18
tools to be able to look for
18:20
malicious requests through to doing anything that
18:22
involves them being able to inspect what's
18:25
in the traffic, that that is how
18:27
it works. And for the longest time,
18:29
we've had these issues pop up, where
18:31
people will be like, hang on a
18:34
second, they're breaking encryption. It's like, no,
18:36
they're not. That's just where the encryption
18:38
ends, and then it gets re-encrypted. And
18:41
then it goes, like, that's literally the
18:43
way the service works. And then people
18:45
say, well, I thought my data was
18:47
encrypted all the way. The normal everyday
18:50
people using the service are not giving
18:52
that a second thought if you're saying
18:54
that as a site operator if you're
18:56
saying that as someone who's a Subject
18:59
matter expert and you're thinking that the
19:01
padlock somehow implies that your data is
19:03
encrypted all the way to where it
19:06
sits at rest Well you're wrong because
19:08
it was never like that you get
19:10
assurance that you're going to have encryption
19:12
and as a result confidentiality integrity and
19:15
all the other bits between your device
19:17
and some point terminates. And even if
19:19
you did have encryption all the way
19:21
to your origin service, the padlock doesn't
19:24
tell you what happens after that. It
19:26
doesn't tell you if the data is
19:28
encrypted when it's sent to the database.
19:30
It doesn't tell you if it's encrypted
19:33
when there's an API call made from
19:35
the web app out to some other
19:37
service. It was never meant to do
19:40
that. It only does that bit. And
19:42
the bits behind that are not clear.
19:44
That's the nature of the internet. There's
19:46
lots of moving moving parts. Now
19:49
I feel like we've been having
19:51
this discussion, you know what we've
19:53
been having it for, for a
19:56
very long time because I wrote
19:58
a blog post called Cloudflare SSL,
20:00
well put it this way, it
20:02
was that bloody... long ago I
20:04
was calling it SSL and not
20:06
TLS. Cloudflare, SSL and Unhealthy Security
20:08
Absolutism and the Unhealthy Security Absolutism
20:10
which I put in the description
20:12
for this weekly video as well
20:15
was effectively where people were saying
20:17
if it is not encrypted all
20:19
the way then it's no good.
20:21
But again like if it was
20:23
encrypted all the way they couldn't
20:25
inspect the traffic and they couldn't
20:27
do the waft things and the
20:29
cage things and all the other
20:31
bits and pieces and pieces that
20:34
they do. Everyone understands, they sit
20:36
there in the middle by design,
20:38
people putting their services behind that
20:40
have an expectation that they will
20:42
be able to do things that
20:44
require them to be able to
20:46
do the decryption bit. Right, now
20:48
leads us to, what's the correct
20:50
word here, leaked password notification, leaked
20:53
credential check, back in June last
20:55
year, they announced this and this
20:57
is something that... that does use
20:59
our data as well, for have
21:01
I been paying, among other sources
21:03
of data, and that their value
21:05
proposition is essentially this. Now that
21:07
we all understand that they sit
21:09
in the middle of the traffic
21:11
and they are able to inspect
21:14
the traffic as people browse the
21:16
target website, if someone is logging
21:18
on to a target website and
21:20
their credentials are flowing through Cloudflow's
21:22
infrastructure, then Cloudflow can see the
21:24
credentials. Now there's a parallel thing
21:26
here around are they allowed to
21:28
do it but what about privacy
21:30
but GDPR. GDPR always seems like
21:33
the hammer and then everyone's just
21:35
looking for a nail to hit
21:37
with it. One would imagine an
21:39
organisation on the side of cloud
21:41
flare has enough lawyers to ensure
21:43
that when you sign up and
21:45
you put your services through them
21:47
that all of the relevant boxes
21:49
around the fact that they can
21:52
inspect traffic are checked. And just
21:54
as a recap of the size
21:56
of cloud flare as well, I
21:58
think they're saying here it's effectively
22:00
20% of the world's websites that
22:02
run through there. Let's see if
22:04
it was 20. What was it?
22:06
20%? Is it in the blog
22:08
post? I'm going to search
22:11
for percent. Ah, percent.
22:13
There are different percent.
22:15
I'm not, oh no, I'm looking
22:17
at the wrong page. Let's
22:19
try here. Now we'll find
22:21
percent. Here you go. With
22:23
30 million internet properties
22:26
comprising some 20% of
22:28
the web. 20% of the web
22:30
is routed through cloud flow, which is massive.
22:33
Separate arguments we had about
22:35
monopolies and everything else. But let's
22:37
just recognise it is huge. And
22:40
I think I raised that point
22:42
because I'm sure they have enough
22:44
lawyers writing enough terms and conditions
22:46
that they have thought about ticking
22:48
the appropriate boxes such that when
22:50
people put their websites behind cloud
22:53
flare, it is well and truly
22:55
all tied up with a nice bow that
22:57
cloud flow is able to inspect the traffic.
23:00
Anywho, where were we? League
23:02
Credential Check. So, so let me
23:04
look at the comments, because a
23:06
lot of stuff flying through here,
23:09
and then we'll talk about how
23:11
they do this and where people
23:13
lost their mind. James was
23:15
the cloud flow, the free cloud
23:17
flow theory, the generous two. One
23:19
of the reasons I started
23:21
using them, is back in the
23:23
day, it was the fastest, easiest
23:25
way to get... The padlock icon,
23:28
let's just call that, the padlock
23:30
icon in the browser, made it
23:32
super easy to implement SSL slash
23:34
TLS. And I did a video
23:36
series, just off my own back,
23:38
it wasn't commercial, I just thought
23:40
it was interesting, called HPDPS is
23:42
easy. It's probably still there. HPDPS
23:44
is easy.com, that itself, sits behind
23:46
cloud for, yeah, here we go. Oh, how
23:48
long it was that I did that?
23:50
There's YouTube videos, there'll be
23:53
dates, dates, dates, dates, dates,
23:55
dates, dates, and those, and
23:57
those. Six years ago.
24:00
Oh, it's actually a little
24:02
bit more than that. June
24:04
2018, let's call it almost
24:06
seven years ago. So, yeah,
24:08
yeah, super, super easy and
24:10
free as well. Everyone likes
24:12
free. Thomas says,
24:14
funky networking tricks, which I recently
24:16
learned can make it in other
24:18
CDNs look like malware using fast
24:20
fluxing. Funky network
24:22
tricks do then carry with it
24:24
all sorts of other issues.
24:26
Yeah, one of them is that
24:28
it is very easy to
24:30
front malicious things when you have
24:32
access to a service like
24:34
Cloudflare. One of the massive criticisms
24:36
of Cloudflare is that there's
24:38
nasty stuff behind there, whether it
24:40
be stuff that's malicious content
24:42
or stuff which is just content
24:44
that we don't like. They
24:46
had a lot of controversy, I'm
24:48
sorry, in 2018 because of
24:50
the Daily Stormer, the white supremacist
24:53
website that was running behind
24:55
Cloudflare. And there's this whole other
24:57
massive discussion that we're not
24:59
gonna fit in the rest of
25:01
the show around what is
25:03
their responsibility for content that is
25:05
legal but recalcitrant. Massive
25:08
big, big, big issue over there.
25:10
So, there are issues there. As Scott
25:12
says, crazy how people don't understand
25:14
the basics of things they use. Milford
25:17
says, I suspect the criticism goes
25:19
to the heart of the issue, not
25:21
the structural cleverness of Edge. I
25:23
think it's very much going to the
25:25
philosophy of it. I will get
25:27
more into that in a sec. Scott
25:32
says, the key phrase is
25:34
transparent encryption. And it's in the
25:36
name, TLS, transport, sorry, transport
25:38
encryption. As Scott says, it's in
25:40
the name, TLS transport security. It
25:44
only provides protection on the network, not on
25:46
the server. And it only provides protection up
25:48
to the point where it terminates as well,
25:50
which is again where people have the issue.
25:56
Okay, let's go on, credentials. They
26:00
can sit in the middle and look at
26:02
the credentials that are being submitted to
26:04
a website. They can start to do
26:06
some cool value and stuff. And what
26:08
they're doing with their leaked password check
26:11
here is that when someone's submitting
26:13
the form to log on to a website and
26:15
that request comes through encrypted
26:17
from the client through to cloud
26:19
for edge node and they are then able
26:21
to inspect the request and do all of
26:23
the waft things that they're already there for
26:26
and the rate limits and all the rest
26:28
of it. They can... Also, look at
26:30
that password and they can apply the
26:32
same logic that we're using with Kain
26:34
and Imity to do the 10 billion
26:36
searches a month with home passwords.
26:39
They can see whether that password
26:41
has been in a data breach before. And
26:43
they can do it super super super
26:45
super fast and then they can
26:47
add a request header such that when
26:50
that request is then routed onto the
26:52
origin, you've got a header somewhere. I'm
26:54
pretty sure that's how they do it.
26:56
Search for header. I'd have to read
26:59
the technical bits. How did they do
27:01
that? There's a link through to the
27:03
details. Let's imagine it's a header. I'm
27:05
quite sure it's there. Add the
27:07
header so that when you're writing
27:09
your code for running on your
27:11
origin server and you're receiving that
27:13
authentication request, not only can you
27:15
get the username and password, which
27:18
again is in plain text by
27:20
the anchors on your origin server,
27:22
after, as Scott has said, the
27:24
transport component finishes finishes and has
27:26
dec scripted. has been in a
27:28
data breach before. Now you could do the
27:30
same thing with Pone passwords by
27:32
getting the request through and then
27:34
making an outbound request to do
27:36
the Pone passwords check, but that's
27:38
going to add latency. You're then
27:40
going to have to make an
27:42
outbound request back to cloud flares
27:45
infrastructure to get that response. So
27:47
for them being able to do it on the
27:49
fly is super cool. I'm happy with that.
27:51
I was excited about that. They launched it
27:53
in the middle of last year. So
27:55
where's it all gone wrong? This week.
27:58
So all gone wrong this week. Because
28:02
they wrote a blog post
28:04
reporting on what they had
28:06
observed. Observed? Observed. Now, when
28:08
I initially saw this, I
28:11
was just interested in the
28:13
figures, because I know how
28:15
it works. Based on cloud
28:18
flares observed traffic between September
28:20
and November 2024, 41% of
28:22
successful log-ins across websites protected
28:24
by cloud flare involved compromised
28:27
passwords. That is an interesting
28:29
statistic. 41% of successful logins
28:31
are using a password that's
28:33
been under a data breach
28:36
before. In other words, 41%
28:38
of logins are at some
28:40
degree of greater risk than
28:42
the remaining 59%. I think
28:45
that's interesting. They talk about
28:47
scope of analysis, they've got
28:49
some charts, which are nice,
28:52
some graphs, some... flows here
28:54
about stuff, some cloud flare
28:56
value proposition, it's obviously a
28:58
marketing effort on their behalf
29:01
too. Happy days, no problem
29:03
with that. And then here's
29:05
where it gets political, because
29:07
it's always political these days.
29:10
I am finding myself spread
29:12
across X, Mastodon, and Blue
29:14
Scope. And there are very
29:16
different people on all of
29:19
these. Now nobody on X
29:21
seemed to mind this, nobody
29:23
in Blue Sky seen to
29:26
mind this, but then a
29:28
bunch of people on Mastodon
29:30
were very upset. And the
29:32
upsetness seems to be censored
29:35
around the fact that people
29:37
were surprised that Cloudflare was
29:39
able to do this. And
29:41
there seems to be some
29:44
misinterpretation that they have gone
29:46
too far in accessing people's
29:48
passwords or something to that
29:50
effect. Now before
29:53
I delve in there
29:55
further I do think
29:57
that there are some
29:59
optical problems as they
30:01
would say with the way this
30:03
has been written evidently based on
30:05
the feedback and probably it
30:07
could have been clear up front around
30:10
everything from how they're able
30:12
to do this to the fact that the
30:14
site owners are agreeing to somewhere in
30:16
the terms and conditions it allows them
30:18
to do this and that there is
30:21
no privacy risk around using the K
30:23
anonymity soil models. It is there in
30:25
the blog post and some of the
30:27
discussions I've seen in the blog post.
30:30
Apparently one thing wasn't originally in
30:32
the blog post but then it was in
30:34
there by the time I read it and
30:37
then people were upset about that as well.
30:39
I'm angry this isn't in the blog post.
30:41
All right here it is. I'm angry
30:43
it's now in the blog post and
30:45
they didn't say in the blog post
30:47
after they fixed it that before they
30:49
hadn't. Oh for fuck's sake like where
30:51
does it own? Anyway so there's some
30:53
contention around there. There's a lot
30:55
of argument then around whether or not
30:57
they should be able to do this. because
31:00
passwords are PII. They're not, but
31:02
I'm going to come back to that.
31:04
And if you go and read some of
31:07
my engagements on Mastodon, you'll
31:09
see these discussions. I
31:11
just think that there's sort of
31:13
a sensitivity around passwords where on
31:15
cloud flares end, they've got to
31:18
be a bit more cautious about how
31:20
they communicate this. And on the
31:22
end of people commenting on it,
31:24
they've got to have some... understanding
31:27
of the technical implementation of this,
31:29
and also the fact that cloud
31:31
flow inevitably have ticked off the
31:33
legal boxes they need to be
31:36
able to intercept traffic, given
31:38
that that is their entire
31:40
value proposition. The bit about PII, I
31:42
had a couple of people argue that
31:44
passwords are PII. Now the... I think the
31:46
hint to the reason why I believe this... I'm
31:48
going to caveat it. I believe this is
31:50
wrong. is a PII's personal identifiable
31:53
information, information that can
31:55
be used to reasonably
31:57
identify a person, an email
31:59
address. A name in combination with
32:02
other factors. John Smith can't
32:04
reasonably identify one person. In
32:06
combination with the physical address
32:08
it can. An IP address
32:10
is relatively unique. A password
32:12
is not. It's personal information,
32:14
it's your information in the
32:16
same way as if you
32:18
were to enter your favourite
32:20
colour into a website, that
32:22
is personal information, but if
32:24
you see blue, you're not
32:26
like, oh that's definitely true,
32:28
you know, you're going through
32:31
a database, someone likes blue,
32:33
oh that's true. No, it's
32:35
not, it might be, could
32:37
be. But what if someone
32:39
enters, when the website asks
32:41
for your favourite colour, And
32:43
it says, Troy hunted this address with
32:46
this IP address and this email address,
32:48
likes blue. Well, that's a different story.
32:50
But the request for the favourite colour
32:53
alone is not the thing that makes
32:55
it personal identifiable. And if you do
32:57
go through the Maston thread, there's a
33:00
couple of folks there sort of saying,
33:02
GDPR. Again, everyone uses GDPR as a
33:04
giddy power as a hammer. I know
33:07
you're very fond of that in your
33:09
corner of the world. There are other
33:11
privacy laws in other parts of the
33:14
world as well, and a lot of
33:16
them do align around things like what
33:18
is sensitive data, medical information, specifically pathology
33:21
reports or doctors. These sorts of things
33:23
are very clearly defined as a particular
33:25
class of data. But when pushed, and
33:28
I said to people, can you point
33:30
me to the bit which says that
33:32
passwords are PII, and I get sent
33:35
a link, and I searched through the
33:37
page. Well the word password doesn't appear
33:39
anywhere. Ah it's implied. No it's not
33:42
implied. Being personal data doesn't make a
33:44
personal identifiable. And the reason we went
33:46
down this rabbit hole is because there
33:49
was concern that cloud flare were taking
33:51
something that could identify individuals and ultimately
33:53
compromise their privacy by doing what they're
33:56
doing and it just simply doesn't work
33:58
that way. Someone suggested
34:00
that cloud thread engage me
34:02
to respond on this or paid me
34:04
or something to that effect, which is also
34:06
bullshit. It doesn't work that way. It's
34:08
literally, it's a little bit like the HTTPS
34:11
as easy thing before nearly six years ago.
34:13
We're just like, I have an opinion on this
34:15
and I'm going to put that forward and
34:17
here it is. So no, it
34:19
is not a cloud flesh shell. But I think
34:21
Scott would pretty much be on the, Scott's
34:24
saying, someone did say GDPR. Yeah, right.
34:27
So I think Scott's reading the thread.
34:29
Like you read the thread and feel
34:31
free to chime in here with your
34:33
own comments on this, mate, but you
34:35
just get that tone. Now to go back
34:37
to the bit about the social platforms
34:39
and the politicization, it
34:42
does feel to often be the way that
34:45
the folks that have gone to mastodon
34:47
get very cranky about this stuff in
34:50
just the same way that the folks that
34:52
went to blue sky get particularly cranky about Elon.
34:55
Isn't it weird at the moment? Certainly don't have people
34:58
like burning tests. It's not so much that
35:00
the King Tesla's and burning superchargers and
35:02
everything, which is just dumb. It's
35:06
seeing people on the other
35:08
side of the political spectrum
35:10
then being defensive of the
35:12
people doing that. Well, they're
35:14
very upset because Elon. The
35:16
world's gone mad. Anyway, good day,
35:19
Wayne. Thanks for coming. Stephen,
35:21
good day, mate. We're just finishing on the
35:23
cloud plus. I know you've been looking at
35:25
some of the cloud plus stuff this week
35:27
as well. Around the
35:29
leak financial. Yeah, people got very upset about
35:31
that. There
35:35
was one person in particular, again, if you
35:37
go back through the thread, you
35:40
seem to have a lot of trouble having
35:42
a discussion without swearing. Now
35:45
we all do this from
35:47
time to time, but as I said
35:49
at one point in time in the discussion, he
35:54
seems to have started with the premise of
35:56
fuck cloud flare and then just worked backwards. This
36:00
is my view. Let me find
36:02
ways to support my view and
36:04
even as the discussion progresses and
36:06
it's it's very clear that yes of
36:08
course they have the ability to inspect
36:11
traffic and of course they would have
36:13
had the legal right to do so
36:15
and people would have opted into that
36:17
even if you didn't read the
36:20
terms and conditions if you're trying
36:22
to find reasons to support your...
36:24
view of not liking the organization
36:26
and there are many reasons people
36:28
don't like cloud. People call it
36:31
crime flare because it can sit
36:33
in front of malicious websites
36:35
or yeah again we had
36:37
the discussion about sitting in
36:39
front of other otherwise legally
36:41
operating but recalcitrant services that we
36:43
don't like. Again a whole other discussion
36:46
but clearly people wanted to work
36:48
backwards and find a way of
36:50
being upset with cloud. I kind of
36:52
feel like... I think when you're having a
36:54
discussion on a public forum and you're
36:57
trying to articulate a point around, they'd
36:59
be pissed. James, an Ozzy complaining about
37:01
swearing, I thought that's how you all
37:03
said hello, I can say how we
37:06
say hello, but I probably shouldn't
37:08
say it on this video.
37:10
Times and places, you know,
37:13
I think when you're having
37:15
a discussion on a public
37:17
forum and you're trying to
37:19
articulate a point around What
37:22
is ultimately a combination of
37:24
technical implications or implementations, privacy
37:26
concerns, legal terminology, I don't
37:28
think that it really helps
37:30
the conversation to be
37:33
using abusive language, particularly
37:35
not in the angry
37:37
sense. Lots of different
37:39
ways of swearing, which are
37:41
more light-hearted and fun. It's
37:44
pretty much mandatory here in
37:46
Australia. Note to self stop
37:48
stop doing that before I
37:50
go to Oxford next week All
37:52
righty Yeah, so that's clear. I
37:54
will fault them on one thing
37:56
just to Show everyone not a
37:59
cloud official I went and checked
38:01
my cloud flare assets that
38:03
I've had there for quite
38:06
some time and the link
38:08
credential check was not enabled.
38:10
In fact we can do
38:13
this now. So if I
38:15
go, let's pick something here.
38:17
Let's pick a website I've
38:19
had there for a while.
38:22
Let's go to Scott Helm
38:24
sucks.com. That's a popular sign.
38:26
So come let's to make
38:29
sure that no one typos
38:31
that or drop it in
38:33
the chat. So if we
38:36
go there, you're going to
38:38
see a very insightful message
38:40
on the website. If I
38:42
go into my security settings
38:45
on cloud flare, and I
38:47
go into settings, security settings,
38:49
where's the lead credential check?
38:52
Is it there? Is
38:56
it somewhere else? Switch between dashboards.
38:58
If you want to try the
39:00
new dashboard later on, press the
39:03
try new dashboard. Let's try the
39:05
new dashboard. Maybe it's there. This
39:07
feels like the azure portal all
39:09
over again. Oh, all settings. Oh,
39:11
that's handy. That's actually better. Where
39:13
now where is that setting? Let
39:16
me pick another cloud flare property.
39:18
I definitely know it's not turned
39:20
on. Which is... Troyhunt.com which is
39:22
also here now I have no
39:24
authentication going on to Troyhunt.com anyway
39:26
so it would sort of be
39:29
pointless to have that over there.
39:31
Let's have a look. Leach credential
39:33
detection is off. I wonder if
39:35
it's because Troyhunt.com is on Enterprise
39:37
and Scott Helm sucks is not.
39:39
I don't think there's a lot
39:42
of... point in putting him on
39:44
enterprise there. Okay, that could have
39:46
been my interpretation. Anyway, here's the
39:48
bit where I will give them
39:50
some criticism. Apparently this is on
39:52
by default on the free tier.
39:55
And I had a sudden recollection
39:57
where one password does a similar
39:59
thing insofar as one password has
40:01
their watchtale feature. And that plugs
40:03
in the Pone passwords and is
40:05
able to take all of the
40:08
passwords and using the same anonymity
40:10
model so it doesn't disclose anything
40:12
about the password itself. It can
40:14
check every single one of your
40:16
passwords and your password managers to
40:18
see if it's been exposed before.
40:20
My recollection is that they had
40:23
that on by default to begin
40:25
with, and then probably based on
40:27
community feedback, they went, no, we
40:29
should leave it, even though it's
40:31
not a privacy risk, we should
40:33
turn that off, and then give
40:36
people the ability to turn it
40:38
on, and then we'll explain how
40:40
it works that they want to
40:42
turn it on. And I do
40:44
think that is what cloud flare
40:46
should do, particularly based on the
40:49
feedback we saw. Even though I
40:51
don't agree with it all, I
40:53
think that when you're dealing with
40:55
a privacy-related thing like this, you're
40:57
better offering on the side of
40:59
caution. But honestly, that and potentially
41:02
not being clear enough in their
41:04
wording to make sure that people
41:06
understand it wasn't a privacy risk
41:08
is the only thing I can
41:10
fault them on. And I think
41:12
that even if they did all
41:15
that, there'd still be a bunch
41:17
of people that are very cranky
41:19
because they're working backwards from cloud
41:21
flow. I'm just
41:23
glad you. Oh yeah, I had
41:26
references for why Scott sucks. That
41:28
was you in the tree, wasn't
41:30
it? Oh dude, I got so
41:33
much more material now. How long
41:35
how did I do this? Oh
41:38
yeah. That was funny. All right.
41:40
Can I watch that later? Oh
41:42
and you're on the wakeboard as
41:45
well. Oh shit, oh that hurt.
41:47
Oh, nice. Okay, what was next?
41:50
Don't be serious here. Two data
41:52
reaches this week and in all
41:54
us see we have been very
41:57
very focused on a combination of
41:59
getting ready for this trip and
42:01
doing the things I needed to before going
42:03
on this trip, two of which included doing
42:06
my Microsoft Regional Director renewal stuff I
42:08
have to do every two years and my
42:10
Microsoft MVP renewal stuff I have to do
42:12
every one year and I know Scott and
42:14
Stefan I got that probably on their agenda
42:16
too. I just wanted to have all of
42:18
that out because it was due by the end of
42:21
the month. So a lot of time went on that a
42:23
lot of time gone on the have I been paying UX
42:25
stuff as well, which we'll talk about more in
42:27
just a moment, more in just a moment.
42:29
One of those, it's quite an
42:32
odd one, Lexipole. I'll
42:34
explain what's odd in a
42:36
moment. Well, part of the
42:38
reason it's odd, is that it's
42:41
attributed to puppygirl
42:43
hacker polycule. Now,
42:45
here's the order. There's
42:47
sometimes the way with hackers,
42:50
or I suspect in
42:52
this case, though, classified
42:54
themselves as hack diverse
42:56
as hack diverse. because
43:00
they have some activist-related
43:03
reasons for why they
43:05
didn't like the company. But
43:07
if I go to lexipole.com,
43:09
we make performance excellence
43:11
the heartbeat of public safety,
43:14
mission critical tools, ensure
43:16
the well-being and effectiveness
43:18
of the public servants
43:21
whose safeguard communities.
43:23
So this is a service
43:26
which creates policy manuals, hence
43:28
the poll part. Is that the
43:30
poll part or is police the poll
43:32
part of Lexi poll? I don't know.
43:35
Anyway, they do a bunch of policy
43:37
documents that are everything like law enforcement
43:39
agencies to first responders, to fire crews.
43:42
I believe predominantly in the US, I'm
43:44
not sure if it's exclusively in
43:46
the US, use. Oh, here we
43:48
go. Law enforcement, fire and rescue,
43:50
corrections, EMS, local government. Now they
43:52
create a lot of
43:54
this material. The hacktivist
43:57
angle on this is
43:59
that... Puppy Girl Hacker
44:01
Polycule, their position here on
44:03
this is anonymous group of
44:05
hackers calling us on a
44:07
link more than 8,500,000 files
44:09
from the first responder training
44:11
company Lexipole. Now this goes
44:13
back to last month. So
44:15
why did they do this?
44:17
They had a reason. As
44:20
well as emails, phone numbers,
44:22
hence they have been paying
44:24
connection over 600,000 records in
44:26
there. They then published this
44:28
data publicly. This is very
44:30
very public. The group told
44:32
the Daily Dot in a
44:34
statement, they chose to target
44:36
Lexi Pole because they are
44:38
not, there are not enough
44:40
hacks against the police. So
44:42
he took matters into our
44:45
own paws. Is that a
44:47
good reason? We hacked them
44:49
because there are not many
44:51
hacks, or not enough hacks
44:53
against them. I seen a
44:55
recall from memory, it was
44:57
a little bit more... more
44:59
activist related reasons that I
45:01
read somewhere. Where was it?
45:03
Lots of stuff. Yeah, there's
45:05
obviously some contention. Puppy or
45:07
hackapulical is not known to
45:10
be formally connected with other
45:12
previous data breaches, but hacker
45:14
grips with fuzzy monitors are
45:16
not in short supply. Last
45:18
summer the famed gay furry
45:20
hackers, siege SEC, known for
45:22
taking government websites, and checkly
45:24
good demanding research into IRL
45:26
cat girls. Anyway, yeah, it
45:28
attracts all kinds of this
45:30
industry. So, that's going to
45:33
have been poned. That I
45:35
flagged as sensitive. And the
45:37
reason I flagged as sensitive
45:39
is that there was a
45:41
very significant skew of law
45:43
enforcement in... that data set.
45:45
And in fact I did
45:47
go through a verification process,
45:49
an offline verification process, I
45:51
couldn't find an enumeration vector
45:53
that made it easy to
45:55
see whether or not someone
45:58
had an account. due
46:00
to the nature of it and the
46:02
fact that it's policing manuals and so
46:04
on, I pinged a bunch of have-or-bane
46:06
phone subscribers in there and said, look,
46:08
your data's turned up in here, is
46:10
this your data, is it legit? And
46:13
most of the replies I got were
46:15
from police officers in the US.
46:17
So good on them for being
46:19
have-a-bane phone subscribers as well. I
46:21
flagged it as sensitive because this is
46:23
a group that is at higher risk and
46:25
I guess that's sort of one of one of
46:27
the one of the... triggers that
46:30
I use for flagging a breach
46:32
as sensitive. So whether you're someone
46:34
in Ashley Madison who you are
46:36
now at high risk or if
46:38
it's someone who has a religious
46:40
affiliation or a political affiliation, if
46:42
it's anywhere in the space of
46:44
sensitive PII, like actual PII,
46:46
not passwords mate, the thing that had
46:48
me a little bit worried about law
46:51
enforcement officers is that they do
46:53
tend to get targeted more as
46:55
a result of their job. Whether
46:57
that's right or wrong in different
46:59
circumstances is a totally different issue.
47:01
I just don't want to have
47:03
a vampire to be the vector
47:05
by which someone, let's say, takes
47:07
a cop's email address and plugs it
47:10
in to have a vampire and then
47:12
figures out a lot more personal information
47:15
about them. So that's why that one
47:17
is sensitive. Now we'll go on the
47:19
next one in a second. One of the
47:21
comments here. Yep, nothing new. Next
47:24
one here is the gazillionth. spyware
47:27
service to have a data
47:29
breach lately spy X now
47:31
say gazillionth even when I
47:33
look at the front page
47:35
of have I been poned
47:37
at the moment where we
47:39
see the 10 most recent
47:42
data breaches four of them
47:44
are spyware companies working
47:47
in in chronological
47:49
order cocoa spy spy spy
47:51
I see spy Z and
47:53
now spy X now interestingly
47:55
spy X Is that law? Oh,
47:58
actually, that's just a little. bit
48:00
large in CocoaSpire. CocoaSpire is
48:02
1 .8 million unique accounts.
48:04
Spirex was just under two. Now,
48:08
I worked with Zach Whitaker on
48:10
this journalist at TechCrunch. Zach has
48:12
covered a lot of spyware breaches
48:14
in the past. Zach's preferred term
48:16
to use is stalkerware, because inevitably
48:18
there is a bunch of it,
48:20
which is used for stalking people.
48:23
And again, like the challenge we got
48:25
here is that the value proposition
48:27
that is put forward for this class
48:29
of software is that it can
48:31
be used to say monitor your children,
48:33
because who doesn't want to monitor
48:35
their children? Well,
48:38
me. Okay, but this is
48:40
the way it's put. I'm just
48:42
making sure I remember what
48:44
I'm talking about here. And then
48:46
I'll load up the spy
48:48
X page here. The
48:51
best phone monitoring software for parental control.
48:53
Know everything. Know everything happens. Just know
48:55
this is bad English. Know everything happens
48:57
on your kid's cell phones or tablet
48:59
without installing an app, set up monitoring
49:01
just two minutes. Now, how do you
49:03
do this without installing an app? Because
49:06
this is where things get kind of
49:08
interesting. One of
49:10
the ways that spyware can spy
49:12
is that you just give it
49:14
the iCloud credentials and it connects
49:16
directly to the iCloud. And then
49:18
it grabs everything that gets synced
49:20
to iCloud. I mean, what could
49:22
go wrong with that? The
49:26
data breach has a large number
49:28
of iCloud credentials in plain text.
49:30
So this is your Apple account,
49:32
Apple email, Apple password. And I
49:34
know it's correct because I pinged
49:36
a bunch of have I been
49:38
punished as groves and this one
49:41
I was like, Hey, I just
49:43
want to make sure this is
49:45
what it looks like. And they're
49:47
like, yep, yep, that is what
49:49
it looks like. Now, curiously, a
49:51
bunch of people have pinged on
49:53
this for like, yeah, we signed
49:55
up, we tried to use it,
49:57
it didn't work. So we no
49:59
longer use it. The people that
50:01
responded now, maybe they wouldn't respond
50:04
if they are of the more storkish kind. But the people that
50:06
responded predominantly said they were setting this up to monitor
50:08
their kids. That's in the legal
50:10
bucket. We'll get to the illegal
50:12
bucket in the moment. A reliable
50:14
parental control out, apparently
50:17
not, lets you monitor over 40
50:19
essential activities on a child's phone,
50:21
ensuring their safety at all times.
50:23
Now, I mean, like look at what's
50:25
being tracked here. Calls and SMS.
50:28
You spikes know exactly who your
50:30
child is messaging and calling as
50:32
well as the context list and
50:34
deleted message history in the target's
50:37
phone. We pivoted very quickly from
50:39
child there didn't we? Child just
50:41
became target. Social media apps
50:43
monitor your child's social
50:46
media activities including Facebook,
50:48
Instagram, Twitter, Snapchat, TikTok,
50:50
WhatsApp, WhatsApp, every keystroke
50:53
and tap to ensure your child's
50:55
safety. We're back to child, we've pivoted
50:57
from target, back to child. Photos
50:59
and videos, keep track of media stored
51:01
in the cloud and shared through
51:03
social apps. You can easily view their
51:06
phone photos and videos remotely. Now
51:08
how does this magic work? Well, it
51:10
puts it in their cloud where it then ends
51:12
up in a data bridge. Now this data
51:14
bridge did not have that information.
51:16
I didn't see photos and other
51:18
things like that. We have seen
51:20
other data breaches of other spyware
51:22
slash storkware. apps that have had
51:25
a lot of very personal
51:27
things and let's just say that
51:29
I have seen these with my
51:31
own eyes and this is the risk
51:33
that anyone takes on board when
51:36
they use an app of this
51:38
kind so yeah look it's just
51:40
it's just a super shitty
51:42
business model we know
51:44
empirically that it is
51:46
regularly used this class
51:49
of application of which I assume
51:51
Spy-X is included in this. These
51:53
sorts of things are regularly used
51:55
for non-consensual tracking of other parties,
51:58
particularly in domestic violence. of scenarios.
52:00
And one of the reasons we
52:02
know that is that when MSPI
52:05
had their data breach it was
52:07
their Zendesk logs that got leaked
52:09
and in the Zendes logs was
52:12
all of this stuff around people
52:14
trying to install the software not
52:16
just on someone's device so they
52:19
can't see it, oh maybe it's
52:21
a child, but on my wife's
52:23
device or on my husband's device.
52:26
Multiple references to I think my
52:28
partner is having an affair, how
52:30
can I track them? without them
52:33
knowing. It's just a shitty, shitty
52:35
business model. And it's legal, too.
52:38
This is what drives me nuts.
52:40
Oh, this is fun. So there's
52:42
a comparison further down. Why spy
52:45
ex is the only one to
52:47
try. Now, spy ex. This is
52:49
like Storkware Bingo. Spy ex is
52:52
being compared here to spy IC.
52:54
One of the last 10 data
52:56
breach and have been poned. They're
52:59
also being compared to MSPI. a
53:01
recent data breach in Have I
53:03
Been Pone, not just once, but
53:06
I think when they're either two
53:08
or three times. The third comparison
53:10
here, Spiex versus Kids Guard Pro.
53:13
I will make some space in
53:15
Have I Been Pone for Kids
53:17
Guard Pro. They're not there yet.
53:20
Who else? Excents by. Make space
53:22
for them. You Mobex. How many
53:24
are there? Auto forward. And then
53:27
we're background to spy-I-C. Didn't we
53:29
do spy-I-C already? Number one or
53:31
spy-I-C. It's almost like these companies
53:34
misrepresent things. Yeah, look, you go
53:36
down here. Helping protect teenage kids
53:38
from bullying. Wonder if your daughter
53:41
is in a healthy relationship. Now
53:43
it's getting weird. Concerned about the
53:45
mental health of elders. Also weird.
53:48
Are you worried about online fraud
53:50
targeting? Oh mate, I'm worried about
53:52
you targeting elders. Geez. Anyway, so
53:55
they're in there. Right. Where are
53:57
we other comments here? Someone says
53:59
spy being the name of a
54:02
kid's monitoring upsets me just a
54:04
little tiny bit. But this is
54:06
the thing, it's like they're constantly
54:09
using the word spy. I mean
54:11
monitoring is a different story and
54:13
I do wonder how conscious the
54:16
decision is to use the word
54:18
spy. Do they get more hits
54:20
from people saying how do I
54:23
spy on my partner? As opposed
54:25
to how do I monitor my
54:27
child? It just super sucks. I
54:31
missed a bunch of comments
54:33
here. Scott says, I love
54:35
when Zak quotes people with
54:37
snark. Quote, we did not
54:39
get breached, said the company
54:41
of representatives citing no evidence.
54:43
But that's not on spy
54:46
ice. Well, where are we
54:48
up to? Spy X, is
54:50
it? Because he didn't get
54:52
a response from the company,
54:54
which was part of the
54:56
problem. Is that here? We
54:58
did. No, so where did
55:00
you pull that one from?
55:02
Or maybe that's just a
55:04
general, this is Zach's style.
55:06
Milford says, in all fairness,
55:08
that's like Trump saying, we
55:10
did not invade Zimbabwe, citing
55:12
no evidence. I think it's,
55:14
yeah, so Scott's more like
55:16
a general quote. I feel
55:18
like that's a more... general
55:20
thing particularly in some parts
55:22
of the world where to
55:24
bridge the gap between Scott
55:26
and James here it's a
55:28
company saying we did not
55:30
get breached citing no evidence
55:32
meanwhile we're saying you did
55:34
get breached here's the evidence
55:36
so yeah maybe no explanation
55:38
for the evidence is more
55:40
about it James says just
55:42
Troy says law enforcement have
55:44
to watch their backs a
55:46
bit more I recommend to
55:48
use something like delete me
55:50
at least it makes more
55:52
difficult I've had discussions, Scott
55:54
we can have this chat
55:56
offline, but I've had discussions
55:58
with law enforcement. before in
56:00
a very well-known agency where someone
56:03
did get targeted as a result
56:05
of the work they're doing and
56:07
it was I was sitting there having a
56:09
chat to the guy I think it was like
56:11
over dinner or something and like it
56:13
was genuinely generally stressful for
56:15
them because either these are people
56:17
who put their pants on one leg at
56:20
a time and go to work and try
56:22
and do a job to make the world
56:24
a better place and then they get they
56:26
get targeted by anonymous adversaries you
56:28
know when they're just trying to do
56:31
their job so yeah it's it's super
56:33
super shitty for those guys Milford
56:35
no they don't they have the law
56:37
on their side in the event of
56:39
turned into a gangster star war they
56:42
have the edge and for far more
56:44
group lawyer yeah not sure exactly
56:46
what that angle is what else
56:48
is here Kim says spoke bad yep
56:50
how to parents manage kids
56:53
devices online activities using
56:55
the native parental controls
56:58
Also, I have a blog post about
57:00
this. There are native controls
57:02
in all of the popular operating
57:04
systems. I have eye things, there
57:06
are family controls that will not
57:08
show you their photos and their
57:11
text messages, that will let you
57:13
set screen times for the amount
57:15
of time that can be spent, or
57:17
what time of day they can pick
57:19
it up. Are they allowed to have
57:22
conversations with new people? A whole bunch
57:24
of other things. dropping a link into
57:26
the chat here about my blog post
57:28
from 2020, sharing BYOD in kids online,
57:30
10 digital tips for modern day parents.
57:33
And one of the big things on there,
57:35
if this is a genuine question here,
57:37
is actually like being around your kids.
57:39
When they use devices, and I know by
57:41
virtue of them being mobile, you're not going
57:43
to be there a lot of the time,
57:46
but it is guiding them in the direction
57:48
of what is a healthy device used. And
57:50
that's certainly a big thing for Charlotte and
57:53
I with our kids. at 12 and 15
57:55
as well. Anything more
57:57
are we going on to...
58:00
Milford says, why are parents managing
58:02
their kids devices? Would you really
58:04
want to be tied to a
58:06
tree like a dog back in
58:09
the 60s? I'd suggest that's going
58:11
a little bit further than what
58:13
was meant by that. What I
58:15
will say is in the various
58:17
reporting I've done on spider incidents,
58:19
there have been a few occasions
58:21
where I could justify parents having
58:23
this sort of thing on their
58:25
device, recognizing that every one of
58:27
these decisions is like a risk
58:29
and reward thing. Massive risks, I
58:31
think it's a bad idea in
58:33
about 99% of cases. The remaining
58:35
1% of cases is when people
58:37
have said things like, my kid
58:39
has all sorts of mental issues
58:42
that put them at a very
58:44
high risk or they've got a
58:46
drug addiction or they've got something
58:48
where having visibility into what they
58:50
do on their device might actually
58:52
be a life-saving thing. So I'm
58:54
sympathetic in that 1% of cases.
58:56
But if you cut out 99%
58:58
of the business of these organizations,
59:00
I don't think that exists anymore
59:02
Which I don't think is the
59:04
bad thing if I'm completely honest
59:06
So yeah, there's there's that Kim's
59:08
been looking to Cross Studio and
59:10
bark in the last month. Yeah,
59:13
I don't know those he says
59:15
He or she says my son
59:17
hacked Microsoft family safety on his
59:19
PC when he was 10. Well,
59:21
first of all if he can
59:23
genuinely do that Go to Microsoft
59:25
and get yourself a nice bug
59:27
bounty. I suspect that's hack in
59:29
the very liberal sense of the
59:31
world. I'll give you an example
59:33
of how my son hacked around
59:35
the parental controls on iOS. He
59:37
wanted to keep watching YouTube after
59:39
YouTube was on the bandlist after
59:41
a certain time of day and
59:43
he found that if you if
59:46
you messaged the YouTube link to
59:48
someone you could watch in the
59:50
message app you could watch YouTube
59:52
because it had effectively had an
59:54
in-app player and that wasn't subject
59:56
to the parental controls. controls
59:58
of the dedicated
1:00:00
standalone YouTube app.
1:00:03
Good on him, proud parenting moment. And
1:00:06
then we had a discussion about when you should
1:00:08
be watching YouTube. Lillford
1:00:13
says, I'm not suggesting it's personal or
1:00:15
there aren't exceptions. I'm just saying it's
1:00:17
ridiculous at scale. We're securing the planet into
1:00:20
complete immobility and lack of freedom and spontaneity.
1:00:22
Yeah, look, I agree with that. I would
1:00:24
argue that's also a very cultural thing. We
1:00:27
had a discussion here just the other day. I
1:00:30
can't remember who it was with. But if you've
1:00:32
ever been, assuming most, you're probably
1:00:34
like in the UK and Australia and places that
1:00:36
are very, very similar. Let's
1:00:38
say you went out with
1:00:40
your baby in a pram and
1:00:42
you wanted to have a coffee, you know, you
1:00:44
take the baby into the coffee shop and
1:00:46
you keep a good eye on them in case
1:00:48
someone stole the baby. And now Norway, and
1:00:50
it's probably the same Stefan can confirm in Iceland
1:00:52
because a lot of the Nordics are very
1:00:55
similar. You see
1:00:57
a whole bunch of parents taking, going out to
1:00:59
the coffee shop, they park the pram out the front,
1:01:02
it's minus whatever. It was Norway, you park the
1:01:04
pram out the front, baby's wrapped up, and then
1:01:06
you all go to the coffee shop and you
1:01:08
have a coffee and you come back out and
1:01:10
the baby's still there, which just blows my mind.
1:01:12
But that's normal. Or you go to the playground
1:01:15
and all the parents are not hovering around, waiting
1:01:17
to catch the child when they fall off the
1:01:19
swings. But they're over there having a coffee and
1:01:21
the kids doing their own thing. Very different thing
1:01:23
there. I feel like I'm much more on that
1:01:25
sort of Nordic scale of stuff. I
1:01:28
saw an interview with Jimmy
1:01:30
Carr recently, who's obviously a comedian
1:01:32
and he's always doing a lot of
1:01:34
funny edgy stuff. But
1:01:37
for some reason, this was a more serious interview and
1:01:39
the interviewer was talking about kids and they said
1:01:41
something to the effect of, how
1:01:43
much freedom or
1:01:45
should you give your
1:01:47
children? And his response was as
1:01:49
much as they can handle. And
1:01:51
I think that's a really good response. Just
1:01:54
yesterday, we were having
1:01:56
the... Chris says, my daughter did the
1:01:58
same thing to avoid the... block and I
1:02:00
was proud father-in-I-S, proud father-in-home, was talking
1:02:03
at 12-year-old daughter and said, do you
1:02:05
think you're ready to fly yourself to
1:02:07
Norway on your own? Our son did
1:02:09
this last year at 14. And we
1:02:11
did that after we felt he was
1:02:13
ready to do that and she was
1:02:15
like, no, not really sure, but I
1:02:17
think when we have a need to
1:02:19
do that and we feel she's ready,
1:02:21
which might be, might be at the
1:02:23
end of the year, for argument's sake,
1:02:25
I mean, be like, be like, be
1:02:27
like, be like, Do it? Like, what's
1:02:29
going to get happen? You get lost
1:02:31
in Dubai and you need help in
1:02:33
the airport. You know, you can't accidentally
1:02:35
slip out into the street. You'll be
1:02:38
fine. All right, last thing here, before
1:02:40
we wrap it up, because we're at
1:02:42
our end already. Behavea been poned UX
1:02:44
stuff. So we have forging ahead on
1:02:46
this. We have Ingebera and Iceland doing
1:02:48
all of the static front end. If
1:02:50
we head on over to preview, I'll
1:02:52
drop this in the chat as well,
1:02:54
preview. Have I been poned.com, we will
1:02:56
see the current state of the front
1:02:58
end. There, it is static, it will
1:03:00
not hit data on the back end.
1:03:02
You can put in a string that
1:03:04
matches an email address and click on
1:03:06
check and you'll see the results come
1:03:08
back. We've got the about us that's
1:03:10
in there, the terms of use are
1:03:13
in there, the domain search pages are
1:03:15
in there. We've got a whole bunch
1:03:17
of different paradigms that we have now
1:03:19
built into this new model, which is
1:03:21
really cool. And what we're doing is
1:03:23
we're just getting to the point where
1:03:25
the static site can represent enough of
1:03:27
what needs to actually be integrated into
1:03:29
the live running site that we are,
1:03:31
I think Stefan might have already done
1:03:33
this, but we're branching the code base
1:03:35
of the actual live site. We'll have
1:03:37
a private... deployable website that we can
1:03:39
keep checking. We're making it private because
1:03:41
this is literally integration work and I
1:03:43
don't know how much we're going to
1:03:45
screw it up. We'll keep preview have
1:03:48
I been paying the com public and
1:03:50
we'll just start baking this in and
1:03:52
that the goal is I forget what
1:03:54
date we... said Stefan,
1:03:56
but it's like around
1:03:58
the middle of
1:04:00
May is to have
1:04:02
all of this
1:04:04
live. And I think we'll be
1:04:06
able to do that. We've got a bit of stuff
1:04:08
coming out because Stefan's got MVP summit next week. I've got
1:04:10
a month now of travel, including
1:04:13
hanging out in Iceland with you, Stefan. But
1:04:15
we'll keep iterating on this.
1:04:18
And we do have an open source repository
1:04:20
with all of this code. We have an
1:04:22
issues list. We have a discussions list. We're
1:04:24
inviting people to contribute to that. We're
1:04:27
getting a lot of good stuff.
1:04:29
It's just simple things like,
1:04:31
on the about page, when
1:04:34
you're looking at it in mobile
1:04:36
view, the social icons, the spacing
1:04:38
isn't right. Okay,
1:04:40
that's minor. But thank you for finding
1:04:42
that now. And
1:04:44
creating an issue that Ingeberg can now go and
1:04:46
fix. And it saves us from having to
1:04:49
do it later on, potentially when it's
1:04:51
live, and then we're checking things more rigorously.
1:04:54
So go and check that
1:04:56
out. I feel like there
1:04:58
are obviously things missing, but
1:05:00
I feel like what we
1:05:02
really want to do as
1:05:05
part of this is to
1:05:07
make sure that we get
1:05:09
new features, new stuff in there.
1:05:11
So I want to see stuff like
1:05:13
more visualizations of data breaches. We've got
1:05:15
a dedicated breach page where for
1:05:17
every individual breach, we end up having
1:05:19
a lot more information, we're going
1:05:21
to have some recommended actions. I think
1:05:23
we're going to have some other
1:05:25
partners on that page in the right
1:05:27
context where they make sense. That's
1:05:31
the sort of stuff I want to get like more
1:05:33
useful stuff for people. So
1:05:35
please if you have ideas for that effect, go and
1:05:37
drop them in the in the issues or discussions
1:05:39
log. Last bits here,
1:05:41
Scott, I've seen your messages, I'll give
1:05:43
you a call after this. Wayne
1:05:47
says, love the animated logo when adding a
1:05:49
domain. Yeah, me too. And I don't
1:05:51
know why I love this so much, but
1:05:54
if you go to the domain search
1:05:56
page and click on verify new domain and
1:05:58
you put in, you know, like example .com
1:06:00
and then you click on Verify
1:06:02
Now, that new logo, just
1:06:04
the icon part, animates.
1:06:06
How's he done that? Like it's
1:06:08
really cool. This is what I've
1:06:11
loved about the work that Ingerberg
1:06:14
is doing, where he's taking
1:06:16
initiative to go and do,
1:06:18
often just little stuff like
1:06:20
that, but I see it
1:06:22
and it's like this just
1:06:24
immediately clicks with me and
1:06:26
I'm really, really stoked about
1:06:29
that. That's the kind of stuff
1:06:31
I like. It just feels nice.
1:06:33
Fritz says it's a nice spacious
1:06:36
layout and easy on the eyes.
1:06:38
Milford, I'm already breached. Well,
1:06:40
I didn't even know I was
1:06:43
registered on two out of three
1:06:45
of those at the time. Fronten
1:06:47
looks pro, bad-ass. Cheers Troy, thank
1:06:49
you, mate. That's really good.
1:06:52
All right, so, um... Milford says
1:06:54
color dating at, is that a joke?
1:06:56
No, it's a real thing. It's copied
1:06:58
off the one that's in production. Yeah,
1:07:00
anyway. All right folks, I'm gonna wrap
1:07:03
it up there. I will be in
1:07:05
Iceland this time next week. So I'll
1:07:07
do, from summer in Iceland, I won't
1:07:09
be in Reykjavik, we're gonna go
1:07:11
and actually see some sites over
1:07:13
the end of the week and
1:07:15
the weekend before we do the
1:07:17
work stuff in Reykjavik after that.
1:07:19
So thanks very much folks, I'll
1:07:21
see you'll see you. from a
1:07:23
place I've never been before. That'll
1:07:26
be cool. All right. Where'd my
1:07:28
stop button go?
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More