Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
I'm Dr. Brian Goldman, host of the CBC
0:03
podcast, The Dose. Each week
0:05
we answer vital health questions that will help
0:07
you thrive, like, what does my mental health
0:10
have to do with my gut? How
0:12
can I prevent melanoma? How much sleep
0:14
do I really need? And how can
0:16
I manage my health without a family doctor?
0:19
I chat with the top experts to bring you
0:21
the latest evidence in plain language, all in about
0:23
20 minutes. Join The Dose on
0:25
the CBC Listen app or wherever you get your
0:28
podcasts. This
0:31
is a CBC Podcast. Hi,
0:37
I'm Nora Young. This is Spark. The
0:39
Paris Olympics kick off in a little over
0:42
a month, and while there's always lots of
0:44
new tech at the games, sports tech, broadcasting
0:46
tech, we've got our eyes on security. This
0:49
summer in Paris, officials are planning a suite
0:51
of security tools bolstered by AI, from
0:54
spotting abandoned packages to predicting the
0:57
movement of crowds. As
0:59
AI security rolls out for major
1:01
public events, how do we balance
1:03
safety, security and privacy? And
1:05
how do we guard against mission creep, where surveillance
1:07
for special events becomes the new normal?
1:10
This time on Spark, AI surveillance at
1:12
home and on our borders. In
1:22
2017, France won the bid to host
1:24
the 2024 Summer Olympics coming up next
1:27
month. In planning for
1:29
security at such a major international event,
1:31
the use of new technologies had to
1:33
be balanced with stringent laws governing not
1:36
only France, but the European Union. My
1:39
name is Medi Gassemi. I'm Head
1:42
of Research and Assistant Professor of Media
1:44
and Communication at the
1:46
Lille Institute of Communication, ISTC.
1:50
And he leads a team focusing on
1:52
the impact of digital media, including AI-based
1:54
surveillance. But
10:00
they say that there hasn't been enough public
10:03
discussion about this law and there hasn't been
10:05
enough transparency in terms of how these technologies
10:07
are implemented and what the motivations are and
10:09
so on and so forth. Because
10:13
I guess, I mean, obviously there's the
10:15
great French tradition of rights, individual rights,
10:17
but then on the other hand, people
10:19
legitimately want to be safe. They're hosting
10:21
a huge international event and they don't
10:23
want a major security crisis.
10:25
So how do you balance those two
10:28
things? I mean, I appreciate the concept of mission
10:30
creep, but I also appreciate that this is an
10:32
enormous international event.
10:35
Yeah, absolutely. I think it's
10:37
important to note that the
10:39
Paris Olympics happen against a
10:41
backdrop. There is context behind this.
10:44
So the terrorist attacks that happened in Paris
10:46
in 2015, so that really marked the general
10:48
public and the general perception in terms of
10:50
security, it was the first time that France
10:52
was attacked on this scale in Paris. And
10:55
the second was basically the catastrophe
10:57
that happened during the Champions League
10:59
final. And what came
11:01
out of this was when France had
11:03
won the bid to host the events in Paris,
11:06
everybody was afraid in terms of like,
11:09
now we have the image of France
11:11
scarred and that's about our national security
11:13
in terms of reputational security. So
11:15
we need to make sure that
11:17
the terrorist attacks and what happened
11:19
during the Champions League final never
11:21
happens again, specifically never happens during
11:23
the Olympics. Yeah. Mehdi,
11:26
thanks so much for your insights on this. Thank
11:28
you, Nora. Mehdi Ghasemi is Assistant
11:30
Professor of Media and Communication at
11:32
the ISTC Institute of Communication in
11:34
Lille, France, where his work includes
11:36
researching AI surveillance and algorithmic monitoring.
11:38
We spoke to him from Paris
11:40
on June the 5th. We
11:43
reached out to four AI companies that won
11:45
the bid for the Olympics video surveillance contract,
11:47
but none of them made themselves available for
11:49
an interview. Thanks
11:55
for watching. that
16:01
was considering the privacy students had vis-a-vis
16:03
a teacher at their school. And
16:06
the court started to add a bit
16:08
of nuance to it, but it was
16:10
still in some ways connected to property.
16:13
Because it was still like while they're in the semi-private
16:15
location of their school, so they can
16:18
have certain expectations. So when
16:20
I read that decision, it doesn't clearly say, well,
16:22
if I was out on the sidewalk or if
16:24
I was in a park, I would get the
16:26
same protection as in that space. But
16:29
we might see some shifts in this in the near future.
16:32
But it seems to me when we're talking about
16:34
what happens in public space, that is one thing
16:36
when you're talking about human beings sifting through
16:38
hours of CCTV footage or
16:40
sitting in a parked car outside
16:42
of my home, and another when you can
16:45
kind of automatically gather images and have
16:47
AI analyze them. So how does
16:50
the ease and scale of image and
16:52
data gathering change things? It
16:54
changes them dramatically. I think courts and other
16:56
parties that might be involved in thinking through
16:59
privacy, protection, and law need to
17:01
be thinking about what should privacy law
17:03
look like today. And it relates
17:06
to something that a number of academics,
17:08
I'm thinking in particular a couple of
17:11
American academics, Woodrow Hartzog and Evan Selinger,
17:13
have referred to as privacy by obscurity,
17:16
which is this idea that for a long time, or
17:20
at least in many different capacities, we
17:22
could expect privacy just by virtue of the fact
17:24
that it is a
17:26
resource drain to have
17:28
a police officer or a number of police officers
17:30
follow somebody for a long period of time. So
17:33
yes, sure, somebody could sit across the street from
17:35
my house and they could watch my house for
17:37
10 days. But
17:40
if you're paying them and you're paying
17:42
them overtime, that's an incredibly resource exhaustive
17:44
task. And so that's going to
17:47
be limited to cases where, you know,
17:49
there's a really strong reason for them
17:51
to be doing that. Now, with all
17:53
kinds of different technological developments, it's
17:56
easy to do that. And there's actually a case from
17:58
the Ontario Court of Appeal. is
20:00
technology does not evolve in a vacuum.
20:02
And I think the developers of
20:04
different systems are cognizant to some degree of
20:06
the legal restrictions on what they can do
20:09
with those technologies. But we're
20:11
really seeing these types of technologies
20:13
that can identify you accurately
20:15
or inaccurately, different types of concerns with
20:17
each of those, and that
20:20
can be used by law enforcement
20:22
agencies or other government agencies, which
20:24
can be highly concerning because of
20:26
the high stakes nature of
20:28
decisions that could be made there, like border uses,
20:30
whether or not you can enter the country or
20:33
stay in the country, law enforcement, whether you might
20:35
be arrested or not. Companies
20:37
can use this. We've seen examples
20:39
of this in Canada, like in
20:41
shopping malls trying to identify, quote,
20:43
unquote, potential shoplifters. There's high risk
20:45
for profiling in that. But
20:48
also individuals. So we're also seeing a
20:50
growth in the development of
20:52
facial recognition systems that are sort of
20:54
marketed to, or at least
20:57
partly marketed to, people to use on other
20:59
people. And the legal regime
21:01
around that is seriously
21:03
lacking. And that can
21:05
be almost as high stakes, or in some cases,
21:07
just as high stakes, if you have somebody who
21:10
is interested for nefarious reasons, like stalking
21:12
or other number of reasons, in identifying
21:15
you and is capable of just downloading
21:17
an app and identifying you, riding the
21:19
bus. They can associate you with all
21:21
of your online activity, like
21:24
your Twitter and your Facebook, et cetera.
21:26
That's a very dangerous situation as well.
21:29
["The You
21:41
Are Listening To Spark"] Any photo
21:43
that exists out there on the internet has probably been scraped
21:45
up. I don't
21:47
think that more cameras equals more
21:50
safety. In fact, I
21:52
think the inverse is true, that
21:54
for particular communities, more cameras often
21:56
mean less safety. It's
21:58
everywhere. So definitely it's... surveillance culture.
22:00
Yeah. I'm Nora Young. Today
22:13
on Spark we're talking about the proliferation of
22:15
facial recognition and other forms of AI surveillance.
22:18
Right now my guest is Kristen
22:20
Thomason who specializes in Canadian law,
22:22
particularly when it comes to public
22:25
space, privacy, and AI and robotics.
22:27
One of the areas she's been researching is
22:29
what happens when images and data are gathered
22:31
not by the state but by
22:33
private individuals. A
22:36
lot of new technologies, surveillance
22:39
and information collection technologies, what I'm
22:41
really noticing as a trend is
22:43
they're being marketed to individuals to
22:45
use on other individuals. A really
22:47
good example of that, but there's
22:49
many, is Amazon Ring. So
22:52
these doorbell cameras that can record everything that
22:54
happens in front of your house. And in
22:56
a lot of instances, depending on the shape
22:58
of the property and everything, they're probably recording
23:00
the sidewalk as well. And
23:02
Amazon has had things like apps that make
23:05
it really easy for people to then upload
23:07
the footage that they film off of their
23:09
Ring camera, share it with their neighbors, engage
23:12
in sort of like an
23:14
informal vigilante type profiling, share
23:17
that footage with law enforcement agencies,
23:19
either voluntarily or when asked by
23:21
law enforcement. And then
23:23
of course, Amazon has, you know,
23:26
this is all being uploaded through Amazon. So
23:28
Amazon has some access to this information. Amazon
23:31
for a period of time was also
23:33
developing a facial recognition tool called Recognition
23:36
to potentially be used by law enforcement
23:38
or individuals to sift through the information
23:40
collected off these cameras. So
23:42
it's actually a very sophisticated,
23:44
very invasive surveillance network or
23:46
infrastructure that is being developed
23:49
in cities, but through
23:51
private individuals who are far less
23:53
regulated and with a lot less
23:55
legal oversight than, you know, the state
23:57
or a company would have if they did the same thing.
24:00
Yeah, I know there have been cases in the
24:02
states where police departments have been sort of working
24:04
with individuals to
24:06
access that, I think
24:08
it was Ring in particular, that video camera footage.
24:10
Are you aware of any of that happening in
24:12
Canada? I'm aware
24:14
of, well, there was an attempt to
24:17
create a law enforcement ring
24:19
partnership in Windsor, Ontario. This
24:22
was pre-pandemic. It didn't develop
24:24
at that time, and I haven't seen anything
24:26
since. We do have this federal regulation
24:29
around commercial collection of information, of
24:32
commercial information. Their
24:35
laws are different here than they are in the
24:37
United States in important ways. I'm not sure if
24:39
that factored into the ultimate failure of that partnership,
24:41
but there has been an effort there. Then
24:44
I'm also aware that in Vancouver, there have been
24:46
discussions with Vancouver police
24:48
trying to set up
24:51
sort of strategies. I'm not sure
24:53
the exact term they use, but
24:55
strategies or networks to be
24:57
able to have access to people's doorbell cameras. I
25:01
understand why people buy doorbell cameras. I understand
25:03
why they have them. There's a range of
25:05
reasons, and it offers some
25:07
conveniences. I think we don't
25:09
always see, especially in some of the
25:11
advertising for these types of systems, is
25:14
the way that they're then networked beyond that. Yeah, it
25:16
seems to me there's sort of ... There's
25:18
the collection of the footage, and then there's this question
25:20
of what happens to the footage after the fact. For
25:23
example, even if we say that a business has
25:25
the right to capture and analyze images on
25:27
their property, can they then
25:29
turn around and sell that footage
25:32
to a security company? Or can I
25:34
sell my video doorbell footage or my
25:36
drone footage or whatever? Can
25:39
I? There's a really important
25:41
test, Chris. Speaking
25:43
generally, it really brings us back to
25:46
another ... I wouldn't say weakness
25:49
per se is a challenge in
25:51
our legal structure around privacy, a
25:54
challenge that is getting increasingly problematic because
25:56
of the kinds of technologies that we're
25:58
seeing, which is is that a
26:01
lot of privacy legal protection
26:03
is premised around an idea of consent,
26:06
which makes sense, because sometimes we want to
26:08
consent to being filmed, or we want to
26:10
consent to sharing our information because we get
26:12
something out of it, or we see
26:15
some benefit to it. Maybe we see
26:17
some collective benefit to sharing. I'm
26:20
thinking of even medical studies and things
26:22
that are really premised on consent and
26:24
we're benefiting other people by participating. Many
26:29
of our privacy laws are premised around this
26:31
idea of consent. And so that
26:34
idea of could a company collect information and
26:37
then sell it off to another, you'd have
26:40
to read the terms of service
26:42
really closely to see if when
26:44
you signed up to share your
26:46
information, you consented to
26:49
a subsequent use. And
26:51
some of our federal commercial privacy legislation is
26:53
in flex, so there's a bill that's being
26:56
under consideration now to update and
26:58
modify some of our privacy laws.
27:00
So some of these things, if
27:03
you want the most up to date and you're listening to
27:05
this, somebody's listening to this podcast later, they might
27:07
want to just check in on what the current
27:10
legal framework is. But this
27:12
idea that individual consent can
27:14
override a privacy expectation or
27:17
interest is becoming increasingly fraught,
27:20
partly because none of us has time to
27:22
read every terms of service. Probably.
27:25
I would venture to guess many of us don't
27:27
read any of them. I speak anecdotally at least.
27:30
So we're consenting to a lot of things that we're not necessarily
27:32
aware of. And even if we did read them, it's not always
27:34
clear. You might agree that they
27:37
can use your information in other ways and
27:39
maybe don't realize what those ways include.
27:42
That ought to be caught by the law
27:44
to some degree, but I think there's a possibility
27:46
that you're agreeing to something without entirely
27:49
realizing it. So there's
27:51
just the time commitment aspects of
27:53
consent. But increasingly with these kinds
27:55
of machine learning based or artificial
27:57
intelligence based technologies, there's also
27:59
a collection. versus individual
28:01
issue. So I could individually consent
28:03
to share my information or I
28:06
might not mind you know that me
28:09
going in and out of my house is caught on
28:11
my ring camera and Amazon has access to that. Like
28:13
I might make that choice but
28:16
there is a way in which that
28:18
information collected all together. So like we're
28:20
each sort of like drops in the
28:22
bucket and when you collect it all
28:24
together into the full bucket of water
28:27
you can analyze it and draw out
28:29
perhaps profitable or insightful or
28:31
revealing insights into
28:34
the kinds of activities that people engage in,
28:36
what they might be interested in buying, where
28:38
they might be going, what their political
28:40
leanings might be, what their network of
28:42
friendships and relationships in the real world
28:44
might look like. And that can be
28:46
as invasive if not potentially more invasive
28:49
as the sort of individual loss of
28:51
personal information. Yeah. I
28:54
mean it seems to me especially in the case of
28:56
artificial intelligence which
28:58
is premised on making correlations in enormous amount
29:00
of data. So I may not be
29:03
imagining that assistant is
29:05
going to correlate what I bought at the
29:07
drugstore with what I did here and what
29:09
I did there and come up with some
29:11
problematic uses that I could not have anticipated
29:14
because I'm not anticipating the uses that AI is
29:16
going to be put to. Absolutely
29:18
and to be honest maybe those companies
29:20
aren't anticipating them yet either. Yeah. Like
29:22
there is an incentive right now to
29:24
collect a lot of information with a
29:28
sort of an expectation probably you know
29:30
a pretty well informed expectation that as
29:33
different techniques evolve and develop we can
29:35
draw out more and more rigorous
29:37
and refined insights from that data.
29:40
And so you know the
29:42
reality with consent too is that at the
29:44
time that I consented if you know three
29:46
years later machine learning techniques have
29:48
developed extensively even the
29:50
company that drafted up the consent
29:53
form wouldn't have known yet what it
29:55
is that I'm agreeing to. And the
29:57
idea that going back and seeking consent
29:59
later criminal
52:00
justice issues like predictive policing, even
52:03
the surveillance of sports stadiums. So
52:05
this is again, it's not just
52:07
about the border or migration or something
52:09
happening over there that we can't relate
52:11
to, but rather because it becomes normalized
52:13
in places like the border or a
52:15
refugee camp, it can then proliferate into
52:17
other parts of public life. And
52:21
just finally, we've talked about how technologies
52:23
are being used against people
52:25
on the move, but to what extent
52:27
can migrants use these technologies themselves, for
52:29
example, just to make TikTok
52:31
videos of their experiences, to use them as
52:33
communication tools, et cetera? Yeah. I
52:36
mean, the majority of my work and the book
52:38
looks at kind of the sharp edges of the
52:40
tech, but there's also so many ways of resisting
52:42
the kind of violent border regimes that
52:44
are happening and also to upscale communities
52:47
and empower communities on the move through
52:49
technology. Maybe that'll be a second book.
52:52
But there are amazing ways that people on
52:54
the move have been using technology to share,
52:56
for example, their experiences with their friends and
52:59
communities on TikTok or migrant talk, as it's
53:01
called, using different archival
53:03
methods, for example, for a psychosocial
53:05
support archive, which is a project
53:08
I know about really interesting, using
53:10
chatbots to get information into refugees'
53:12
hands directly. And so definitely
53:14
that's kind of the other side, being
53:17
creative and finding ways of kind of sitting
53:19
in this joyful resistance of what technologies can
53:21
also do and how we can maybe dream
53:23
of a different world that is also being
53:25
led by people on the move who really
53:27
are the ones who are experiencing this and
53:29
should be in the driver's seat when it
53:31
comes to development of technology, too, that can
53:34
actually assist them. Yeah. Petra,
53:36
thanks so much for talking to us. Thank you so
53:38
much, Nora. Petra Molnar is
53:40
a lawyer and anthropologist and the
53:42
author of The Walls Have Eyes,
53:44
Surviving Migration in the Age of
53:46
Artificial Intelligence. You've
54:01
been listening to Spark. The show is
54:03
made by Nichelle Parisi, Samarit Yohannes, Megan Carty
54:06
and me, Nora Young. And
54:08
by Mehdi Ghasemi, Kristen Thomason and
54:10
Petra Molyneux. I'm
54:14
Nora Young. You can check out back issues of
54:16
Spark, find and follow us wherever you get your
54:18
podcasts. Talk to you soon. Okay,
54:33
Corey, let's get our feet wet. Show us how to
54:35
surf the net. What
54:38
makes this cool is the fact that you
54:41
can point that camera at anything. Cameras can
54:43
be on every corner in the world. I mean,
54:46
and if you can request that data from anywhere
54:48
else in the world over the World Wide Web,
54:50
you're laughing. What
54:53
happens when I can have a look anywhere? For
55:04
more CBC podcasts,
55:06
go to cbc.ca/podcasts.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More