Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
This is a Technikon podcast.
0:10
So far in this series, we have learned that
0:12
almost every technology project has
0:14
some ethical considerations attached.
0:17
To avoid unintended consequences, disenfranchisement
0:20
or potentially illegal activities,
0:23
ethics measures must be implemented.
0:25
We learned that it's up to scientists
0:27
and technology experts to go beyond
0:29
requirements to embed ethics
0:32
principles, to not only ask
0:34
the question, could we, but should
0:36
we? Ethics is no longer
0:38
an appendix or an add on.
0:40
It's now becoming part of the project
0:42
framework. As a result, we are
0:45
seeing things like annual ethics audits
0:47
and continuous monitoring mechanisms.
0:50
Welcome to this special podcast series entitled
0:53
Ethics and Technology - A Prerequisite
0:55
for European Research. I'm
0:57
your host, Peter Balint from Technikon.
1:00
In this episode, we talk about the world
1:02
of software and hardware engineering. While
1:05
this discipline is dominated mainly by coders,
1:08
scientists and engineers, you
1:10
will soon see that an underpinning
1:12
of responsible ethics is key
1:14
to expecting effective results.
1:17
The EU funded EXFILES project is
1:19
an effort to develop new tools, methods
1:22
and training programs for law enforcement officials
1:24
seeking to gain access to encrypted
1:26
telephones that have been taken into police
1:28
custody as evidence. And
1:30
this is a big deal for a few reasons. First,
1:33
almost all mobile phones nowadays are
1:35
encrypted, which gives everyone
1:37
a highly effective way of keeping their data
1:39
under wraps. So it's widespread.
1:42
Secondly, a confiscated phone may contain
1:44
information that could solve crimes
1:47
or even prevent future criminal activity.
1:49
In this scenario, Law enforcement are basically
1:52
on their own, but the EXFILES project is
1:54
meeting this challenge by researching data
1:56
extraction methods for encrypted
1:58
devices. Joining
2:00
us today is Marcel Moritz. In
2:02
EXFILES he manages legal, ethical
2:05
and societal issues. He
2:07
speaks with us today from the University of Lille,
2:09
where he is a lecturer in public law. Welcome
2:12
and thanks for coming on today.
2:13
You're welcome.
2:15
We know that EXFILES is a project
2:17
necessitated by encryption. Tell
2:20
us more about this and specifically where
2:23
do technology and ethics overlap in a project
2:25
like this?
2:26
Well, I would say it's about maintaining
2:29
balance between two things. On
2:31
the one hand, allowing law enforcement agencies
2:34
to do their work properly, conducting
2:36
investigations and fighting organized
2:39
crime and terrorism, and
2:41
on the other hand, safeguarding fundamental
2:44
rights such as the right
2:46
to privacy, confidentiality of
2:48
communications, also to
2:50
avoid mass surveillance. And this
2:52
project meets the general ethical
2:55
question of the judicious
2:57
use of technological progress,
3:00
the use that will be good for us all. There's
3:02
also the question of the transparency
3:04
of evidence gathering techniques. Indeed,
3:08
behind this issue lies
3:10
an important aspect the right
3:12
to a fair trial, which is
3:14
a fundamental right. This raises
3:16
one major question to what extent
3:19
should law enforcement agencies
3:21
be transparent regarding their methods?
3:25
If there are overly secretive, they
3:27
may endanger the fundamental rights of
3:29
citizens and in particular
3:31
those of suspects. But if
3:33
law enforcement agencies are too transparent,
3:36
they risk giving criminals the means
3:38
to circumvent the methods. The
3:41
general interest must also be
3:43
taken into account, of course.
3:45
So it would appear that a project like
3:47
EXFILES has to be very
3:49
sensitive about this balance between
3:52
investigative services and
3:54
fundamental rights. This makes
3:56
me wonder in the project itself,
3:59
there are developers, either software or
4:01
hardware developers. Do they
4:03
encounter ethical issues in
4:05
their day to day work?
4:08
Yeah, of course there are ethical conflicts.
4:10
The possibilities offered by technologies
4:13
are numerous, but one should never forget
4:15
that the acceptability is a major
4:17
issue. In this regard the
4:20
crucial question in a democracy is
4:22
the possible opacity of decryption
4:24
tools. The debate between
4:27
the protection of individual liberties
4:29
and the effectiveness of surveillance
4:31
measures is in fact a very old one.
4:34
Well, I would say it's at least as old
4:36
as the 2001 attacks and
4:38
the concomitant development of digital
4:40
technologies. And this is why
4:42
it is extremely important and interesting
4:45
that such a project does not only bring
4:47
together researchers in the hard sciences
4:51
to be more precise in EXFILES, these
4:53
legal, societal and ethical issues
4:56
involve two teams, the University of Lille,
4:59
with regard to fundamental rights and legal
5:01
practical aspects and Royal Holloway and Bedford
5:03
New College with regard to forensic
5:05
research and its ethical dimensions.
5:09
OK, and so we talk about these ethical
5:11
conflicts. Are you able to share
5:13
some practical examples? Let's
5:16
take a look specifically at EXFILES, which
5:19
is, of course, forensic work on decrypting
5:22
mobile phones. Can you point to anything specific
5:24
in that project?
5:26
Yeah, I'm thinking, for example, of
5:28
the issue of zero day policies
5:30
or the introduction of back doors,
5:33
which should allow law enforcement agencies
5:36
to exploit hardware or software
5:38
vulnerabilities, whether intentional
5:40
or not, in order to carry out their
5:43
investigations. Unfortunately,
5:45
it's difficult to imagine that these
5:47
vulnerabilities could not be used by criminals
5:50
than themselves and thus undermine
5:52
the security of general public users.
5:55
It is therefore a question of finding
5:57
balance between the technical
5:59
capabilities of developers and
6:02
an ethical concern in order
6:04
to guarantee the privacy
6:06
and safety of users. Today,
6:09
cases like EncroChat prove
6:12
that decrypting phones raises
6:14
a lot of questions, this technology
6:17
EncroChat was used by tens
6:19
of thousands of people who were willing
6:21
to pay a lot of money to use sophisticated
6:23
encryption tools. Most
6:26
of those people, one imagines, were
6:28
criminals. But is monitoring
6:30
these thousands of phones targeted
6:32
surveillance or is it mass surveillance?
6:35
This is a legal issue. Proceedings
6:38
are on the way, but it is also an ethical one,
6:41
obviously. Another example
6:43
of ethical problem that maybe
6:45
is interesting is those
6:48
concerning forensic scientists. How
6:50
can we be sure that the evidence
6:53
handled by an expert has
6:55
not been falsified? To what
6:57
extent can justice trust
7:00
an expert? And if it does
7:02
not have all the technical means
7:04
or knowledge to understand the
7:06
evidence he or she presents?
7:09
OK, and you say that in EXFILES there
7:11
are ethical issues coming
7:13
up. Is there a certain way to deal with
7:15
these? Is there a set
7:17
of guidelines that you can use
7:20
within the project to make sure that everyone's on the same
7:22
page and that ethical issues get
7:24
resolved?
7:25
Yeah, in EXFILES ethical
7:28
issues are dealt with in the legal,
7:30
ethical and societal issues
7:33
work package which is work
7:35
package two, and more precisely
7:37
in two specific tasks, which
7:39
will first give rise to an analysis
7:42
of the project's ethical issues
7:45
and then combine the sets of
7:47
constraints previously identified with
7:49
the practical considerations.
7:51
Task 2.2 is entitled forensic
7:54
research, its ethical dimension
7:56
and disclosure criteria. This
7:59
task aims at offering a
8:02
practical, realistic, ethical
8:04
framework for the assessment
8:06
of the forensic research. As an
8:08
example, law enforcement agencies
8:10
might not share their capability
8:13
with the general public as device
8:15
manufacturers can adjust to
8:17
such tools methodologies, making
8:19
them impractical. One major
8:22
ethical challenge is then to
8:24
decide whether the discovery
8:26
of weakness on the targeted devices
8:29
should be made public or not.
8:31
Finally, we'll issue legal
8:34
and ethical recommendations to
8:36
law enforcement agencies, to lawmakers
8:39
and all the stakeholders starting
8:42
in the later stage of the project. This task
8:44
will, for instance, determine the regulation
8:47
regarding the legal access to data
8:49
authorized entities, scenarios
8:51
involved so on, and include
8:53
confidentiality aspects, for
8:55
instance, amount of data involved, duration
8:58
and so on. Again, the
9:00
lines between targeted
9:03
and mass surveillance are sometimes
9:05
blurred and it's important
9:07
to define the appropriate uses.
9:10
This is at the same time
9:12
the legal, societal and ethical
9:14
major issue. And that's why
9:16
all of this is together in the
9:18
task 2.2.
9:20
In your experience, are there any missing
9:22
components or outstanding issues
9:25
when it comes to responsible
9:27
research and innovation in European projects?
9:31
Well, this project is
9:33
for my young team, the second H
9:35
2020 project after
9:38
<INAUDIBLE> and for both of
9:40
these projects, I
9:42
must say that the ethical issues have
9:45
always been given special
9:47
attention. So I would say that this
9:49
point is very satisfying.
9:52
However, in my opinion, there is still
9:54
room for further reflection,
9:57
for instance, on the type of society
10:00
we want in the European Union. Well,
10:02
it is a political question, of course,
10:04
but it's a very interesting issue
10:06
for me. For example, as I
10:08
said before, the balance between the
10:11
desire for security and the desire for privacy
10:13
is changing. And it's fragile
10:17
and the issue of decryption
10:19
illustrates this. But there are many
10:22
other technologies that today raise
10:24
the same question. Facial
10:26
recognition, for instance,
10:29
behavioral prediction algorithms
10:31
and so on. A global
10:33
study about the admissable
10:35
limits of these technologies
10:38
in the European Union would
10:40
really make sense to me.
10:42
OK, so this is something that would reach across
10:44
all sectors dealing
10:46
with security, privacy and
10:49
ethical issues in technology.
10:52
Ethics, law, of
10:55
course.
10:56
Yeah. The idea of a global study would certainly
10:58
make sense at this point and perhaps we
11:00
will be fortunate enough to see this soon. Marcel,
11:03
I feel like these discussions are just the tip
11:05
of the iceberg, but raising awareness
11:08
can sometimes be step wise process,
11:10
and this means that perhaps we
11:12
can chat again. But until that time,
11:14
I'd like to say thanks for telling us about your
11:17
experience with ethics and technology.
11:19
Thank you. It was my pleasure.
11:21
And to our audience, I say thanks for listening
11:23
to the TECHNIKON series on ethics and technology.
11:27
If there is one point we can make
11:29
thus far, it's that ethics issues
11:31
will always be there. There's
11:33
no changing that. So in
11:36
responsible research, which is of course
11:38
the cornerstone in these kinds of larger
11:40
EU projects, institutions
11:43
must prepare in advance. There must
11:45
be a solid ethics plan.
11:47
Partners must be willing to discuss the
11:49
issues. There is quite
11:52
a premium on risk mitigation in this
11:54
area. That
11:56
said, embedding ethics principles
11:58
into technology projects adds
12:00
a certain dimension which could
12:03
really unify partners while benefiting
12:05
the European society as a whole. And
12:08
that's the ultimate goal, right? See
12:11
you next time as we continue our journey into
12:13
the realm we call ethics and
12:15
technology.
12:18
The EXFILES project has received funding
12:20
from the European Union's Horizon 2020
12:22
Research and Innovation Program under
12:24
grant agreement number 883156
12:26
1 5 6.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More