Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
This is a Technikon podcast.
0:09
What happens at the intersection of ethics
0:11
and technology? It's
0:14
now becoming customary to install
0:16
ethics experts in technology related
0:18
projects everywhere, but to what
0:20
end and what kind of ethical
0:23
architecture is required to keep pace
0:25
with the exponential growth in technology
0:27
development? I'm Peter Balint
0:29
from Technikon, and these are questions
0:31
we strive to answer in this podcast series
0:33
called Ethics and Technology -
0:36
A Prerequisite for European Research.
0:39
Each episode we look at the role of ethics
0:41
in different technology related sectors
0:43
such as cybersecurity, medical
0:46
technology and software and hardware
0:48
engineering and many more. Today
0:50
we look at infrastructure and smart mobility.
0:53
Our guest, Hans Graux, is an attorney
0:55
in Brussels working on the mGov4EU on
0:59
behalf of Project Partner Timelex.
1:01
Timelex is a niche law firm specializing
1:04
in the legal aspects of information
1:06
technology, which of course includes
1:08
ethics. Welcome Hans and
1:11
thank you for coming on today.
1:12
Thank you for for inviting me.
1:14
When we talk about ethics, most discussions
1:17
quickly go to privacy protection
1:20
and data protection, but this is just
1:22
one piece of the pie. There's so
1:24
much more to ethics. Can you edify
1:27
these perceptions by citing the role
1:29
that ethics plays in the mGov4EU
1:32
project?
1:33
Absolutely. And well, it's a very
1:35
welcome question because indeed there's a tendency
1:37
in European research projects to narrow
1:40
ethics challenges down to data protection, privacy
1:43
protection, and basically the questions on
1:45
which kinds of information about citizens
1:47
gets collected and what it's used for. But
1:50
ethics is a lot broader as a
1:52
as a working topic. Basically, it
1:54
comprises all kinds of potential
1:57
breaches, potential risks, potential infringements
1:59
of fundamental rights. And
2:01
I think mGov4EU is an interesting project
2:04
to demonstrate that -the the
2:06
core idea behind mGov4EU is
2:08
basically the whole notion
2:10
of data sovereignty - giving people control
2:13
over their personal data and
2:15
obviously privacy data protection is
2:17
a big part of that. We want to make sure
2:19
that data still gets dealt with
2:21
safely and securely. But there is
2:23
a broader component to that which isn't purely
2:26
about data protection and privacy
2:28
protection. It's also basically about the
2:30
questions, for instance, of empowerment,
2:32
of non-discrimination, of making
2:34
sure that you're building an information society that's
2:36
accessible pretty much across the board to
2:38
all of our European citizenship. Look simply
2:40
at the question, for instance, of digital
2:43
divide. That's not a data protection issue.
2:45
That's not a privacy issue. But if
2:47
you want to build an
2:49
ecosystem like this project wants to do
2:51
where people get more control over their
2:53
own information where they can use their own mobile
2:56
phone to interact more securely
2:58
with public administrations, manage
3:00
their own information, manage their own official
3:03
documentation, evidentiary documents, certificates,
3:05
attestations. If you want to allow people to do
3:07
that, that can be very beneficial.
3:10
But not everybody in the citizenship will be
3:12
able to do that. There is still a digital divide. Not
3:14
everybody is as equally handy with at
3:16
home with mobile devices. So you do need
3:19
to make sure that your project is designed in a
3:21
way that takes that into consideration
3:23
and that you still keep avenues open
3:26
for people who aren't really all that comfortable
3:28
acting, interacting with digital
3:30
technologies and with mobile devices.
3:33
So that's a good example of an
3:35
ethics challenge that mGov4EU needs
3:38
to deal with and where we need to provide guidance
3:41
that actually isn't related to data protection
3:43
or privacy protection at all. But
3:46
that does have a very big ethics
3:48
component
3:49
Since you broke this wide open I have to follow
3:51
up with a question. Do we need
3:53
a continuous approach to
3:56
make sure that ethics ideas are being
3:58
adhered to or are
4:00
regular monitoring efforts and general
4:02
rules sufficient?
4:04
I think there's no way to steer
4:07
around the need for continuous assessment.
4:09
And I think European projects are generally quite
4:11
good at that, at least the ones that sort of internalize
4:14
the role of ethics. So there's
4:16
a lazy way to do ethics and there's
4:18
a proper way to do ethics. The lazy
4:20
way is to just look at you know what are the ethics questions
4:23
that have been imposed to us as a part
4:25
of the project grant and
4:27
what are the boxes that we need to tick? And
4:30
just to say in a couple of weeks before the deadlines
4:32
are due, we'll write up the deliverables, we'll send
4:34
them in and we'll be done with it. That's the
4:36
lazy way to do it. But obviously
4:38
that will not help you monitor continuously
4:42
what lessons you're learning and what problems
4:44
you face and what new issues might come up.
4:46
To do that, you really need a continuous approach where
4:49
all of the partners more or less internalize
4:51
the ethical values of the project and
4:54
have an ethics awareness.
4:56
And that's very challenging because in
4:58
European research projects a part of the strength
5:00
is that you have a broad team, a broad
5:02
range of stakeholders, all of whom have their
5:04
own expertise, but all of them also have the
5:06
tendency to look at every part
5:09
of the project from their own narrow perspective.
5:12
If you're an application
5:14
designer, then you want to build your application
5:16
in the way that it works and it functions. But you
5:18
don't have to worry too much, you don't want to worry too much about
5:20
the societal impacts and
5:23
what ethical problems you might be creating
5:25
with a specific solution. And yet precisely
5:28
that part is very important if
5:30
you want to make sure that ethics are
5:32
done right. You really need basic guidance
5:36
for the project, you need a short statement of what the
5:38
main ethical principles and constraints of the
5:40
project are. You need everybody to understand
5:42
those and you need a willingness within the team
5:44
to keep an eye out for those and
5:46
to communicate about them. At least flag and
5:48
say, hey, this is what we're doing right now.
5:51
I'm kind of concerned of what the implications
5:53
are going to be. Maybe it's nothing, but maybe we should have
5:55
a brainstorming session or maybe I'd like
5:57
someone else to look at it. And if
6:00
you do it that way, if you sort of create this culture
6:03
of ethically responsible
6:05
development within a project, you can do a
6:07
lot of good and also and finding out sort
6:09
of where the unintended consequences
6:11
of your project might be, which could be beneficial
6:14
or which can be detrimental for from
6:16
an ethics perspective, making sure that those
6:18
are analyzed correctly and you have a way to deal with
6:20
those challenges before you bring a final
6:22
product to the market where you have to say, you know, we
6:24
had a good idea, but the implications
6:27
of what we're doing here are pretty serious,
6:29
there are some some downsides that are difficult to manage
6:31
which jeopardize the viability
6:34
of our project outcome as a market product.
6:37
So it sounds like then the ethics efforts
6:40
within a project, they continue to grow with
6:42
the project through the lifetime of the project, and
6:45
maybe this feeds into this idea
6:47
of responsible research. Many
6:49
EU projects expect their partners
6:52
to adhere to this idea of responsible
6:54
research. Can you give examples
6:56
of what this means in general and perhaps
6:59
more specifically in the project
7:01
that you're involved in, which is mGov4EU ?
7:04
Absolutely. So that's sort of the overarching
7:07
principle for European research
7:09
projects. The Commission
7:11
would like all of its funded projects to practice
7:13
responsible research, responsible innovation,
7:16
and that's actually a very hands
7:18
on approach towards handling ethics challenges,
7:20
because the core difficulty is that
7:23
the things like fundamental rights and ethics,
7:25
they sound good to everybody. The issue is that
7:27
you need to make sure that those values
7:29
are somehow ingrained into your project
7:32
and the outcomes in the way you approach those. And
7:35
that's difficult because usually
7:37
there isn't one solution that's
7:40
ambiguously optimal. And as a project,
7:42
you need to make some choices there on
7:44
where you place your priorities, which
7:46
values you choose to to emphasize,
7:48
and how you mitigate some potential negative
7:51
impacts on other values. I'll make that
7:53
a little bit more concrete specifically in the for mGov4EU. So
7:57
that project is essentially about allowing
8:00
people to maintain better
8:02
control over their own personal information,
8:05
making sure that when they want to identify
8:08
themselves in a specific service, they don't reveal
8:10
more information than they need to. It also
8:12
builds on the philosophy that citizens
8:15
are generally capable of managing their
8:17
own personal information
8:20
and that they can keep control over that data
8:22
via their mobile devices. That's
8:24
a very explicit value. That's about citizen
8:27
empowerment. That's about giving people control
8:29
over their own information. That's a very powerful
8:31
idea and a very ethical
8:33
perspective on how to deal with data. But
8:36
there's also there's
8:38
also tensions there with other fundamental values,
8:40
obviously with data protection, privacy protection.
8:43
That's that's one of the key priorities here. You
8:45
want to make sure that by putting
8:47
all of the data in a mobile
8:49
phone or at least under the control of a mobile phone,
8:52
you don't want to accidentally lower
8:54
the bar of security. You want to make sure that
8:56
the application of the services that you have there
8:58
are adequately protected
9:01
and that they take into account
9:03
the skill level of a reasonable,
9:06
of an average person or even someone with
9:08
a lower than average skill level. If
9:10
you are going to put people in control
9:12
over their own data, you need to do it in such
9:14
a way that people don't actually harm themselves.
9:17
That's the data protection issue as well. But
9:20
there's also a bigger justice issue behind that
9:22
that. What do you do, for instance, when things go wrong,
9:24
when there's an incident, when somebody steals data from
9:26
your mobile phone because there were tricked,
9:28
because of the phishing exercise, you know, how
9:32
do you protect people then? So empowering
9:34
citizens, you know, obviously, nobody could possibly
9:36
object to that. But how do you balance
9:39
that with the need for data protection,
9:41
making sure that people cannot get tricked? And
9:43
how do you provide people with adequate protections
9:45
if something goes wrong? What is the degree of control
9:47
you exercise over the whole
9:49
framework and over where data can
9:52
flow? So that's a good example
9:54
of where the application of an ethical framework
9:57
and making explicit what kind of values you have
9:59
will help scope the approach
10:02
to the work and the design of the architecture
10:04
around that.
10:05
So, yeah, I think that those are great examples. And it kind
10:07
of makes me wonder now what
10:10
happens when you have two
10:12
different perceptions of what ethics really
10:14
means. A
10:16
prime example is the cultural differences
10:19
which may alter perceptions about ethics.
10:21
Absolutely. At the
10:24
beginning of a project, there should be a discussion
10:26
about what kind of values you endorse within a
10:28
specific project, what kind of outcomes you want to get
10:31
and what kind of fundamental rights you want to
10:33
strengthen. And that is the point where you can
10:35
have an open discussion and say, OK, well, how important
10:38
is, for instance, individual governance?
10:40
How important is governmental supervision
10:42
for us? And what are the kind of lessons that we draw
10:45
out of that in terms of architectural
10:47
requirements? And most European
10:49
projects have that sort of built into the approach
10:52
in the sense that they have a task early
10:54
on where somebody needs to look at
10:56
what are the legal requirements and what are the ethics requirements.
10:59
But it's actually quite rare that a project explicitly
11:01
says, well, OK, let's not do
11:03
that abstractly and just say no , what is what
11:06
is the GDPR general data protection regulation,
11:08
what does the law impose on us as an obligation
11:10
in terms of data protection? But let's actually
11:13
reflect on what are the
11:15
different ethical values that we could be designing our
11:17
solution for. And one of the choices
11:19
that we want to make as a consortium, not from the
11:21
perspective that one choice is the right one,
11:23
is necessarily optimal for society,
11:26
but at least making it more explicit
11:28
what the choices are that you make as a project
11:31
that will serve also as your basis for evaluation
11:33
later on to see whether you did a good job. And
11:35
maybe you'll find halfway through the project that you were
11:38
too flexible or too strict, and that's perfectly
11:40
fine, that just means that you're learning on a difficult
11:42
topic.
11:43
And I think this goes back to what we mentioned
11:46
earlier about this idea of the continuous
11:48
approach or this constant evolving
11:50
of, you know, how ethics fits into
11:52
a project, so...
11:54
Exactly. And it
11:56
also goes, you know, also to
11:58
it impacts a lot of more practical aspects
12:01
of the of the project as well.
12:03
Yes. I'm glad you bring up the practical aspect
12:06
of ethics. This is something we will talk about
12:08
in a future episode, so for
12:10
our listeners, stay tuned. And
12:13
as for today's show, that's all we have time
12:15
for. But it was great to get an insider's view
12:17
about ethics and technology in
12:19
a real world setting. Thanks
12:21
for coming on today.
12:22
You're very welcome. Thank you for the questions. It was
12:24
very interesting.
12:26
And thank you for listening and make sure
12:28
to join us next time as we examine how
12:30
ethics fits into cybersecurity,
12:33
we will look at a huge government funded project
12:35
which allocates quite a few resources to
12:38
ensuring effective ethics principles
12:40
are embedded in a measurable way.
12:43
You won't want to miss it. See you next
12:45
time.
12:49
The mGov4EU project has received
12:51
funding from the European Union's Horizon
12:53
2020 Research and Innovation Program
12:56
under grant agreement number 959072
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More