Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
This is a Technikon podcast.
0:06
In our constantly evolving digital landscape,
0:09
it is expected that purveyors of
0:11
technology maintain certain responsibilities.
0:14
After all, these groups have the ability
0:16
to do good or harm or
0:19
influence others to do good or harm.
0:22
This evolution is a valid argument for
0:24
the integration of ethical decision making
0:26
into the strategic management of technology
0:29
projects. I'm Peter
0:31
Balint from Technikon, and I'm happy to bring
0:33
you this special podcast series entitled "Ethics
0:35
and Technology - A Prerequisite
0:38
for European Research." In
0:40
this series, we will look at ethics in the context
0:42
of cybersecurity, personalized health
0:44
care and many more disciplines. We
0:47
will talk about attitudes towards
0:49
ethics and we will examine how ELSA
0:51
is increasingly becoming part of
0:53
the framework in EU funded projects.
0:57
As we have heard in previous episodes,
0:59
responsible ethics is more than ticking
1:01
boxes on a form, and
1:03
no one knows this better than Conçalo Cadete. He's
1:06
an internal auditor for ELSA in
1:08
the SPARTA project. He joins
1:11
us today from INOV in Portugal. INOV
1:14
specializes in information and
1:16
communication technologies, and they
1:18
are project partners in SPARTA. Welcome
1:21
and thanks for coming on today, Conçalo.
1:23
Thank you, Peter.
1:24
You are an internal auditor
1:27
in the SPARTA project, meaning that
1:29
you report on ethical issues. And
1:32
when we spoke offline, you told me
1:34
that there were quite a few project resources
1:36
dedicated to responsible ethics.
1:39
And this tells me that ethics
1:41
cannot be ignored in EU
1:43
projects. And in
1:45
a project like SPARTA, which is largely
1:47
a technology project in the domain
1:50
of cybersecurity, what specific
1:52
ethical issues are being addressed?
1:55
Yes, correct, Peter, so ethical,
1:59
legal and societal aspects in
2:01
SPARTA, which we call ELSA,
2:04
are absolutely fundamental. They
2:06
are indeed embedded
2:09
in 4 work packages out of 14.
2:12
There is a one work package, work package 1,
2:15
that regards governance. So that
2:17
means that ELSA is embedded
2:20
in the governance framework. So that's a
2:22
very important thing. So it's not
2:25
just something as that's an appendix
2:27
or something that we thought after
2:29
that... it is really embedded. Then
2:32
we have a work package 2 which is a specific
2:34
work package to develop
2:37
and ensure ethical, legal
2:39
and societal aspects. We
2:42
also have a task in another work package
2:45
which deals with ensuring
2:49
gender and diversity issues. And
2:51
finally, we have an initial work
2:54
package that concentrates
2:56
on specifying ELSA requirements.
2:59
So ELSA is really big in SPARTA.
3:02
And good thing because this is a really
3:05
huge project with over 44
3:07
partners. And regarding
3:09
ethics, it has been said
3:11
that ethic princles in
3:14
responsible research must be
3:16
embedded in EU projects. And
3:19
looking forward, how can the EU ensure
3:21
that this happens in the future, and
3:23
how can you measure something like this?
3:26
I think SPARTA is maybe a
3:28
success story in this respect because
3:30
we have very specific mechanisms
3:33
to provide assurance.
3:35
So it's not only, you know, making
3:38
sure we have a few assessments, etc ,
3:40
but mechanisms for providing assurance.
3:44
How did SPARTA do that? So first
3:46
of all, we focused on a few issues
3:49
of fundamental rights. That's one big important
3:51
plus, privacy and ethics
3:54
requirements, gender and diversity,
3:56
and ultimately achieving responsible
3:58
research and innovation. So
4:02
what did SPARTA include
4:04
in the description of work? Yearly
4:07
audits, so that's one
4:09
fairly important mechanism. And
4:11
interestingly enough, the
4:13
audit reports are public. So
4:16
that really shows the commitment
4:18
of the European Commission towards
4:21
transparency. So they can be...
4:23
they are published. We are now
4:26
in the 3rd and final year of SPARTA,
4:28
so anyone can look at the first
4:31
2 yearly reports. They
4:34
can look at the outcomes
4:36
at the conclusions and they can, even
4:39
as we now have 2 reports, look at the progress.
4:42
So that's very important. That really
4:45
engages people. And and
4:48
as I mentioned, provides transparency.
4:51
It's not every project that has you know
4:53
public... so sensitive public
4:56
reports. The other thing
4:58
is, besides this yearly
5:01
audit, SPARTA defined that
5:04
there should be a continuous
5:06
monitoring mechanism. So
5:09
SPARTA implemented a helpdesk
5:12
with a very, very simple and accessible
5:14
interface. You just have to send an email
5:17
with the content that you wish
5:19
with a question with...well some
5:22
complain, that was never the case, but
5:24
it could be done. It will
5:27
be recorded in a ticketing system
5:29
and all the issues must be raised
5:32
to an ethical committee. So
5:35
we also have specific organizational
5:38
structures to provide
5:40
assurance and to escalate any
5:42
ethical issues. Finally,
5:46
within the context of that privacy
5:48
helpdesk, we have built
5:50
what we call a privacy frequent
5:53
asked questions document, a previously
5:55
FAQ. So it's a sort of lessons
5:57
learned and very
5:59
easy to read document on all
6:01
issues concerning ethical, legal and societal
6:04
acceptance topics.
6:07
So you're saying that you actually had
6:09
a help desk sort of environment
6:11
set up specifically for some of these ethical
6:13
issues, is that right?
6:15
That's correct, yes.
6:16
OK, wow, so do you think that will
6:19
be a model for the future?
6:20
Yes, absolutely. Because it provides
6:22
assurance to people that, first
6:25
there's something official in place, so
6:27
you don't just... - if you have an ethical
6:29
issue or a question
6:31
regarding some ethical issue, right, you know,
6:33
you want to know how to behave in a certain situation
6:36
or how to manage a conflict of interest.
6:38
You don't just send, you know, an
6:41
email to someone without
6:44
an official process tied to
6:46
it. So you have assurance that there's
6:48
a mechanism in place. It
6:51
may be used, it should be used,
6:53
and issues will get
6:55
escalated in a professional
6:58
way. So I think that's very
7:00
comfortable for everybody. So
7:02
far in Sparta, we maybe
7:06
luckily didn't have to
7:08
escalate any issues.
7:11
But, you know, all the all the pieces
7:13
were in place if that was required
7:15
or if that is still required, because we still
7:17
have some time until the end of the project.
7:20
Yeah, that was really excellent planning on
7:22
the side of ethics in the SPARTA project,
7:24
yeah. Now, when
7:27
it comes to technology related projects
7:29
such as SPARTA, how
7:31
do we integrate ethical thinking into
7:34
applied research and development?
7:36
It seems like some pragmatic
7:38
approaches would have to be employed.
7:41
Oh, yes, absolutely. So our ultimate
7:43
goal is responsible research
7:45
and innovation, right? Providing
7:48
assurance for the European Commission
7:50
and the people, the Europeans that
7:52
are actually financing this project
7:55
on 2 aspects. So the first
7:57
aspect is how we conduct
8:00
our daily activities right?. So
8:02
there are many partners from many member
8:04
nations that want
8:06
to be treated fairly, right? Not
8:09
only in terms of the distribution
8:12
of work, if it's interesting,
8:14
if it's valuable, if
8:17
we are working in an honest
8:20
way, you know, the outcomes when we publish
8:22
papers, etc . So there's
8:25
a whole lot of... so
8:27
to speak, internal issues in the
8:29
way we work together, in
8:31
the way we are strong in diversity,
8:33
right? So that's working ethics.
8:36
Now, the other aspect is the actual
8:38
outcomes of what we do. You know, things
8:41
like artificial intelligence
8:43
solutions. Are
8:45
they embedding something that
8:47
is against the values of Europeans?
8:50
Are we, for example, in artificial intelligence,
8:53
are we somehow training
8:55
and testing our algorithms with
8:58
unacceptable bias, you know, gender
9:01
bias, racial bias,
9:04
nationalistic bias, whatever, right?
9:06
So we want to ensure
9:09
that we are ethical in
9:11
these 2 aspects.
9:13
And that makes complete sense, of course.
9:16
And you speak of outcomes. Let's
9:19
look at this for a moment. In your experience,
9:21
how can bottom up communications
9:24
in the ethics realm effectively reach
9:26
the top? Especially in
9:29
an environment where if something is not
9:31
in the description of work, it may not
9:33
get done.
9:35
Good point. Actually, one
9:37
of the mechanisms that we implemented
9:39
that we just talked about, the
9:41
privacy help desk, is a very
9:44
specific implementation of a measure
9:46
for bottom-up feedback.
9:49
So I think that's
9:52
that's an interesting one. It did
9:54
not get a lot of traffic.
9:57
Maybe that's good in a way. Maybe
9:59
that's, that
10:02
may also be a sign that people are not
10:04
used to bottom
10:08
up approaches. So
10:10
maybe an area for opportunity in the future.
10:13
Let's look at conflict for for a moment.
10:16
Is there an area of conflict between
10:18
cybersecurity and ethics? And
10:20
then how do you deal with this kind of issue?
10:24
We did not have big, big
10:26
issues at SPARTA regarding
10:29
areas of conflict, but we
10:31
came up with an issue
10:33
that we identified during the audits
10:36
in the first year. It seems to be solved
10:38
in the second, but I think it could
10:40
be an interesting lesson for the future. And it
10:43
relates to the
10:45
internal side of ethics, so how we work
10:47
together as Europeans coming from different
10:50
nations and different partners.
10:52
And the specific issue is
10:56
small partner versus big
10:58
partner. What does that mean, you know?
11:00
When we have all
11:03
sorts of organizations
11:05
working as partners, some
11:08
very powerful with big
11:11
budgets, big
11:13
lobbying power, right? Which
11:16
is not a bad thing; it's just
11:18
a fact, right? And so it's
11:22
something that does come up, right, the
11:25
relation of power and how that
11:27
affects the perception of fairness.
11:30
So small partner, big partner
11:32
is something to watch for in the future, making
11:35
sure that everyone feels that they're having
11:37
a fair treatment and fair share in
11:39
the burdens, in the costs, in
11:41
the benefits.
11:43
And I think for a large project like SPARTA
11:46
with 44 partners, this
11:48
idea of equity for all
11:50
partners is an important facet in
11:52
moving the efforts forward in a responsible
11:54
way.
11:55
Absolutely, yes.
11:56
Thanks for giving us a glimpse of how ethics
11:59
looks in an ongoing project. It
12:01
was great to hear about some of your success stories
12:04
as well as challenges.
12:05
Thank you, Peter. Appreciate it. Thanks for
12:07
the invitation.
12:09
Join us next time, as we look at the world of
12:11
software and hardware engineering in
12:13
the EU. We will look at how
12:15
efforts in law enforcement and forensics
12:17
are aided by not only new
12:19
methods and tools, but also by ethics
12:21
principles. See you next
12:24
time.
12:27
This podcast has been brought to you by
12:30
Technikon, the SPARTA
12:32
project has received funding from the European
12:34
Union's Horizon 2020 Research
12:36
and Innovation Program under grant agreement
12:38
number 830892.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More