Four Responsibility Gaps with Artificial Intelligence: Why they Matter and How to Address them

Filippo Santoni de Sio*, Giulio Mecacci

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

67 Citations (Scopus)
206 Downloads (Pure)

Abstract

The notion of “responsibility gap” with artificial intelligence (AI) was originally introduced in the philosophical debate to indicate the concern that “learning automata” may make more difficult or impossible to attribute moral culpability to persons for untoward events. Building on literature in moral and legal philosophy, and ethics of technology, the paper proposes a broader and more comprehensive analysis of the responsibility gap. The responsibility gap, it is argued, is not one problem but a set of at least four interconnected problems – gaps in culpability, moral and public accountability, active responsibility—caused by different sources, some technical, other organisational, legal, ethical, and societal. Responsibility gaps may also happen with non-learning systems. The paper clarifies which aspect of AI may cause which gap in which form of responsibility, and why each of these gaps matter. It proposes a critical review of partial and non-satisfactory attempts to address the responsibility gap: those which present it as a new and intractable problem (“fatalism”), those which dismiss it as a false problem (“deflationism”), and those which reduce it to only one of its dimensions or sources and/or present it as a problem that can be solved by simply introducing new technical and/or legal tools (“solutionism”). The paper also outlines a more comprehensive approach to address the responsibility gaps with AI in their entirety, based on the idea of designing socio-technical systems for “meaningful human control", that is systems aligned with the relevant human reasons and capacities.
Original languageEnglish
Pages (from-to)1057-1084
Number of pages28
JournalPhilosophy and Technology
Volume34
Issue number4
DOIs
Publication statusPublished - 2021

Keywords

  • AI and accountability
  • AI ethics
  • Engineer responsibility
  • Meaningful human control
  • Responsibility gap

Fingerprint

Dive into the research topics of 'Four Responsibility Gaps with Artificial Intelligence: Why they Matter and How to Address them'. Together they form a unique fingerprint.

Cite this