Automated opioid risk scores: a case for machine learning-induced epistemic injustice in healthcare

Giorgia Pozzi*

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

6 Citations (Scopus)
48 Downloads (Pure)

Abstract

Artificial intelligence-based (AI) technologies such as machine learning (ML) systems are playing an increasingly relevant role in medicine and healthcare, bringing about novel ethical and epistemological issues that need to be timely addressed. Even though ethical questions connected to epistemic concerns have been at the center of the debate, it is going unnoticed how epistemic forms of injustice can be ML-induced, specifically in healthcare. I analyze the shortcomings of an ML system currently deployed in the USA to predict patients’ likelihood of opioid addiction and misuse (PDMP algorithmic platforms). Drawing on this analysis, I aim to show that the wrong inflicted on epistemic agents involved in and affected by these systems’ decision-making processes can be captured through the lenses of Miranda Fricker’s account of hermeneutical injustice. I further argue that ML-induced hermeneutical injustice is particularly harmful due to what I define as an automated hermeneutical appropriation from the side of the ML system. The latter occurs if the ML system establishes meanings and shared hermeneutical resources without allowing for human oversight, impairing understanding and communication practices among stakeholders involved in medical decision-making. Furthermore and very much crucially, an automated hermeneutical appropriation can be recognized if physicians are strongly limited in their possibilities to safeguard patients from ML-induced hermeneutical injustice. Overall, my paper should expand the analysis of ethical issues raised by ML systems that are to be considered epistemic in nature, thus contributing to bridging the gap between these two dimensions in the ongoing debate.
Original languageEnglish
Article number3
JournalEthics and Information Technology
Volume25
Issue number1
DOIs
Publication statusPublished - 2023

Keywords

  • Automated hermeneutical appropriation
  • Epistemic injustice
  • Epistemology and ethics of ML
  • Hermeneutical injustice
  • Medical ML
  • Opioid risk score
  • PDMP

Fingerprint

Dive into the research topics of 'Automated opioid risk scores: a case for machine learning-induced epistemic injustice in healthcare'. Together they form a unique fingerprint.

Cite this