Testimonial injustice in medical machine learning
- PMID: 36635066
- DOI: 10.1136/jme-2022-108630
Testimonial injustice in medical machine learning
Abstract
Machine learning (ML) systems play an increasingly relevant role in medicine and healthcare. As their applications move ever closer to patient care and cure in clinical settings, ethical concerns about the responsibility of their use come to the fore. I analyse an aspect of responsible ML use that bears not only an ethical but also a significant epistemic dimension. I focus on ML systems' role in mediating patient-physician relations. I thereby consider how ML systems may silence patients' voices and relativise the credibility of their opinions, which undermines their overall credibility status without valid moral and epistemic justification. More specifically, I argue that withholding credibility due to how ML systems operate can be particularly harmful to patients and, apart from adverse outcomes, qualifies as a form of testimonial injustice. I make my case for testimonial injustice in medical ML by considering ML systems currently used in the USA to predict patients' risk of misusing opioids (automated Prediction Drug Monitoring Programmes, PDMPs for short). I argue that the locus of testimonial injustice in ML-mediated medical encounters is found in the fact that these systems are treated as markers of trustworthiness on which patients' credibility is assessed. I further show how ML-based PDMPs exacerbate and further propagate social inequalities at the expense of vulnerable social groups.
Keywords: Ethics- Medical.
© Author(s) (or their employer(s)) 2023. No commercial re-use. See rights and permissions. Published by BMJ.
Conflict of interest statement
Competing interests: None declared.
Comment in
-
'Can I trust my patient?' Machine Learning support for predicting patient behaviour.J Med Ethics. 2023 Aug;49(8):543-544. doi: 10.1136/jme-2023-109094. Epub 2023 May 15. J Med Ethics. 2023. PMID: 37188507 No abstract available.
-
Ubuntu as a complementary perspective for addressing epistemic (in)justice in medical machine learning.J Med Ethics. 2023 Aug;49(8):545-546. doi: 10.1136/jme-2023-109097. Epub 2023 May 15. J Med Ethics. 2023. PMID: 37188508 No abstract available.
-
PDMP causes more than just testimonial injustice.J Med Ethics. 2023 Aug;49(8):549-550. doi: 10.1136/jme-2023-109112. Epub 2023 May 22. J Med Ethics. 2023. PMID: 37217278 No abstract available.
-
Testimonial injustice in medical machine learning: a perspective from psychiatry.J Med Ethics. 2023 Aug;49(8):541-542. doi: 10.1136/jme-2023-109059. Epub 2023 May 30. J Med Ethics. 2023. PMID: 37253554 No abstract available.
-
Epistemic virtues of harnessing rigorous machine learning systems in ethically sensitive domains.J Med Ethics. 2023 Aug;49(8):547-548. doi: 10.1136/jme-2023-109105. Epub 2023 May 31. J Med Ethics. 2023. PMID: 37258138 No abstract available.
-
Further remarks on testimonial injustice in medical machine learning: a response to commentaries.J Med Ethics. 2023 Aug;49(8):551-552. doi: 10.1136/jme-2023-109302. Epub 2023 Jun 12. J Med Ethics. 2023. PMID: 37308279 No abstract available.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Medical
Research Materials