Relational Ethics in the Administration of Healthcare Technology: AI, Automation and Proper Distance
- PMID: 40526627
- PMCID: PMC12173213
- DOI: 10.1111/1467-9566.70055
Relational Ethics in the Administration of Healthcare Technology: AI, Automation and Proper Distance
Abstract
Automation and AI-driven decision support systems are increasingly reshaping healthcare, particularly in diagnostic and clinical management contexts. Although their potential to enhance access, efficiency and personalisation is widely recognised, there remain ethical concerns especially around the shifting dynamics of healthcare relationships. This article proposes a conceptual framework for understanding the relational ethics of healthcare automation, drawing on the work of Levinas and Silverstone to interrogate the ethical implications embedded in regulatory processes. Focusing on the Australian Therapeutic Goods Administration (TGA) database, we analyse clinical decision support system (CDSS) approvals to examine how healthcare relationships are discursively constructed within regulatory documentation. Through close reading of these technical and administrative texts, we investigate how ethical concerns such as patient autonomy, informed consent and trust are acknowledged or elided. Our findings reveal a limited framing of relational dimensions in regulatory discourse, raising important questions about how ethics are operationalised in the oversight of automated systems. By making visible the administrative practices shaping healthcare automation, this study contributes to emerging debates on AI governance and the ethical integration of automation into clinical practice.
© 2025 The Author(s). Sociology of Health & Illness published by John Wiley & Sons Ltd on behalf of Foundation for the Sociology of Health & Illness.
Similar articles
-
Ethical implications of AI-driven clinical decision support systems on healthcare resource allocation: a qualitative study of healthcare professionals' perspectives.BMC Med Ethics. 2024 Dec 21;25(1):148. doi: 10.1186/s12910-024-01151-8. BMC Med Ethics. 2024. PMID: 39707327 Free PMC article.
-
AI Through Ethical Lenses: A Discourse Analysis of Guidelines for AI in Healthcare.Sci Eng Ethics. 2024 Jun 4;30(3):24. doi: 10.1007/s11948-024-00486-0. Sci Eng Ethics. 2024. PMID: 38833207 Free PMC article.
-
Trust, Trustworthiness, and the Future of Medical AI: Outcomes of an Interdisciplinary Expert Workshop.J Med Internet Res. 2025 Jun 2;27:e71236. doi: 10.2196/71236. J Med Internet Res. 2025. PMID: 40455564 Free PMC article.
-
Assessing the comparative effects of interventions in COPD: a tutorial on network meta-analysis for clinicians.Respir Res. 2024 Dec 21;25(1):438. doi: 10.1186/s12931-024-03056-x. Respir Res. 2024. PMID: 39709425 Free PMC article. Review.
-
Applications of Large Language Models in the Field of Suicide Prevention: Scoping Review.J Med Internet Res. 2025 Jan 23;27:e63126. doi: 10.2196/63126. J Med Internet Res. 2025. PMID: 39847414 Free PMC article.
References
-
- Aquino, Y. S. J. , Rogers W. A., Braunack‐Mayer A., et al. 2023. “Utopia Versus Dystopia: Professional Perspectives on the Impact of Healthcare Artificial Intelligence on Clinical Roles and Skills.” International Journal of Medical Informatics 169: 104903. 10.1016/j.ijmedinf.2022.104903. - DOI - PubMed
-
- Attwooll, J . 2023. “‘Extremely Unwise’: Warning Over Use of ChatGPT for Medical Notes.” RACGP NewsGP. https://www1.racgp.org.au/newsgp/clinical/extremely‐unwise‐warning‐over‐....
-
- Bjerring, J. C. , and Busch J.. 2021. “Artificial Intelligence and Patient‐Centered Decision‐Making.” Philosophy and Technology 34, no. 2: 349–371. 10.1007/s13347-019-00391-6. - DOI
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Medical