Mitigating patient harm risks: A proposal of requirements for AI in healthcare
- PMID: 40446590
- DOI: 10.1016/j.artmed.2025.103168
Mitigating patient harm risks: A proposal of requirements for AI in healthcare
Abstract
With the rise Artificial Intelligence (AI), mitigation strategies may be needed to integrate AI-enabled medical software responsibly, ensuring ethical alignment and patient safety. This study examines how to mitigate the key risks identified by the European Parliamentary Research Service (EPRS). For that, we discuss how complementary risk-mitigation requirements may ensure the main aspects of AI in Healthcare: Reliability - Continuous performance evaluation, Continuous usability test, Encryption and use of field-tested libraries, Semantic interoperability -, Transparency - AI passport, eXplainable AI, Data quality assessment, Bias Check -, Traceability - User management, Audit trail, Review of cases-, and Responsibility - Regulation check, Academic use only disclaimer, Clinicians double check -. A survey conducted among 216 Medical ICT professionals (medical doctors, ICT staff and complementary profiles) between March and June 2024 revealed these requirements were perceived positive by all profiles. Responders deemed explainable AI and data quality assessment essential for transparency; audit trail for traceability; and regulatory compliance and clinician double check for responsibility. Clinicians rated the following requirements more relevant (p < 0.05) than technicians: continuous performance assessment, usability testing, encryption, AI passport, retrospective case review, and academic use check. Additionally, users found the AI passport more relevant for transparency than decision-makers (p < 0.05). We trust that this proposal can serve as a starting point to endow the future AI systems in medical practice with requirements to ensure their ethical deployment.
Keywords: AI act; Artificial intelligence; Medical software; Mitigating strategies; Patient harm; Reliability; Responsibility; Risk for patients; Software design requirements; Survey; Traceability; Transparency.
Copyright © 2025 The Authors. Published by Elsevier B.V. All rights reserved.
Conflict of interest statement
Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References
MeSH terms
LinkOut - more resources
Full Text Sources
Medical
Miscellaneous
