Considerations for addressing bias in artificial intelligence for health equity
- PMID: 37700029
- PMCID: PMC10497548
- DOI: 10.1038/s41746-023-00913-9
Considerations for addressing bias in artificial intelligence for health equity
Abstract
Health equity is a primary goal of healthcare stakeholders: patients and their advocacy groups, clinicians, other providers and their professional societies, bioethicists, payors and value based care organizations, regulatory agencies, legislators, and creators of artificial intelligence/machine learning (AI/ML)-enabled medical devices. Lack of equitable access to diagnosis and treatment may be improved through new digital health technologies, especially AI/ML, but these may also exacerbate disparities, depending on how bias is addressed. We propose an expanded Total Product Lifecycle (TPLC) framework for healthcare AI/ML, describing the sources and impacts of undesirable bias in AI/ML systems in each phase, how these can be analyzed using appropriate metrics, and how they can be potentially mitigated. The goal of these "Considerations" is to educate stakeholders on how potential AI/ML bias may impact healthcare outcomes and how to identify and mitigate inequities; to initiate a discussion between stakeholders on these issues, in order to ensure health equity along the expanded AI/ML TPLC framework, and ultimately, better health outcomes for all.
© 2023. Springer Nature Limited.
Conflict of interest statement
M.D.A. reports the following conflicts of interest: Digital Diagnostics, Inc, Coralville, Iowa: Investor, Director, Consultant; patents and patent applications assigned to the University of Iowa and Digital Diagnostics that are relevant to the subject matter of this manuscript; Chair Healthcare AI Coalition, Washington DC; member, American Academy of Ophthalmology (AAO) AI Committee; member, AI Workgroup Digital Medicine Payment Advisory Group (DMPAG) of the American Medical Association. Z.O. reports the following conflicts of interest: Chief Scientific Officer, Dandelion Health. None of the other authors report conflicts of interest.
Figures
Similar articles
-
Foundational Considerations for Artificial Intelligence Using Ophthalmic Images.Ophthalmology. 2022 Feb;129(2):e14-e32. doi: 10.1016/j.ophtha.2021.08.023. Epub 2021 Aug 31. Ophthalmology. 2022. PMID: 34478784 Free PMC article.
-
Development and preliminary testing of Health Equity Across the AI Lifecycle (HEAAL): A framework for healthcare delivery organizations to mitigate the risk of AI solutions worsening health inequities.PLOS Digit Health. 2024 May 9;3(5):e0000390. doi: 10.1371/journal.pdig.0000390. eCollection 2024 May. PLOS Digit Health. 2024. PMID: 38723025 Free PMC article.
-
Empowering nurses to champion Health equity & BE FAIR: Bias elimination for fair and responsible AI in healthcare.J Nurs Scholarsh. 2025 Jan;57(1):130-139. doi: 10.1111/jnu.13007. Epub 2024 Jul 29. J Nurs Scholarsh. 2025. PMID: 39075715 Free PMC article. Review.
-
Human-Centered Design to Address Biases in Artificial Intelligence.J Med Internet Res. 2023 Mar 24;25:e43251. doi: 10.2196/43251. J Med Internet Res. 2023. PMID: 36961506 Free PMC article.
-
Ethical Considerations in the Use of Artificial Intelligence and Machine Learning in Health Care: A Comprehensive Review.Cureus. 2024 Jun 15;16(6):e62443. doi: 10.7759/cureus.62443. eCollection 2024 Jun. Cureus. 2024. PMID: 39011215 Free PMC article. Review.
Cited by
-
Clinical applications of artificial intelligence and machine learning in neurocardiology: a comprehensive review.Front Cardiovasc Med. 2025 Apr 3;12:1525966. doi: 10.3389/fcvm.2025.1525966. eCollection 2025. Front Cardiovasc Med. 2025. PMID: 40248254 Free PMC article. Review.
-
Comparison of Deep Learning and Clinician Performance for Detecting Referable Glaucoma from Fundus Photographs in a Safety Net Population.Ophthalmol Sci. 2025 Feb 25;5(4):100751. doi: 10.1016/j.xops.2025.100751. eCollection 2025 Jul-Aug. Ophthalmol Sci. 2025. PMID: 40235827 Free PMC article.
-
Bias in medical AI: Implications for clinical decision-making.PLOS Digit Health. 2024 Nov 7;3(11):e0000651. doi: 10.1371/journal.pdig.0000651. eCollection 2024 Nov. PLOS Digit Health. 2024. PMID: 39509461 Free PMC article. Review.
-
A scoping review of reporting gaps in FDA-approved AI medical devices.NPJ Digit Med. 2024 Oct 3;7(1):273. doi: 10.1038/s41746-024-01270-x. NPJ Digit Med. 2024. PMID: 39362934 Free PMC article.
-
Breaking barriers: enhancing cancer detection in younger patients by overcoming diagnostic bias in primary care.Front Med (Lausanne). 2025 Jan 22;11:1438402. doi: 10.3389/fmed.2024.1438402. eCollection 2024. Front Med (Lausanne). 2025. PMID: 39911679 Free PMC article. No abstract available.
References
-
- U.S. Department of Health and Human Services HRaSA, Office of Health Equity. Health Equity Report 2019-2020. https://www.hrsa.gov/sites/default/files/hrsa/health-equity/HRSA-health-... (2020).