Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022;24(3):39.
doi: 10.1007/s10676-022-09658-7. Epub 2022 Aug 31.

Enabling Fairness in Healthcare Through Machine Learning

Affiliations

Enabling Fairness in Healthcare Through Machine Learning

Thomas Grote et al. Ethics Inf Technol. 2022.

Abstract

The use of machine learning systems for decision-support in healthcare may exacerbate health inequalities. However, recent work suggests that algorithms trained on sufficiently diverse datasets could in principle combat health inequalities. One concern about these algorithms is that their performance for patients in traditionally disadvantaged groups exceeds their performance for patients in traditionally advantaged groups. This renders the algorithmic decisions unfair relative to the standard fairness metrics in machine learning. In this paper, we defend the permissible use of affirmative algorithms; that is, algorithms trained on diverse datasets that perform better for traditionally disadvantaged groups. Whilst such algorithmic decisions may be unfair, the fairness of algorithmic decisions is not the appropriate locus of moral evaluation. What matters is the fairness of final decisions, such as diagnoses, resulting from collaboration between clinicians and algorithms. We argue that affirmative algorithms can permissibly be deployed provided the resultant final decisions are fair.

Keywords: Bias; Decision-making; Fairness; Healthcare; Machine learning.

PubMed Disclaimer

References

    1. Adamson AS, Smith A. Machine Learning and Health Care Disparities in Dermatology. JAMA Dermatol. 2018;154(11):1247–1248. doi: 10.1001/jamadermatol.2018.2348. - DOI - PubMed
    1. Anderson KO, Green CR, Payne R. Racial and ethnic disparities in pain: causes and consequences of unequal care. The journal of pain. 2009;10(12):1187–1204. doi: 10.1016/j.jpain.2009.10.002. - DOI - PubMed
    1. Angwin, J., Larson, J., Mattu, S., & Kirchner, L. 2016. Machine Bias. Technical Report. Probublica. https://propublica.org/article/machine-bias-risk-assessments-in-criminal...
    1. Baghdadi A, Lama S, Singh R, Hoshyarmanesh H, Razmi M, Sutherland GR. A data-driven performance dashboard for surgical dissection. Scientific Reports. 2021;11(1):15013. doi: 10.1038/s41598-021-94487-9. - DOI - PMC - PubMed
    1. Baghdadi A, Megahed FM, Esfahani ET, Cavuoto LA. A machine learning approach to detect changes in gait parameters following a fatiguing occupational task. Ergonomics. 2018;61(8):1116–1129. doi: 10.1080/00140139.2018.1442936. - DOI - PubMed

LinkOut - more resources