Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2023 Oct;96(1150):20230023.
doi: 10.1259/bjr.20230023. Epub 2023 Sep 12.

AI pitfalls and what not to do: mitigating bias in AI

Affiliations
Review

AI pitfalls and what not to do: mitigating bias in AI

Judy Wawira Gichoya et al. Br J Radiol. 2023 Oct.

Abstract

Various forms of artificial intelligence (AI) applications are being deployed and used in many healthcare systems. As the use of these applications increases, we are learning the failures of these models and how they can perpetuate bias. With these new lessons, we need to prioritize bias evaluation and mitigation for radiology applications; all the while not ignoring the impact of changes in the larger enterprise AI deployment which may have downstream impact on performance of AI models. In this paper, we provide an updated review of known pitfalls causing AI bias and discuss strategies for mitigating these biases within the context of AI deployment in the larger healthcare enterprise. We describe these pitfalls by framing them in the larger AI lifecycle from problem definition, data set selection and curation, model training and deployment emphasizing that bias exists across a spectrum and is a sequela of a combination of both human and machine factors.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Summarizes possible biases at every stage of AI development from model development, including demonstration of the intersection of human and machine in causing bias. AI, artificial intelligence.
Figure 2.
Figure 2.
The output of Roentgen with the left two images representing “pneumothorax with chest tubes” and the right two images “pneumothorax without chest tubes”. Visual inspection shows fragmented chest drains and no obvious pneumothorax on the images without chest tubes.

References

    1. Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting racial bias in an algorithm used to manage the health of populations. Science 2019; 366: 447–53. doi: 10.1126/science.aax2342 - DOI - PubMed
    1. Seyyed-Kalantari L, Zhang H, McDermott MBA, Chen IY, Ghassemi M. Underdiagnosis bias of artificial intelligence Algorithms applied to chest Radiographs in under-served patient populations. Nat Med 2021; 27: 2176–82. doi: 10.1038/s41591-021-01595-0 - DOI - PMC - PubMed
    1. Pannucci CJ, Wilkins EG. Identifying and avoiding bias in research. Plast Reconstr Surg 2010; 126: 619–25. doi: 10.1097/PRS.0b013e3181de24bc - DOI - PMC - PubMed
    1. Leavy S. Gender bias in artificial intelligence: the need for diversity and gender theory in machine learning. In: Paper presented at the Proceedings of the 1st International Workshop on Gender Equality in Software Engineering 14–16 (Association for Computing Machinery).
    1. Turner Lee N. Detecting racial bias in Algorithms and machine learning. JICES 2018; 16: 252–60. doi: 10.1108/JICES-06-2018-0056 - DOI