Addressing fairness in artificial intelligence for medical imaging
- PMID: 35933408
- PMCID: PMC9357063
- DOI: 10.1038/s41467-022-32186-3
Addressing fairness in artificial intelligence for medical imaging
Abstract
A plethora of work has shown that AI systems can systematically and unfairly be biased against certain populations in multiple scenarios. The field of medical imaging, where AI systems are beginning to be increasingly adopted, is no exception. Here we discuss the meaning of fairness in this area and comment on the potential sources of biases, as well as the strategies available to mitigate them. Finally, we analyze the current state of the field, identifying strengths and highlighting areas of vacancy, challenges and opportunities that lie ahead.
Conflict of interest statement
The authors declare no competing interests.
Figures
References
-
- Buolamwini, J. & Gebru, T. Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency, 77–91 (PMLR, 2018).
-
- Zou, J. & Schiebinger, L. AI can be sexist and racist - it's time to make it fair. Nature559, 324–326 (2018). - PubMed
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Medical
