Development and validation of an AI algorithm to generate realistic and meaningful counterfactuals for retinal imaging based on diffusion models
- PMID: 40373008
- PMCID: PMC12080772
- DOI: 10.1371/journal.pdig.0000853
Development and validation of an AI algorithm to generate realistic and meaningful counterfactuals for retinal imaging based on diffusion models
Abstract
Counterfactual reasoning is often used by humans in clinical settings. For imaging based specialties such as ophthalmology, it would be beneficial to have an AI model that can create counterfactual images, illustrating answers to questions like "If the subject had had diabetic retinopathy, how would the fundus image have looked?". Such an AI model could aid in training of clinicians or in patient education through visuals that answer counterfactual queries. We used large-scale retinal image datasets containing color fundus photography (CFP) and optical coherence tomography (OCT) images to train ordinary and adversarially robust classifiers that classify healthy and disease categories. In addition, we trained an unconditional diffusion model to generate diverse retinal images including ones with disease lesions. During sampling, we then combined the diffusion model with classifier guidance to achieve realistic and meaningful counterfactual images maintaining the subject's retinal image structure. We found that our method generated counterfactuals by introducing or removing the necessary disease-related features. We conducted an expert study to validate that generated counterfactuals are realistic and clinically meaningful. Generated color fundus images were indistinguishable from real images and were shown to contain clinically meaningful lesions. Generated OCT images appeared realistic, but could be identified by experts with higher than chance probability. This shows that combining diffusion models with classifier guidance can achieve realistic and meaningful counterfactuals even for high-resolution medical images such as CFP images. Such images could be used for patient education or training of medical professionals.
Copyright: © 2025 Ilanchezian et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Conflict of interest statement
The authors have declared that no competing interests exist.
Figures









Similar articles
-
Combining transfer learning with retinal lesion features for accurate detection of diabetic retinopathy.Front Med (Lausanne). 2022 Nov 8;9:1050436. doi: 10.3389/fmed.2022.1050436. eCollection 2022. Front Med (Lausanne). 2022. PMID: 36425113 Free PMC article.
-
Exploring Healthy Retinal Aging with Deep Learning.Ophthalmol Sci. 2023 Mar 1;3(3):100294. doi: 10.1016/j.xops.2023.100294. eCollection 2023 Sep. Ophthalmol Sci. 2023. PMID: 37113474 Free PMC article.
-
Use of artificial intelligence with retinal imaging in screening for diabetes-associated complications: systematic review.EClinicalMedicine. 2025 Feb 18;81:103089. doi: 10.1016/j.eclinm.2025.103089. eCollection 2025 Mar. EClinicalMedicine. 2025. PMID: 40052065 Free PMC article. Review.
-
Counterfactual MRI Generation with Denoising Diffusion Models for Interpretable Alzheimer's Disease Effect Detection.Annu Int Conf IEEE Eng Med Biol Soc. 2024 Jul;2024:1-6. doi: 10.1109/EMBC53108.2024.10782737. Annu Int Conf IEEE Eng Med Biol Soc. 2024. PMID: 40039528
-
Discriminative, generative artificial intelligence, and foundation models in retina imaging.Taiwan J Ophthalmol. 2024 Nov 28;14(4):473-485. doi: 10.4103/tjo.TJO-D-24-00064. eCollection 2024 Oct-Dec. Taiwan J Ophthalmol. 2024. PMID: 39803410 Free PMC article. Review.
References
-
- Prosperi M, Guo Y, Sperrin M, Koopman JS, Min JS, He X, et al.. Causal inference and counterfactual prediction in machine learning for actionable healthcare. Nat Mach Intell. 2020;2(7):369–75. doi: 10.1038/s42256-020-0197-y - DOI
-
- Sanchez P, Tsaftaris SA. Diffusion causal models for counterfactual estimation. In: First Conference on Causal Learning and Reasoning; 2022. Available from: https://openreview.net/forum?id=LAAZLZIMN-o
-
- Boreiko V, Augustin M, Croce F, Berens P, Hein M. Sparse visual counterfactual explanations in image space. In: Pattern recognition. Springer; 2022. p. 133–48.
LinkOut - more resources
Full Text Sources
Miscellaneous