On Hallucinations in Artificial Intelligence-Generated Content for Nuclear Medicine Imaging (the DREAM Report)
- PMID: 41198241
- DOI: 10.2967/jnumed.125.270653
On Hallucinations in Artificial Intelligence-Generated Content for Nuclear Medicine Imaging (the DREAM Report)
Abstract
Artificial intelligence-generated content (AIGC) has shown remarkable performance in nuclear medicine imaging (NMI), offering cost-effective software solutions for tasks such as image enhancement, motion correction, and attenuation correction. However, these advancements come with the risk of hallucinations, generating realistic yet factually incorrect content. Hallucinations can misrepresent anatomic and functional information, compromising diagnostic accuracy and clinical trust. This paper presents a comprehensive perspective on hallucination-related challenges in AIGC for NMI, introducing the DREAM report, which covers recommendations for definition, representative examples, detection and evaluation metrics, and attributions and mitigation strategies. This position statement paper aims to initiate a common understanding for discussions and future research toward enhancing AIGC applications in NMI, thereby supporting their safe and effective deployment in clinical practice.
Keywords: AIGC; NMI; artificial intelligence–generated content; hallucination; nuclear medicine imaging.
© 2025 by the Society of Nuclear Medicine and Molecular Imaging.
LinkOut - more resources
Full Text Sources