Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Jan 1:224:117399.
doi: 10.1016/j.neuroimage.2020.117399. Epub 2020 Sep 21.

Approximating anatomically-guided PET reconstruction in image space using a convolutional neural network

Affiliations

Approximating anatomically-guided PET reconstruction in image space using a convolutional neural network

Georg Schramm et al. Neuroimage. .

Abstract

In the last two decades, it has been shown that anatomically-guided PET reconstruction can lead to improved bias-noise characteristics in brain PET imaging. However, despite promising results in simulations and first studies, anatomically-guided PET reconstructions are not yet available for use in routine clinical because of several reasons. In light of this, we investigate whether the improvements of anatomically-guided PET reconstruction methods can be achieved entirely in the image domain with a convolutional neural network (CNN). An entirely image-based CNN post-reconstruction approach has the advantage that no access to PET raw data is needed and, moreover, that the prediction times of trained CNNs are extremely fast on state of the art GPUs which will substantially facilitate the evaluation, fine-tuning and application of anatomically-guided PET reconstruction in real-world clinical settings. In this work, we demonstrate that anatomically-guided PET reconstruction using the asymmetric Bowsher prior can be well-approximated by a purely shift-invariant convolutional neural network in image space allowing the generation of anatomically-guided PET images in almost real-time. We show that by applying dedicated data augmentation techniques in the training phase, in which 16 [18F]FDG and 10 [18F]PE2I data sets were used, lead to a CNN that is robust against the used PET tracer, the noise level of the input PET images and the input MRI contrast. A detailed analysis of our CNN in 36 [18F]FDG, 18 [18F]PE2I, and 7 [18F]FET test data sets demonstrates that the image quality of our trained CNN is very close to the one of the target reconstructions in terms of regional mean recovery and regional structural similarity.

Keywords: Image reconstruction; Machine learning; Magnetic resonance imaging; Molecular imaging; Quantification.

PubMed Disclaimer

Figures

Fig. 1.
Fig. 1.
Architecture of our convolutional neural network to predict a 3D anatomically-guided PET reconstructions from an input 3D OSEM PET image and a 3D structural MR image. See text for details.
Fig. 2.
Fig. 2.
Boxplots of regional values for RCmean (top) and SSIMmean (bottom) between the BOWCNN and BOW in the [18 F]FDG (blue), [18 F]PE2I (orange), and [18 F]FET (green) test cases.
Fig. 3.
Fig. 3.
Example [18 F]FDG test case acquired on the mMR. (top row) structural T1-weighted MRI used as prior image in the iterative anatomically-guided PET reconstruction using the Bowsher prior. (2nd row) Standard OSEM PET reconstruction obtained from 20 min emission data. (3rd row) reference iterative anatomically-guided PET reconstruction using the Bowsher prior (BOW). (4th row) prediction of our trained convolutional neural network (BOWCNN ) using the OSEM PET image and the structural MRI as input. (5th row) absolute difference between BOWCNN and BOW. The red arrow indicates the location of the right claustrum between the insula and putamen where BOW and BOWCNN show more anatomical detail compared to OSEM. The blue arrow shows a region of fringing artifacts in BOW that less apparent in BOWCNN.
Fig. 4.
Fig. 4.
Same as Fig. 3 for a [18 F]PE2I test case acquired on the SIGNA.
Fig. 5.
Fig. 5.
Same as Fig. 3 for a [18 F]FET test case acquired on the SIGNA. In this case the acquisition time was 25 min.
Fig. 6.
Fig. 6.
Impact of noise level in the input OSEM PET image on the image quality of the predicted anatomically-guided PET image (BOWCNN). The case shown here is the same as in Fig. 3. (top row left) structural T1-weighted MRI used as prior image in the iterative anatomically-guided PET reconstruction using the Bowsher prior. (top row right) reference iterative anatomically-guided PET reconstruction using the Bowsher prior (BOW). (2nd till 4th row left) OSEM PET reconstruction obtained from 20 min, 3 min, and 1 min of emission data. (2nd till 4th row left) corresponding predictions of our trained convolutional neural network (BOWCNN) using the respective OSEM PET image and the structural MRI as input. Note that although the noise level of the input OSEM images varies a lot, the noise level and the level of detail in the BOWCNN images is remarkably constant and comparable to the BOW image of the full 20 min emission data. The red arrows indicate a noise cluster in the 1 min and 3 min OSEM images that leads to a small focus with slightly increased signal in the BOWCNN images predicted from the those OSEM images which is not seen in the BOWCNN nor the BOW from the 20 min data.

References

    1. Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Jozefowicz R, Kaiser L, Kudlur M, Levenberg J, Mané D, Monga R, Moore S, Murray D, Olah C, Schuster M, Shlens J, Steiner B, Sutskever I, Talwar K, Tucker P, Vanhoucke V, Vasudevan V, Viégas F, Vinyals O, Warden P, Wattenberg M, Wicke M, Yu Y, Zheng X, 2015. Tensorflow: Large-scale machine learning on heterogeneous systems, software available from tensorflow.org http://tensorflow.org/.
    1. Baete K, Nuyts J, Paesschen WV, Suetens P, Dupont P, 2004. Anatomical-based FDG-PET reconstruction for the detection of hypo-metabolic regions in epilepsy. IEEE Trans. Med. Imaging 23 (4), 510–519. doi:10.1109/TMI.2004.825623. - DOI - PubMed
    1. Bland J, Mehranian A, Belzunce MA, Ellis S, McGinnity CJ, Hammers A, Reader AJ, 2017. MR-guided kernel EM reconstruction for reduced dose PET imaging. IEEE Trans. Radiat. Plasma Med. Sci. 2 (3). doi:10.1109/trpms.2017.2771490. 1–1. - DOI - PMC - PubMed
    1. Bowsher JE, Yuan H, Hedlung LW, Turkington TG, Akabani G, Badea A, Kurylo WC, Wheeler CT, Cofer GP, Dewhirst MW, Johnson A, 2004. Utilizing MRI information to estimate f18-FDG distributions in rat flank tumors. In: Proceedings of the Nuclear Science Symposium Conference Record IEEE 4 (C), pp. 2488–2492. doi:10.1109/NSSMIC.2004.1462760. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1462760. - DOI
    1. Burgos N, Cardoso MJ, Thielemans K, Modat M, Pedemonte S, Dickson J, Barnes A, Ahmed R, Mahoney CJ, Schott JM, Duncan JS, Atkinson D, Arridge SR, Hutton BF, Ourselin S, 2014. Attenuation correction synthesis for hybrid PET-MR scanners: application to brain studies. IEEE Trans. Med. Imaging 33 (12), 2332–2341. doi:10.1109/TMI.2014.2340135. - DOI - PubMed

Publication types

Substances