This is a preprint.
Real-time CBCT Imaging and Motion Tracking via a Single Arbitrarily-angled X-ray Projection by a Joint Dynamic Reconstruction and Motion Estimation (DREME) Framework
- PMID: 39398221
- PMCID: PMC11469417
Real-time CBCT Imaging and Motion Tracking via a Single Arbitrarily-angled X-ray Projection by a Joint Dynamic Reconstruction and Motion Estimation (DREME) Framework
Update in
-
Real-time CBCT imaging and motion tracking via a single arbitrarily-angled x-ray projection by a joint dynamic reconstruction and motion estimation (DREME) framework.Phys Med Biol. 2025 Jan 21;70(2):025026. doi: 10.1088/1361-6560/ada519. Phys Med Biol. 2025. PMID: 39746309 Free PMC article.
Abstract
Objective: Real-time cone-beam computed tomography (CBCT) provides instantaneous visualization of patient anatomy for image guidance, motion tracking, and online treatment adaptation in radiotherapy. While many real-time imaging and motion tracking methods leveraged patient-specific prior information to alleviate under-sampling challenges and meet the temporal constraint (< 500 ms), the prior information can be outdated and introduce biases, thus compromising the imaging and motion tracking accuracy. To address this challenge, we developed a framework (DREME) for real-time CBCT imaging and motion estimation, without relying on patient-specific prior knowledge.
Approach: DREME incorporates a deep learning-based real-time CBCT imaging and motion estimation method into a dynamic CBCT reconstruction framework. The reconstruction framework reconstructs a dynamic sequence of CBCTs in a data-driven manner from a standard pre-treatment scan, without utilizing patient-specific knowledge. Meanwhile, a convolutional neural network-based motion encoder is jointly trained during the reconstruction to learn motion-related features relevant for real-time motion estimation, based on a single arbitrarily-angled x-ray projection. DREME was tested on digital phantom simulation and real patient studies.
Main results: DREME accurately solved 3D respiration-induced anatomic motion in real time (~1.5 ms inference time for each x-ray projection). In the digital phantom study, it achieved an average lung tumor center-of-mass localization error of 1.2±0.9 mm (Mean±SD). In the patient study, it achieved a real-time tumor localization accuracy of 1.8±1.6 mm in the projection domain.
Significance: DREME achieves CBCT and volumetric motion estimation in real time from a single x-ray projection at arbitrary angles, paving the way for future clinical applications in intra-fractional motion management. In addition, it can be used for dose tracking and treatment assessment, when combined with real-time dose calculation.
Keywords: Deep learning; Dynamic CBCT reconstruction; Motion estimation; Motion model; Real-time imaging; X-ray.
Figures
References
-
- Bernier J., Hall E. J. and Giaccia A. (2004). “Radiation oncology: a century of achievements.” Nat Rev Cancer 4(9): 737–747. - PubMed
-
- Chen M., Lu W., Chen Q., Ruchala K. J. and Olivera G. H. (2008). “A simple fixed-point approach to invert a deformation field.” Med Phys 35(1): 81–88. - PubMed
-
- Cui Y., Dy J. G., Sharp G. C., Alexander B. and Jiang S. B. (2007). “Multiple template-based fluoroscopic tracking of lung tumor mass without implanted fiducial markers.” Physics in Medicine and Biology 52(20): 6229–6242. - PubMed
-
- Feldkamp L. A., Davis L. C. and Kress J. W. (1984). “Practical Cone-Beam Algorithm.” Journal of the Optical Society of America a-Optics Image Science and Vision 1(6): 612–619.
-
- Keall P., Poulsen P. and Booth J. T. (2019). “See, Think, and Act: Real-Time Adaptive Radiotherapy.” Semin Radiat Oncol 29(3): 228–235. - PubMed
Publication types
Grants and funding
LinkOut - more resources
Full Text Sources