Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Nov;50(11):6649-6662.
doi: 10.1002/mp.16691. Epub 2023 Sep 13.

Real-time liver motion estimation via deep learning-based angle-agnostic X-ray imaging

Affiliations

Real-time liver motion estimation via deep learning-based angle-agnostic X-ray imaging

Hua-Chieh Shao et al. Med Phys. 2023 Nov.

Abstract

Background: Real-time liver imaging is challenged by the short imaging time (within hundreds of milliseconds) to meet the temporal constraint posted by rapid patient breathing, resulting in extreme under-sampling for desired 3D imaging. Deep learning (DL)-based real-time imaging/motion estimation techniques are emerging as promising solutions, which can use a single X-ray projection to estimate 3D moving liver volumes by solved deformable motion. However, such techniques were mostly developed for a specific, fixed X-ray projection angle, thereby impractical to verify and guide arc-based radiotherapy with continuous gantry rotation.

Purpose: To enable deformable motion estimation and 3D liver imaging from individual X-ray projections acquired at arbitrary X-ray scan angles, and to further improve the accuracy of single X-ray-driven motion estimation.

Methods: We developed a DL-based method, X360, to estimate the deformable motion of the liver boundary using an X-ray projection acquired at an arbitrary gantry angle (angle-agnostic). X360 incorporated patient-specific prior information from planning 4D-CTs to address the under-sampling issue, and adopted a deformation-driven approach to deform a prior liver surface mesh to new meshes that reflect real-time motion. The liver mesh motion is solved via motion-related image features encoded in the arbitrary-angle X-ray projection, and through a sequential combination of rigid and deformable registration modules. To achieve the angle agnosticism, a geometry-informed X-ray feature pooling layer was developed to allow X360 to extract angle-dependent image features for motion estimation. As a liver boundary motion solver, X360 was also combined with priorly-developed, DL-based optical surface imaging and biomechanical modeling techniques for intra-liver motion estimation and tumor localization.

Results: With geometry-aware feature pooling, X360 can solve the liver boundary motion from an arbitrary-angle X-ray projection. Evaluated on a set of 10 liver patient cases, the mean (± s.d.) 95-percentile Hausdorff distance between the solved liver boundary and the "ground-truth" decreased from 10.9 (±4.5) mm (before motion estimation) to 5.5 (±1.9) mm (X360). When X360 was further integrated with surface imaging and biomechanical modeling for liver tumor localization, the mean (± s.d.) center-of-mass localization error of the liver tumors decreased from 9.4 (± 5.1) mm to 2.2 (± 1.7) mm.

Conclusion: X360 can achieve fast and robust liver boundary motion estimation from arbitrary-angle X-ray projections for real-time imaging guidance. Serving as a surface motion solver, X360 can be integrated into a combined framework to achieve accurate, real-time, and marker-less liver tumor localization.

Keywords: X-ray; graph neural network; liver; real-time imaging.

PubMed Disclaimer

Conflict of interest statement

Conflict of Interest Statement

The authors have no relevant conflicts of interest to disclose.

Figures

Figure 1.
Figure 1.
Architecture of the angle-agnostic X360 model. The X360 model predicted liver boundary motion by extracting motion-related image features from a single, arbitrary-angle X-ray projection. X360 consisted of a feature extraction network and a graph neural network for image feature extraction and motion estimation, respectively. In the graph neural network, the liver boundary motion was sequentially estimated by the graph convolutional networks in the rigid and deformable modules, each of which estimates rigid and deformable motion, respectively, to improve the model performance and robustness. The angle awareness of X360 was achieved by the geometry-informed perceptual feature pooling layers which were capable of angle-dependent feature pooling.
Figure 2.
Figure 2.
The geometry-informed perceptual feature pooling layer. The layer pools projection angle-dependent image features by projecting the liver surface nodes onto the ResNet-50-extracted feature maps. To facilitate the angle-agnostic inference of liver boundary motion, the cone-beam geometry same as the X-ray projection was incorporated into the pooling layer to extract the relevant features at the current projection angle.
Figure 3.
Figure 3.
Synergy of X360 with DL-based surface imaging and liver biomechanical modeling for intra-liver motion tracking and tumor localization. X360 was incorporated in a previously developed framework to assess the accuracy of liver tumor localization. (a) The Surf model first estimated approximated liver boundary motion, using an optical body surface image in the thoracic-abdominal region. (b) Using the Surf-predicted liver surface mesh as the initial mesh, X360 further fine-tuned the liver boundary motion from an arbitrary-angle X-ray projection. (c) Finally, the Bio model performed a DL-based liver biomechanical modeling to solve intra-liver DVFs, using the X360 fine-tuned liver surface DVFs as the boundary condition for tumor localization.
Figure 4.
Figure 4.
Liver surface mesh overlay and liver boundary registration errors of two patients at three projection angles. The first column presents the mesh overlay between the prior and the ‘ground-truth’ liver surface meshes of the same liver motion at the 0° view angle, which is accompanied by the zoomed-out overlays viewed at the −80° and 80° angles, respectively. The prior and the ‘ground-truth’ meshes are in color red and yellow, respectively. The second and the third columns present the X360-nrr-deformed and the X360-deformed liver surface meshes, color coded by the node-wise surface distance to the ‘ground-truth’ liver surface. Each row of the second and the third columns presents different meshes solved by an X-ray projection acquired at a corresponding view angle (−80°, 0°, or 80°), accompanied by the zoomed-out meshes viewed from the other two angles.
Figure 5.
Figure 5.
Liver surface node projection onto onboard X-ray projections at −80°, 0° and 80° angles for two patients. The first and the second columns of each patient case present the projections of the liver surface nodes prior to and after the X360 motion inference.
Figure 6.
Figure 6.
Comparison of the liver boundary registration accuracy between X360-nrr and X360 for 10 patient cases. The first boxplot of each patient presents the registration error prior to the registration, and the second and the third boxplots present the registration errors of X360-nrr and X360, respectively.
Figure 7.
Figure 7.
Comparison of liver boundary registration accuracy of angle-specific (X0) and angle-agnostic (X360) models as a function of the projection angle.

Similar articles

Cited by

References

    1. Verellen D, De Ridder M, Linthout N, Tournel K, Soete G, Storme G. Innovations in image-guided radiotherapy. Nat Rev Cancer. 2007;7(12):949–960. - PubMed
    1. Bernier J, Hall EJ, Giaccia A. Radiation oncology: a century of achievements. Nat Rev Cancer. 2004;4(9):737–747. - PubMed
    1. Jaffray DA. Image-guided radiotherapy: from current concept to future perspectives. Nat Rev Clin Oncol. 2012;9(12):688–699. - PubMed
    1. Tubiana M, Eschwege F. Conformal radiotherapy and intensity-modulated radiotherapy--clinical data. Acta Oncol. 2000;39(5):555–567. - PubMed
    1. Teoh M, Clark CH, Wood K, Whitaker S, Nisbet A. Volumetric modulated arc therapy: a review of current literature and clinical use in practice. Br J Radiol. 2011;84(1007):967–996. - PMC - PubMed