Cardiac ultrasound simulation for autonomous ultrasound navigation
- PMID: 39193499
- PMCID: PMC11347295
- DOI: 10.3389/fcvm.2024.1384421
Cardiac ultrasound simulation for autonomous ultrasound navigation
Abstract
Introduction: Ultrasound is well-established as an imaging modality for diagnostic and interventional purposes. However, the image quality varies with operator skills as acquiring and interpreting ultrasound images requires extensive training due to the imaging artefacts, the range of acquisition parameters and the variability of patient anatomies. Automating the image acquisition task could improve acquisition reproducibility and quality but training such an algorithm requires large amounts of navigation data, not saved in routine examinations.
Methods: We propose a method to generate large amounts of ultrasound images from other modalities and from arbitrary positions, such that this pipeline can later be used by learning algorithms for navigation. We present a novel simulation pipeline which uses segmentations from other modalities, an optimized volumetric data representation and GPU-accelerated Monte Carlo path tracing to generate view-dependent and patient-specific ultrasound images.
Results: We extensively validate the correctness of our pipeline with a phantom experiment, where structures' sizes, contrast and speckle noise properties are assessed. Furthermore, we demonstrate its usability to train neural networks for navigation in an echocardiography view classification experiment by generating synthetic images from more than 1,000 patients. Networks pre-trained with our simulations achieve significantly superior performance in settings where large real datasets are not available, especially for under-represented classes.
Discussion: The proposed approach allows for fast and accurate patient-specific ultrasound image generation, and its usability for training networks for navigation-related tasks is demonstrated.
Keywords: Monte-Carlo integration; echocardiography; path tracing; simulation; ultrasound.
© 2024 Amadou, Peralta, Dryburgh, Klein, Petkov, Housden, Singh, Liao, Kim, Ghesu, Mansi, Rajani, Young and Rhode.
Conflict of interest statement
PK, KP, VS, Y-HK and FCG are employed by Siemens Healthineers. RL and TM were employed by Siemens Healthineers. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Figures
References
-
- Huang Y, Xiao W, Wang C, Liu H, Huang RP, Sun Z. Towards fully autonomous ultrasound scanning robot with imitation learning based on clinical protocols. IEEE Robot Autom Lett. (2021) 6:3671–8. 10.1109/LRA.2021.3064283 - DOI
-
- Hase H, Azampour MF, Tirindelli M, Paschali M, Simson W, Fatemizadeh E, et al. . Ultrasound-guided robotic navigation with deep reinforcement learning. In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2020). p. 5534–41.
-
- Li K, Li A, Xu Y, Xiong H, Meng MQH. Rl-tee: autonomous probe guidance for transesophageal echocardiography based on attention-augmented deep reinforcement learning. IEEE Trans Autom Sci Eng. (2023) 21(2):1526–38. 10.1109/TASE.2023.3246089 - DOI
Grants and funding
LinkOut - more resources
Full Text Sources
