Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Apr;9(4):3807-3814.
doi: 10.1109/lra.2024.3368192. Epub 2024 Feb 21.

Towards Autonomous Retinal Microsurgery Using RGB-D Images

Affiliations

Towards Autonomous Retinal Microsurgery Using RGB-D Images

Ji Woong Kim et al. IEEE Robot Autom Lett. 2024 Apr.

Abstract

Retinal surgery is a challenging procedure requiring precise manipulation of the fragile retinal tissue, often at the scale of tens-of-micrometers. Its difficulty has motivated the development of robotic assistance platforms to enable precise motion, and more recently, novel sensors such as microscope integrated optical coherence tomography (OCT) for RGB-D view of the surgical workspace. The combination of these devices opens new possibilities for robotic automation of tasks such as subretinal injection (SI), a procedure that involves precise needle insertion into the retina for targeted drug delivery. Motivated by this opportunity, we develop a framework for autonomous needle navigation during SI. We develop a system which enables the surgeon to specify waypoint goals in the microscope and OCT views, and the system autonomously navigates the needle to the desired subretinal space in real-time. Our system integrates OCT and microscope images with convolutional neural networks (CNNs) to automatically segment the surgical tool and retinal tissue boundaries, and model predictive control that generates optimal trajectories that respect kinematic constraints to ensure patient safety. We validate our system by demonstrating 30 successful SI trials on pig eyes. Preliminary comparisons to a human operator in robot-assisted mode highlight the enhanced safety and performance of our system.

Keywords: Computer Vision for Medical Robotics; Medical Robots and Systems; Vision-Based Navigation.

PubMed Disclaimer

Figures

Fig. 1.
Fig. 1.
(A) Problem statement; Our imaging system provides a simultaneous view of the surgical tool and the B-scan image which dynamically tracks the surgical instrument (B) Experimental setup
Fig. 2.
Fig. 2.
High-level workflow; 1) the microscope image and the iOCT B-scan images are acquired. 2) Three CNNs provide needle-tip detections and retinal layer segmentations. The surgeon provides two waypoint goals in the microscope and B-scan images via two mouse-clicks. The waypoint goal is first specified in the microscope image, then in the B-scan image before insertion. 3) The relevant task and motion is planned, and an optimized trajectory is sent to the robot for trajectory-tracking.
Fig. 3.
Fig. 3.
Key variables used are shown. Double arrows are shown to indicate that the pixel goals ig_ILM and ig_subretina correspond to pg_ILM and pg_subretina respectively in euclidean space.
Fig. 4.
Fig. 4.
(A) Microscope-OCT system setup. FC, fiber collimator; GS, galvo scanners; DM, dichroic mirror; IL, imaging lens; OL, objective lens. (B) The mapping between the laser scanning position and the applied voltage.
Fig. 5.
Fig. 5.
Network architectures: (A) Two networks are trained to detect the needle tip and its base (thus defining its axis), one for microscope and another for iOCT images. (B) A third network is trained to the ILM and RPE layer segmentations
Fig. 6.
Fig. 6.
Experimental metrics are shown (see Section V): (A) navigation error on the retinal surface goal pg_ILM (B) needle-insertion error at the drug delivery site pg_subretina (C) RCM error (D) total duration of the surgery by task (E) 2D navigation error to the clicked goal from the microscope view during robot-assisted mode (F) qualitative comparison between autonomous and robot-assisted modes (G) same comparison from a close-up side view during needle insertion (H) comparing the deviation of the needle-tip from the insertion axis during needle insertion; ADE (average displacement error), FDE (final displacement error).

Similar articles

Cited by

References

    1. Berger W, Kloeckener-Gruissem B, and Neidhardt J, “The molecular basis of human retinal and vitreoretinal diseases,” Progress in Retinal and Eye Research, vol. 29, no. 5, pp. 335–375, 2010. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S135094621000025X - PubMed
    1. Singh S and Riviere C, “Physiological tremor amplitude during retinal microsurgery,” in Proceedings of the IEEE 28th Annual Northeast Bioengineering Conference (IEEE Cat. No.02CH37342), 2002, pp. 171–172.
    1. Xue K, Edwards T, Meenink H, Beelen M, Naus G, Simunovic M, De Smet M, and MacLaren R, Robot-Assisted Retinal Surgery: Over-coming Human Limitations, 05 2019, pp. 109–114.
    1. Cehajic-Kapetanovic J, Xue K, Edwards TL, Meenink TC, Beelen MJ, Naus GJ, de Smet MD, and MacLaren RE, “First-in-human robot-assisted subretinal drug delivery under local anesthesia,” Am. J. Ophthalmol, vol. 237, pp. 104–113, May 2022. - PubMed
    1. Mach K, Wei S, Kim JW, Martin-Gomez A, Zhang P, Kang JU, Nasseri MA, Gehlbach P, Navab N, and Iordachita I, “Oct-guided robotic subretinal needle injections: A deep learning-based registration approach,” in 2022 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2022, pp. 781–786. - PMC - PubMed

LinkOut - more resources