Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Dec 5;6(6):159-164.
doi: 10.1049/htl.2019.0068. eCollection 2019 Dec.

Real-time surgical instrument tracking in robot-assisted surgery using multi-domain convolutional neural network

Affiliations

Real-time surgical instrument tracking in robot-assisted surgery using multi-domain convolutional neural network

Liang Qiu et al. Healthc Technol Lett. .

Abstract

Image-based surgical instrument tracking in robot-assisted surgery is an active and challenging research area. Having a real-time knowledge of surgical instrument location is an essential part of a computer-assisted intervention system. Tracking can be used as visual feedback for servo control of a surgical robot or transformed as haptic feedback for surgeon-robot interaction. In this Letter, the authors apply a multi-domain convolutional neural network for fast 2D surgical instrument tracking considering the application for multiple surgical tools and use a focal loss to decrease the effect of easy negative examples. They further introduce a new dataset based on m2cai16-tool and their cadaver experiments due to the lack of established public surgical tool tracking dataset despite significant progress in this field. Their method is evaluated on the introduced dataset and outperforms the state-of-the-art real-time trackers.

Keywords: active research area; challenging research area; computer-assisted intervention system; established public surgical tool tracking dataset; medical robotics; multidomain convolutional neural network; multiple surgical tools; neural nets; real-time knowledge; robot-assisted surgery; surgeon–robot interaction; surgery; surgical instrument location; surgical robot; time surgical instrument tracking.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1
Our dataset is made based on cadaver experiment videos for transoral surgery and selected laparoscopic surgery videos in the m2cai16-tool dataset. The corresponding examples of surgical tools are shown here
Fig. 2
Fig. 2
Our robot-assisted surgery framework which exploits multi-domain CNN to track surgical tools with endoscopic images as input and surgical tool location as output. The output information with bounding boxes will be further utilised in the processing unit to provide 6D pose estimation, which will provide more benefits for surgical tool navigation
Fig. 3
Fig. 3
Precision and success plots using OPE a, b Ablation study: our method compares with RT-MDNet and the corresponding version without instance embedding loss on our STT dataset ch Show quantitative results of six real-time trackers on m2cai-tool-tracking sub-dataset, robot-assisted-tracking sub-dataset and STT dataset
Fig. 4
Fig. 4
Success plots of six real-time trackers over eight tracking challenges a Illumination variation b Background clutter c Deformation d Occlusion e In-plane rotation f Scale variation g Out-of-plane rotation h Motion blur
Fig. 5
Fig. 5
Qualitative evaluation of six real-time trackers with example frames shows that our method outperforms the state-of-the-art on STT dataset

References

    1. Chmarra M.K., Grimbergen C., Dankelman J.: ‘Systems for tracking minimally invasive surgical instruments’, Minim Invasive Ther. Allied Technol., 2007, 16, (6), pp. 328–340 (doi: 10.1080/13645700701702135) - PubMed
    1. Brown A.J., Uneri A., De Silva T.S., et al. : ‘Design and validation of an open-source library of dynamic reference frames for research and education in optical tracking’, J. Med. Imaging, 2018, 5, (2), p. 021215 (doi: 10.1117/1.JMI.5.2.021215) - PMC - PubMed
    1. Zhang L., Ye M., Chan P.-L., et al. : ‘Real-time surgical tool tracking and pose estimation using a hybrid cylindrical marker’, Int. J. Comput. Assist. Radiol. Surg., 2017, 12, (6), pp. 921–930 (doi: 10.1007/s11548-017-1558-9) - PMC - PubMed
    1. Qiu L., Ren H.: ‘Endoscope navigation and 3D reconstruction of oral cavity by visual SLAM with mitigated data scarcity’. Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition Workshops, Salt Lake City, Utah, USA, 2018, pp. 2197–2204
    1. Sundermeyer M., Marton Z.-C., Durner M., et al. : ‘Implicit 3D orientation learning for 6D object detection from RGB images’. Proc. of the European Conf. on Computer Vision (ECCV), Munich, Germany, 2018, pp. 699–715

LinkOut - more resources