Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Apr 12;8(1):101.
doi: 10.1038/s41597-021-00882-2.

Heidelberg colorectal data set for surgical data science in the sensor operating room

Affiliations

Heidelberg colorectal data set for surgical data science in the sensor operating room

Lena Maier-Hein et al. Sci Data. .

Abstract

Image-based tracking of medical instruments is an integral part of surgical data science applications. Previous research has addressed the tasks of detecting, segmenting and tracking medical instruments based on laparoscopic video data. However, the proposed methods still tend to fail when applied to challenging images and do not generalize well to data they have not been trained on. This paper introduces the Heidelberg Colorectal (HeiCo) data set - the first publicly available data set enabling comprehensive benchmarking of medical instrument detection and segmentation algorithms with a specific emphasis on method robustness and generalization capabilities. Our data set comprises 30 laparoscopic videos and corresponding sensor data from medical devices in the operating room for three different types of laparoscopic surgery. Annotations include surgical phase labels for all video frames as well as information on instrument presence and corresponding instance-wise segmentation masks for surgical instruments (if any) in more than 10,000 individual frames. The data has successfully been used to organize international competitions within the Endoscopic Vision Challenges 2017 and 2019.

PubMed Disclaimer

Conflict of interest statement

L.M.-H., T.R., A.R., S.B. and S.Sp. worked with device manufacturer KARL STORZ SE & Co. KG in the joint research project “OP4.1,” funded by the German Federal Ministry of Economic Affairs and Energy (Grant number BMWI 01MT17001C). M.W., B.M. and H.G.K. worked with device manufacturer KARL STORZ SE & Co. KG in the joint research project “InnOPlan,” funded by the German Federal Ministry of Economic Affairs and Energy (grant number BMWI 01MD15002E). F.N. reports receiving travel support for conference participation as well as equipment provided for laparoscopic surgery courses by KARL STORZ SE & Co. KG, Johnson & Johnson, Intuitive Surgical, Cambridge Medical Robotics and Medtronic. V.R, F.B, S.Bit. and M.M. are/were employees of the company understand.ai, which sponsored the initial labeling of the ROBUST-MIS data set. L.M. is employee of the company KARL STORZ SE & Co. KG, which sponsored a scientific prize for the surgical workflow analysis in the sensorOR challenge. The authors not listed as employees of understand.ai or KARL STORZ SE & Co. KG are not and have never been in conflict of interest or financial ties with the company. All other authors have no competing interests.

Figures

Fig. 1
Fig. 1
Overview of the Heidelberg Colorectal (HeiCo) data set. Raw data comprises anonymized, downsampled laparoscopic video data from three different types of colorectal surgery along with corresponding streams from medical devices in the operating room. Annotations include surgical phase information for the entire video sequences as well as information on instrument presence and corresponding instance-wise segmentation masks of medical instruments (if any) for more than 10,000 frames.
Fig. 2
Fig. 2
Laparoscopic images representing various levels of difficulty for the tasks of medical instrument detection, binary segmentation and multi-instance segmentation. Raw input frames (a) and corresponding reference segmentation masks (b) computed from the reference contours.
Fig. 3
Fig. 3
Examples of challenging frames overlaid with reference multi-instance segmentations created by surgical data science experts.
Fig. 4
Fig. 4
Folder structure for the complete data set. It comprises five levels corresponding to (1) surgery type, (2) procedure number, (3) procedural data (video and device data along with phase annotations), (4) frame number and (5) frame-based data.
Fig. 5
Fig. 5
Folder structure for the ROBUST-MIS challenge data set. It comprises five levels corresponding to (1) data type (training/test), (2) surgery type, (3) procedure number, (4) frame number and (5) case data.

References

    1. Maier-Hein L, et al. Surgical data science for next-generation interventions. Nat. Biomed. Eng. 2017;1:691–696. doi: 10.1038/s41551-017-0132-7. - DOI - PubMed
    1. Islam, M., Li, Y. & Ren, H. Learning where to look while tracking instruments in robot-assisted surgery. in Med. Image Comput. Comput. Assist. Interv., 412–420 (Springer, 2019).
    1. Funke I, Mees ST, Weitz J, Speidel S. Video-based surgical skill assessment using 3D convolutional neural networks. Int. J. Comput. Assist. Radiol. Surg. 2019;14:1217–1225. doi: 10.1007/s11548-019-01995-1. - DOI - PubMed
    1. Allan, M. et al. 2017 Robotic instrument segmentation challenge. Preprint at https://arxiv.org/abs/1902.06426 (2019).
    1. Ross T, et al. Exploiting the potential of unlabeled endoscopic video data with self-supervised learning. Int. J. Comput. Assist. Radiol. Surg. 2018;13:925–933. doi: 10.1007/s11548-018-1772-0. - DOI - PubMed

Publication types

LinkOut - more resources