Heidelberg colorectal data set for surgical data science in the sensor operating room
- PMID: 33846356
- PMCID: PMC8042116
- DOI: 10.1038/s41597-021-00882-2
Heidelberg colorectal data set for surgical data science in the sensor operating room
Abstract
Image-based tracking of medical instruments is an integral part of surgical data science applications. Previous research has addressed the tasks of detecting, segmenting and tracking medical instruments based on laparoscopic video data. However, the proposed methods still tend to fail when applied to challenging images and do not generalize well to data they have not been trained on. This paper introduces the Heidelberg Colorectal (HeiCo) data set - the first publicly available data set enabling comprehensive benchmarking of medical instrument detection and segmentation algorithms with a specific emphasis on method robustness and generalization capabilities. Our data set comprises 30 laparoscopic videos and corresponding sensor data from medical devices in the operating room for three different types of laparoscopic surgery. Annotations include surgical phase labels for all video frames as well as information on instrument presence and corresponding instance-wise segmentation masks for surgical instruments (if any) in more than 10,000 individual frames. The data has successfully been used to organize international competitions within the Endoscopic Vision Challenges 2017 and 2019.
Conflict of interest statement
L.M.-H., T.R., A.R., S.B. and S.Sp. worked with device manufacturer KARL STORZ SE & Co. KG in the joint research project “OP4.1,” funded by the German Federal Ministry of Economic Affairs and Energy (Grant number BMWI 01MT17001C). M.W., B.M. and H.G.K. worked with device manufacturer KARL STORZ SE & Co. KG in the joint research project “InnOPlan,” funded by the German Federal Ministry of Economic Affairs and Energy (grant number BMWI 01MD15002E). F.N. reports receiving travel support for conference participation as well as equipment provided for laparoscopic surgery courses by KARL STORZ SE & Co. KG, Johnson & Johnson, Intuitive Surgical, Cambridge Medical Robotics and Medtronic. V.R, F.B, S.Bit. and M.M. are/were employees of the company understand.ai, which sponsored the initial labeling of the ROBUST-MIS data set. L.M. is employee of the company KARL STORZ SE & Co. KG, which sponsored a scientific prize for the surgical workflow analysis in the sensorOR challenge. The authors not listed as employees of understand.ai or KARL STORZ SE & Co. KG are not and have never been in conflict of interest or financial ties with the company. All other authors have no competing interests.
Figures





References
-
- Islam, M., Li, Y. & Ren, H. Learning where to look while tracking instruments in robot-assisted surgery. in Med. Image Comput. Comput. Assist. Interv., 412–420 (Springer, 2019).
-
- Allan, M. et al. 2017 Robotic instrument segmentation challenge. Preprint at https://arxiv.org/abs/1902.06426 (2019).
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources