Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Aug 6:15:1-11.
doi: 10.1016/j.sopen.2023.07.023. eCollection 2023 Sep.

Deep learning based suture training system

Affiliations

Deep learning based suture training system

Mohammed Mansour et al. Surg Open Sci. .

Erratum in

Abstract

Background and objectives: Surgical suturing is a fundamental skill that all medical and dental students learn during their education. Currently, the grading of students' suture skills in the medical faculty during general surgery training is relative, and students do not have the opportunity to learn specific techniques. Recent technological advances, however, have made it possible to classify and measure suture skills using artificial intelligence methods, such as Deep Learning (DL). This work aims to evaluate the success of surgical suture using DL techniques.

Methods: Six Convolutional Neural Network (CNN) models: VGG16, VGG19, Xception, Inception, MobileNet, and DensNet. We used a dataset of suture images containing two classes: successful and unsuccessful, and applied statistical metrics to compare the precision, recall, and F1 scores of the models.

Results: The results showed that Xception had the highest accuracy at 95 %, followed by MobileNet at 91 %, DensNet at 90 %, Inception at 84 %, VGG16 at 73 %, and VGG19 at 61 %. We also developed a graphical user interface that allows users to evaluate suture images by uploading them or using the camera. The images are then interpreted by the DL models, and the results are displayed on the screen.

Conclusions: The initial findings suggest that the use of DL techniques can minimize errors due to inexperience and allow physicians to use their time more efficiently by digitizing the process.

Keywords: Classification; Deep learning; Suture training.

PubMed Disclaimer

Conflict of interest statement

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Figures

Fig. 1
Fig. 1
Suture training example.
Fig. 2
Fig. 2
CNN structure example.
Fig. 3
Fig. 3
Study flow chart.
Fig. 4
Fig. 4
System Setup: 1) Laptop, 2) Camera 3) Suture pad.
Fig. 5
Fig. 5
Successful suture images example.
Fig. 6
Fig. 6
Failed suture images example.
Fig. 7
Fig. 7
Data augmentation.
Fig. 8
Fig. 8
Xception model training and validation accuracy and accuracy loss.
Fig. 9
Fig. 9
Inception model training and validation accuracy and accuracy loss.
Fig. 10
Fig. 10
VGG16 model training and validation accuracy and accuracy loss.
Fig. 11
Fig. 11
VGG19 training and validation accuracy and accuracy loss.
Fig. 12
Fig. 12
MobileNet training and validation accuracy and accuracy loss.
Fig. 13
Fig. 13
DensNet training and validation accuracy and accuracy.
Fig. 14
Fig. 14
Confusion matrix basic.
Fig. 15
Fig. 15
Confusion matrix of Xception.
Fig. 16
Fig. 16
Confusion matrix of Inception.
Fig. 17
Fig. 17
Confusion matrix of VGG16.
Fig. 18
Fig. 18
Confusion matrix of VGG19.
Fig. 19
Fig. 19
Confusion matrix of MobileNet.
Fig. 20
Fig. 20
Confusion matrix of DensNet.
Fig. 21
Fig. 21
Accuracy chart.
Fig. 22
Fig. 22
Evaluation metrics for CNN architectures.
Fig. 23
Fig. 23
The graphical user interface.

References

    1. Zeynep Solakoglu. Evaluating the educational gains of the 6th year medical students on injection and surgical suture practices. J Ist Faculty Med. 2014;77:1–7.
    1. George Stockman, Shapiro Linda G. Prentice Hall; 2001. Computer vision. PTR 285.
    1. Irfan Kil, Anand Jagannathan, Singapogu Ravikiran B., Groff Richard E. 2017 IEEE EMBS International Conference on Biomedical & Health Informatics (BHI): 29–32. IEEE; 2017. Development of computer vision algorithm towards assessment of suturing skill.
    1. Woonjae Choi, Bummo Ahn. A flexible sensor for suture training. IEEE Robot Autom Lett. 2019;4:4539–4546.
    1. Amir Handelman, Yariv Keshet, Eitan Livny, Refael Barkan, Yoav Nahum, Ronnie Tepper. Evaluation of suturing performance in general surgery and ocular microsurgery by combining computer vision-based software and distributed fiber optic strain sensors: a proof-of-concept. Int J Comput Assist Radiol Surg. 2020;15:1359–1367. - PubMed

LinkOut - more resources