Developing an artificial intelligence model for phase recognition in robot-assisted radical prostatectomy
- PMID: 40693331
- DOI: 10.1111/bju.16862
Developing an artificial intelligence model for phase recognition in robot-assisted radical prostatectomy
Abstract
Objectives: To develop and evaluate a convolutional neural network (CNN)-based model for recognising surgical phases in robot-assisted laparoscopic radical prostatectomy (RARP), with an emphasis on model interpretability and cross-platform validation.
Methods: A CNN using EfficientNet B7 was trained on video data from 75 RARP cases with the hinotori robotic system. Seven phases were annotated: bladder drop, prostate preparation, bladder neck dissection, seminal vesicle dissection, posterior dissection, apical dissection, and vesicourethral anastomosis. A total of 808 774 video frames were extracted at 1 frame/s for training and testing. Validation was performed on 25 RARP cases using the da Vinci robotic system to assess cross-platform generalisability. Gradient-weighted class activation mapping was used to enhance interpretability by identifying key regions of interest for phase classification.
Results: The CNN achieved 0.90 accuracy on the hinotori test set but dropped to 0.64 on the da Vinci dataset, thus indicating cross-platform limitations. Phase-specific F1 scores ranged from 0.77 to 0.97, with lower performance in the phase of seminal vesicle dissection, and apical dissection. Gradient-weighted class activation mapping visualisations revealed the model's focus on central pelvic structures rather than transient instruments, enhancing interpretability and insights into phase classification.
Conclusions: The model demonstrated high accuracy on a single robotic platform but requires further refinement for consistent cross-platform performance. Interpretability techniques will foster clinical trust and integration into workflows, advancing robotic surgery applications.
Keywords: artificial intelligence; convolutional neural network; deep learning; phase recognition; prostatectomy; robotic surgical procedures.
© 2025 BJU International.
References
-
- Esteva A, Kuprel B, Novoa RA et al. Dermatologist‐level classification of skin cancer with deep neural networks. Nature 2017; 542: 115–118
-
- Rajpara SM, Botello AP, Townend J, Ormerod AD. Systematic review of dermoscopy and digital dermoscopy/artificial intelligence for the diagnosis of melanoma. Br J Dermatol 2009; 161: 591–604
-
- Liu S, Zhang Y, Ju Y et al. Establishment and clinical application of an artificial intelligence diagnostic platform for identifying rectal cancer tumor budding. Front Oncol 2021; 11: 626626
-
- Huang D, Li Z, Jiang T, Yang C, Li N. Artificial intelligence in lung cancer: current applications, future perspectives, and challenges. Front Oncol 2024; 14: 1486310
-
- Varghese C, Harrison EM, O'Grady G, Topol EJ. Artificial intelligence in surgery. Nat Med 2024; 30: 1257–1268
MeSH terms
LinkOut - more resources
Full Text Sources
Medical
