Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Aug;16(8):1393-1401.
doi: 10.1007/s11548-021-02437-7. Epub 2021 Jul 5.

Toward autonomous robotic prostate biopsy: a pilot study

Affiliations

Toward autonomous robotic prostate biopsy: a pilot study

Bogdan Maris et al. Int J Comput Assist Radiol Surg. 2021 Aug.

Abstract

Purpose: We present the validation of PROST, a robotic device for prostate biopsy. PROST is designed to minimize human error by introducing some autonomy in the execution of the key steps of the procedure, i.e., target selection, image fusion and needle positioning. The robot allows executing a targeted biopsy through ultrasound (US) guidance and fusion with magnetic resonance (MR) images, where the target was defined.

Methods: PROST is a parallel robot with 4 degrees of freedom (DOF) to orient the needle and 1 DOF to rotate the US probe. We reached a calibration error of less than 2 mm, computed as the difference between the needle positioning in robot coordinates and in the US image. The autonomy of the robot is given by the image analysis software, which employs deep learning techniques, the integrated image fusion algorithms and automatic computation of the needle trajectory. For safety reasons, the insertion of the needle is assigned to the doctor.

Results: System performance was evaluated in terms of positioning accuracy. Tests were performed on a 3D printed object with nine 2-mm spherical targets and on an anatomical commercial phantom that simulates human prostate with three lesions and the surrounding structures. The average accuracy reached in the laboratory experiments was [Formula: see text] in the first test and [Formula: see text] in the second test.

Conclusions: We introduced a first prototype of a prostate biopsy robot that has the potential to increase the detection of clinically significant prostate cancer and, by including some level of autonomy, to simplify the procedure, to reduce human errors and shorten training time. The use of a robot for the biopsy of the prostate will create the possibility to include also a treatment, such as focal ablation, to be delivered through the same system.

Keywords: Automatic segmentation; Image fusion; Medical robotics; Prostate biopsy; Robot-assisted biopsy.

PubMed Disclaimer

Conflict of interest statement

The authors declare that they have no conflict of interest.

Figures

Fig. 1
Fig. 1
Artist rendering of the PROST system. Left: the robotic head, used for biopsy procedures. Right: the head as a component of a larger robotic system, used for more advanced procedures. In this paper, we describe the robotic head, which can be used independently of the base
Fig. 2
Fig. 2
a: PROST CAD shown with the US probe. The probe is actuated to rotate on the sagittal plane (roll). A TOF sensor measures the depth of the needle insertion. b: Schematic view of the SCARA structure. Each pair of same-color links is actuated by one of the motors. c: PROST’s workspace is a cone with apex at the insertion point. d: PROST prototype with the synthetic phantom during a needle insertion test
Fig. 3
Fig. 3
Left: The needle (the hypo-echogenic segment encircled in the figure) is aligned with the vertical plane, represented as a dotted line in the middle of the image. Right: The needle before alignment with the vertical plane
Fig. 4
Fig. 4
Graphical user interface and the biopsy procedure. Top left: US image of the anatomical phantom registered with MR contours during a needle insertion; the US image contains 2 targets, one of which is selected (in pink); the needle reached the target and can be seen in the US image as a hyper-echoic line; the whole procedure can be seen in 3D; on the right, a progress bar shows the distance to the target, represented by a green circle. Top right: before reaching the target, a green path shows the projected trajectory both in 2D and 3D. Bottom left: targeting the synthetic phantom. Bottom right: overall view of the system during the insertion in the anatomical phantom
Fig. 5
Fig. 5
Left: 3D-printed phantom with 9 targets. The phantom is immersed in a silicon shell (yellow) transparent to US. Right: the ‘Target points’ are chosen by the user in the US images, CAD (CT or MR) points come from the 3D model, and the ‘Biopsy points’ are those reached by the tip of the needle during a robot-assisted puncture. The error is computed as the difference between ‘Target points’ and ‘Biopsy points’
Fig. 6
Fig. 6
CIRS 053L commercial phantom. It includes: perineal membrane, prostate, urethra, seminal vesicles, lesions, and rectal wall
Fig. 7
Fig. 7
Architectural diagram for segmentation with detail on ResNet Backbone and Region Proposal Network for ROI selection. The last layers perform prostate segmentation
Fig. 8
Fig. 8
Segmentation of the prostate in US images, in axial (left) and sagittal (right) view: the blue line shows the ground truth, whereas the green line shows the result of automatic segmentation
Fig. 9
Fig. 9
Workflow for phantom experiments: synthetic (top row); anatomical (bottom row)

References

    1. Besl PJ, McKay ND (1992) A method for registration of 3-d shapes. In: Sensor fusion IV: control paradigms and data structures, vol. 1611. International Society for Optics and Photonics, pp 586–606
    1. Han M, Chang D, Kim C, Lee BJ, Zuo Y, Kim HJ, Petrisor D, Trock B, Partin AW, Rodriguez R. Geometric evaluation of systematic transrectal ultrasound guided prostate biopsy. J Urol. 2012;188(6):2404–2409. doi: 10.1016/j.juro.2012.07.107. - DOI - PMC - PubMed
    1. He K, Gkioxari G, Dollár P, Girshick R (2017) Mask r-cnn. In: Proceedings of the IEEE international conference on computer vision, pp 2961–2969
    1. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778
    1. Jun C, Lim S, Petrisor D, Chirikjian G, Kim JS, Stoianovici D. A simple insertion technique to reduce the bending of thinbevel-point needles. Minim Invasive Therapy Allied Technol. 2019;28(4):199–205. doi: 10.1080/13645706.2018.1505758. - DOI - PMC - PubMed

Grants and funding