Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Feb 2:9:800232.
doi: 10.3389/frobt.2022.800232. eCollection 2022.

Visual Haptic Feedback for Training of Robotic Suturing

Affiliations

Visual Haptic Feedback for Training of Robotic Suturing

François Jourdes et al. Front Robot AI. .

Abstract

Current surgical robotic systems are teleoperated and do not have force feedback. Considerable practice is required to learn how to use visual input such as tissue deformation upon contact as a substitute for tactile sense. Thus, unnecessarily high forces are observed in novices, prior to specific robotic training, and visual force feedback studies demonstrated reduction of applied forces. Simulation exercises with realistic suturing tasks can provide training outside the operating room. This paper presents contributions to realistic interactive suture simulation for training of suturing and knot-tying tasks commonly used in robotically-assisted surgery. To improve the realism of the simulation, we developed a global coordinate wire model with a new constraint development for the elongation. We demonstrated that a continuous modeling of the contacts avoids instabilities during knot tightening. Visual cues are additionally provided, based on the computation of mechanical forces or constraints, to support learning how to dose the forces. The results are integrated into a powerful system-agnostic simulator, and the comparison with equivalent tasks performed with the da Vinci Xi system confirms its realism.

Keywords: collision detection; haptics; knot-tying; minimally invasive surgery; robotic surgery training; surgical simulation; virtual reality.

PubMed Disclaimer

Conflict of interest statement

FJ, BV, and JA were employed by the company InSimo SAS The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

FIGURE 1
FIGURE 1
The computation of the deformation is done in the local reference frame F of the beam element (in red). Using this frame, the 6D local displacements of the nodes are computed between the current configuration and their rest configuration (in orange).
FIGURE 2
FIGURE 2
Construction of the spline control points from the beam end nodes.
FIGURE 3
FIGURE 3
To construct R F we first evaluate the tangent at the cubic Bézier curve mid point (A) We then transform each beam end point orientation to align its first axis with the mid point tangent (B) We finally compute the orientation of F by taking the average of these two rotations (C).
FIGURE 4
FIGURE 4
Alignment of uxi with the mid point tangent.
FIGURE 5
FIGURE 5
Visual stress on a thread from floppy (A) to tight (B,C) condition until suture breakage with tension release (D).
FIGURE 6
FIGURE 6
We test four different discretizations of a suture thread of 12 cm subjected to gravity. In increasing order of discretization red—orange—blue—green. The dynamics of the pendular motion is captured at time 0[s] (A), 0.1[s] (B), 0.2[s] (C) and 0.3[s] (D). The model converges to a unique solution as the level of discretization increases.
FIGURE 7
FIGURE 7
We apply a torsional motion at the end points of the thread. The coupling between bending and torsion allows the formation of plectonemes.
FIGURE 8
FIGURE 8
A suture thread is wrapped around a fixed rigid cylinder by pulling at its ends. The wrapping motion is sampled at time 0 [s](A), 0.5 [s](B), and 1 [s](C). The model remains inextensible during this simulation, which is why the loops around the rigid cylinder get closer and closer.
FIGURE 9
FIGURE 9
Gradual tightening of a simple knot. The motion is sampled at time 0 [s](A), 1 [s](B), and 2 [s](C). The accurate contact model allows to form the knot and to tighten it.
FIGURE 10
FIGURE 10
First step of a surgeon’s knot. The double overhand knot is created by wrapping the thread around an instrument tip (A), grasping the other end, and pulling it through the double loop, resulting in a pattern with two twists (B). The knot position can be adjusted by exerting traction on one end, thus moving the knot over to the other end (C,D). The collision is sufficiently robust to support thread interactions with itself and with the other elements.
FIGURE 11
FIGURE 11
Simulation accurately reproducing knot-tying errors. An accidental loop is created when one end of the thread is not completely pulled through the loop before tightening the knot on the rings (A). An air knot is created when the knot is not properly guided down onto the rings before tightening it with traction on both ends of the thread (B).
FIGURE 12
FIGURE 12
Interrupted suturing (A,B) and continuous suturing (C,D).
FIGURE 13
FIGURE 13
Visual stress gradients on a thread when tightening a continuous suture (A), and on tissue when driving a needle through tissue (right instrument) while applying a counter-pressure (left instrument) (B). Combined visual cues on thread and tissue when applying excessive force (C) leading to a tear in the tissue and subsequently reduced tension on the thread (D).
FIGURE 14
FIGURE 14
Visual cues for excessive forces during collisions: Visual cues for excessive forces during collisions. Instrument tips turning from silver to orange during collision when forming a loop with a thread for knot-tying (A), right instrument tip turning orange during collision with the dome, which turns from blue to pink (B), double grasping with shearing forces during needle manipulation with both instrument tips turning orange, and the needle approximating the breakage point as represented by its color change to red (C). Animated scenes are shown in the Supplementary Video S1.
FIGURE 15
FIGURE 15
Task performance comparison between a real robotic system and the simulation. The knot-tying sequence is based on a surgeon’s knot. The task with the da Vinci Xi system is shown on the left, and the simulated task is shown without (middle) and with visual haptic cues (right).

References

    1. Amirabdollahian F., Livatino S., Vahedi B., Gudipati R., Sheen P., Gawrie-Mohan S., et al. (2018). Prevalence of Haptic Feedback in Robot-Mediated Surgery: a Systematic Review of Literature. J. Robotic Surg. 12, 11–25. 10.1007/s11701-017-0763-4 - DOI - PubMed
    1. Baraff D., Witkin A. (1998). “Large Steps in Cloth Simulation,” in SIGGRAPH 98 Conference Proceedings, 43–54. 10.1145/280814.280821 - DOI
    1. Basdogan C., De Rensselaer S., Jung Kim J., Muniyandi M., Hyun Kim H., Srinivasan M. A. (2004). Haptic Rendering - beyond Visual Computing - Haptics in Minimally Invasive Surgical Simulation and Training. IEEE Comput. Grap. Appl. 24, 56–64. 10.1109/mcg.2004.1274062 - DOI - PubMed
    1. Bergen G. v. d. (1997). Efficient Collision Detection of Complex Deformable Models Using Aabb Trees. J. graphics tools 2, 1–13. 10.1080/10867651.1997.10487480 - DOI
    1. Bergou M., Audoly B., Vouga E., Wardetzky M., Grinspun E. (2010). “Discrete Viscous Threads,” in ACM SIGGRAPH 2010 PapersSIGGRAPH ’10 (New York, NY, USA: Association for Computing Machinery; ). 10.1145/1833349.1778853 - DOI

LinkOut - more resources