Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Apr;10(2 Suppl):41S-55S.
doi: 10.1177/2192568219868217. Epub 2020 May 28.

Spine Surgery Supported by Augmented Reality

Affiliations

Spine Surgery Supported by Augmented Reality

Barbara Carl et al. Global Spine J. 2020 Apr.

Abstract

Study design: A prospective, case-based, observational study.

Objectives: To investigate how microscope-based augmented reality (AR) support can be utilized in various types of spine surgery.

Methods: In 42 spinal procedures (12 intra- and 8 extradural tumors, 7 other intradural lesions, 11 degenerative cases, 2 infections, and 2 deformities) AR was implemented using operating microscope head-up displays (HUDs). Intraoperative low-dose computed tomography was used for automatic registration. Nonlinear image registration was applied to integrate multimodality preoperative images. Target and risk structures displayed by AR were defined in preoperative images by automatic anatomical mapping and additional manual segmentation.

Results: AR could be successfully applied in all 42 cases. Low-dose protocols ensured a low radiation exposure for registration scanning (effective dose cervical 0.29 ± 0.17 mSv, thoracic 3.40 ± 2.38 mSv, lumbar 3.05 ± 0.89 mSv). A low registration error (0.87 ± 0.28 mm) resulted in a reliable AR representation with a close matching of visualized objects and reality, distinctly supporting anatomical orientation in the surgical field. Flexible AR visualization applying either the microscope HUD or video superimposition, including the ability to selectively activate objects of interest, as well as different display modes allowed a smooth integration in the surgical workflow, without disturbing the actual procedure. On average, 7.1 ± 4.6 objects were displayed visualizing target and risk structures reliably.

Conclusions: Microscope-based AR can be applied successfully to various kinds of spinal procedures. AR improves anatomical orientation in the surgical field supporting the surgeon, as well as it offers a potential tool for education.

Keywords: augmented reality; head-up display; intraoperative computed tomography; low-dose computed tomography; microscope-based navigation; navigation registration; nonlinear registration.

PubMed Disclaimer

Conflict of interest statement

Declaration of Conflicting Interests: The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Barbara Carl has received speaker fees from B. Braun and Brainlab, Christopher Nimsky is consultant for Brainlab. For the remaining authors none were declared.

Figures

Figure 1.
Figure 1.
Total effective dose (ED; scout and intraoperative computed tomography [iCT] scan) visualized for all 42 procedures in relation to the scanned levels and scan protocol (light blue: sinus-80%; green: c-spine-70%; red: t-spine-70%; orange: neonate full body; dark blue: l-spine-70%) (note that the length of each bar represents the vertebra included in the scan range and not the actual scan length).
Figure 2.
Figure 2.
In a 58-year-old female patient with a destruction of T8 and T9 due to spondylodiscitis with previous fixation T4-T11 (case 16), a neonate protocol was used for intraoperative computed tomography (iCT)–based patient registration; registration with preoperative image data was possible due to the previous instrumentation, which was visible in preoperative, as well as in the blurry iCT images, note that the outline of the vertebra is not clearly visible in the neonate protocol images; in A-D, the pointer is placed on the rod segmented in blue; in E-H, the pointer is placed in the head of the right screw of T10 (A/B/E/F: neonate protocol iCT; C/D/G/H: preoperative CT) (A/C, B/D, E/G, and F/H show corresponding images after registration) (A/C/E/G: axial; B/D/F/H: sagittal view).
Figure 3.
Figure 3.
In a 59-year-old male patient, a glioma was biopsied with augmented reality (AR) support (case 33); the tumor object is segmented in yellow, the brain stem and medulla are segmented in green, C0/C1/C2 are visualized in different shades of violet, additionally tractography data are visualized (A: axial; B: coronal, C: sagittal T2-weighted images, D: AR visualization; E: probe’s eye view; F: target view; G: 3-dimensional overview)
Figure 4.
Figure 4.
A 58-year-old male patient (case 31) with an arterio-venous fistula below the right pedicle L4 was visualized with augmented reality (AR). The surgical situation was complicated due to a spina bifida occulta with an intra- and extradural lipoma and a tethered cord; the area where the fistula was expected was segmented in orange, additionally the dural sac and the vertebrae T12-S5 were segmented in individual colors and visualized by AR; the situation after laminectomy of L4 and preparation of the extra- to intradural transition of the lipoma is displayed, the blue crosshair in E depicts the center of the microscope view and corresponds to the position depicted in A-D and F and H (A: axial, B: sagittal view of registration intraoperative computed tomography [iCT]; corresponding axial (C) and sagittal view (D) of preoperative T2-weighted images; E: AR view with all objects activated; F: probe’s eye view of T2-weighted images; G: enlarged target view; H: 3-dimensional (3D) overview depicting how the video frame is related to the 3D anatomy)
Figure 5.
Figure 5.
The same patient as in Figure 4 after dural opening; the fistula is clearly visible in the enlarged view and enclosed by the orange contour (E), note that the blue lines representing the microscope-viewing field are much smaller compared with Figure 4 representing the enlarged microscope magnification (A: axial, B: sagittal view of registration intraoperative computed tomography (iCT); corresponding axial (C) and sagittal view (D) of preoperative T2-weighted images; E: enlarged augmented reality (AR) view, only the target object is activated; F: probe’s eye view of T2-weighted images; G: enlarged target view; H: 3-dimensional (3D) overview depicting how the video frame is related to the 3D anatomy).
Figure 6.
Figure 6.
A 64-year-old male patient (case 14) undergoing corpectomy of T1 via an anterior approach for removal of a small cell lung carcinoma metastasis and stabilization with an expandable implant; the neighboring vertebrae are visualized (C7: blue; T2: green), A/C allows a comparison of the 2 augmented reality (AR) display modes (A: AR as 3-dimensional [3D] representation; B: 3D overview display visualizing how the video frame relates to the 3D anatomy with the objects rendered in 3D; C: AR as line-mode representation; D: probe’s eye view of intraoperative computed tomography (iCT) images, the blue circle represents the microscope viewing field).
Figure 7.
Figure 7.
A 50-year-old female patient (case 38) with an osteoclastoma in L1 that was removed and an expandable implant was inserted via a lateral approach; A: augmented reality (AR) display with the 3-dimensional (3D) representation of the vertebrae T11-L3 and the tumor outline in L1 (orange), additionally the fixation T11/T12-L2/L3 that was implanted before is visualized in blue; B: enlarged view of A with the pointer in the surgical field, the pointer tip is visualized as green crosshair, while the microscope focus point is visualized as a blue crosshair; C: after removal of the tumor and insertion of the expandable cage a repeated intraoperative computed tomography (iCT) was performed, in which the implant was segmented and subsequently visualized by AR (dark blue) showing the close matching of AR object and implant; D: overview display of C, depicting how the video frame is placed in relation to the 3D image anatomy.
Figure 8.
Figure 8.
The same patient (case 16) as in Figure 2—a posterior vertebral body replacement was performed via a posterior approach; A: augmented reality (AR) view with the 3-dimensional (3D) outline of the vertebrae T7-T11, the myelon is segmented in violet, and the implants are segmented in blue (screws and rod on the left side, for the approach the right rod was removed), a close matching of the screw head and the AR representation is visible; B: probe’s eye view of preoperative computed tomography (CT) images; C: target view; D: 3D video overview; E-H: navigation view of preoperative images with the pointer inserted in the resection cavity at the ventral border of T8/T9 (E: axial, F: sagittal view, G/H: 3-D representation in different viewing angles).
Figure 9.
Figure 9.
The same patient as in Figures 2 and 8 after implantation of the expandable cage and repeated intraoperative computed tomography (iCT) documenting the high registration accuracy in updated augmented reality (AR; the new implant is segmented in green); A/B: axial and sagittal view with the navigation pointer tip placed on the implant; C-F: navigation view with the operating microscope (C: axial; D: sagittal view; E/F: 3-dimensional (3D) rendering in different viewing angles, the microscope field of view is visualized as a blue oval); G: AR view with the new implant demonstrating the close matching of AR and reality; H: 3D video overview, showing the relation of the video frame and 3D anatomy.
Figure 10.
Figure 10.
A 60-year-old male patient (case 13) with a recurrent disc in L4/L5 on the right side; the disc fragment and the vertebrae L4, L5, and S1 are visualized by augmented reality (AR); A: AR view after exposure of the spinal canal and removal of scar tissue; B: enlarged AR view with the visible spinal dura; C: probe’s eye view; D: target view; E: 3-dimensional video overview; F: AR view while the disc fragment is removed showing the close matching.

References

    1. Kelly PJ, Alker GJ, Jr, Goerss S. Computer-assisted stereotactic microsurgery for the treatment of intracranial neoplasms. Neurosurgery. 1982;10:324–331. - PubMed
    1. Roberts DW, Strohbehn JW, Hatch JF, Murray W, Kettenberger H. A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope. J Neurosurg. 1986;65:545–549. - PubMed
    1. Fahlbusch R, Nimsky C, Ganslandt O, Steinmeier R, Buchfelder M, Huk W. The Erlangen concept of image guided surgery In: Lemke HU, Vannier MW, Inamura K, Farman A, eds. CAR’98. Amsterdam, Netherlands: Elsevier Science; 1998:583–588.
    1. King AP, Edwards PJ, Maurer CR, Jr, et al. A system for microscope-assisted guided interventions. Stereotact Funct Neurosurg. 1999;72:107–111. - PubMed
    1. Kiya N, Dureza C, Fukushima T, Maroon JC. Computer navigational microscope for minimally invasive neurosurgery. Minim Invasive Neurosurg. 1997;40:110–115. - PubMed