gms | German Medical Science

60th Annual Meeting of the German Society of Neurosurgery (DGNC)
Joint Meeting with the Benelux countries and Bulgaria

German Society of Neurosurgery (DGNC)

24 - 27 May 2009, Münster

Neurosurgical navigation with 3D augmented reality technology: First clinical experiences

Meeting Abstract

  • R. Kockro - Neurochirurgische Klinik, Universitätsklinikum Mainz
  • T. Yeo - National University Hospital, Singapore
  • I. Ng - National Neuroscience Institute, Singapore
  • A. Stadie - Neurochirurgische Klinik, Universitätsklinikum Mainz
  • C. Gerardo - Hospital del Mar, Barcelona
  • L. Serra - Volume Interactions

Deutsche Gesellschaft für Neurochirurgie. 60. Jahrestagung der Deutschen Gesellschaft für Neurochirurgie (DGNC), Joint Meeting mit den Benelux-Ländern und Bulgarien. Münster, 24.-27.05.2009. Düsseldorf: German Medical Science GMS Publishing House; 2009. DocMO.07-02

doi: 10.3205/09dgnc037, urn:nbn:de:0183-09dgnc0370

Published: May 20, 2009

© 2009 Kockro et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by-nc-nd/3.0/deed.en). You are free: to Share – to copy, distribute and transmit the work, provided the original author and source are credited.


Outline

Text

Objective: Currently available navigation systems rely on the display of 2D multi-planar images or simple 3D graphics which are not visible in context with the actual surgical scene. We have developed a navigation system which displays segmented 3D imaging data and integrates augmented reality technology in order to “see” 3D navigational information beneath the visible surgical field. We are reporting on our development process with phantoms as well as on our experience with the first 30 patients undergoing tumor and vascular related procedures.

Methods: The system incorporates a lipstick-size micro camera in a hand-held navigation probe and displays 3D computer graphics representing surgically relevant structures augmented over a real time video stream. The transparency of the 3D graphics and video image can be adjusted to enhance depth perception. Distances between the tip of the probe and superimposed 3D targets can be displayed. Three-dimensional workstations were used to create a pre-operative surgical plan based on co-registered segmentations of imaging data from CT, CTA, MRI, MRA and fMRI. This included the display of tumors, vasculature, parts of the skull base, cranial nerves, ventricles and the sulcal and gyral patterns of the cortex. The navigation system described here enables the transfer of the multi-modal 3D graphic data of the surgical plan into the context of the operation and it allows for navigation either in Augmented Reality mode or as 3D graphics displayed on a stereoscopic monitor.

Results: The system was integrated perfectly into the surgical workflow and is working with an average clinical registration accuracy of 1.2 mm. The see-through effect of the Augmented Reality feature was found to enable navigation with 3D graphics beyond the visible surface of the surgical site, while still remaining in direct contact to it. Navigating with 3D graphics instead of 2D image planes and the effect of perceiving them directly related to the surgical site resulted in improved spatial understanding and was found to be especially useful for complex tumors, aneurysms and AVM surgery.

Conclusions: Navigating with 3D objects instead of 2D planes allows straightforward understanding of the surgical scene and direct guidance towards surgical targets. Video Augmented Reality technology makes possible a novel type of image guidance, which appears to be more intuitive and comprehensive than standard navigation systems.