Edit this page

NA-MIC Project Weeks

Slicer VR

This breakout aims to identify the current state of VR capabilities in Slicer, identify areas of most need for improvement, and brainstorm about future potential applications of VR in Slicer.

Attendees

  1. Adam Rankin (Robarts Research Institute)
  2. Andras Lasso (Queen’s University)
  3. Sam Horvath (Kitware, Inc.)
  4. JCFR (Kitware, Inc.)
  5. Csaba Pinter (Ebatin S.L.)
  6. Steve Pieper (Isomics Inc.)
  7. Thomas Muender (University of Bremen)
  8. Verena Reinschluessel (University of Bremen)
  9. Thomas Mildner (University of Bremen)

Current State

VR is currently available in Slicer out-of-the-box via the SlicerVirtualReality extension. This extension uses the VTK OpenVR interface that provides input, tracking, and visualization capabilities. To use this, Steam and SteamVR must be installed and running (SteamVR installed from within Steam).

Preliminary work for AR support in Slicer is functional at SlicerAugmentedReality but needs improvements. Video passthrough infrastructure is available in VTK (last tested 8.2).

Extensions

Interaction

Tracking data is available for all OpenVR devices including headset, controllers, and generic trackers. Input from the controllers is available and the default interactor provides navigation, zoom, and selection capabilities.

Visualization

The 3D view in Slicer can be rendered in compatible OpenVR headsets. Other views including 2D Slice, Chart, etc… view are not currently rendered to the headset.

The Leap Motion hand tracking device can be streamed into Slicer and visualized using the SlicerLeapMotion extension.

Hardware

Tested OpenVR headsets include:

Infrastructure/Algorithms

The SlicerVideoCameras extension provides a wrapper to OpenCV’s camera calibration techniques.

Areas of Improvement

We have identified the following major categories as areas that need improvement in the current Slicer ecosystem.

Interaction

It is still unclear which input methods will provide the easiest navigation and action capabilities in VR, but is most likely a combination of input methods.

Even once input methods have been decided, interaction guidelines for input are still a shifting target in VR. Using controllers, for example, does one use the point of the device as a selection point, or use the direction of the controller as a laser pointer? There are many such questions/decisions to be settled.

Visualization

Hardware

Infrastructure/Algorithms

Minutes

Other References