Edit this page

NA-MIC Project Weeks

Back to Projects List

El Cheapo Tracking

Key Investigators

Project Description

Investigate less expensive but still good tracking for IGT. Modern AR/VR devices use inside-out tracking with IMUs, cameras, lidar, and other sensors (e.g. in phones and glasses). These are small enough and becoming (maybe?) good enough to consider for IGT. Would these be options for NousNav or the SlicerTMS projects?

Objective

  1. Make a plan for determining accuracy and utility of options
  2. Plan any implementation efforts or further experiments
  3. Consider issues like form-factor, sterilization, re-usability, etc.

Approach and Plan

  1. Survey developments in the field pushed by AR/VR devices
  2. Look at any prototypes, e.g. Steve’s WebXR experiment
  3. Determine next steps

Progress and Next Steps

Progress

  1. Improved demo to work with https using letsencript on Google Cloud virtual machine running Slicer
  2. Added touch screen events to control attributes of Slicer model (brighter yellow when touching the screen).
  3. Gave demos to colleagues at the Wednesday IGT breakout and discussed the tradeoffs of intrinsict tracking vs EM and extrinsic optical tracking

Next steps

  1. Explore the use of phone-based tracking for SlicerTMS research
  2. Experiment with local rendering and touch interactions on phone mixed with remote rendering and computation on CPU/GPU with Slicer
  3. Consider developing native phone app to avoid https performance overhead vs upgrading Slicer’s web server to support web sockets for faster performance
  4. Brainstorm about other applications of this technology
  5. Monitor developments of intrinsic tracking systems in non-phone form factors for use in other tracking scenarios (e.g. in IGT)

Illustrations

Phone controller demo (click to see video)

Background and References