MARIN is an application that can be used in conjunction with a neuronavigation platform to enable in situ AR guidance on a mobile device. It currently supports iOS and works in conjunction with Ibis (with the additional MARIN plugins). The goal of this project is to implement the same support in Slicer.
MARIN is a mobile application that can overlay virtual structures over the live camera feed from a device, enabling in situ augmented reality navigation for surgical applications (see image below). The MARIN application itself can interface with any platform, provided that the platform supports real-time communication, can handle tracking and generate 3D renderings. Slicer has all of these capabilities. Communication between Slicer and MARIN can be set-up through the OpenIGTLinkIF module. The main components that will have to be implemented are Slicer modules to handle device configuration and rendering of tracked virtual objects.
Because MARIN and the OpenIGTLinkIF don’t currently support the same video codecs (H264 only for MARIN vs VP9 only for OpenIGTLinkIF), most of this week’s effort was focused on extending MARIN to support more codecs as well as support sending unencoded images. Further work could then be done on the Slicer side to enable more codecs as well. This would allow more flexibility and support for more devices. Unencoded frames will be limited in terms of resolution by the available bandwidth.
MARIN demo, with Ibis:
Article: Léger, É., Reyes, J., Drouin, S., Popa, T., Hall, J. A., Collins, D. L., Kersten-Oertel, M., “MARIN: an Open Source Mobile Augmented Reality Interactive Neuronavigation System”, International Journal of Computer Assisted Radiology and Surgery (2020). https://doi.org/10.1007/s11548-020-02155-6
Source code repository: MARIN