An Audiovisual Augmented Reality Experience Built on Open-Source Hardware and Software (2021)

Inspiration & Rationale →


If an AR system can be thought of as one that combines real and virtual processes, is interactive in real-time, and is registered in three dimensions; why do we witness the majority of AR applications utilising primarily visual displays of information? I propose a practice-led compositional approach for developing multisensory AR experiences’, arguing that, as an medium that combines real and virtual multisensory processes, it must explored with a multisensory approach.

This project uses the open-source Project North Star HMD from Leap Motion alongside bone-conduction headphones to deliver a spatialised audio-visual experience via Unity called polaris~. This repository started off as a fork of the Software Companion for Project North Star, hence the other repository contributors and long list of commits. However, the experience itself including all audio-visual / artistic / musical content was added afterwards.

This page outlines my use of the system which started around June of 2020 and is ongoing. To clarify, the original design has been open sourced by Leap Motion since 2018, but there have been a fair few community revisions and updates to the design see more here. This page documents the development of the Combine Reality Deck X version of the Project North Star HMD. Combine Reality is run by Noah Zerkin, who has provided countless support to my own project, so thanks Noah! He’s also pretty much the only inexpensive parts sourcer of the electrical bits needed for the headset. These pages act more like a devblog of my first year with North Star as a platform, its not to be taken as project instructions. Those can be found on the wiki guide

{Presentation} {Demonstration} {Prototype} {Palm Synth} {Finger Synth} {LibPd Explainer}

Inspiration and Similar Projects

  • Listening Mirrors: an audio AR interactive installation by my PhD supervisors
  • Laetitia Sonami: pioneer in early glove-based interactive music systems
  • Atau Tanaka: interactive gestural synthesis using muscle sensors
  • Keijiro Takahashi specifically their work with audio-reactivity in Unity.
  • Tekh:2 has created XR instruments using granular synthesis in Unity.
  • Amy Brandon creates amazing musical AR performances.


  • Noah Zerkin (CombineReality) for their help in understanding some specifics workings of the North Star headset.
  • Damien Rompapas (BEERLabs / ThinkDigital) for their explaining and debugging of the Software Companion to me.
  • Bryan Chris Brown (CombineReality) for their moderation of the very friendly Discord server and considerable explanations of the benefits of working with the North Star headset.


Inspiration & Rationale →


Headset Documentation: Project North Star

Community: Project North Star Discord Server

Repository: Project Esky Renderer