Mobile gestural synthesizer prototype (May 2019)

This is a 360 / ambisonic video. that means that as long as you are viewing on desktop, you can drag the video around in 3 dimensions and hear the sound pan around you. wear headphones for best results.

Gestur.a is an interactive musical synthesiser app, built for iOS. It is currently in the iterative prototyping stage. The synthesiser is controlled by a mixture of different affordances - user gestural motion, on screen touch parameters, and the microphone. The purpose of the app is to give users the ability to make expressive music with their iPhone in an easy and intuitive way, and to be a platform where they can share these compositions with friends and collaborators. Gestur.a is built in Xcode, using the LibPD framework to connect with PureData - the open source visual programming language that it uses for audio processing and synthesis.