Live Stage Monitor Mixing through Gestural Interaction in Augmented Reality

Existing stage monitor mixing systems are inefficient and cannot accommodate the communication between the musicians and sound engineers. We introduce ARMixer, which allows musicians to perform self-stage monitor mixing through gestures in augmented reality to provide an intuitive mixing experience. We performed two usability tests and found that ARMixer is acceptable to the user and has excellent psychoacoustic intuitiveness in terms of mixing parameter controls by gestures and identifying mixing target.

Paper link here.