The LeapMax Gestural Interaction System is a project which utilizes the Leap Motion controller and visual programming language Max to extract complex and accurate skeletal hand tracking data from a performer in a global 3-D context. The goal of this project was to develop a simple and efficient architecture for designing dynamic and compelling digital gestural interfaces. At the core of this work is a Max external object which uses a custom API to extract data from the Leap Motion service and retrieve it in Max. From this data, a library of Max objects for determining more complex gesture and posture information was generated and refined. These objects can be are highly flexible and modular and can be used to create complex control schemes for a variety of systems. To demonstrate the use of this system in a performance context, an experimental musical instrument was designed in which the Leap is combined with an absolute orientation sensor and mounted on the head of a performer. This setup leverages the head mounted Leap Motion paradigm used in VR systems to construct an interactive sonic environment within the context of the user's environment. The user's gestures are mapped to the controls of a synthesis engine which utilizes several forms of synthesis including granular synthesis, frequency modulation, and delay modulation.
Included in this item (3)