This page contains an explanation and demonstration of a research project at the University of California, Irvine exploring gestural control of music with motion capture.
In particular, we provide details about custom software that converts data from a Vicon 8 motion capture system into musical control data, and we show examples of the software and its musical results.
The software, called MCM (Motion Capture Music) is under development for both Windows and Macintosh platforms. The Macintosh version, developed using Max/MSP/Jitter, is the primary focus of the discussion and examples here.
This demonstration is provided for the 2003 International Conference on New Interfaces for Musical Expression (NIME 2003), and supplements the article in the NIME 2003 proceedings entitled "Gestural Control of Music Using the Vicon 8 Motion Capture System", which is available here in PDF format.
The central issues addressed by this research are:
In this report we identify two stages of work in this area: software to parse and map the motion capture data, and experiments to test its use for musical control.
The MCM software provides a way for an artist/programmer (or an artist-programmer team) to identify a) which location markers and coordinates to use, b) what information about those markers is essential, c) how that information will be mapped, and d) how that mapped information will be applied in music (e.g., as a controller of pitch, volume, timbre, panning, density of events, formal structure, etc.). We provide here some articles written by researchers at UCI related to this effort, and some examples demonstrating basic concepts from early experiments in applying the mocap data to gestural control of music.
Last modified May 18, 2003.