Gestural Control of Music
Using the Vicon 8 Motion Capture System

Christopher Dobrian and Frédéric Bevilacqua

A Demonstration for NIME 2003


Motion Capture Music


This page contains an explanation and demonstration of a research project at the University of California, Irvine exploring gestural control of music with motion capture.

In particular, we provide details about custom software that converts data from a Vicon 8 motion capture system into musical control data, and we show examples of the software and its musical results.

The software, called MCM (Motion Capture Music) is under development for both Windows and Macintosh platforms. The Macintosh version, developed using Max/MSP/Jitter, is the primary focus of the discussion and examples here.

This demonstration is provided for the 2003 International Conference on New Interfaces for Musical Expression (NIME 2003), and supplements the article in the NIME 2003 proceedings entitled "Gestural Control of Music Using the Vicon 8 Motion Capture System", which is available here in PDF format.


The central issues addressed by this research are:

  1. How can the stream of motion capture data, which may contain upward of 90 location parameters per frame at a frame rate upward of 30 frames per second, be parsed effectively for musical control?
  2. What other types of useful information (velocity, acceleration, direction, gesture boundaries, gestural qualities, etc.) can be derived from these location parameters?
  3. What are the most intuitive and/or effective mappings of bodily location parameters (and other derived information) to musical meaning for composition and/or improvisation?

In this report we identify two stages of work in this area: software to parse and map the motion capture data, and experiments to test its use for musical control.

The MCM software provides a way for an artist/programmer (or an artist-programmer team) to identify a) which location markers and coordinates to use, b) what information about those markers is essential, c) how that information will be mapped, and d) how that mapped information will be applied in music (e.g., as a controller of pitch, volume, timbre, panning, density of events, formal structure, etc.). We provide here some articles written by researchers at UCI related to this effort, and some examples demonstrating basic concepts from early experiments in applying the mocap data to gestural control of music.


MCM

Articles

Examples


Last modified May 18, 2003.
Christopher Dobrian
dobrian@uci.edu