Gestural Control of Music
Using the Vicon 8 Motion Capture System

Research Papers from the University of California, Irvine

Reports on Motion Capture Research at UCI
Conducted at the UCI Motion Capture Studio and the
Realtime Experimental Audio Laboratory (REALab)


These documents are in PDF format.


Gestural Control of Music Using the Vicon 8 Motion Capture System

Christopher Dobrian and Frédéric Bevilacqua

Proceedings of the New Interfaces for Musical Expression 2003 Conference, Montréal, Quebec, Canada

Abstract: This article reports on a project that uses unfettered gestural motion for expressive musical purposes. The project involves the development of, and experimentation with, software to receive data from a Vicon motion capture system, and to translate and map that data into data for the control of music and other media such as lighting. In addition to the commercially standard MIDI - which allows direct control of external synthesizers, processors, and other devices - other mappings are used for direct software control of digital audio and video. This report describes the design and implementation of the software, discusses specific experiments performed with it, and evaluates its application in terms of aesthetic pros and cons.


Mapping sound to human gesture: demos from video-based motion capture systems

Frédéric Bevilacqua, Christopher Dobrian, and Jeff Ridenour

Demo presented at the 5th International Workshop on Gesture and Sign Language based Human-Computer Interaction, 2003, Genoa, Italy

Abstract: We report research on gestural control of digital music, currently performed at the University of California Irvine. Experiments with various motion capture systems are performed, including single video camera system and 3D optical motion capture system (Vicon8). We present a series of demos of various approaches for motion analysis and gesture-to-sound mapping.


Aesthetic Considerations in the Use of "Virtual" Music Instruments

Christopher Dobrian

Proceedings of the Workshop on Current Research Directions in Computer Music, 2001, Institut Universitari de l'Audiovisual, Universitat Pompeu Fabra, Barcelona, Spain; also published in the Journal of the Society for Electro-Acoustic Music in the United States, Spring 2003

Abstract: Computer-mediated music control devices compell us to reexamine the relationship between performer and sound, the nature and complexity of which is theoretically unlimited. This essay attempts to formulate some of the key aesthetic issues raised by the use of new control interfaces in the development of new musical works and new performance paradigms: mapping the gesture-sound relationship, identifying successful uses of "virtual" instruments, questioning the role of "interactivity" in performance, and positing future areas of exploration.


3D motion capture data: motion analysis and mapping to music

Frédéric Bevilacqua, Jeff Ridenour, and David J. Cuccia

Proceedings of the Workshop/Symposium on Sensing and Input for Media-centric Systems 2002, Santa Barbara, California, USA

Abstract: We report research performed on gesture analysis and mapping to music. Various movements were recorded using 3D optical motion capture. Using this system, we produced animations from movements/dance, and generate in parallel the soundtrack from the dancer's movements. Prior to the actual sound mapping process, we performed various motion analyses. We present here two methods, both independent of specific orientation or location of the subject. The first deals with gestural segmentation, while the second uses pattern recognition.


Virtual dance and music environment using motion capture

Frédéric Bevilacqua, Lisa Naugle, and Isabel Valverde

Proceedings of the IEEE Multimedia Technology and Applications Conference (MTAC), 2001, Irvine, California, USA

Abstract: We present a multimedia project incorporating music and dance. We used a 3D motion capture system to produce animations from dance and generate the soundtrack from the dancer's movements. Movement analysis is performed to extract the important features of a particular gesture. Based on the parameters chosen from this analysis, various mappings between gesture and music are applied. In particular, the motion capture data is used to trigger and modify the timbre of sounds. This paper describes the method and the interactive environment that are under development.


Music control from 3D motion capture of dance

Frédéric Bevilacqua, Lisa Naugle, and Christopher Dobrian

Submitted for the CHI Workshop on New Interfaces for Musical Expression, 2001, Seattle, Washington, USA

Abstract: Research is currently being conducted at the University of California Irvine to develop novel approaches in music performance and composition generated from dancer's gestures. We report here specifically the modification of the "Vicon 8" motion capture system to control digital music. This system allows for the capture of a dancer's movement in 3D. Software is being developed, in the "Max/MSP" environment, to use the motion capture data for sound generation and/or alteration, through MIDI parameters or by controlling signal processing algorithms. This approach is promising for the extensive study of various possible relationships between gesture and music. This paper describes the method currently developed, and briefly discusses future directions of this work in progress.


Last modified May 18, 2003.
Christopher Dobrian
dobrian@uci.edu