These documents are in PDF format.
Abstract: This article reports on a project that uses unfettered gestural motion for expressive musical purposes. The project involves the development of, and experimentation with, software to receive data from a Vicon motion capture system, and to translate and map that data into data for the control of music and other media such as lighting. In addition to the commercially standard MIDI - which allows direct control of external synthesizers, processors, and other devices - other mappings are used for direct software control of digital audio and video. This report describes the design and implementation of the software, discusses specific experiments performed with it, and evaluates its application in terms of aesthetic pros and cons.
Abstract: We report research on gestural control of digital music, currently performed at the University of California Irvine. Experiments with various motion capture systems are performed, including single video camera system and 3D optical motion capture system (Vicon8). We present a series of demos of various approaches for motion analysis and gesture-to-sound mapping.
Abstract: Computer-mediated music control devices compell us to reexamine the relationship between performer and sound, the nature and complexity of which is theoretically unlimited. This essay attempts to formulate some of the key aesthetic issues raised by the use of new control interfaces in the development of new musical works and new performance paradigms: mapping the gesture-sound relationship, identifying successful uses of "virtual" instruments, questioning the role of "interactivity" in performance, and positing future areas of exploration.
Abstract: We report research performed on gesture analysis and mapping to music. Various movements were recorded using 3D optical motion capture. Using this system, we produced animations from movements/dance, and generate in parallel the soundtrack from the dancer's movements. Prior to the actual sound mapping process, we performed various motion analyses. We present here two methods, both independent of specific orientation or location of the subject. The first deals with gestural segmentation, while the second uses pattern recognition.
Abstract: We present a multimedia project incorporating music and dance. We used a 3D motion capture system to produce animations from dance and generate the soundtrack from the dancer's movements. Movement analysis is performed to extract the important features of a particular gesture. Based on the parameters chosen from this analysis, various mappings between gesture and music are applied. In particular, the motion capture data is used to trigger and modify the timbre of sounds. This paper describes the method and the interactive environment that are under development.
Abstract: Research is currently being conducted at the University of California Irvine to develop novel approaches in music performance and composition generated from dancer's gestures. We report here specifically the modification of the "Vicon 8" motion capture system to control digital music. This system allows for the capture of a dancer's movement in 3D. Software is being developed, in the "Max/MSP" environment, to use the motion capture data for sound generation and/or alteration, through MIDI parameters or by controlling signal processing algorithms. This approach is promising for the extensive study of various possible relationships between gesture and music. This paper describes the method currently developed, and briefly discusses future directions of this work in progress.
Last modified May 18, 2003.
Christopher Dobrian
dobrian@uci.edu