CHRISTOPHER DOBRIAN


Current Research Project


Interproviplaytions I-VI

A series of compositions for live performer and interactive computer system, focusing on realtime processing of flute sounds. Interproviplaytion I is for electric guitar, sampled flute, synthesizer, and computer. It was premiered by the composer in Buenos Aires, Argentina in October 1999. Insta-pene-playtion (Interproviplation II) is for computer-processed flute sounds. It was premiered at the SEAMUS 2001 conference in Baton Rouge, Louisiana in March 2001. Play for Me (Interproviplaytion III) is for flutes and interactive audio processing system. It was premiered by Beate-Gabriela Schmitt in Irvine, CA in May 2001. In Tongues (Interproviplaytion IV) for flute and computer was premiered by James Newton at the Primavera en la Habana (Spring in Havana) conference in Habana, Cuba in March 2002. Trans (Interproviplaytion V) for electric guitar, sampled daegum (Korean flute), and computer, was premiered by the composer in Seoul, Korea in June 2003. Mannam (Interproviplaytion VI) for Korean daegum and computer, was premiered in the Seoul International Computer Music Festival in Seoul, Korea in November 2003.


Recent Research Projects


Motion Capture for Gestural control of Music

Motion Capture

Design of software for the realtime translation of motion capture data from a Vicon 8-camera system into control data for music and audio. Experimentation with effective methods of mapping motion capture data for musical input by dancers and/or actors and/or audience to achieve truly interactive performances/installations in which human and computer participants contribute to the production of a music composition in real time. This work was done in collaboration with Frédéric Bevilacqua, and with the assistance of Mark Magpayo and Maybelle Tan.


Jitter Manual

Co-author of the Jitter user's manual, a set of tutorials (with online examples) for the Jitter matrix data and video image processing extensions of the MAX computer music programming language.


Video image produced with Jitter


Internet Pianos

Experimental performances with Yamaha Disklavier pianos in remote locations communicating via the Internet. The first performance was a live simulcast of a recital by Italian pianist Marino Formenti on October 5, 2001, with Formenti performing works by Karlheinz Stockhausen and Morton Feldman at the Orange County Museum of Art, and his performance being duplicated in real time at the University of California, Irvine. The second performance was a duo recital on October 17, 2001 by stellar jazz pianists Kei Akagi (performing in Irvine, CA) and Anthony Davis (performing in La Jolla, CA), using interactive software written by UC professors Christopher Dobrian (UCI), Miller Puckette (UCSD), and others.


Motion Tracking for Musical Input

Motion Capture

Experimentation with uses of motion tracking systems for musical input by dancers and/or actors and/or audience to achieve truly interactive performances/installations in which human and computer participants contribute to the production of a music composition in real time. These experiments were conducted using Max/MSP, Very Nervous System (VNS), BigEye, and Vicon8. Co-researchers were Lisa Naugle, Frédéric Bevilacqua, Gene Wie, and Cayci Suitt.



Microepiphanies

Microepiphanies

A computerized musical-theatrical "digital opera" by Douglas-Scott Goheen and Christopher Dobrian in collaboration with other UCI faculty and students; a multi-media performance in which music, lights, video, and environmental elements are all under the realtime control of computers, engagingly and intelligently interacting with the live performers. Premiered April 12, 2000 at the University of California, Irvine.

Goheen and Dobrian continue to collaborate on a new digital opera using many of the techniques and technologies developed in Microepiphanies.



Continually Variable Tunings and Timbres Realizable by Computer

A presentation at the MicroFest 2001 conference, demonstrating formulae for music in which the scales and tuning systems vary continuously throughout a piece, and methods of calculating artificial spectra for computer-synthesized sounds, such that the timbres are always tuned to the current tuning system of the music.

Abstract

Computer sound synthesis provides great potential for experimentation with novel musical tuning systems, a fact which has been known for decades. Yet only a fraction of this potential has been explored, due to conceptual restrictions which are inherent in most paradigms of tuning based on acoustic instruments. Computers provide the opportunity to explore abstract concepts that are not readily realizable by physical means, such as tuning systems that change gradually and continually during the course of a piece of music, scale steps that change continually in size, and timbres that can be designed specifically to suit the tuning system in use and can defy physical acoustic principles.

The author presents selected methods of tuning which are uniquely realizable by computer, focusing on tunings that can change gradually during the course of a musical passage. He also makes a case for the use of computer synthesis to design timbres that contain frequency relationships which correspond to the tuning system in use in the music. The article is accompanied by a CD of sound examples and musical excerpts.


Granular Windowing Techniques in MSP

Algorithms for realtime granular processing of audio, explicated in the article Programming New Realtime DSP Possibilities with MSP (1999), published in the proceedings of the the 2nd COST G-6 Workshop on Digital Audio Effects (DAFx99), NTNU, Trondheim, Norway.


Automated Harmonization of Melody in Real Time

Pedro Eustache

An interactive computer system, developed in collaboration with flutist/composer Pedro Eustache, for realtime melodic analysis and harmonic accompaniment. Based on a novel scheme of harmonization devised by Eustache, the software analyzes the tonal melodic function of incoming notes, and instantaneously performs an orchestrated harmonization of the melody. The software was originally designed for performance by Eustache on Yamaha WX7 wind controller, and was used in his composition "Tetelestai", premiered in Irvine, CA in March 1999.



MSP: The Documentation

Author of the original user's manual, reference, and tutorial (with online examples) for the MSP audio signal processing extensions of the MAX computer music programming language.


Frequency modulation synthesis in Max with MSP


Talk to Me

A computer installation with animations by Daniel Beck, sculptural design by UCI professor Douglas-Scott Goheen, and interactive MSP audio processing by Christopher Dobrian. Sounds "heard" by the computer are used to cause changes in the animation and are processed and incorporated into a musical fabric of sounds from the immediate environment.


"Lightheads", from animation by Daniel Beck


Christopher Dobrian
January 7, 2004
dobrian@uci.edu