Assignments

Computer Audio and Music Programming
Music 147 / CompSci 190 / EECS 195 / ACE 277

University of California, Irvine
Winter Quarter 2007


Assignments for upcoming class sessions are posted on this page.


Thursday, March 15:

Final written exam.

For Tuesday, March 13:

Review all the material presented in this class, in preparation for the final written examination on Thursday March 15. You might want to consult the review page for last year's final exam in this class. And of course you should also consult the review page for this quarter's hypothetical midterm exam.

Note that the questions on those two pages are representative of some of the topics that might be covered on the exam, but they are not intended as a comprehensive listing of final exam topics. The final exam might include any of the topics from the full quarter of lectures and readings. Almost all questions will involve definition of terms or factual information about digital audio. Some questions on the exam might require that you perform some simple arithmetic (no calculator needed) to get the right answer. One or two questions will ask you to solve a programming problem.

For Thursday, March 8:

Read Programming New Realtime DSP Possibilities with MSP (1999) by Dobrian.

Read Strategies for Continuous Pitch and Amplitude Tracking in Realtime Interactive Improvisation Software (.pdf format) (2004) by Dobrian.

Listen to the two pieces by Dobrian that will be the primary topic of Monday's lecture: Insta-pene-playtion for computer-processed flute, and Mannam (Encounter) for daegeum (Korean bamboo flute) and computer.

Write a C program that either generates a sound or modifies a sound. The sophistication of what you do will depend on your C programming skills. You can use some of the examples on the professor's page to get you started; however, if you use those examples, your own program should make some significant further development that shows that you have done some original thought (and hopefully your program will do something a bit more interesting and challenging than those examples).

To do actual input and/or output of audio on your computer, you will need to familiarize yourself with the PortAudio libraries for C programming of audio I/O. These libraries work for audio input/output on almost all major operating systems, with most major compilers (e.g., Visual Studio on Windows, Xcode on OS X, CodeWarrior on either platform, etc.). You can download these libraries to your own computer from the PortAudio download page. Included with the PortAudio package are documentation, tutorials, and examples that will help you understand how to use PortAudio.

PortAudio is already installed in the NACS Lab which is located in EG3151, for use with the Visual Studio programming environment. The PortAudio libraries on those computers are located at "D:\Apps\PortAudio".

If you have a Macintosh, you probably have the Xcode programming environment on the Developer disk that came with your computer (although you likely never installed that stuff). If you are using Windows, and don't have Visual Studio, you can download a free copy of Microsoft Visual Studio 2005 Express (then install PortAudio with that).

It is also possible to use Java or C to program your own Max or MSP object. (This is only for people who are pretty comfortable with programming already.) What you actually do is write a Java program (according to some conventions that are described in the document "WritingMaxExternalsInJava.pdf", located in the "java-doc" folder inside the Max application folder) and compile that as a .class file, then load that into Max using the mxj (or mxj~) object.

To program a Max or MSP object in C, you will need to download and read the Max Software Developer Kit, located on the Cycling '74 product documentation page.

On your website, post your source code (.c file), the executable file (.exe, .app, etc.), and a brief description of what you were trying to do and how you did it.

For Tuesday, March 6:

Read "Fourier Analysis", Appendix of the textbook, pp. 1073-1112.

Read the first part of "Spectrum Analysis" Part IV, chapter 13 in the textbook, pp. 533-577.

Read the online document (in PDF format) "Fourier Notes (.pdf)" by Peter Elsea.

Study MSP Tutorials 25 (Using the FFT) and 26 (Frequency Domain Signal Processing with pfft~).

For Thursday, March 1:

Post on your web site the work you have achieved so far on your term project. This should include a more developed and specific description of your project -- what topic you want to explore, what materials you have researched to prepare for it, what program you plan to produce, and your plan (ideally including a detailed timeline of intermediate goals) for how to complete it by the end of the quarter -- as well as some example patches or code showing your programming work in progress. The idea is to show that you have a clear idea of what you are trying to do, what you need to do to accomplish it, and that you have work underway.

Take a look at, and try to understand all the details of, the file formats for AIFF and WAVE files, and for standard MIDI files, which we will discuss in class.

For Tuesday, February 27:

Experiment with the sonic effect of one or more types of filter by feeding a sound (such as a pre-recorded sound file) into a filter object (such as lores~ or reson~ or comb~ or biquad~ or filtercoeff~ or onepole~ or svf~ or fffb~ or buffir~), and using a control signal (such as line~ or cycle~ or phasor~, or any other source) to vary some parameter of the filter (such as center frequency, Q, etc.). You can also consider using the delay objects (delay~, or tapin~ and tapout~) to construct your own delay/filter (such as flanging, chorusing, or a filter of your own design).

The goal of the assignment is a) to learn more about how different filters sound, and what are effective center frequency and Q settings, and b) to make an intriguing sound with a dynamically controlled filter.

Post a plain text version of your patch on your web site, accompanied by a written description of what you were trying to achieve and how it works. If your patch requires a specific .aiff or .wav file in order to produce the sound you want, post that (small) soundfile as well, and include a link to it.

On the professor's page there are a few relevant examples from the classes on February 20 and February 22, and you can see still more examples on the professor's page from last year's class. This should provide enough food for thought to get you started. However, your patch should not be too closely identical to these, or at least should include some substantial improvement or elaboration.

Familiarize yourself with the PortAudio libraries for C programming of audio I/O. These libraries should already be installed in the lab at 3151 Engineering Gateway, for use with the Visual Studio programming environment. You can download these libraries to your own computer from the PortAudio download page. Included with the PortAudio package are documentation, tutorials, and examples that will help you understand how to use PortAudio.

If you can't program in C, and can program in Java, you should familiarize yourself with the Java Sound API, which you can read about at the sound API page of Sun's Java web site. I will not cover the Java Sound API specifically in class, but I will cover general issues of file and stream I/O for audio, and there are many examples on the web that can help you. For Thursday, February 22:

Read the section on Digital Filters in the textbook, pages 397-419.

Read about the filter objects that are available in Max/MSP in Max and Filters (.pdf) by Peter Elsea of UC Santa Cruz.

Study MSP Tutorial 31 on comb filtering (the "comb~" object), as well as the help files for other MSP filter objects such as "reson~", "lores~", "teeth~", "allpass~", "fffb~", and the general-purpose filter object "biquad~" and its accompanying user interface object "filtergraph~" (and "filtercoeff~" for time-varying filtering). Other more advanced filter objects include "cascade~" and "zplane~".

For Tuesday, February 20:

Read pages 432-440 of the textbook, regarding "Fixed Time Delay Effects" and "Variable Time Delay Effects".

Study the implementation of delay effects as demonstrated in MSP Tutorials 27 through 30. You might want to try building some delay effects of your own to confirm your understanding (for example, you could play a sound file and apply multiple echos or combined delay effects -- perhaps occurring in different virtual spatial locations).

For more "fun with variable time delay", check out the example of Doppler effect using variable time delay, dopplerexample.txt, on the professor's web page. Study each part of the patch to try to understand the different aspects of the spatial motion effect.

For Thursday, February 15:

Study the list of midterm questions, answer as many as you are able, and research the ones you cannot answer. If, after researching them, you are unable to answer some of them, be prepared to ask about them in class on Thursday February 15. (This is much better than just ignoring the questions to which you don't know the answers, since any of these questions might appear on the final exam.)

Think about what you would like to do as a term project in audio/music programming. If you want to work in a collaborative team with one or two of your classmates, that's great; in that case you should consult with those people and form your team. Write a brief (one paragraph or so) proposal of what you want to do as a project. Once you have an idea of a subject you would like to pursue, you should do some research to find out a) the scope of the topic (it might be an enormous and highly developed topic, which is OK, but you might need to get more specific in that case), b) what the active questions and existing research findings already are in that topic, and c) what is an interesting and manageable subset of that topic to tackle as a programming project. Post your proposal (and your team members' names if you're working in a team) on your web page, and be ready to discuss it in class.

Some things to think about in your project are:
1) General topic that interests you: audio effects processing, spatialization, algorithmic composition, audio mixing/collage, sound/music synthesis, etc. Topics not yet covered in class are also fair game: applications of networking in audio/music, audio encoding/decoding, etc., but you will probably find it easiest to stick with a topic covered in class or in the textbook or both.
2) User interface: once you know what you want to accomplish, how will the user interact with the computer, both in terms of physical control and visual/sonic feedback?
3) Note that if you are more comfortable using C or Java, MaxMSP does have provisions for allowing you to include such code as a MaxMSP object.

For Tuesday, February 13:

No assignment. The topic of the class will be the computer music projects of TA Greg Elliott.

For Thursday, February 8:

Attend the Gassmann Electronic Music Series concert by Adam Rudolph and Michael Dessen Wednesday Feburary 7 at 8:00 pm in Winifred Smith Hall (free event, not ticket required).

For Tuesday, February 6:

The topic of the lecture will be spatialization and reverberation.

Read "What is Ampitude?" by Jeffrey Hass.

Read "Sound Spatialization and Reverberation" by Curtis Roads in the textbook, part III, chapter 11, pp. 449-484.

Read "MIDI Panning", MSP Tutorial 22, in MSP Tutorials and Topics, pp. 178-184, and study the accompanying Max/MSP program in the Tutorial Patches folder.

Read "Psychoacoustics in Computer Music" by John W. Gordon in the textbook, part VII, chapter 23, pp. 1053-1069.

For Thursday, February 1:

Read about musical tuning and musical scales, beginning with the page on The Twelve-Tone Musical Scale by Keith Enevoldsen. Then read some more technical descriptions such as the Wikipedia article on "musical tuning", the Wikipedia article on "Equal temperament" (focusing primarily on the information about 12-tone equal temperament) and the Wikipedia article on "Mathematics of musical scales". Follow the available links to any words of which you are unsure of the meaning (such as "octave", "just intonation", etc.).

For Tuesday, January 30:

Write a program that automatically plays a short melody on a synthesizer. You can use the built-in DLS synthesizer in your computer, or you can use any sort of soundmaker that you can design in MSP. (For ideas on designing your own instrument in MSP, you can take a look at MSP Tutorials 7, 11, 12, and 18-21. You can also read a pretty good 8-part tutorial on Synth-Builidng with Max/MSP by Darwin Grosse.) The definition of a "melody" should be interpreted very broadly to mean any sort of organized sequence of notes or other sound events. (So that means just about anything, as long as you can make a good case that it's an "organized sequence of events".) There are many ways to automatically play a tune in Max: you can play a (format 0) MIDI file with the seq object, you can look up pitches (and velocities and durations) in table object(s) or a coll object, triggered by a metro -> counter combination, or you can try some other way of generating pitch numbers with counter, or some mathematical formula, or tempo, or line, or ... . (If you have designed a synthesizer in MSP and want to play a melody on it, you can do this most easily by sending MIDI-style pitch numbers to the mtof object to translate the pitch into a frequency value.) Try to make your result sonically interesting and (arguably) musically interesting. This is not an easy assignment, so don't leave it till the night before it's due.

Read about modulation synthesis (part II, chapter 6) in the textbook, specifically pages 215-239. (Read on further if you'd like, of course.) To supplement that reading and to see how it is implemented in Max/MSP, study MSP tutorials 8, 9, 10, and 11.

For Thursday, January 25:

Read about MIDI in UCSC electronic music instructor Peter Elsea's article on MIDI, and on the "Exploring MIDI" website at Northwestern University, and in the textbook (part 6, chapter 21). These three articles discuss the same topic, at three different levels of technicality, so it is suggested that you read them in the order they are listed here.

To see how MIDI is handled by MaxMSP, I suggest that you consult the section on "MIDI Overview and Specification" in the manual called MaxGettingStarted.pdf in the MaxMSP "Documentation" folder. Then look at Max Tutorials 12, 13, and 16 in the manual called "Max46Tutorial.pdf. For your own edification (not to hand in, but this will give you a head start on a future assignment) try building your own Max patch that uses MIDI. If you work at the Arts Media Center, you can use the MIDI synthesizer keyboards that are connected to the computers with MaxMSP. But even if you don't have any MIDI equipment available, you can still send MIDI data to the built-in synthesizer that's provided by your computer's operating system. MIDI objects for output such as midiout, noteout, bendout, pgmout, ctlout, etc. can all send MIDI to that synth or any external synth. For example, create a 'noteout' object, then (in the locked patcher window) double-click and hold the noteout object and select your computer's DLS synth from the popup menu. That will direct the object's MIDI messages to that synth. You can then play notes on that synth by sending velocity and pitch messages into the second and first inlets (repectively) of the noteout object.

For Tuesday, January 23:

In the textbook, read pages 84-85 "Overview to Part II" (sound synthesis), one paragraph on page 89 "The Unit Generator Concept" (a conceptual description of MSP objects), pages 90-98 on waveform synthesis (the graphic notation for synthesis instruments looks a bit different than the way a network of objects looks in MSP, but it is conceptually identical), pages 117-128 on sampling synthesis (use of digitally recorded sounds to construct a computer instrument), and pages 134-144 on additive synthesis (combining simple tones to make a complex tone).

Make an MSP program that generates a sound (plays a sound from a file, plays sound from a RAM buffer, or synthesizes a sound from a tone-generating object), and that uses a control signal to modify that sound continuously (and automatically). Your program should also include some user interface that allows you to change the control signal, to change its effect.

Post your completed program as a plain text file (.txt) on your web site, and put a link to that file on your web page, along with a brief description of what you intend the program to do, how it works, and any other considerations you took into account when making it.

In doing this assignment it will be particularly useful for you to do MSP Tutorials 2, 3, and 6, and to take a look at the example files "vibrato" and "playrandomnotes" on the professor's example page and the example file speedchange.txt on the page of examples from the COSMOS 2005 summer class.

For Thursday, January 18:

Begin working with the Max/MSP programming environment. You can download Max/MSP to your own computer, or work in the Arts Media Center (Mac) on the second floor of the AITRC and also in the Arts TEC on the first floor of the AITRC. (Soon it will be available in a NACS Windows lab, as well.)

Taking the professor's class examples as a starting point, build a small Max/MSP program that either generates sound or plays back recorded sound, and has some onscreen control(s) that allow you to alter the sound. Some examples might be: mixing sounds together with control over the balance of the sound, dynamically altering the speed/pitch of the sound and/or the loudness of the sound, playing a sound repeatedly at a controllable rate, etc. Use your imagination and try to implement your ideas.

The result should be a single Max/MSP program that allows the user to perform a task for playing/altering some type of sound. (But if you have a more complicated idea and are able to accomplish it, go right ahead!) Save the file as plain text (choose Save As... from the File menu, specify Text as the file format, and end the file name with ".txt"), upload that text file to your web site, and put an obvious link to that file on your web page, accompanied by a description of the program, a description of what it does, and any other information you want to include. If you want to put a graphic picture ("screenshot") of your program on your site as a JPG file, in addition to the text file, go right ahead. (Experienced Max/MSP programmers can often tell what a program will do just by looking at it.)

The most accessible source of instruction in Max/MSP is the set of Tutorials ("Max Tutorials and Topics" and "MSP Tutorials and Topics") included in the product Documentation as PDF files with example programs (or downloadable separately). You are encouraged to work through these tutorials - as many as possible - on your own. Tutorials that are specific to this assignment are MSP Tutorials 1, 2, 7, and 16. For basic instruction in how Max works, look at Max Tutorials 1, 2, 4, 7, 10, and 15.

Remember that you can get quick help on any Max/MSP object by option-clicking (alt-clicking) on the object. You can also find out which object might be appropriate to any given task by looking up a relevant key word in the Max Object Thesaurus or MSP Object Thesaurus at the back of the Max Reference Manual or the MSP Reference Manual.

You can also post questions and ideas on the class NoteBoard.

For Tuesday, January 16:

Read about some of the popular commercial software for sound and music listed below. If you don't understand some of the terminology used in the descriptions, look it up, and/or ask about it on the course NoteBoard.

Read about the Audacity sound editing software. You might want to download the program and try it out to get a firsthand idea of how it works and what features it provides.

In addition to being a useful free program, Audacity is noteworthy in relation to this course because it is open source software. You can, if you want, download the source code for Audacity and study it.

Read about the Reason computer music software. It's like an entire electronic music studio in one application. You can download a demo version of this program, too.

(Optionally) Read about the Ableton Live software for composition and performance. Yes, you can download a demo version and the manual, if you want.

For Thursday, January 11:

Note: This class session and all future sessions will meet in Music and Media Building, Room 216.

Read the article "Digital Audio" by Christopher Dobrian.

Establish your web page for the course, and send the URL to the professor. UCI provides Web space for you to establish your own website at ea.uci.edu, or you can use any other Web space that is available to you. Wherever your site is hosted, you should create a page that is specifically devoted to this course.


This page was last modified on March 7, 2007.
Christopher Dobrian
dobrian@uci.edu