For Thursday, January 10:
Note: This class session and all future sessions will meet in Music and Media Building, Room 216.
Purchase the textbook for the class.
Read the article "Digital Audio" by Christopher Dobrian.
For Tuesday, January 15:
Begin working through the MSP tutorials in the document "MSP2.pdf" in the Documentation folder in the Max folder. The example programs for the tutorials can be found in the MSP Tutorial folder in the Tutorials folder in the Max folder. If you have trouble understanding the basic operations of Max objects, you can consult the other Max documentation manuals, particularly "Max4GettingStarted.pdf", "Max4Reference.pdf", and "Max4TutorialsAndTopics.pdf" (and the example programs in the Max Tutorial folder in the Tutorials folder in the Max folder). Come to class with your questions, discoveries, and problems.
For Thursday, January 17:
Read chapters 2 and 3 of the textbook. Bring questions to class for discussion.
Establish your web page for the course, and send the URL to the professor.
Continue working through the MSP tutorials to at least Tutorial 11 (FM synthesis). As you encounter Max objects that you don't understand, there are three good ways you can learn more about them. You can option-click on the object to open its online help file, you can look the object up in the main Max reference documentation "Max4Reference.pdf", and often you can find a further description of the object in "Max4TutorialsAndTopics.pdf".
For Tuesday, January 22:
Read chapter 4 and the first half of chapter 5 (thru page 139) of the textbook, focusing particularly on the basic principles of synthesis, addititive synthesis, and simple FM synthesis.
Design an "instrument" in MSP to synthesize a novel kind of sound using either additive synthesis or FM sysnthesis. Make some sort of simple user interface (button, keyboard, etc.) for playing notes (random pitches, specific pitches, or a sequence of pitches) on your instrument. Use the line~ object (and the function~ object, if you want) to control the sound's amplitude. You can use MSP Tutorials 7 and 11 as starting points, but try to develop your instrument in some way that goes beyond what's presented in the tutorial. When you have finished, save the file as a plain text file (using the "Save As..." command in the File menu of Max and choosing the "Max text file" option from the resulting pop-up menu), post that file on your web site, and post an obvious link to it on your web page. Come to class prepared to explain and discuss your program.
Wednesday, January 23:
Attend the free Gassmann Electronic Music Series concert by Jane Rigler at 8:00 pm in Room 220A of the Music and Media Building.
For Thursday, January 24:
Design an MSP "instrument" that uses sound read in from a soundfile--or sound read into a buffer--and modifies the sound using modulation, windowing, or some other "abnormal" method of reading the audio. For soundfile playback, use "sfplay~". For playback from a memory buffer, use "buffer~" and a buffer-reading object such as "index~" or "play~" or "groove~" or "wave~" or "cycle~" or "lookup~" or ... When you have finished, save the file as a plain text file (using the "Save As..." command in the File menu of Max and choosing the "Max text file" option from the resulting pop-up menu), post that file on your web site, and post an obvious link to it on your web page. Come to class prepared to explain and discuss your program.
For Tuesday, January 29:
Read chapter 6 of the textbook, on "subtractive synthesis" and digital filters. You may omit sections 6.8-6.10. In sections 6.11-6.13 you should understand the basic filter equations, but you need not examine every equation of every filter in detail. The idea is to understand the basic filter equation, the audible effects of different types of filter (lowpass, highpass, bandpass, etc.), and such concepts as spectrum, center frequency, bandwidth, passband, stopband, etc. There are excellent supplementary readings in Road's Computer Music Tutorial and Strawn's Digital Audio Signal Processing, both of which are on reserve in the Arts Media Center.
For Thursday, January 31:
Use delays, echos, flanging, chorusing, or filtering to modify a prerecorded sound in some interesting or useful way in MSP. For delays, see MSP Tutorials 27 and 28 (tapin~ and tapout~) as well as the delay~ object. For flanging and chorusing, see MSP Tutorials 29 and 30. For filtering, see such objects as comb~, reson~, biquad~, and lores~. You can use your previous assignment to generate the original sound to be delayed/filtered, if you'd like. When you have finished, save the file as a plain text file (using the "Save As..." command in the File menu of Max and choosing the "Max text file" option from the resulting pop-up menu), post that file on your web site, and post an obvious link to it on your web page. Come to class prepared to explain and discuss your program.
For Tuesday, February 5:
Read chapter 7.2 of the textbook, on applications of the Fourier transform, and read all of chapter 10 on "Reverberation, Audio Localization, and Other Sound-Processing Techniques".
For Thursday, February 7:
Post a thorough description of your proposed midterm project on your website. Come to class prepared to discuss it, and to discuss specific approaches for implementing it.
Tuesday, February 12:
Guest lecture: Professor Oscar Pablo Di Liscia, Universidad Nacional de Quilmes (Buenos Aires, Argentina), will lecture on "WDSPA: a program for Sound spatialisation using Ambisonics"
Pablo Di Liscia is Director of the Program in Electronic Composition at the Universidad Nacional de Quilmes (Argentina). He is a composer and computer music researcher, and is currently also a Research Associate at the Laboratorio de Producción e Investigación Musical (LIPM, Buenos Aires, Argentina). He will speak about his own musical and research work, and will describe the academic program and current research at UNQ.
Wednesday, February 13:
Attend the ITAC lecture by Pablo Di Liscia on the subject of "Research, Production and Education in Electronic Music at Universidad Nacional de Quilmes(Argentina)" at 5:00 pm in Room 316 of the Music and Media Building.
For Thursday, February 14:
Continue to work on your midterm project. Read chapter 12 of the textbook (sections 12.1 thru 12.4) on realtime control, paying specific attention to the discussion of MIDI in section 12.3.
For Tuesday, February 19:
Complete your midterm project, and be prepared to give a very brief and concise (7-8 minutes) presentation of it in class. You may need to actually rehearse your presentation, to be sure you have a good sense of how much you can say in 7 minutes.
Wednesday, February 20:
Recommended: Attend the Gassmann Electronic Music Series concert by innovative electric guitarist Fred Frith at 8:00 pm in the Winfred Smith Hall, adjacent to the Music and Media Building. The concert is FREE!
For Thursday, February 21:
In class we will hear continued presentations of midterm projects.
Post on your website a proposed topic for your final programming project. The final programming project should be an idea for an MSP object that can be a useful tool for a specific purpose or set of purposes for musically useful audio processing. You should determine a topic area (spatialization, filtering, spectral processing, etc.), a useful task within that area, research resources for studying the problem and researching prior work, and approaches for implementation.
Wednesday, February 27:
Attend the Gassmann Electronic Music Series concert of music for trumpet and electronics by trumpeter Graham Ashton at 8:00 pm in the Winfred Smith Hall, adjacent to the Music and Media Building. The concert is FREE!
For Thursday, February 28:
Study the documentation for "Writing Max/MSP Externals.pdf", especially pp. 15-38 and pp. 206-222, in preparation for an in-class discussion of coding Max/MSP external objects in C.
Tuesday, March 5:
Guest lecture: Professor Ichiro Fujinaga, Peabody Conservatory, Johns Hopkins University, will lecture on his current work in computer music.
Also, attend the ITAC lecture by Ichiro Fujinaga on the subject of "Computer Recognition of Orchestral Instruments" at 5:00 pm in Room 316 of the Music and Media Building.
Ichiro Fujinaga, of the Peabody Conservatory of Music Computer Music Department, has been developing the use of exemplar-based computer learning techniques to teach computers how to recognize the sound of different musical instruments. He will describe and test these techniques in an engaging and entertaining demonstration.
For Thursday, March 7:
Post a thorough description of your proposed final project on your website. Come to class prepared to discuss it, and to discuss specific approaches for implementing it.
For Tuesday, March 12:
Post a draft of your final paper on your website. If your paper requires graphic examples, you will need to post your article in HTML or .pdf format. It is also acceptable to turn in hard copy of your paper; in this case, two copies are required, one for the TA and one for the professor.
Bring a copy of your compiled working MSP or Pd object on Zip or CD.
For Thursday, March 14:
Multiple choice final exam on basic principles discussed in lectures and in the textbook.
Tuesday, March 19 (Finals Week), 10:30 am - 12:30 pm, Music and Media Building, Room 216:
Finals week; final exam session: 5-minute presentations of final projects. Compiled working MSP or Pd object due. Final copy of paper due. (No deadline extension.)