Music Technology


This page contains assignments for the class
Music 215: Music Technology - Spring 2013
University of California, Irvine

The assignment for the upcoming class will be posted here after each class session.

Monday June 10, 2013:

Participate in the final critique session.

Thursday June 6, 2013:

Present and perform your final project in the Music 215 Concert, Thursday June 6, 8:00 pm, Music and Media 218.

For Wednesday June 5, 2013:

Complete your final project so that you're ready to do a technical rehearsal on Wednesday in preparation for the Thursday June 6 concert.

For Wednesday May 29, 2013:

Study the examples from the May 22 class as well as the examples provided for the upcoming May 29 class.

Continue working on your final project, and come prepared to discuss any problems or questions you may have.

For Wednesday May 22, 2013:

1) Prepare to give a 20-minute in-class presentation on the research, experimentation, and learning you have done so far in the course of working on your term project. It should not be simply a "progress report" or a "here's what I've done so far" demonstration of your project work-in-progress. Instead, you should identify and explain the specific research question or technical problem you have undertaken, and you should describe and teach what you have learned in that regard through your research and experimentation. In short, you should describe a technical problem you have tried to solve, and then provide a brief lesson in how to solve it, based on what you have learned in the past six weeks. (Yes, that's right, it will have been more than six weeks since you submitted your research topic proposal!)

If your presentation requires video projection, you should make plans to meet with Martim in advance of the class time to put all your presentation materials on the "REALab Data" hard drive of the MacPro computer in the REALab, so that you can present from that computer, which will enable you to share your presentation with the class and simultaneously with the professor via iChat.

2) Check your timeline of tasks that you created three weeks ago as part of your project proposal, and revise it as need be to be able to complete your project and have it ready for performance on June 6. Set yourself one or more specific tasks that you need to complete by May 22, and be prepared to report on those accomplishments.

For Wednesday May 15, 2013:

Catch up on past assignments you have not completed, and study every programming example provided up to this point.

Continue working on your final performance project. It's advised that you not focus exclusively on technical and programming concerns, but rather that you devote at least equal time to compositional concerns. It usually works best if the technical development and musical development are done conjointly, so that the two processes inform each other. Consult your project plan in order to divide your work into specific small tasks that need to be completed by a certain time; this will help you stay on schedule and avoid excessive last-minute panic.

Make an appointment with the professor to meet by Skype some time between Monday May 13 and Friday May 17. Send by email whatever materials will be needed for that discussion, ideally at least 24 hours in advance of the meeting.

For Wednesday May 8, 2013:

Meet with the professor to discuss your project plan on Thursday May 2, and begin working on your project.

In preparation for the upcoming class session, in which we will discuss the fundamentals of video control and 3D animation in Jitter, and audio spatialization and panning, read the following tutorials and study the following examples.

Jitter Fundamentals Tutorials:
What is a Matrix?
Attributes: Editing Jitter object parameters
Tutorial 1: Playing a QuickTime Movie
Tutorial 4: Controlling Movie Playback

Jitter Examples:
Play a QuickTime movie (simple version)
Play a QuickTime movie with Jitter (slightly more advanced)
Movie attributes
Use attributes to control video playback in Jitter
Random video editing
A-B video switcher
A-B video crossfade
Crossfade video and audio
Crossfade to new location in a video

Jitter OpenGL Tutorial:
"Open GL in Jitter" by Peter Elsea (read at least the first ten and a half pages)

Jitter OpenGL Examples:
Create a sphere in OpenGL
Apply a texture to a shape in OpenGL
Display a video on a shape in OpenGL
Display a video on a plane in OpenGL

Audio spatialization readings:
Wikipedia entry on "panning"
Spatialization and reverberation

Audio spatialization examples:
Linear amplitude panning
Constant power panning using square root of intensity
Constant power panning using table lookup
Abstraction for constant-intensity stereo panning
Abstraction for quad panning using x,y coordinates
(The quadraphonic audio aspect of the following patches won't work if you only have stereo output, but it demonstrates some basics of multi-channel panning as well as an example of displaying panning with OpenGL animation.)
Quadraphonic panning with mouse control and OpenGL visualization
Quadraphonic panning based on radial angle
Gain factors for quadraphonic panning based on radial angle
Circular quadraphonic panning

For Wednesday May 1, 2013:

1) Write a program that uses the transport for tempo-relative control of metric timing to achieve a musically useful operation. Start by thinking about what kind of activities could benefit from transport control, such as synching a process to a beat or other metric unit, synching an automated process to a changing tempo, getting the computer to sync its sense of "beat" to a performer's tempo, causing particular events to occur at specific moments (timepoints), discerning one beat or pulse from another because of its placement in the metrical structure, etc. You can use the transport to control either MIDI or audio; the idea is to explore how the concept of tempo-relative timing can be useful to you.

By the end of the day on Monday April 29, post your patch (and any other essential files) on the class MessageBoard, and upload the file(s) to the EEE DropBox called "Transport". (Remember to put lots of comments in your patch!)

2) Starting from your initial plan, which you formulated in response to the assignment for April 10, outline a plan for a final project that will fulfill the course requirement for "one complete work of technological music, roughly 7-10 minutes in duration, incorporating [your] research and technological studies, and dealing with well-defined aesthetic, technological, and compositional foci." Your plan should include a) your conceptual focus, b) the technological and personnel resources you will require, c) how you intend to implement your piece with software and hardware, d) steps you will need to take and a timeline for those steps, and e) what specific things you think you will need to learn along the way.

Post your plan on the MessageBoard no later than the end of the day on Tuesday April 30.

Make an appointment to meet with the professor on Tuesday April 30 or Thursday May 2 to discuss your plan.

3) Watch the following videos, read the following articles, and study the following patches on the topic of Max for Live, in preparation for the upcoming class session.

Video Tutorials:
Introducing Max for Live
Max for Live Video Tutorial #3: Simple Delay
Max for Live Video Tutorial #4: State-Variable Filter
Max for Live Video Tutorial #6: User Interface Part II
Programming in Max for Live: Creating a Wobble Bass MIDI Instrument (3 videos)
Programming in Max for Live: Creating a Live API-based Step Sequencer (2 videos)

Useful links and readings:
Live Abstractions: Useful audio and MIDI abstractions you can use in your own devices
Creating Devices that use the Live API: Max for Live provides ways to access the Live application directly
Live API Overview: Basic information about the Live API and links to further readings
The Live Object Model (LOM): What objects you find behind the Live API and what their structure, properties and functions are

Example patches:
Tremolo effect in Max for Live
Tap to teach tempo to Max
Tap tempo utility in Max for Live

For Wednesday April 24, 2013:

Write a program in Max that automatically algorithmically generates music that you find genuinely interesting and engaging. This requires a) that you have enough self-awareness to analyze what are some basic characteristics of what you find interesting and engaging in music, b) that you be able to formulate that as a formal methodology or system (keep it simple!), and c) that your system be possible to implement with the available technology (i.e., with Max). Be able to justify/explain why you find the resulting music interesting and pleasing, and also analyze critically its shortcomings relative to your hopes.

Write a program that algorithmically transforms some MIDI input information into control information for a sound generator. Again start by thinking of a (simple) relationship between input control information and output sound that you think might be useful or interesting or expressive, and think about what sort of MIDI controller you have (or could easily have) at your disposal, then try to implement a match between your controller and the musical result. Bear in mind that your controller could control one or more parameters of the sound directly, or it could control some parameter(s) of an algorithmic system that is providing some of the musical information automatically.

Your result will be two Max patches: one that plays music automatically (just turn it on and it does something musically interesting using MIDI), and one that you control or influence via MIDI (but which may also have some automated aspects). By the end of the day on Monday April 22, post a compressed version of your Max patches on the MessageBoard, and turn in your Max patch files to the EEE DropBox called "AlgorithmicMIDI". If you're using another program or plugin to synthesize your sound (Reason, Live, Kontact, etc.), include whatever other files you think will be necessary to reproduce your piece. Accompany your MessageBoard post with a short explanation of what you were trying to accomplish musically and what you like about it, and also discuss how it falls short of your goal or can be improved with more development.

For the upcoming class, in which two of the topics will be "logarithmic versus linear" and "tempo-relative timing", check out the following readings and examples.

Fechner's law definition
Fechner's law explanation
What is amplitude?
Robinson-Dadson curves
Harmonic series
12-tone equal-tempered scale

Max examples:
Exponential mapping
Linear vs. exponential change in the rate of events

Tempo-relative timing
Time value syntax

Max examples:
Simple demonstration of the transport object
Other basic functionality of the transport object
Tempo-relative timing with the transport object
Tempo-relative timing for MSP LFO control
Synchronizing MSP audio loops with the transport
The translate object updates its output when the tempo changes
Rhythmic delays in time with a musical tempo
Using timepoints for interactive sequencing
Multiple simultaneous tempi using named transports

For Wednesday April 17, 2013:

By the end of the day on Monday April 15, make a composition of at least 20-30 seconds of interesting sound shaped by linear control signal(s), employing some of the objects we discussed in class: line~, line, and matrix~. In addition to single lines, you can use line segment functions created with the function object or with list messages to line~.

Teach yourself about one additional line generator: the phasor~ object, which you are also welcome to use in this assignment. In addition to the help file and reference page, you might want to try the following two online examples: The phasor~ object and Using phasor~ directly as a control signal.

Bear in mind that the linear control can be applied to anything, or many things, not just amplitude. It could control amplitude, frequency, rate, range, delay time, anything that you would like to change gradually over any amount of time.

Put lots of comments in your patch explaining how it works.

Post a compressed text version of your patch on the MessageBoard so that other people can see/hear it. If you use any abstractions, don't forget to provide the abstraction patch, too. Also, place the patch(es) in the EEE DropBox called "LinearControl".

In preparation for the upcoming class, read the MIDI Manufacturers Association tutorial on "MIDI and Music Synthesis". The first sections -- Introduction, MIDI vs. Digitized Audio, and MIDI Basics -- might not be particularly exciting, but read 'em anyway. Focus more closely on the subsequent sections that describe the different kinds of MIDI messages, and the General MIDI specification. We will discuss MIDI messages in some detail in class.

Two other topics we will be discussing are lookup and mapping. In preparation for those topics, take a look at these examples.

Counting through a list
Analysis of some patterns
Read MIDI pitches from a lookup table

Linear mapping of ranges
Linear mapping of MIDI to amplitude

The scale object is actually much simpler to use than the methods shown in these examples, so take a look at the help file for that object.

If you're interested in more of the underlying math of linear mapping, take a look at these examples.

More mapping:
Linear mapping of one range to another
Linear mapping equation
Linear mapping and linear interpolation

For Wednesday April 10, 2013:

By the end of the day on Monday April 8, post in the Discussion section of the class MessageBoard some detailed answers to the following questions.

In addition to discussing your individual projects, we'll review (and hopefully solidify) your understanding of some MSP basics. In preparation for that discussion, please read the following articles and try out the following Max examples. Come prepared with questions about whatever you don't understand.

If you haven't already done so, read the introductory explanations about MSP. Much of it will be stuff you already know, but it never hurts to review, and you'll likely improve your understanding of how MSP actually works.

[You can read these online or in the MSP Tutorial within Max.]
How MSP Works
How Digital Audio Works

[You can read this online or in the MSP Tutorial within Max (where you can also try the patch).]
MSP Tutorial 2: Adjustable Oscillator
[For the following examples, you can try the patch by clicking on the graphic, copying the JSON source code of the Max patch, then saving it as a file (or pasting it into an empty patch) in Max.]
Linear fade-in/out of audio
Line segment control functions
Smooth audio switching to bypass an audio effect
Mixing multiple audio processes
Using matrix~ for audio routing and mixing

For Wednesday April 3, 2013:

Come to class prepared to give a short presentation about the work you did in the previous quarter's Music 215 class. The presentation may or may not include an actual performance, but should at least document and summarize the work that was done and the future directions that are planned.

This page was last modified June 2, 2013.
Christopher Dobrian,