Music 147 / ICS 180 / ECE 198
Computer Audio: Musical Applications of Digital Signal Processing
UCI

Final Exam Preparation


Final Exam (in class): Thursday, March 14, 2002, 11:00-12:20, MMB 216


Assigned readings up to this point:

Digital Audio by Christopher Dobrian

Textbook (Dodge and Jerse), chapters 2, 3, 4, 5 (pp. 115-139), 6, 7 (section 7.2), 10, and 12 (pp. 402-416).

The MSP manual and tutorials contained in the document MSP2.pdf.

Possible supplementary readings include Road's Computer Music Tutorial and Strawn's Digital Audio Signal Processing, both of which are on reserve in the Arts Media Center.

See also the glossary in the textbook.

Possible Midterm Exam Topics

What causes sound?

What does "amplitude" refer to?

What does "frequency" refer to?

What characteristics must sound have for us to hear it?

What is the range of frequencies audible to humans?

How is sound recorded digitally?

What does "sampling rate" refer to?

What is "aliasing"?

What is the physical and audible effect of aliasing?

What is the "Nyquist theorem"?

What is the definition of the "Nyquist rate"?

What aspect of the recorded (or synthesized) sound is affected by the sampling rate?

What is meant by "quantization precision" or "bit precision" in digital audio?

What aspect of the sound is affected by precision quantization?

What is quantization noise?

How do you calculate the amplitude of quantization noise?

What is the process of operations by which sound is quantized and played back?

What is a "low pass filter"?

Why do we use a low pass filter before digitally recording sound?

Why do we use a low pass filter when playing digital audio (when converting it from digital to analog information)?

What is "simple harmonic motion"?

How is simple harmonic motion related to the trigonometric "sine" function?

What is a "complex" tone?

Are most sounds simple or complex?

What does "spectrum" refer to?

What does "timbre" refer to?

What is "additive synthesis"?

What is "wavetable synthesis"?

What mathematical operation corresponds to the analog phenomenon of "amplification"?

What does "amplitude envelope" refer to?

What does "envelope" in general refer to?

What are the differences in amplitude envelope between a plucked string (such as a guitar), a (gently) blown tube (such as a flute or a trumpet), and a struck surface (such as a xylophone or a drum)?

What is the relationship between "frequency" and "pitch"?

What is an "octave"?

How does one calculate "equal temperament" of the pitches used in most Western classical music (e.g., a piano or a synthesizer keyboard)?

What is the relationship between "amplitude" and "loudness"?

What is a "decibel"?

What does a decibel measurement refer to?

How does one calculate the decibel difference between two amplitudes?

What is "frequency modulation" (FM)?

What is "vibrato"?

How do we create the effect of vibrato by means of frequency modulation?

What is a "low frequency oscillator" (LFO), and why is it useful?

What happens when we increase the amplitude of the modulating wave?

What happens when the frequency of the modulating wave is at an audio rate?

How can we calculate what frequencies might be produced by FM at an audio rate?

What is the "harmonicity index"? How can you calculate it?

What is the "modulation index"? How can you calculate it?

What happens to the timbre of an FM sound as you increase the modulation index?

What kind of tone is produced when you use a non-integer harmonicity index?

What is "sampling"?

How does a sampling synthesizer work?

What MSP objects are particularly useful for sampling?

What is a "sustain loop" in a sampling synthesizer? What is it good for?

What is "ring modulation"?

What mathematical operation is used to achieve ring modulation?

What frequencies are produced when you ring modulate two sine waves of frequency f1 and f2?

What would happen if you ring modulated two complex tones?

What is "clipping"?

If you play two simultaneous notes at full amplitude in MSP, what will happen?

What will the audible effect probably be?

What does the acronym MIDI stand for?

What is MIDI?

What is the transmission rate of MIDI data?

Is MIDI good for transmitting digital audio?

What are the different types of MIDI "channel messages"?

What is a "status byte"?

What is the distinguishing characteristic of a status byte?

What is a MIDI channel?

Why is a MIDI channel useful?

Where does the MIDI channel information reside in a MIDI channel message?

How many bytes are in a MIDI note-on message?

What information is contained in a MIDI note-on message?

What is the MIDI code number (key number) for the pitch middle C?

What is MIDI note-on "velocity"?

How is velocity generally measured on a keyboard?

What is MIDI "pitchbend"?

What is the difference in transmission and usage between MIDI "events" such as notes, and "continuous control" messages (such as volume)? That is, what is different between the way that a keyboard sends out MIDI note messages and volume messages, and why is that difference musically necessary?

Why do we listen to sound in stereo?

What is interauraral intensity difference (IID)?

What is interaural time difference (ITD)?

What is the speed of sound (in air, at sea level)?

What is the range of time delays that are significant for simulating ITD?

How you might you use delay between left and right channels (in Csound) to evoke a sense of location?

How does amplitude vary with the distance of a sound source?

In addition to amplitude, what other factor is our primary cue for distance?

What is the main factor that permits us to locate sounds situated behind us?

What are head-related transfer functions (HRTFs)?

How are HRTFs measured?

What is the difference between "linear" panning and "equal power" panning?

What is the "Doppler effect"?

As a sound is moving toward you, does its frequency vary upward or downward as its velocity relative to you increases?

What is a low-pass filter?

What is a high-pass filter?

What is the technical definition of the "cutoff frequency"?

What is a band-pass filter?

What is the technical definition of the "bandwidth"?

What is the "center frequency" of a band-pass filter?

What is Q?

How is Q calculated for a band-pass filter?

What is the audible effect of a low-pass filter?

What is the audible effect of increasing the Q?

What is the primary means of digitally filtering sounds?

What is the general equation for filtering using scaled, time-delayed versions of a digital signal?

What is meant by a "finite impulse response" (FIR) filter (also known as a "feedforward" filter)?

By deduction, then, what is an "infinite impulse response" (IIR) filter?

What is a comb filter?

How is a comb-filtering effect achieved?

How can one calculate the fundamental frequency of a comb filter?

What is "white noise"?

How is white noise generated in a computer?

What is meant by the terms "time domain" and "frequency domain", when viewing discrete audio signals?

Given that sound occurs only in passing time, what is actually depicted in a two-dimensional plot (amplitude over frequency) of a sound spectrum for a single "instant"?

What is the "Fourier theorem"?

What is "Fourier analysis"?

What is the "discrete Fourier transform"?

How does the number of discrete samples used in a Fourier transform affect the number of frequency "bins" in the resulting spectrum?

What are the advantages and disadvantages of increasing the number of discrete samples in a Fourier transform?

How can the Fourier transform of a sound be used for resynthesis of new sounds?

What is convolution?

What is the relationship between convolution in the time domain and multiplication in the frequency domain?

How are multi-tracking, mixing, and filtering used to enhance clarity in a complex recorded texture?

What are some of the uses of delay in audio processing for commercial music recordings?

What is "flanging"?

What is "chorusing"?


Posted February 27, 2002
Christopher Dobrian dobrian@uci.edu