EURO-SERGE - SYSTEM MODULES
ELBY Designs - Laurie Biddulph
9 Follan Close, Kariong, NSW 2250, Australia
28 of 42
OUTPUT MIXING
For three million years, in fact up until the early 1900's, EVERY sound heard
by a man or a woman was integrally associated with its source. The sound of
a lion implied a lion, the sound of a snapping twig implied a snapping twig,
and the sound of somebody calling out your name implied somebody calling
out your name. This changed suddenly and irrevocably with invention of the
record player. The sound of a lion could now be a speaker cone vibrating.
The sound of somebody calling out your name could be a telephone.
Despite this recent change, we humans still hear sounds as distinct entities.
Even though all the sounds in a room combine to form a single complex
pressure wave which vibrates our ear drum, we still hear the tap dripping, the
clomp of shoes in the apartment above, the cars whooshing outside, the
conversation in the other room, and if the radio is on not only do we hear the
music, we can hear the singer, the bass player, the piano, the drums and
even something else called "hiss". All these separate "sound sources" are
MIXED in the air to impinge on our ears as a single complex waveform. Our
brain easily sorts them out. This ability to sort different sounds is valid even
with electronic sounds. A synthesizer can create two different sounds, mix
them together, and the ear upon hearing them, can separate them out again.
For three million years not only could we tell that a lion was roaring, but we
could tell that the lion was roaring over there, that the twig broke behind that
bush,, that somebody called your name behind you. It was possible to
localise the sound in space. While it is important to know that a big cat is
around, it is just as important to know where (you run the other way!).
A "sound entity" is located in space by hearing the sound twice, once with
each ear. Because sound takes time to move through the air it reaches one
ear before the other - just enough to create a phase difference. The brain can
process these phase differences to locate the direction of the sound source.
The relative loudness and quality of the sound help to determine the distance
of the sound source. A phenomenon called the "Doppler Shift" helps to
determine whether the sound is coming or going. Because the brain
discovers the direction of a sound by phase differences and distance by
relative loudness, location can be simulated with two speakers. With two
speakers appropriately placed the brain can locate a recorded lion pacing
back and forth. To accomplish this a single sound is PANNED back and forth
between the two speakers.