We’ve gotten started putting up some video tips, despite some rather annoying technical difficulties. The audio is currently only in mono. It is surprisingly buggy and annoying trying to capture not only the voice over, but also the output of Pro Tools. You’d think this is easy, but 5 trial programs later NOTHING worked, and we decided to push ahead. These first videos are actually fine in mono, because we want to be sure that the sound works in mono!

Here is the first video in this series of recording blog posts we’re doing on “Mixing Your Home Recordings”. Below the video is some more info on the technical side of things. Hope you dig it!

Although all the tech-speak is not necessary for someone producing tracks entirely in a home studio, understanding the nomenclature of audio engineering is important for those of you who may work in a professional recording environment. An often misunderstood term is “phase”. You may hear things like “out of phase”, “sounds phasey”, or “set phasers to stun” (when dealing with the real studio geeks…). What does this mean? And what about that “phase invert” button on many console preamps, and even on plugins?

First of all, you have to understand that an audio signal represented by a voltage over time. Such a signal has both positive and negative values that change over time. On a very short scale (that is, zooming in close on some kick drum or snare drum hits, for example), we see a sort of periodic wave:

Granted, the wave is not perfectly periodic as, let’s say, a sustained synth note would be, but over a short time, it is close enough to that to make or break your drum sound, so bear with me.

Let’s clear up what “absolute phase” means first. Phase is a relationship between two signals, but you might also want to be sure that the first “motion” of the wave is in the positive direction. That looks like “upwards” in the DAW display, and translates to an outward motion of your speaker in the physical world. Can I hear the difference between a single kick drum track with the initial motion as “positive” instead of “negative”? No. But I do need to pick some sort of reference, and why NOT have the drums start with an outward speaker motion, instead of inward. Just superstition, I guess. Anyway, that is the “absolute phase”, for our purposes; “Does it go up first, or down”.

Have another look at the picture above (click to enlarge). You’ll see that the track “bass box” (highlighted) is muted, but the two tracks below that are active, and they don’t line up very well. Those two tracks are two separate mics on the kick drums, a D12 and a D112. For some reason, the D12’s phase is inverted in relationship to the D112. This is bad. Now the phase makes a difference! These signals are similar enough so that adding the two together (and that’s all that mixing is, really – adding signals together) will cause a lot of important info in the combined signal to be canceled out. They won’t cancel completely, but what you do hear will sound “out of phase”; typically thin and undefined. That’s a great example for how that phrase is used. We could say that their “relative phase” was out of whack.

How do we fix this? The first step is to flip the phase on the D12 track. That puts it right; now both signals start out in the positive direction. See the video for how I handled this. You could also process the file with a “phase invert” algorithm.

Next – and look at the image one more time – you’ll also notice that the D12 signal starts a hair earlier than the D112 signal. Most likely, the D12 mic was a bit closer to the drum. This is a time-based phase problem, and will cause quite similar “phasey” sounding problems, since these signals are so similar over the time frame that we’re dealing with. This is one of the basic problems that crops up as you add more microphones around a drum kit. This is also why a “less is more” approach to mic’ing drums works – few microphones in close proximity means less phase relationship problems, and results in a clearer, fuller sound. You can, however phase correct and time align the multiple mics:

Of course this looks neat, but the sound of it comes first. You can hear the difference in the video. When the multiple mics are in phase and time aligned, the drum sounds have more impact and “focus” – they sound more real and “solid”.

When cutting tracks, you can also avoid problems by sticking to the “Rule of Three”: If you have one microphone on a source, postition other microphones at least three times the distance from the sound source as the first microphone. Of course, multi-micing a guitar amp, for instance, is common practice. The key is to keep the microphones almost teh same distance from the speaker. Naturally, with what you have learned here, you might come to the idea of listening to the combined microphones while flipping the phase of one of them, or nudging one track a few samples/micro-/milliseconds to change the time alignment. Then, trust your ears!

Important note: very dis-similar signals will not have this problem. So don’t go insane time-aligning all your tracks… bass, keyboards, guitars, drums, etc. All you will wind up doing is removing the little timing nuances that give a song it’s soul. In fact, a “fatter” sound happens when timing is a little loose. How much of this is right? It depends on the style, performance, etc. Use your discretion, and my advice is to err on the side of not DAW-editing your tracks to death. Correct obvious timing mistakes if needed, but resist the urge to line up everything like a manicured garden of sound files.