Understanding recording settings

1040462.jpg

Another question I get asked a lot is: “What settings should I use when I’m recording?”

When you open your recording settings on programs like Adobe Audition, ProTools or even on a portable recorder you see things like Sample Rate and Bit Depth. Clicking on them, all you’ll see is a whole bunch of numbers and names which may be a bit daunting. Which do you pick? Is the default good enough? Well, before you start, there’s a few things you should take into consideration—don’t just stick with the default option.

Firstly, let me explain what each bit means

Bit Depth

Bit depth in a nutshell refers to the dynamic range of audio file. Think of it like this the higher the number the bits of information are collected in the audio recording making it a more accurate recording, particularly when it comes to the subtle changes in the waveform.

 
Bit_Depth.png
 

Sample Rate

Similarly to bit depth, an audio file’s sample rate refers to the number of samples taken per second to represent the audio digitally. When recording, you need to be able to sample at least twice every cycle. For example, 1kHz tone takes 1/1000th of a second to complete a cycle, so if we wanted to make a recording of it you’d need to make a recording of it at twice that speed with a sample rate of 2kHz. (If you’d like to understand the math behind it a little more I suggest you follow this). Now with that in mind let’s put that into a more real scenario. If we wanted to record something that is a bit more complicated like a voice, which will have a large number of different frequencies all at different times up to the limit of human hearing (20kHz) you’d need a sample rate of at least 40kHz, or what you’ll most likely see in your DAWs 44.1kHz. Bellow is an example of what the human voice sounds like at 48kHz, 22.05kHz, 11.25kHz and 6kHz (in that order). As you’ll hear, it sounds audibly worse the lower the sample rate. Also notice how you lose your high frequencies the lower you go.

 
 

Ok, cool I know what it all means now but what should I use???

Well, it really depends on what your delivery is. For example Apple’s Mastered for iTunes standard requires a 24 bit rate and at least 44.1 kHz sample rate, but when you’re recording, it would be better to bump up your sample rate to 96kHz and keep your bit rate where it is, just to get a little bit of wiggle room when editing.

Best case scenario, you’d record as high as possible for both Bit Rate and Sample Rate. But lets be honest, if you did that you’d run out of hard drive space, and on top of that you’d require a lot of processing power. Unless your delivery requires a particular bit rate or sample rate, record 48 kHz at 16 bit and render out everything at the same (or 44.1 kHz if you’re burning to a CD). This provides a happy little medium between file size and quality, and will be fine for most internet delivery and even for things like radio!

As for file type, when you’re recording something you want start with a .WAV file. When you’re rendering everything out .MP3 if you want to send it easily via the internet (or via email) but, if you want to retain a lot more quality use .WAV- it’s a lossless codec and is the industry standard when submitting your music or whatever for publishing to iTunes, Spotify, etc BUT will mean you can’t email it directly to people.

Well, I think that’s enough for this post. In a future post I will go over the finer details of what all the different file types mean to you.

Harry Hughes