Why A Digital Signal Is Never Accurate

The signal that comes from the output of your microphone preamplifier is pure and perfect. But as soon as it is converted to digital, something is lost. Why does that happen?

Let's assume you have a great performer, with a great instrument in a wonderful acoustic space. You choose the ideal microphone and use a really good preamplifier.

 You have the perfect audio signal, right there at the output of the preamp.

That is the art you and your performer have created.

But to record that signal, it must be converted to digital form. And in that process, something is lost, and noise and distortion are added.

Why is that?

The process of digital conversion involves measuring the level of the signal, thousands of times every second.

So at each of these measurement points, the signal is 'sampled', which means that its voltage is held constant for a short time while it can be measured.

The measurement is then 'quantized'. In a CD-quality digital audio signal, there are only 65,536 allowable values. So the actual value of the signal is pushed up or pulled down to the nearest allowable quantization interval.

This is where the inaccuracy lies. The signal was almost certainly somewhere in between quantization intervals, but it has to be approximated by the one that was closest.

This happens at every measurement point - 44,100 per second in CD-quality audio. So in every second, there are 44,100 tiny little inaccuracies in the digital signal.

Since these are mostly random, the effect is heard as noise. But since they are not entirely random but related to the signal, they also produce distortion. Distortion is where the shape of the waveform is changed from what it should be and additional frequencies are added.

So the process of digitization adds noise and distortion. 'Quantization noise' and 'quantization distortion' we call them

Fortunately there are two ways the problem can be lessened.

The first is to add 'dither noise' to the signal. Oddly enough, adding noise randomizes the quantization noise and, done properly, removes the harshness.

The second is to sample at a higher resolution - 24 bits instead of 16, for example. This makes the inaccuracies smaller.

Even so, no matter how fine the resolution, detail is always lost and noise and distortion are added.

Your signal will never again be as pure and perfect as it was at the output of the preamplifier.

Having said that, the inaccuracies produced even in CD-quality audio are tiny, and in 24-bit audio they are tinier still. It is worth knowing about, but most people would find it difficult to tell the difference between the raw output of the preamp, and the digitized version.

(Note - the delta-sigma type of analog to digital converter operates in a rather different way to the converter described. However it is still subject to similar errors in quantization.)

David Mellor - Record-Producer.com 2007