Hugh Robjohns wrote:In the analogue world, with multiple signals passing through console channels and tape machines multiple times, there are countless high-pass filters everywhere in the form of inter-stage coupling capacitors which all introduce not only frequency response changes, but also phase shifts. Same for coupling transformers. And then there is HF crosstalk effects within and between adjacent channels and elsewhere introducing more phase shifts and response changes. And added noise, and added distortions.
All true - of course. All analogue circuitry acts in a complex way, as each component interacts with each other component. In some types of circuit, such as valve amps, these different levels of complexity are extreme and can be easily observed as harmonic and dis-harmonic distortions are introduced to the signal. These can be additional wave-forms or the suppression of other wave-forms. A typical example is the loading of audio signals onto tape and at such levels that the HF signals of one source suppresses the HF signals of a weaker source - thereby acting as a type of multiband compressor. Digital calculations can mimic such effects, but only fairly roughly - but hey! They're getting better!
One issue though with analogue is phase. Phase compensation in things like filters is in a completely different league to, say 20 years ago. Even the better quality budget desks (e.g. A&H R-Series) are just rock-solid. I tested their 24R a couple of years back and I could not get the signal to shift sideways on my multi-channel Oscar, no matter where I placed the filters or what frequency I was shoving through it.
Even with the better (and light-years more expensive!) desks from the 80s and into the 90s, turning the filters to extreme levels saw the signal from that channel drift sideways, albeit slightly.
Another issue is that all digital processes take time. Electrical events and therefore all analogue electrical processes take place instantaneously - well, at about 1/3 the speed of light, so we'll call that instantly.
Switch on an FM radio and it comes to life immediately. DAB takes a few seconds to find a transmission stream, identify the COFDM packages for Radio Four and reassemble them and turn that into an analogue audio signal and make our lives miserable with 'Woman's Hour'.
Some digital events take so long, we can slip out for a coffee, whilst some workstation boots, up, finds the wi-fi signal, up-dates our emails and scratches its nuts and then wastes five minutes looking for some routine we only half-deleted yesterday! In fact, the time-delay involved with digital is the biggest problem facing those who design such things.
This is important!
We hear stereo and are able to pin-point the position of the source of a sound, even when blindfolded, because of time. Left-to-right positioning is the difference between the time it takes for a sound to reach the L and R ears. Sound moves at about 1ms per foot, so the difference between all-the-way-to-the-left and all-the-way-to-the-right is just one millisecond! Now try and work out what the time difference is for just two or three degrees! It's tiny! It's one-90th of one millisecond!
Of course, we also add the filter effect of our outer ears to help with positioning and with realising front-to-back and above and below, but the Haas effect tells us that time is the major component in our ability to locate the position of a sound source.
Now look at processing times in digital - they are catastrophic! The fastest ADDA round-trip conversion I know is for the various models of the Radar from iZ-Tech of Vancouver - it comes in at 1.6ms. PT HD and all the other usual suspects are much slower!
We talk of latency-free monitoring, when in fact we are talking about some damn box delivering the signal to the foldback path a couple of milliseconds after the event. Now imagine just how many compromises any multi-path digital summing has to battle through!
We position a sound by identifying a time difference of about one-90th of one thousandth of one second, or 0.01111 of a millisecond. Yet most audio is delivered at either 48kHz or 44.1kHz. That means that every sample is positioned at time intervals of about DOUBLE that required by our ears to accurately position a sound!
Nyquist theorem tells us that such positioning issues are irrelevant, as indeed they are - UNLESS we use plugins and/or other digital processes that cause the signal to be re-positioned, albeit microscopically.
Even if we do all our processing at 96kHz, that is still the time factor we require to position accurately, so any process such as compression, gating or repitching will have to place our audio in exactly the same sample it came from - and they often do not!
If you are recording using one pair of mics and no summing is involved, that makes no difference, as both sides of the stereo signal are treated (or suffer!) the same time approximations (assuming a decent recording device that is working properly and is internally clocked.
Analogue summing does not re-position bits of the audio to places where they do not belong. It works instantaneously, so everything goes to the right place time-wise.