Please could Hugh Robjohns write a comprehensive article explaining the operational advantages and disadvantages of using a word-clock signal to synchronise studio equipment as compared to alternative methods? Further, if the audio has been reference-clocked as it was recorded, does the replay chain (perhaps including multiple downstream signal processors) still require a synchronisation reference or is the clocking information embedded in the recorded data sufficient to hold all the downstream equipment in the correct relationship? Finally, with respect to jitter, will using an external master clock to synchronise the equipment chain prevent it?
SOS Forum Post
Technical Editor Hugh Robjohns replies: A new series of articles concerning various practical aspects of working with digital audio is planned for the near future, but in the meantime I'll have a bash at tackling your list of clocking questions.
A lot of equipment accepts only a simple word clock reference input, rather than AES or composite video references, purely because it is far easier and cheaper to implement. However, there is no significant technical advantage in using only the word clock format. Some might argue that the ability to daisy-chain a word clock signal around a number of devices using BNC T-pieces makes word clock superior since it provides a cheap and convenient way of distributing a reference clock. The problem is that while this approach can work in controlled situations, there are inherent dangers involved if the equipment isn't (or can't be) configured correctly, or the setup is changed without correctly re-engineering the chain. A proper star-shaped distribution of clock signals from a dedicated hub or master generator, using word clock, AES or a combination of both, is a far better and more reliable approach.
As to your second question, a digital recording has, by definition, to be clocked from a reference at the source. That reference is most often the internal crystal clock of an A-D converter. At each subsequent transfer of the digital audio from one machine to the next (assuming the use of AES, S/PDIF, SDIF3, MADI or ADAT interfaces) the clocking information is fully embedded and passed along with the audio. Assuming the equipment is configured to extract the embedded reference clock from its input signal, then it is not strictly necessary to provide separate reference clock signals from a master generator system. However, in larger setups there are significant practical and technical advantages in using a central master clock to provide stable references to the entire system.
Jitter is the enemy of all clocking systems because it introduces a degree of uncertainty in the timing of samples, which translates as a rise in the noise floor with various noise modulation effects, and often causes a blurring or instability in the stereo image at the A-D and D-A conversion stages. However, it should be noted that these jitter effects only become an issue at the points where audio is converted between the analogue and digital worlds — digital transfers between equipment are completely unaffected by even quite severe levels of jitter.
A good-quality master clock should have less intrinsic jitter than most individual devices, but that isn't a guarantee that you'll have a jitter-free system. There are three main causes of jitter: poor clock design, poor clock-recovery circuits (the part of the A-D/D-A converter which extracts the clock data from an incoming digital audio or reference signal), and the effects of the interconnecting cables. Of these, cable effects and poor clock-recovery circuits cause the most problems. The capacitance inherent in cables limits the slew rate of the data — how fast the signal can transition from one binary state to the other. At the output of a piece of digital equipment the data might switch from one state to the other in a beautifully crisp square wave, but by the time it reaches the input of another device the cable capacitance will have rounded it out into something looking more like a triangle wave. The clocking reference timing is generally taken from the points where the data transitions cross the nominal centre line of the waveform, and if these 'vertical transitions' have become sloping lines because of the cable capacitance, the precise point of transition becomes rather vague — that's jitter!
The greater the capacitance of the cable, the worse this problem becomes, so short, high-quality, low-capacitance cables will preserve clocking information far better than overly long, cheap, high-capacitance ones. Obviously, fibre optic cables don't suffer from electrical capacitance, but they have an optical equivalent, which is dispersion. If the optical quality of the plastic or glass is not optimised, the pulses of light can be degraded in such a way that the transitions between light and dark become (quite literally) blurred, and that causes exactly the same kind of jitter problems.
Fortunately, a good clock-recovery circuit can reject the effects of cable jitter, and some companies put a lot of effort into designing good jitter-rejecting clock recovery circuits. The problem is that most techniques which reject jitter to a high degree are very slow to respond and synchronise in the first place, so a practical compromise has to be reached, trading jitter rejection for responsiveness.
So, given that cables induce clock jitter, and that some jitter often seeps through the clock-recovery circuitry, it won't come as a surprise to learn that it is often better to use the A-D converter's own internal crystal clock as the reference, both for the conversion itself and the rest of the digital system, rather than use an external reference. This assumes that the converter has a good-quality low-jitter clock, of course. If it doesn't, you might get better results clocking from a better-quality external clock, although you are then at the mercy of the jitter-rejection capability of the device's clock-recovery circuit.
Sometimes there is no choice but to externally clock an A-D converter, as is the case when you need to synchronise several separate A-D converters for a multi-channel recording, for example. Using good-quality converters, linked with short clock cables to a common master reference clock would be the best and most practical solution in this case. The only alternative would be to run each converter on its internal clock, and then use sample-rate converters to resynchronise their outputs to a common reference — an expensive option, and one which might introduce a whole different set of unwanted artifacts!