N i g e l wrote:16 bit is ok for CD playback as the audio will have been optimised to fit.
Very little recorded audio has a dynamic range anywhere near 90dB, so it's not a case of 'being optimised to fit' -- 99% of all recorded audio would fit comfortably anyway.
But 16-bit is more than sufficient for CD (or any domestic format) as some simple sums quickly reveal. The typical ambient or background noise of a quiet lounge, say, in a quiet suburb in the dead of night, is going to be around 30dBA. More likely it's closer to 40dBA... (A professionally designed and constructed studio control room will often have a background noise level of 25dBA, and some can be down to 15dBA across the mid band frequencies -- but that's getting very specialised, very difficult, and very expensive).
So, if the domestic listener adjusts the volume of his mega-expensive super-dooper hi-fi to set the dither noise floor of the CD at or just below the ambient room noise, then full scale digital peaks will be around 93dB higher... which is at least 123dBA, and potentially over 130dBA. Not only is that painfully if not damagingly loud, it would be one heck of a good hi-fi that could actually generate that kind of sound pressure level (for more than one snare hit!).
Therefore there is zero benefit in using more than 16-bits for any consumer release format. The extra dynamic range capability would be wasted even if material could be found that made genuine use of it, and the reduced noise floor would be inaudible anyway below the ambient acoustic noise floor.
However, for source recording there are very sane arguments for using 24 bits to allow additional headroom to cope with unexpectedly loud transients peaks etc, and for post-production even greater dynamic range can be helpful, which is why the 32-bit fixed or float formats are generally employed (and now 64-bit float in some systems -- but more for convenient programming than any real sound benefit)
Is a CD track today the same as it would have been in the 80’s or can they now do additional processing purely to the digital data (eg dithering or better dithering) to make it sound better on the playback machine ?
Actually, some of those early 80s CDs didn't have any dithering at all (and you're careful you can sometimes hear the reverb tails and track noise breakup in the quietest end of the fade-outs as a result!). I think it's true to say that A-D performance, in particular, has improved quite significantly since the first generation of CD mastering recorders, and the introduction of psycho-acoustic noise-shaped dither systems has also been beneficial. D-A conversion has also improved with oversampling and delta-sigma techniques, as have the all-important clocking technologies, of course.
So no, a modern CD track wouldn't be the same, it would be better... except that in the 80s the loudness war hadn't really got going, so although material was being peak normalised in mastering, it wasn't being limited and compressed until the pips squeaked, which is largely is today... So while the technical
sound quality today is better than the 80s, the musical
quality is, IMHO, often significantly lower.