A new business model for mastering houses allows you to submit tracks via the Web and pay on-line. Last month, we embarked on a unique test of these services. This month: the results are in...
If you read last month's feature on on-line mastering, you'll remember that we sent the same three recordings to five different mastering studios via their Web front ends, with the aim of comparing both the service dimension and the quality of the results. What's more, if you have the DVD that accompanied last month's issue (www.soundonsound.com/dvd/), you'll have been able to listen to the mastered tracks under exactly the same blind test conditions in which we auditioned them at the SOS office. Many of you have already listed your preferences and comments in SOS Forum discussions. Note: the DVD has long since sold out, but Part 1 includes audio files in a Soundcloud Player.
What I didn't do last month was reveal the key to the blind test, so it's time to bring the suspense to an end. Did you manage to pick out the version of each track that we mastered ourselves from the professionals' efforts (we couldn't in every case)? Did you find yourself preferring one mastering house consistently, or did different engineers tackle different tracks more successfully? Well, the table below shows the answers. I've summarised the thoughts of the SOS staff on each version to save you referring back to last month's issue.
Despite our best efforts, most of us found it impossible to rate all six versions of each track in strict numerical order. And in as much as we were able to conclude anything from the preferences that were expressed, it was that no one mastering house stood out. So, interestingly, it seemed that our preferences weren't related either to the cost of the mastering, nor the amount of interaction that was possible through the Web-based interface. Hafod Mastering, the least expensive service in the test, did about as well as the costliest, Super Audio Mastering. I criticised Metropolis' iMastering slightly for its impersonal interface (to which they now offer an alternative — see this month's News), but it scored just as highly as the services under the Mastering World banner, which I found more approachable. It does seem fair to say that the professionally mastered versions were better liked than our 'home brew' masters, but the difference was smaller than any of us expected.
|The Debbie Taylor Band: 'Melt Like This Snow'||Recorded and mixed by Dave Lockwood|
|1.||Dave's home-brew master||Most of us spotted this.|
|2.||Super Audio Mastering||Debbie Poyser's favourite. No-one disliked it.|
|3.||eMasters||Debbie, Mike Senior and Hugh Robjohns disliked this. Dave Lockwood liked it.|
|4.||Loud Mastering||Hugh's favourite, but Dave would have sent it back!|
|5.||Metropolis iMastering||Dave and Paul White's favourite. No-one disliked it.|
|6.||Hafod Mastering||Mike's favourite. Again, no-one disliked it.|
|The Morning People: 'Ancient History'||Recorded and mixed by Sam Inglis|
|1.||Hafod Mastering||Dave thought this one OK. Everyone else disliked it.|
|2.||Sam's home-brew master||Debbie's favourite. Others disliked it, but no-one picked it — not even me.|
|3.||eMasters||No-one particularly liked it. Dave disliked it.|
|4.||Super Audio Mastering||Most people's favourite. No-one disliked it.|
|5.||Metropolis iMastering||Dave's favourite. Paul and Mike thought it OK. Others disliked it.|
|6.||Loud Mastering||Matt Bell, Debbie and Mike all liked it, others felt it too harsh.|
|The Resistance: 'The Baltic Fleet'||Recorded and mixed by David Glasper|
|1.||Metropolis iMastering||Paul's joint favourite. No-one else much liked it.|
|2.||Loud Mastering||Paul, Dave and Hugh's least favourite. Mike, Debbie, David Glasper's and my favourite!|
|3.||Super Audio Mastering||Hugh liked it. Dave thought it 'congested'. David hated it.|
|4.||eMasters||Hugh's favourite. David and Debbie liked it. Mike and I disliked it.|
|5.||Hafod Mastering||Paul's joint favourite. Mike and Debbie liked it. David disliked it.|
|6.||The original unmastered mix||Everyone spotted this.|
Can we draw any general conclusions about mastering from this exercise? Well, as I said last month, it really brings home two points. The first is that mastering is not a magical process for making mixes sound better: its primary functions are to make disparate tracks sit together as a coherent whole, and to do the technical work necessary to prepare music for duplication. The second is that, just like mixing or any other aspect of record production, mastering preferences are above all a matter of individual taste. All of the engineers who took part in the test demonstrated the necessary technical competence, and beyond that, there are no absolutes.