Few artists so far have taken advantage of the Blu‑Ray format's potential to deliver stunning audio quality. A concert film by Dutch metal act Within Temptation shows what's possible.
Despite being largely ignored by the mainstream music media, symphonic metal is a genre that can boast legions of very loyal fans. Thanks to its fondness for spectacle and ambition, it also pushes technology to the limit, both in live performance and recording. These qualities are very much in evidence in Within Temptation's Black Symphony, a live recording of a special one‑off performance at a sold‑out Ahoy Arena in Rotterdam, with some 10,000 fans in attendance. For the occasion, the band worked up an elaborate stage show, and for the first time performed live with the Metropole Orchestra. The resulting concert film was then released not only as a DVD, but also in the high‑definition Blu‑Ray format, with unprecedented audio quality: viewers whose equipment permits can listen in uncompressed 24‑bit, 96kHz 5.1 surround.
The Black Symphony project was handled by a team working from Galaxy Studios in Belgium, one of relatively few studios that could offer the necessary facilities. Overseeing the recording and mixing was resident engineer Ronald Prent; the audio was also mastered at the same studio by Darcy Proper, while another Galaxy engineer, Wouter Strobbe, was responsible for managing the project and authoring the master discs.
"We used Peter Brandt's remote recording truck, which is fully analogue from the stage all the way to the truck,” relates Ronald Prent. "On stage we had Neumann mic preamps and splitters, and from there it went direct into Pro Tools at 96kHz/24‑bit, and we had 140 tracks. You can get 96 tracks onto one Pro Tools at 96/24, so we had two Pro Tools setups, and four Tascam digital recorders as backup. It was the best‑quality recording we could make, with a full analogue front end, not splitting off the Digicos [front of house mixers] but splitting at the microphone.”
As Prent explains, much thought also went into capturing the response of the crowd and the ambience of the arena. "We had an SPL Atmos spider microphone at the front of house [position], then we had seven Schoeps mics in the roof as a 7.1 — because there was talk of doing 7.1 — and we experimented with one of Peter Brandt's ideas, which was Schoeps boundary microphones stuck on a big Plexiglass plate positioned behind the stage. Those were actually the best for audience response. Then we had two B&K 4006s all the way in the rear for slap and depth, and shotguns on the front — two on the left‑hand side, two in the middle and two in the right. I used all of them. We wanted the audience to be as loud as possible.”
Even though there were 140 signals to contend with, the demands of the staging and filming meant that visible microphones, stands and cables were a no‑no, so wireless units and lavalier or clip‑on mics were the order of the day. "Everything was black, and when it wasn't black, it was blacked out with chalk!” laughs Prent. "We tried to get rid of as many microphones as we could without compromising the recording. The amps were on the sides of the stage left and right, facing outwards so they didn't have any crosstalk, miked with an SM57 and a Royer — that's what they had for the PA and that suited me perfectly. The drummer likes Audix and I do too. They have these clip‑on sets you almost don't see.”
Over the years, some orchestras have attracted the reputation of being Luddite. By contrast, the Metropole and its musicians are thoroughly at ease with technology, to the extent of having their own studio and preferred choices of microphones. "Initially I suggested some alternatives,” admits Prent, "but the orchestra was more comfortable with their own 'tried and true' approach, which, in the end, worked just fine for me.”
With the concert audio and video safely in the can, the next stage was what Ronald Prent describes as 'repair'. "For the orchestra, the only problem was the strings, because the huge temperature differences on the stage with flamethrowers and big lights meant that after five minutes, everybody was detuned. It wasn't too bad during the original performance, but for the record we re‑recorded the strings. It's something you have to do. The orchestra have their own studio in Hilversum where they rehearse, and they also record there. So in one afternoon they replayed the concert top to bottom and re‑recorded only the strings. That was done with normal miking for the orchestra. It's pretty ambient where they record, so some ambient mics were used so that we could blend it properly. That worked pretty well.”
This done, it was left to the tech‑savvy band and orchestra to tidy up their own performances where necessary. "We split up the audio onto two hard drives,” says Prent. "One went to the band, and they went through their own performance and repaired or edited whatever they wanted to, and then it came back to me. Then I just took my original recording and replaced what I had gotten from them — a guitar lick here, bass note there, the usual. Once I had that, it became one Session that was the full length of the concert; band only, including audience. From the orchestra I got back another Session with all their repairs in it, and I would put only the repairs back into my second Session; the complete show, orchestral tracks only. I spent a couple of days — I think about four — preparing the two Sessions in two different Pro Tools rigs and sync'ing them up. They were both running at 96kHz/24‑bit, bringing it back to 128 outputs, 130 outputs sometimes.”
The mix itself was done on Galaxy's custom‑built API Vision analogue console, which was designed to Ronald Prent's specifications. "The API console allows you to do a true stereo and surround mix at the same time. You have separate buses. The level automation remains the same, but the pan‑pot is designed in such a way that you can deal with it in stereo and surround simultaneously, and the bussing of the console is the same way. It's a discrete panner: left‑centre‑right, and the next one is front‑to‑back, and then left/right back. The trick in there is that from the front panner, the signal goes to two wipers. One feeds the stereo bus, the other feeds the surround bus. So if you pan something to the rear, you still have your front pan‑pot available on the stereo bus. And because it's a discrete panner it doesn't have level compensation anywhere in the image, so you can pan it anywhere without changing the balance.”
So how does one begin to approach such a vast mixing project? "I divided [the mix] up into band, audience and orchestra,” explains Prent. "The API console is an all‑discrete analogue console, so everything comes out of Pro Tools and becomes analogue and goes into the console over 128 to 130 channels. You have your drums, bass, guitars, keyboards, audience, vocals groups, the strings, woodwinds, and the percussion — just the percussion was 16 channels — and then you have the orchestral score. So I got the band going, got the orchestra going, and then got a blend and a balance. When it got to the actual mixing, first I spent two days with the music director of the orchestra, and we went with the score through the whole orchestra, and mixed each song for the orchestra, leaving the band where it was. When that was done and the orchestra was happy, then the band came and we did the same with the band: they heard what the orchestra was supposed to sound like, and I made adjustments to the band, and sometimes minor adjustments to the orchestra. And slowly it came together. Then there were a couple of days where I went through it and did the fine details and watched the picture.
"The interesting thing about this project was that at the same time I was doing audio, the guys were doing picture editing upstairs, so there could be a synergy between us — like they would say 'We don't have a shot for that, or we have a shot here, can you do something?' That gives a different dynamic from just mixing audio to a rough cut. The film guys liked that I was ahead of them, so they got a rough mix from me from whatever stage I was in, and then of course when the audience was especially exciting or the music was particularly dramatic I would change balance, and they would accommodate that in the picture.”
The fact that, strings excepted, the band and orchestra had not had to re‑play large sections of the performance made it possible to lean heavily on the many ambient mics, rather than using artificial reverb. "I like to leave the audience open as loud as I can so it's part of the sound — which is only difficult when they start replacing notes that they never played live!” laughs Prent. "But in this case, they played very well, so I could leave it loud. The hard work was getting the audience so loud that you actually have a feeling that you're in the concert, but still have the impact of a band and an orchestra. And there's no audience sweetening, as they call it, on this concert. It's the original audience. We now have the advantage of workstations where we can calculate delay times, so when we record we can measure them, and later when I mix, when there are some microphones that have too much of a delay, I can move them forward a little bit. That way I keep the big space, but I don't have all the reflections. It's not exactly true to the original but it gives you the environment, instead of using an artificial sound.”
A live recording such as this poses particular challenges when it comes to the lead vocal, but again, says Prent, "Nothing was completely re‑sung. Here and there a little was added, and some words are Auto‑Tuned. We did some de‑essing and a lot of levelling, but most of the dynamic control is done by hand. Especially with live, the more you compress, the more shit you get from crosstalk, so a lot of the vocals were done with fader rides or Pro Tools automation, whichever worked better.”
Once complete, the 5.1 and stereo mixes migrated across the Galaxy building to the studio of resident mastering engineer Darcy Proper. "Mastering, on a production like this, is probably the simplest part of the process,” she says. "In general, if I get good mixes, then the mastering job is easier — there's less to be corrected and it's simply polishing. In this production that was the case, what I got was great‑sounding stuff. In spite of that, I did the surround version twice. I did it once, came back the next day and decided that in my enthusiasm for making it loud and powerful, I'd pushed it too hard and wasn't happy with it. I said 'OK, that was a practice run, that was a day for free,' and then did it again and brought the level back a bit to allow it not to hit that brick wall at the top, for keeping the dynamics.
"From a mastering standpoint, the biggest challenge with this project was to help it become super‑powerful, the kind of experience that makes your heart pound. To do that you need to keep the dynamic, but you don't necessarily want to have something start off with a whimper just to allow it to get bigger later on. So for the intro where it starts with the orchestra and the choir, the idea was to allow them to have this great size and impressive sound that Ronald had created with the mix, but still leave room for the band to jump in on top of that and add that extra energy that really 'puts it over the top.' But I would say technically it was all pretty easy. I took Ronald's files, 96/24 from Pyramix, I put those into Pro Tools used with external converters — Pro Tools was just a glorified playback machine for me — that then allowed me to record into my Pyramix. Rather than going out of Pyramix, through my processing, and looping back in, I prefer to use one workstation for playback and the other for recording my mastered files. So from Pro Tools to analogue, I worked through my mastering chain in analogue and then back to digital, running picture along with it for visual reference.”
The chain in question consisted of "Basically EQ, compressing, limiting and some loudness maximising. For this, the compression is not doing a whole lot. It's long attack times, low ratios, just a level and feel thing, it's not doing a lot of pumping. The thing to watch out for in surround is that while you're still looking for compression to do something desirable to the sound, you have to keep in mind that it's not necessarily all going to work the same way all the way around [the sound field]. You have to watch what you're doing in the front, and make sure that what's happening in the back then doesn't cause the image to teeter‑totter or something strange to happen.
"The nice thing with surround, with regard to compression, is that you generally don't need so much of it, because everything has its own space to breathe. You don't have to squash it all down into two speakers, so you can have all kinds of energy remaining. All you're looking for out of compression is to create a sort of tight and stable soundfield. You're not necessarily bound to try to control the dynamics the same way you do when you're working in stereo, because you're not carrying so much information in each channel. It's part of what makes surround more exciting — not just that you have the sound all around you but that you can leave a bit more life in all of your sound sources.”
With the audio and video complete, it fell to Wouter Strobbe to assemble the Blu‑Ray and DVD discs themselves — a job complicated by the team's desire to offer uncompressed 24‑bit 96kHz audio in both stereo and surround formats on the Blu‑Ray version. "As far as we know, it's the first and only Blu‑Ray disc that uses as much bandwidth for audio as for picture,” says Ronald Prent.
"It's 20.5 megabits per second for audio and 20 for picture,” concurs Strobbe. "The difficulty in the consumer area at the moment is that if people buy a Blu‑Ray player but they still connect their old receiver, they're not able to decode the high‑definition audio. That's the reason why we put Dolby Digital 5.1 on it, the same format as is used in DVD‑Video. We wanted to be sure that everybody could decode the surround stream. The old receivers are mostly capable of decoding DTS, so we put DTS 24/96 on it at 1.5Mbps, and then for the next‑generation environment, receivers which are capable of HDMI, we have uncompressed 24/96 PCM.”
The challenge of squeezing uncompressed audio onto the disc created some difficulty in managing the video stream, which had to be high enough in quality to satisfy band and director. "You have three main video codecs for Blu‑Ray. You have MPEG2, you have Windows Media and you have AVC. MPEG2 is based on a standard which was invented 15 years ago, so it's old technology. Windows Media is closer to the HD‑DVD format, but if you look at 90 percent of all the Blu‑Ray discs on the market, it's all video encoded in AVC, which is an MPEG4 variant. This is an advanced codec and the algorithm is very complex but the efficiency is very high. With this project, I was really amazed that I could get the picture done with 20 Megabits per second. The picture was shot at 25 frames for Europe and for broadcasting. We knew that this Blu‑Ray would also be released in the US, so we had to bring the picture to the 60Hz [ie. 30 frames] standard, because 60Hz is compatible with every player in the world. If you play a 50Hz Blu‑Ray in the US you have a problem. So we had to convert the picture to 60Hz, and that meant having to calculate five more pictures in a second, which of course affects the encodes for Blu‑Ray as well. You have more movement in the picture so the encoding parameters had to be adjusted accordingly, which was really a challenge.”
Since its release last year, the Black Symphony DVD has been a conspicuous commercial success, reaching the top five in no fewer than nine European countries, including the UK. As a showcase for the relatively new Blu‑Ray format, meanwhile, Black Symphony has few equals. In these days where audiophiles bemoan the trend towards ever louder CDs and lossy MP3 compression, is it too much to hope that Blu‑Ray can succeed where Super Audio CD and DVD‑Audio failed, creating a market for truly high‑fidelity music recordings?
The last few years have been depressing for anyone contemplating employment in the studio engineering line. However, as consumer formats become ever more powerful and complex, it's possible we could see a new specialism: that of 'authoring engineer', one of Wouter Strobbe's roles in the production of the Black Symphony disc. Creating a master disc for CD duplication is relatively straightforward, but the same is not true of DVD and even less so of Blu‑Ray. Not only must the audio and video be correctly encoded in multiple formats — an increasingly demanding task — but there's also the need to create menus and other interactive elements.
"I wish that people in audio and video could appreciate how important the authoring process is to what the final result is,” says Darcy Proper. "Unlike CD production, mastering isn't the end of the audio chain in DVD and Blu‑Ray. And good encoding isn't as simple as just handing the stuff to a guy who shoves it through the encoder using the same settings he uses for everything else. There's a lot of management that needs to happen and it makes a huge difference in end quality. The authoring engineer understands the details of managing bit budgets and can give you an idea of the bandwidth you're looking at based on what you want to have for audio and what you want to have for picture. He may have some advice on which format to use for shooting picture, for example, or which audio streams you should consider including — ideas that are helpful to have from the very beginning of the project in order to get the result you want at the end. I suppose it's possible that an authoring engineer is a complete audio maniac and decides to squash the picture down to practically nothing — but because most authoring people come to it from the picture side, that doesn't typically happen and it's more often the audio that suffers. For that reason, those of us in audio should pay close attention to the authoring process.”
One of the most important aspects of authoring is understanding audio and video encoding processes. "Encoders are used widely in IT infrastructures but also satellite uplinking, broadcast transmission, DVD production workflows and Blu‑Ray,” explains Strobbe. "What I see often is that an encoder is treated as though you just put something in and something else comes out by itself — but in an encoder you have to tweak a lot of parameters to get the best output. If you put uncompressed audio in and you need to get compressed audio out, there are a lot of decisions to be made in the box. In Dolby compression schemes, for example, you're talking about dialogue normalisation, or dynamic range compression, or the LFE handling. I have noticed a lot of discs in the market which are encoded using only standard settings and, frankly, a lot of them could be much better, particularly for music‑focused content.”
The Blu‑Ray specification is, says Strobbe, much wider than that of DVD‑Video. "There are more possibilities for interactivity, in compression codecs for audio and video — a lot more possibilities and choices to make in the authoring process. The Blu‑Ray specification has two platforms. One platform is standard authoring and the other is advanced authoring, which includes even more interactivity. Advanced interactivity means that you can combine Internet applications with audio and video content — you can link them together — like adding interactive gaming to packaged media, for instance.
"The Within Temptation disc was built in HDMV, which is standard authoring. There are two tool sets for authoring HDMV — Sonic Scenarist and Sony Blu‑print. These are the spec‑compliant tools that allow you to author to the full Blu‑Ray specification. There are also some other packages for authoring Blu‑Ray, like Adobe Encore, but they don't allow you to use the complete range of high‑definition audio or video codecs.”
The small town of Mol, in rural Belgium, is not perhaps where you would expect to find a world‑class studio complex. Yet when brothers Guy and Wilfried Van Baelen outgrew the original studio they had built in their parents' barn, they decided it was as good a place as any to locate the replacement. Their quest to build the quietest studio in the world led them to employ some radical construction techniques — most notable of which are the enormous springs on which the entire building rests! The design is said to offer more than 100dB of sound insulation, even in the main live area, which can accommodate a full orchestra.