Creating music and sound effects for sophisticated video games such as Mass Effect 3 requires skills not found in film or television — and a lot of hard work.
Video game releases have become events to rival the biggest Hollywood epics, and none more so than that of Mass Effect 3. The final episode of BioWare's hugely popular trilogy was launched in March to huge fanfare, critical accolades and jaw-dropping sales figures — in its first month of release alone, over three and a half million copies were snapped up.
The game itself invites comparisons with the movie world, thanks to its intricate plotting, detailed characterisation and spectacular visual and sonic effects. All this was backed up by production values and budgets to rival any blockbuster, and the audio production was no exception. No fewer than five composers worked on the game, alongside a small army of BioWare employees. The man with the daunting task of planning and co-ordinating all of this work was Rob Blake, originally from Britain but now resident in Edmonton in Canada, where the BioWare team is located.
"Like with film and television, the bulk of our work happens at the end of production as the other departments are winding down,” explains Rob. "The challenge we face is that when other departments slip behind schedule, there is less time for us to finish, as our release schedule is usually locked, so we're always trying to start earlier in production. Doing this also has the benefit of allowing us to influence the direction of the game and its content. We're strong believers in audio being more than just a way to reinforce the visuals; we really want it to help drive the emotional components and act as deeper gameplay mechanics, so starting early allows us to do this.
"One of the great things about working on long franchises is that you get to build on, and more importantly learn from, the past. Mass Effect 3 was probably the easiest of all the games, as the tools were pretty robust and the aesthetic was clearly defined; it felt more like an iterative process rather than us reinventing the wheel again. However, a huge amount of the content had to be new, as we had new weapons, powers, levels, cinematics. As it turns out, I'd say that about 95 percent of the final audio was new in ME3, but we spent the first half of the project making a lot of source material and the second half creating the final assets from this source.
"I started work about 18 months before ship, but most of the audio team put about nine months into it on average. We had about 12 people on sound design, five composers, and a team of people and studios doing voice-overs. There was about 15 'man-years' spent on the audio of the game, if you were to add it up.”
Just as engineers who work on film often specialise in different areas, such as dialogue replacement, sound design or Foley recording, so each member of BioWare's team has individual responsibilities and expertise. "We have a pretty large audio team and want to avoid people becoming too bogged down with one specialism, but we've also seen great efficiency gains from having people specialise, especially now that things have become so complicated. So people focus on one or two areas per project, but we usually switch people around on each project and do knowledge sharing between projects.
"Generally speaking, on Mass Effect 3 we had about four people working on environmental audio, two on cinematic audio, three on conversation audio, two for music direction and implementation, one for weapons and one for 'powers'. That content usually keeps people busy for the duration of the project, but most people have additional smaller tasks, such as Foley, dialogue processing, visual effects, menus, and so on.
"Only about 25 percent of our time is spent creating the sounds themselves. Fifty percent of our time is actually spent working with the assets in the game and audio engines, and the last 25 percent is split between testing, meetings, reviews and all the other fun stuff. Things have changed a lot in game audio as the tools have become more flexible and audio-friendly; years ago we'd spend most of our time creating sounds, and the technical implementation would be minimal or carried out by programmers. Nowadays we implement almost all of the audio ourselves, and the way it's implemented has a profound effect on the way it sounds.”
Implementation is, of course, one of the key differences between audio for film and for video games. Film as a medium is completely linear and predictable, whereas game audio needs to respond to actions and choices made by the player; and while cinemas employ highly controlled and standardised playback systems, gameplayers can and do listen on anything from a mono television to a 5.1 home cinema system. Making music and sound effects fit into their context within the game is thus both a technical and a creative challenge.
"Technically speaking, we have three methods of sound playback,” explains Rob. "Streaming off the disc, which is usually reserved for music, dialogue and long ambience sounds; playing back from memory, which is usually for fast, short and repetitive sounds such as guns, explosions and footsteps; and real-time generation, which is at an interesting stage in games at the moment. In the early days [of computer games], sound was generated in real time using built-in synth chips like the SID [the notorious Commodore 64 sound engine]. Then memory became more available for audio, so games switched to storing sounds in that and playing back like a sampler. And now the CPU increases have meant that real-time is becoming popular again.
"On Mass Effect 3, we used real-time generation for some wind noise, for example, which allowed us to dynamically change the wind sound as you approach a huge dust storm. It allowed us to create a much more reactive and evolving sonic response than if we'd just used crossfaded samples. That's the power of procedural audio, and I think we'll see it used more in games in the future.
"One of the key things we learned over the years [working on the previous Mass Effect titles] was how important focusing the mix was. We implemented some tools that allowed us to control every aspect of sound playback based on how 'on camera' each sound was individually, that really helped focus the mix on what you were looking at. We also shifted towards more tonal and less noise-based sounds, as these cut through the mix much better.”
From an artistic point of view, much of the music also needs to be open-ended, often setting a mood rather than being tightly synchronised with the unpredictable on-screen action. "About a third of the music for BioWare games is composed 'to picture' in the same way you would approach a film. The rest is used during gameplay that is not sync'ed, for example 'exploration' or 'combat'. So the composers are more writing to style guides, reference pieces and general direction.
"Usually, we'd send over a video of a whole level, with a document listing all the required assets, with timecode references from the video, some general direction and reference pieces for each track. I see our job as making the composers' lives as easy as possible: they don't have the luxury of being able to swing by the writer or level designer's office to chat [none of the composers used on Mass Effect 3 worked in-house], so we try to supply as much info as possible. Normally, the assets we ask for are actually pretty short, maybe 45 seconds each. Then the audio system mixes them together to fit the action.
"We took the unconventional approach to hand-script all of the music on Mass Effect 2 and 3. Most games have 'explore' and 'combat' music for each level that's controlled by global code, but we wanted to support the narrative rather than just the gameplay; perhaps a certain combat isn't significant enough to require combat music, or an explore section should feel more like combat because you're escaping an enemy base. So every section in the game is documented and hand scripted, and we have to get the composers to write for the context, rather than just some global 'states'. I think the end result sounds much more cinematic, and you get less of the awkward and jarring music changes that we've had in the past.”
Not only did the music itself need to be flexible, but so too did the team producing it. Just as in film-making, the music was added late in the production cycle, and it was necessary to retain the ability to respond to late changes made to the game itself. This, as Rob Blake, explains, was one reason why they chose samples and synths over the option of hiring an orchestra: "The vast majority of the music was not recorded live. We had a few sessions for vocals and individual instruments, but the rest was all in the computer. Early on we had some plans to use one of the bigger orchestras for the score, but over time it became increasingly clear that the iterative development process we have on Mass Effect was not going to suit an early lockdown for music. Not only does the content get constantly iterated on, but the director, Casey Hudson, likes to have the ability to make changes late in production as the final experience presents itself. I think this iterative process is one of the reasons BioWare have been so successful, but it makes early lockdown of assets very difficult.”
When it comes to predicting how their work will be heard by players, BioWare are not working in the dark. "The telemetry feedback from our games say that around 75 percent of players are listening in stereo, most of them probably on flat-screen TVs without much bass response,” says Rob. "Of course, that means a quarter are listening in surround, probably on home theatre setups, so it's not all bad news, and we invest a lot of time and money into surround sound.”
With this in mind, the facilities at BioWare are designed to reflect the circumstances in which the game audio will eventually be heard, instead of providing a perfectly neutral or high-fidelity reference system. "Each sound designer has a 5.1 set of Genelec speakers set to bass manage — as most home theatres are — and with a secondary low-grade stereo system that approximates an LCD TV system. Some of us have an even worse-quality third system, which can be painful to listen to!
"We're based in an office building and unfortunately don't have custom-built sound studios, but generally we're trying to mimic home environments as much as possible. We're in the process of building a mix room with a wider range of listening setups and a more calibrated environment.
"Because of the amount of data we deal with and the fact that we're always working to picture, we tend to have more picture monitors than most studios; I have four, some of which have multiple inputs. We also work on multiple end platforms, so we have PS3s, Xboxes and multiple PCs. Unfortunately, the development game consoles usually don't play retail games! Aside from that, we have the usual array of synths, control surfaces and rats' nests of cabling.”
The need to support "multiple end platforms” is a big deal. Making the same game run successfully on general-purpose PCs and on two different consoles can be quite a challenge. "We used the middleware tool Wwise by Audiokinetic on Mass Effect 2 and 3. It's a very flexible and open-ended system that suits the large-scale nature of our games very well. The documentation and support is excellent, and you can download a demo if you want to check it out! Each platform has its challenges, and it tends to change depending on the nature of the game. For Mass Effect 3, with PCs it's the infinite number of hardware configurations and conflicts. With Xbox, it tends to be CPU limitations. With PS3, it's memory restrictions. Because of all these limitations, each sound is optimised for each platform, which obviously takes a fair bit of time.
"We tend to prefer working on PC because it's considerably faster; there is no 'cooking' of content required, where the game is assembled and copied to the console, and the game content is easier to access. However, the consoles are considerably more prone to problems thanks to their limitations, so we have to regularly check on each platform.”
Increasingly, says Rob, working on game audio requires plenty of specialist technical expertise. Being a talented audio engineer or sound designer is not enough: anyone wanting to get into the field will need skills that are not readily transferable from other fields. "Game audio has become increasingly detached from other audio production over the last five years. We now have highly specialised tools and skills that are quite distinct from anything in film, radio or music. As this continues, the barrier will soon become a problem for people hoping to jump from other audio disciplines; at times it feels like we're more akin to game designers than sound designers or composers.
"We look for people with a real sense of attention to detail: we work with large amounts of content, and there are many things that can go wrong, so it helps if you're obsessive over the details! Naturally, people have to be excellent sound designers, inventive and open to experimentation, as well as highly creative with audio. Like I mentioned before, our work is very technical, so in addition to the sonic side we also love working with people that are very comfortable with computers and complex systems. Lastly, game development is highly collaborative… so you need to be an excellent communicator and collaborator in addition to all that other stuff. Quite the list of requirements!”
In other words, no-one said it was going to be easy, but anyone who makes the grade will have the rare luxury of working very much at the cutting edge of new technology. "It's a really interesting time for game audio. The technological restrictions are becoming less restrictive, the tools are becoming more powerful, and we're starting to see what is possible with interactive audio as a result. I say it's interesting because I think it compares to how keyboards have evolved over the years: starting with early analogue synths, to digital samplers playing back samples, and finally back to real-time generation with physical modelling.” .
Even the biggest movie scores rarely employ as many composers as the five who worked on Mass Effect 3. Writing all the music would have been too large a task for any individual to perform in the limited time available, and the modular or sectional nature of the game format lent itself to a multi-composer approach. "Each of the five composers brought something really special to the project,” insists Rob. "We'd worked with Sam Hulick on Mass Effect 1 and 2 before, so he knew the franchise really well and we focused him on some of the meaty plots that tied in to the overarching trilogy arc. Clint Mansell we hadn't worked with before, and is obviously known for his large and emotional film scores such as Requiem For A Dream, Moon and Black Swan, so he helped us build the emotional backbone and some key themes used throughout the game. Then there was Cris Velasco, Sascha Dikiciyan and Christopher Lennertz, all of whom we'd worked with before on downloadable packs, and we ensured their stylistic touches were suited to the levels. For example, Sascha has an amazing array of analogue synths, so we ensured that he worked on missions associated with the Geth, a very technologically focused race.”
All five composers work in their own studios, and take different approaches to producing finished music. Naturally, this sometimes meant a certain amount of going back and forth between BioWare and the composers. "The composers all work very differently and we try to accommodate their different production styles. Most pieces were completed with two or three iterations, but some tracks took a lot of iteration to capture the feel right; I think we maxed out on about 10 iterations for one particular track.”
With five different composers working independently, there could be a risk of the end result not seeming homogeneous from a sonic point of view, but again, the modular nature of the game means this wasn't a big problem for Blake and his team. "Cris and Sascha mix their own music, so it basically comes to us finished as soon as we approve the underlying composition. The other composers sent through unmixed versions, which we'd approve, and then they each went through a separate mixer. Each composer tends to want to use their own people and, while I had concerns over consistency, I think the end quality gain from building on existing relationships pays off. Sonic differences between levels is actually something that suits the game because we want levels to feel unique; there's always a break between different levels as you have to return to your ship between missions, which we kept music-free as a sort of 'palate cleanser'.
"We usually ask for stems because we never know what's going to happen to a scene. We often get caught out when a scene changes at the last minute, and getting stems helps us rebuild things more easily. We can also use the stems in different sections that we normally wouldn't want to re-compose for; essentially we're fighting the fact that we have a large amount of content, so stems are a tool that help us to get more coverage.”
Rob Blake prefers to avoid using 'temp tracks' where possible, instead relying on the composers' imagination. "For Mass Effect, I think there are two aspects of music direction: aesthetic and emotional. One of the great things about working on the end of a trilogy is that we can use our own content as stylistic reference material. We have about six hours of music from Mass Effect 1 and 2, so the aesthetic direction was never going to be an issue, as our music style was pretty well set. For the emotional direction, we tend to prefer directing with words rather than reference pieces, as composers find that more liberating, and we get their interpretation of the emotion rather than their interpretation of the reference piece. So we'd send over a lot of notes and have long phone discussions about the emotions and context. Sometimes with complex pieces and cinematics it can be hard to steer the emotions in the right way, so if the composer is not capturing the feel in the way we want, we'll then send over a reference piece as an emotive guide, which often clarifies things when words don't quite cut it.”
"No sound designer likes using libraries, but schedule and accessibility makes libraries inevitable for certain sounds, especially environmental audio,” says Rob Blake. Where possible, though, the sound effects in Mass Effect 3 were created specifically for the game. "In total, I'd say at least half of our sounds were sourced from original BioWare recordings, but it's hard to put a figure on it, really! We actually did a lot of effect recording sessions and the results were invaluable to the final sound. We take pride in creating unique-sounding games and always strive to create things that sound a little different. For example, two people spent almost a year each working on weapons and 'powers' [Mass Effect's 'magic'] in order to create diverse and iconic audio.
"We had a couple of sessions at a car wrecker's yard where we got to destroy an armoured truck, a tank recording session with the Canadian armed forces, and [a session] at a welding shop recording lots of interesting machinery. We've also got a small studio at BioWare, which we often use to record random items people bring in.
"We bought some high-definition microphones that record frequencies up to 100kHz, so when you play the recordings back at slower speeds you hear all these crazy frequencies that are totally inaudible to the human ear. It's interesting stuff, especially the human voice, which you wouldn't expect to have a lot going on in those higher frequencies. Many of the creature sounds are actually sourced from the sounds the team recorded with these mics, slowed down and processed with whatever random esoteric processing the team was into at the time.”
Simply keeping track of the vast array of sound effects was a crucial job in itself. "The game shipped with over 12,000 sound effects built out of a whole catalogue of source material, often created from the recording sessions. So with this much data, we have to be pretty organised. In 10 years' time they may rerelease Mass Effect on the iPhone of the future, and we need to be prepared for that.
"We use the librarian software Basehead at BioWare, and most of our franchise content goes in there, with metadata added for clarity. Generally, though, it comes down to careful organisation and structuring of data, and this happens at all times during production. We often have to share assets around the department, so we have careful naming conventions to make things clear. Also, it helps that we're all in the same studio — it's easy to just ask someone what a specific sound is. Organisation gets considerably more important when you're working with external studios or contractors.”
Inside Track | Secrets Of The Mix Engineers
Interview | Engineers
Interview | Producer
Interview | Music Production
Interview | Engineer
Interview | Band
Interview | Producer
Four Decades Of De-evolution
Andrew VanWyngarden & Ben Goldwasser: Recording Congratulations
40 Years Of Krautrock
Producing The Defamation Of Strickland Banks
Inside Track: Johnny Cash | American VI: Aint No Grave
Steven Wilson: Recording & Marketing Porcupine Tree
From Rock Producer To Pop Songwriter
Five Decades In The Studio
Time Trial: Bringing Multitracks and MIDI into the 21st Century
Inside Track: Michael Bublé Youre Nobody Till Somebody Loves You