Skip to main content
Artist interview

Top Games Mixer Talks Mixing, Reverb And Workflow

By 16th August 2023No Comments

Score mixer and producer Rich Aitken has earned tremendous respect as a sound production professional for around 25 years. The driving force behind UK-based Nimrod Sound, Rich manages a diverse range of recordings, collaborates with composers, and delivers sonically engaging mixes to bring out the best in a musical piece for film, TV, or gaming.

His remarkable credits include numerous Ivor Novello, Emmy, BAFTA and MASA winning and nominated scores from Killzone to Casualty, Netflix’s Treason to Clumsy Ninja. His more recent work includes Horizon Zero Dawn, Missing: Lucie Blackman, Deadend: Paranormal Park, Ten Pound Poms, Silent Witness and Finding Michael.

We caught up with Rich to find out more about his career, his work processes and the distinctions between mixing for game music and film scores.

Tell us about your audio journey so far and how Nimrod came into being…

“I’ve always been into music, (which is the same reply everyone says!). In the mid-nineties, I managed to land a publishing deal and a record deal with majors. After many years of putting out independent singles with various small labels, I took that money and bought a Pro Tools rig. I learned how to use it and gained a heck of a lot of clients because I had that rig – I was ahead of the curve, so I think I was at the front end of what I would call the lower priced component of digital recording and mixing and all that kind of stuff.

“My first commercial products, apart from putting out independent records and being a bass player on other people’s records, was doing video games and some low-end TV shows. I lucked out by meeting the writer of the Driver video game by accident and he contracted me to make some music for Driver 2 and that was the launchpad.”

Would you get a temp music score as they’re building the game music?

“Definitely. And with the composers I worked with they were always complaining ‘they want me to make this super expensive temp music score on a much lower budget with samples’. Or occasionally you work for someone you know, like SCEA (Sony) or Ubisoft, and you get given the budget to go into Abbey Road and do proper recordings or to spend time developing a truly cool sound palette. But yeah, there’s definitely some temp. Film can be quite frustrating with temp love because it’s got a very strict adhered narrative element, whereas often a lot of music use in games is contextual based so I think there’s some freedom in that for people. But the narrative elements of games definitely are probably as locked in as film, even though the music may come earlier in the cycle than it does on film.

“When I’m mixing film and TV score, I’m right at the end of the cycle really and we’re delivering to the dub stage just as they’re finishing doing dialogue passes. There’s a couple of shows recently that we’d finished mixing the music for and they were on TV two weeks later, so you are the last in TV and movie.

“We’re often working with the people at the dub stage to get the stemming changed to how they prefer it. There’s often a lot of feedback coming back saying, actually, can we have this particular instrument isolated because it’s clashing with an element of dialogue or something? We can deliver that, quickly and accurately.”

In terms of deliverables how do you deliver to game?

“It really varies. Some games it’s just a stereo or 5.1 or 7.1 mix. We did some ambisonic stuff for the Edge of Infinity Dr Who game a couple of years ago and that was quite unusual to work on because it was working in headphones with a head tracker, so that was quite interesting.

“But, generally, it’s not dissimilar to films, although sometimes when I work with Joris De Man (Dutch composer and sound designer), for example, there are guided stems – ie these bits are going to be exchanged for this element if it’s being reused in different ways. So, you might have a different percussion stem to go with the same piece. I think that’s getting a lot more intricate and complicated.

“Twenty years ago I used to just be, ‘here’s a piece of music, make sure it loops seamlessly’. When you are providing content that’s going into a game engine you are often providing intros, loop beds and outros and cross fades – a little bit more techically varied than the narrative film content in that respect.”

And then what you deliver to them is stitched into the game engine?

“Yeah, that’s right. It either goes into some middleware like Wwise or Fabric or it goes direct into something like Unity or whatever bespoke engine the game company is using.

“Sometimes they’ll have their own specification of how they want things laying out. I did a game recently which used rock music for the game play stuff, and they had increasing complexity of drum tracks, but the guitar tracks and everything else would remain fairly static. Then there were stingers, which were almost like heavy metal guitar fills, which are basically the equivalent of jumping up and grabbing some kind of prize in the game. And then the narrative elements were more electronic and orchestral of nature, but the in-game stuff was very Iron Maiden-y really.”

So is each project unique in that sense that there’s no kind of templating because each games house might come to you with a completely different set of deliverables than the other?

“Certainly, there are two kinds of deliverables in games. One is the vertical deliverable, which is remarkably similar to delivering stems to film anyway. And then there’s the timewise deliverable, which is ‘we need this bit for three seconds, then we need this rendering, and this has to be a two minute loop’. That’s what I call X-axis stuff so it’s on the timeline and we make sure we set those things up.

“We do use templates actually. We have a lot of templating here at Nimrod. We’ve got a whole team who set up template sessions for me into various formats – stereo, 5.1, 7.1, and the templates provide the basis of the routing and where extra things come into play. We’ve just got a whole load of A, B, C, D tracks, which can be used for different things but generally we’ve got things like tracks called percussion, strings, lead brass, deep percussion effects, guitar, whatever, all those kind of things. And most things will fit into that framework, whether they’re TV advert or video game or whatever and then the bit that differentiates is probably more the creative element. So, listening to what the composer wants sonically, that seems to be the biggest element now. And then, anything in games that requires a different kind of stemming will generally fit in that template in some way.

“With film and TV I tend to pretty much stem out along similar lines all of the time but games will have some unique focus because someone will say, very specifically, this instrument only happens when your hero makes a particular achievement. So that will go on to something we might label the ‘hero’ stem or something like that.”

So, in a gaming situation, you don’t have a conventional linear timeline, like you would in a conventional song or score?

“You do in some ways because there’s a vertical slice, to use a gaming technology terminology. ‘Vertically’ in music is your instruments. Running from top to bottom is your deep instruments, your percussion instruments, your metals (which is symbols and clangy sounds). Percussive effects, basses, guitars, acoustic guitars, strings, lead strings, brass, lead brass, wind, lead wind, synths, arpeggiated synths, pad synths, SFX, things like that I call vertical arrangement because that’s stacked up vertically on my screen. And then left to right is the notes going from the beginning to the end.”

So it does have a conventional timeline like it would in a song or a movie?

“The bits of music do and how they’re assembled does have a timeline, but the assembly could change, so bits can be swapped in and out depending upon maybe different triggers and games. So, you would have an idea that you are playing music from A to B but the selections of the pieces of music may change, and the equivalent would be like having whole pieces of music, perhaps on faders and swapping them between.

“You might just take some Rock drums out and replace them with Taiko drums very quickly on a cross fade. Or you might even do whole pieces of music cross faded into each other. Or you might buttress up two pieces of music together and put a transition sample between the two that makes the join feel smooth.”

Do you deliver a folder full of stings and samples that they’re using?

“Yes, definitely. They’re very much specified by the composition team and often by request of the production team in game and will be things like beds and stingers and endings and intros and narrative elements. I would say games are more bespoke and specific in output in terms of the technical deliverables than film, TV and advertising. I’ve got a set of stems I tend to deliver in films and nine times out of ten they’ll be broadly similar.”

How does reverb work in all of this? Are you delivering all this stuff with reverb baked in?

“Yes, absolutely. We are delivering it all baked in but we’re doing it per stem, so we don’t have a master fader of, ‘hey, here’s the plate reverb for this project’. If we’re using plate reverbs throughout the whole thing, every stem will have its own plate reverb and all the sends will be localised to the STEM group. I typically have two, sometimes three, reverbs per stem group, and they do change – they’re not all going to be a hall or a small room or something, they’re going to vary behind what the intent is for that particular reverb.

“I’ll use Seventh Heaven quite extensively, for example, on all of my orchestral buses and I use it quite a lot on my percussive stuff as well, definitely using it a lot on vocals and they’re all going back to their own stems. So, any vocals will come out to the reverb and that reverb is included as part of that stem baked in.

“Very occasionally we’ve been asked to supply the reverbs separately so that there can be some contextual change, but I’m not a big fan of doing that because you end up being able to change the mix too much and drifting a fair way from the artistic intent of the composer and the artist.”

But sometimes that’s an artistic concern, not a technical one? You don’t get phasing issues and stuff like that if they don’t trigger at the same time?

“All of my stems get rendered at the same time so that they phase cancel. I don’t go through various things going ‘right, here’s this version, here’s that version’. We don’t want that phasing going on anywhere so everything goes down in one go so that we can mitigate for those horrible sonic anomalies.

“All the stems will add up exactly to the mix that’s being supplied as well because the mix will come from the stems. I don’t use mix buses at all apart from on groups and then those are generally side chained together or linked in some way so that they operate as a mix bus but provide you with a proper set of stems which phase cancel with a mix. If we don’t get silence, we know something’s gone wrong.”

I’m guessing then that you’re not using the conventional sense of using buses for reverbs, that reverbs are sitting on separate channels or stem channels?

“We are using buses, but the buses are used within, so instead of having a reverb return going to the mix buss, just imagine that each of my stems are actually their own plus returns. So I’ll have a set of deep reverbs, a set of percussion reverbs, a set of ticky-ticky reverbs, a set of string reverbs, and they’ll all feed back into the stem itself.”

It sounds like you’ve got a lot of reverbs running on a session then?

“Oh, yeah. Easily on most sessions there’s twenty, on a big session maybe sixty, seventy.”

And that’s Seventh Heaven and Cinematic Rooms?

“A lot yes. I’ve really favoured Seventh Heaven because I used to have Bricasti and it just matches for me. It just feels like it’s got the same tonality.”

How is the CPU load been in terms of when you’ve got sixty, seventy LiquidSonics reverbs sitting on a session?

“I haven’t seen any issues. I do a lot more mixing than recording. I’m running a large buffer size (2048) on a native platform.”

Gaming is as big as movies these days and there’s a lot of people out there who’d love to get into sound for video games. What advice would you give to somebody who’s interested in making music and mixing for games?

“I made a very conscious decision a good twenty-odd years ago that I wanted to be involved in the sonic elements. I think being a composer AND a mixer, (I know a lot of people have to do it), really are two different disciplines. So, my advice would be, as early as you can, build a team. That team might be just two of you, as it was with me and Mark Cannon back in the late nineties/early 2000s, but build yourself a team, even if you’re not always using that team. Just have a team ready.”

What about workflow?

“To keep track of your sessions and be consistent in how you put sessions together. Don’t be ad hoc.

“A lot of composers use big templates, but as a mixer use the idea of templates as well. It’s specifically for routing, so you can repeat things over various cycles of a project. And then the other thing I’d say is probably make sure you keep fully up to date with what’s going on in technology land because there’s no point in saying, ‘I want to use blah blah mixing consoles and outboard gear’ when that’s just not what we’re doing. This is a job with a lot of recall.”

How does gaming differ from a conventional studio?

“We’re not delivering on consoles with outboard gear, even though that’s the sort of fantasy island of working in the studio context. But we really need to be delivering digital assets with proper recall, so if you’re going to use any outboard gear and things like that, make sure they’re printed and that’s part of your production cycle. I really don’t think mixing these days is about using outboard gear, and I know there’ll be some people there cursing me and calling me an idiot for saying that, but we’ve got to be able to bring up a session ten months down the line and it sound exactly the same as what was delivered before any changes that are required. Using outboard gear is laborious and plugins today are so good I don’t need to use it on a live mix.”


Try LiquidSonics Reverbs

From the smallest one-man independent development studios to the biggest games publishing giants in the world with teams in the hundreds, there’s never been a greater demand for music to use in video games than there is today. Whether you’re looking for a career in scoring for music, TV or games, you’ll need a good quality reverb! All of the LiquidSonics reverbs are available to try for free for 14 days, just head to our demos page to drop a code into your license manager and pick up the installers from the downloads page.

If you like what you’ve seen and heard our reverbs are available in the LiquidSonics store and don’t forget that existing customers can use their loyalty discounts for some incredible stackable savings.