Skip to main content
Artist interview

Inside Hans-Martin Buff’s Grammy-Winning Approach to Rooms and Reverbs

Grammy-winning engineer Hans-Martin Buff wearing headphones at Real World Studios.

Grammy-winning engineer and producer Hans-Martin Buff has spent more than three decades inside other people’s music. Prince, Scorpions, Peter Gabriel – his discography reads like a tour through modern pop and rock history.

Tearjerkers is different. It is his own record, his first album as an artist on IAN Records and now a Grammy nominee for Best Immersive Audio Album at the 2026 Awards. It is also a quiet provocation. The project was conceived, recorded and mixed for immersive formats from the start. There is no “real” stereo version, only dedicated binaural and loudspeaker mixes.

Ask Hans-Martin about reverb, though, and the conversation does not begin with a list of favourite plug-ins. It begins with a joke at his own expense.

“I’m such a princess about reverbs,” he laughs. “I’m really picky about what they do and how long they’re allowed to hang around.”

Importantly, this is not a dismissal of plug-ins. Quite the opposite. For Hans-Martin, high-end reverbs are precision tools. They are the knives and pans of the kitchen, the things that need to be sharp, reliable and well-made. He has a deep respect for good algorithms, and Cinematic Rooms Professional has become one of his most trusted tools in recent years, precisely because it lets him solve difficult spatial problems without drawing attention to itself.

His instinct, shaped by years of relatively dry Prince mixes and a later obsession with recording real rooms, is to start with acoustics whenever the opportunity exists and then use plug-ins to finesse what reality has given him.

This article follows three concrete use cases that sit at the heart of his current work:

  1. Lasse Nipkow’s “direction-agnostic” wide-spaced 3D microphone technique on “My Sweet Siren Sings” and “Not For You”, which aims to capture the reverberant field only and suppresses direct sound and early reflections in large spaces.
  2. Guerilla reverb recording in a wartime bunker in Hamburg for “You’re With Me” and the guitar solo on “2 After 2”.
  3. The way Hans-Martin uses Cinematic Rooms Professional on Peter Gabriel’s i/o (In-Side Mix) album and on the Scorpions live album Coming Home Live, often as a sophisticated problem-solver that extends and stabilises real spaces.

On this journey into Buff’s world of reverb, his core philosophy comes into focus: keep the direct sound glued to an authentic acoustic whenever you can, but don’t be afraid to augment that room synthetically when you must.

A Different Relationship with Reverb

Q) How do you think about rooms and reverbs in general? Your approach seems quite different from many engineers.

A) I come from a fairly dry background. The Prince stuff was probably the one period in recent recording memory where reverbs weren’t really hip. As the number of tracks became less precious, I developed a habit of just over-recording. I tend to record rooms if I can, especially in immersive situations that have decent rooms or where the instrument gives off something beyond just a snare hit.

Artificial reverbs, in general, are more of a tool to me. Like a kitchen knife. It’s not a specific ingredient that I feel I really need, and that really led to a breakthrough. I have like three or four presets that I have in my sessions: One is Cinematic Rooms Professional, one is the Fab-Filter, one is Exponential Audio, and then there’s a bunch of delay setups.

Q) When do you reach for synthetic reverb versus recording the actual space?

A) If I can, I’d rather record the room again. I love that moment where the direct sound is glued to the room sound, and I capture pieces of that relationship, rather than using a room as a generic sauce on something that has nothing to do with that space. 

Having said that, Cinematic Rooms Professional is great, and every time I’ve used it, it’s really, really good. I use it specifically for those types of mixes where I get the full session of somebody who worked in stereo, and then I get to make an immersive mix. I try to use either what they have in a multi-channel setup, or I try to recreate stereo reverbs that they used in immersive. And for this, I use Cinematic Rooms Professional.

Stereo room recording technique with a pair of microphones and a towel on a coat hanger for isolation.Hans-Martin often records rooms with a direction-agnostic approach following a technique developed by Swiss engineer Lasse Nipkow.

The Nipkow 3D Technique: Capturing the Diffuse Field

Throughout our conversation, Hans-Martin repeatedly references a 3D recording technique developed by Swiss engineer Lasse Nipkow. This approach to room capture represents a radical departure from conventional microphone placement and was employed extensively on Tearjerkers.

Q) Can you explain Lasse Nipkow’s 3D technique and how it differs from traditional room miking?

A) The essence of Lasse’s technique is that you’re not capturing the direct sound and the early reflections. You’re only recording the reverberant field. If you place a traditional microphone array in front of a drum kit, you always know where the kit is. You get that direct impact first, then the early reflections, and only later the diffuse tail. With this setup, the mics are pointed away so you don’t really get that direct hit; you get only the “room” part of the room.

What he does is he records the reverberation independently from the early reflections and the direct sound. He does so by just placing tons of mics in a space and trying to put barriers between the mics and the direct sound. And the idea behind it is that it’s not directional. And then afterwards, you can listen to however many mics you put up and go, “Okay, this frequency range or transient setup is better for the front, and this is better for the heights.” So you can place them wherever and create this orb of room that doesn’t have a direction.

Q) What makes this approach so attractive for immersive work?

A) I like to call it direction-agnostic. You capture the beauty of a space without being tied to where the original source was. You’re recording the room as an instrument in itself, rather than the room as a by-product of some source at a particular place.

The other important concept is that, unlike a fixed array, Nipkow’s approach isn’t a set picture. An array is like a photograph: eight mics in specific positions, and that’s the picture you must adhere to. With Lasse’s idea, you’re free. You can put up as many or as few mics as you like, park them where the room sounds interesting, and you’re not locked into a rigid geometry. For me, that’s very attractive creatively.

One of the great advantages of that technique is that it’s not a construction of eight mics that have a relationship with each other that you need to keep in order for the sound picture to work. You can literally line them all up and both have any pre-delay and none of that. And that works really, really well. 

Q) Tell us about the “My Sweet Siren Sings” recording. How did the Nipkow technique work alongside traditional recording methods?

A) I did a recording at Abbey Road Studios where I had five guitars in a round, and I recorded them each directly and then panned that. And then I had a cube array in the middle, recording them as they were, and then I had Lasse’s miking setup throughout the room. We used as many Schoeps mics as they had and then added a bunch of DPAs. I think it ended up being about sixteen mics just for his setup, which is a bit much, but that’s what it was in that case.

Guitar recording setup at Abbey Road Studios, London, by Hans-Martin Buff.In the audio example below, you first hear the direct mics panned, just the close microphones. Then you get the array in the middle between all the players. After that, you hear the Lasse Nipkow setup recording the room. Finally, you hear everything together, as it is mixed. This is a dedicated binaural render of the immersive audio.

Hans-Martin Buff's room recording setup at Abbey Road Studio Two featuring Lasse Nipkow's recording technique.

Q) What was the array configuration you used in the middle?

A) It’s a cube made out of Sennheiser MKH 8050s. It’s basically a Günter Theile cube, just smaller. Most of the arrays I saw when I started getting into immersive were these one-metre-plus squares with Schoeps mics in each corner, which made me think, a bit sarcastically, “yeah, that’s going to be great for my punk record.”

Around 2019, Hyunkook Lee set up all these crazy arrays he’d designed, plus some from other people, in a church to record the 3D-MARCo project. It’s actually pretty similar to the 2025 ECHO project: Everything is recorded at the same time, and you can listen back and compare. You could download the recordings and just compare how different arrays behave. 

That was a big one for me, because it wasn’t just an academic exercise. Hyunkook has that university-level geekiness, but he also really thought through whether you could actually use these arrays in everyday recording situations. That practicality was important to me.

At some point, I just did a bit of research and asked myself: “How do you make immersive music if you don’t just want to do a mixing project? How do you record for immersive?” In classical music, a finished piece of music fills a room, and you capture that; in pop, you often have a small room on the snare, a huge room on the vocal, and so on. The drum room is rarely as important as the close mics unless you’re chasing a very specific vintage vibe.

I did a drum recording session where I tried out different examples that seemed sensible, using mics you can more or less afford. From that session, I ended up really liking the cube for sonic reasons. It’s compact, practical, and sounds good.

So I was looking for ways to capture room immersion in pop music in a really cool way. Hyunkook’s work just kept popping up when I searched. That, combined with Lasse’s more unorthodox ideas, pushed me toward this notion that you can rethink overheads, room mics, and arrays rather than just copying classical techniques.

Q) What are the limitations of the Nipkow technique?

A) The main thing is that it doesn’t really work in very small rooms. In a tiny space, the whole thing collapses: you get too many early reflections, not enough of that nice diffuse tail, and the direct sound and reverb smear together into one big explosion. That might be fun as a crazy effect if you want something extreme, but it’s not what I’m after with this technique.

In the Hamburg bunker case (see below), I also noticed that if the reamp signal is too loud, the whole staircase just blows up. Everything becomes this huge wash that loses definition. When my reamp level was sensible, it worked well. But if it had been a live drum kit at full tilt in there, I’m not sure it would have behaved the way I want.

Q) What about equipment requirements? Does this need expensive microphones?

A) I admit to schlepping Behringers around for just that purpose, because it’s about quantity and not about greatness. You don’t need precise setups. I’m very much in the Steve Albini tradition in that sense: he was known for walking around the room, finding the place where the kit sounded exciting, and putting a mic right there. I love that approach. You capture a bunch of perspectives, and later on, you decide what’s useful. The ones that don’t work end up in this big “unused” folder in the Pro Tools session, and that’s fine. The randomness is part of the fun.

For additional resources on Lasse Nipkow’s wide-spaced 3D microphone technique, please visit this page, including manuscripts from VDT tonmeister conferences and ICSA events.

The Hamburg Bunker: Recording “You’re With Me”

One of the most striking examples of Hans-Martin’s room-capture philosophy appears on “You’re With Me,” a track from Tearjerkers that began as a simple piano-and-vocal piece but was transformed through an extraordinary recording session in a Hamburg bunker.

Grammy-winning engineer Hans-Martin Buff standing at the bottom of the spiral staircase of the Heiligengeistfeld Bunker in Hamburg, Germany.

Q) Where did the reverb on “You’re With Me” come from? It sounds enormous.

A) Have you been to Hamburg? You know that big bunker at the Heiligengeistfeld? That’s the one. I went in there. This was during the tail end of the pandemic, when guerrilla moves were still possible, with not too many people around. There’s an SAE school in there that I was kind of connected to. So they have these four staircases that are like five storeys up, and then I just put mics all over that.

I always had the idea that I wanted one song which is very basic at its core, essentially just piano and vocal, and then see how much space I could add by going into a really big environment.

I wanted to make it as my trial package for how immersive works: If you only have a MIDI piano and a vocal like a traditional bedroom producer thing, how do you make it a big immersive deal?

So that’s what it is: The generic Pro Tools piano, a vocal, and a couple of things added afterwards, but then just a little guerrilla action into a big space, which, to be fair, anybody could have done.

With this song, I also wanted to prove that everybody can do something like this: if you have a DAW, a set of active speakers, and you borrow or buy a handful of cheap microphones, you can go into a space and, using something like Lasse’s approach, you suddenly have a lot of options.

Q) What was the microphone setup in the bunker?

A) I think I ended up with fourteen signals. The mics went up the spiral staircase, and I pointed them more or less upwards and towards the walls, as far away from the central axis where the direct sound from the speakers was coming up. That way, I stayed true to the Nipkow idea of not capturing the direct sound, just the room. At the bottom of the staircase was a stereo pair of speakers.

In the mix, the higher you go up the staircase, the further back you can place those signals in the immersive field. Physical height in the building translates to perceived distance and elevation in the playback space.

Q: What draws you to environments like that bunker?

A: I like environments like that because you get these little enclosed areas where reverb builds in pockets, and then it comes back, and then they start coupling together. You get interesting decays from stuff like that.

It’s pretty interesting because it also had one of those moments that you can’t plan. At the end of the song, during the tail end of that long, last piano note, it turns out there was a draft that literally went up the stairwell. So, it’s just a perfect end. The wind, moving from one door to the other up the stairs, was something I wanted to capture as a movement in the mix, and it worked.

Q: Is there a stereo version of the album?

A: There’s no stereo loudspeaker version. The stereo versions are dedicated binaural mixes. It’s all immersive.

Cinematic Rooms Professional: Problem-Solving with Synthetic Reverb

Despite his preference for captured acoustic spaces, Hans-Martin readily acknowledges that synthetic reverb has an essential place in his workflow, particularly when solving specific mixing challenges. Cinematic Rooms Professional has become a key tool on major projects, including Peter Gabriel’s i/o (In-Side Mix) and the recent Scorpions Coming Home Live album.

Grammy-winning engineer Hans-Martin Buff in the Big Room at Real World Studios.

Q) When do you reach for synthetic reverb like Cinematic Rooms Professional versus your captured rooms?

A) In a traditional pop sense, “reality” is basically non-existent. You create various realities at once and then break them again. That has grown exponentially in immersive, because one of the most creative things you can do is set up worlds that don’t exist. You make hard cuts between them, go from a mono thing to something totally immersive, and those jumps in perception are part of the storytelling.

If I had to put it into categories, my first approach to reverb is less about recreating a single believable room and more about creating little scene sets for sounds. Maybe there’s a sound I want to keep away from all the other sounds so it can have its own little life. I’ll create a world around it that makes zero sense in terms of realistic acoustics but makes perfect sense musically.

Second, it’s about extending and finessing actual room recordings. I recorded 3D rooms for both piano and strings on Peter Gabriel’s i/o (In-Side Mix) album. On some of them that weren’t particularly fast, the room sound wasn’t quite as long as I wanted. So I added reverb on top, but with the wet side very low in a dry/wet blend of an insert, not as a traditional send. I’d shape just the tail very carefully so that if you listen straight through, you don’t necessarily notice it, but when I hit stop, you do. There’s suddenly this graceful extra decay.

The main thing is that the reverb and the real room shouldn’t feel like two separate entities. I don’t want one thing to end and then the next thing to start; I want them to form one new acoustic object. So I do a lot of EQ work and whatever processing is necessary to really blend the reverb tail with the original room sound.

I’ll tweak pre-delay quite a bit as well. Not to create a big slap or obvious distance, but to make sure that the combined decay feels natural. If you get that wrong, you end up with the room in one temporal place and the reverb in another, and then it pulls you out of the illusion.

Third, I use artificial reverb for quoting eras and genres. If I were to remix a Scorpions record from the early ’90s, I would absolutely seek out reverbs that sound like that time. I’d go for Lexicon-type sounds, because that’s what those records lived on. Reverb becomes your link back to that era.

The Scorpions Coming Home Live Album: Spatial Problem-Solving at Scale

The recently released Scorpions live album Coming Home Live presented a unique challenge: maintaining the sense of a massive stadium environment when the recorded audience needed to be faded down during performances. Cinematic Rooms Professional proved essential to solving this problem.

Q) Tell us about the audience capture setup for the Scorpions live album.

A) The main audience pickup was basically a version of Hyunkook Lee’s PCMA 3D array. I had five cardioid mics horizontally for the lower layer and four supercardioids pointing up for the height layer, all on a really tall stand in the middle of the stadium. There was one hut for front of house, one for video, and in the middle, there was the camera position – that’s where my array lived.

On top of that, I had nine teams out in the crowd, each with a 4-metre pole and stereo mics on top. All in all, I ended up with nine stereo audience tracks plus the central 3D array. In the central array, the horizontals were Sennheiser MKH 8040s, and the heights were MKH 8050s. That gave me a huge, detailed audience image to work with in immersive.

Q) What was the mixing challenge you faced with all that audience capture?

A) I recorded the audience very intensively, so at the start of songs and at the end, you get this enormous stadium roar. During the songs themselves, though, I fade the audience quite a bit, because otherwise it’s just a constant wash.

What I needed was a way to keep the sense of a huge live space when the crowd is pulled down. For that, Cinematic Rooms Professional was amazing. I could create a really big, believable reverb that felt like a stadium, so I could “ying-yang” between the two: full audience plus barely any reverb when the crowd is loud, then more reverb taking over the space when the audience is faded down.

It’s kind of ironic, because I’m talking about realism after saying I don’t care about realism. But in a live album context, that sense of reality is important.

So one of my prime use cases for Cinematic Rooms Professional is exactly that: replacing or extending real spaces in a way that still feels believable at large scale, especially when I need something that can flex between “subtle glue” and “massive stadium tail” without sounding cheesy.

Q) How did you balance the real audience with the synthetic reverb?

A) The audience mics all went through a bus, and then from that bus, I had a post-fader send into the sidechain that controlled the reverb. Whenever the audience was up, that signal would push the reverb return down. When the crowd got quieter, the reverb would come back up.

I still had to do a bit of automation on top, but that sidechain setup worked amazingly well. It meant that during the show sections where the crowd’s going nuts, you’re actually hearing the real audience more than the artificial space, and when the crowd calms down, you don’t suddenly drop into a dry, small-sounding band. The reverb grows to fill the hole.

Technical Workflow: Processing and Integration

Q) Do you have any particular approaches to processing reverb returns?

A) A big eye-opener for me was pre-EQing what goes into the reverb, not just EQing the return. I saw Michael Ilbert, a Swedish mixing engineer, do that a lot. He has worked with Max Martin, and I think he did Coldplay and Taylor Swift’s Red and 1989 stuff. He’ll put quite heavy high-cuts and low-cuts on the send. You’re not EQing a finished reverb; you’re shaping what parts of the signal are allowed to excite the reverb. It’s a very different feel.

If a reverb is too clean and pristine in the top end, it can suddenly become this hyper-important entity in the mix, which I usually don’t want. By pre-EQing the send, you can stop that high-end from over-stimulating the algorithm. It still triggers the reverb in a cool way, but you take away the bits that would make the reverb itself become the star of the show.

Q: How do you set reverb plugins up in your mixing session?

A: I’m very big on running reverbs through the same processing chains as their associated sources. In stereo, if I have a Cinematic Rooms Professional instance on the drum room and another one on the lead guitar, I’ll have two separate instances, but each one runs through its respective bus processing – compression, EQ, parallel stuff, whatever I’m doing to the drums or the guitars. That makes the space feel much more homogeneous and glued to the source.

In immersive, this used to be a real CPU headache, but it’s getting easier. My solution is the same concept: make subgroups, if the CPU allows it, and feed the reverbs through those busses so they get compressed and EQed alongside their dry counterparts. That way, instead of a reverb floating above or behind the mix, it feels like part of the instrument.

One of the big problems everyone has when they first get into immersive is figuring out how to make things feel fat and punchy. In stereo, that’s relatively easy, especially if you’re a bus-compression person like me. In immersive, it’s much harder, because everything lives in different place, and your usual tricks don’t automatically translate.

Hans-Martin Buff facing the console at Abbey Road Studios.

Looking Forward

Q: What advice would you give to engineers interested in immersive production?

A: If you look at history, every new format goes through a similar phase. When stereo appeared in the ’60s, it took a while before people really used it creatively. I was born into a world where stereo was just how pop music sounded. Mono was never a conscious thing for me until much later, when I bought the Beatles box set and realised that the mono mixes were often what they actually intended. Going backwards can be very eye-opening.

I think immersive is still in that early phase. One of the biggest challenges right now is that you cannot release an immersive-only project as an artist. You can’t say, “This album exists only as Atmos, deal with it.” You’re forced to do a stereo version as well. So, if I want to be genuinely creative in immersive, I know I’ll also have to create a different version for stereo.

That’s not a creative choice; that’s a business decision. Treat immersive as a creative option, but don’t let it dictate your entire process. Focus on writing and performing strong songs. If you later have a chance to work with someone who really understands immersive, great. But right now, the infrastructure forces you into parallel versions, so you have to be very convinced it’s worth the extra effort.

Hans-Martin Buff facing the console at Abbey Road Studios.

Experience Tearjerkers

Hans-Martin Buff’s Tearjerkers is nominated for Best Immersive Audio Album at the 2026 Grammy Awards. The album features contributions from an extraordinary roster of musicians, including Mikael Nord Anderson (Roxette, The Rasmus), Sarah Brown (Pink Floyd, Simple Minds), Mike Scott (Prince, Justin Timberlake), and many others.

Recorded and mixed at studios such as Abbey Road Studios, msm-studios in Berlin, and Real World Studios, Tearjerkers represents both a creative statement and a technical achievement in immersive music production. As the liner notes declare: “Not a single thought was harmed by stereo during the making of this recording.”

The Scorpions’ live album Coming Home Live, featuring extensive use of Cinematic Rooms Professional for spatial continuity, is available in both stereo and immersive formats.

To spend some time inside Hans-Martin’s world of reverb nothing beats a full immersive playback system or a good pair of headphones, but it’s also easy to listen on Apple Music with Spatial Audio. Listen out for the stairway bunker reverb on Tearjerkers and the way synthetic tails subtly lengthen real room sounds without ever shouting for attention on Peter Gabriel’s i/o (In-Side Mix).

A massive thank you to Hans-Martin Buff for sharing his insights and techniques.

Links

Hans-Martin Buff: 

Website | Instagram | LinkedIn | Facebook

Tearjerkers: Available on Apple Music, Tidal, and other streaming platforms in Dolby Atmos: https://buff.lnk.to/tearjerkers

Scorpions Coming Home Live: Available now in stereo and immersive formats
https://www.the-scorpions.com/coming-home-live/

3D-MARCo Project (Hyunkook Lee): Research and resources
ECHO Project: https://apl-hud.com/echo/

VDT Department of Music and Word Production manuscripts and papers on room microphone techniques, including Lasse Nipkow’s wide-spaced 3D microphone setup: https://tonmeister.org/en/departments/music-and-word-production/#block-29228

Lasse Nipkow: http://www.silentwork.com/

Photography credits: York Tillyer, MarkCraig, SteveKraitt, and Hans-Martin Buff