Today I’m starting a series where I write a bunch on Ambisonics in the hope it helps anyone else and saves a lot of the experimentation/Google searches/crying that I did! I’ll have a look at what ambisonics are, how the Spatial Sound works. Then we’ll dive into some of the fun things that can be done with Ambisonics in Unity, and plugin pitfalls to avoid. This post will look at magic of ambisonics, and having a look at what they can bring to an interactive game or musical experience.
To make sure I don’t go too fast or waffle too much later on, I’ll start from the beginning, and cover what ambisonics do. There’s also going to be a little (and I do mean a little) bit of the science before we head onto the next steps on the journey.
Ambisonics are a wonderful idea to bring a sense of space into sound. If you’ve watched a film on surround sound and turned as you hear a car behind you, you’ve enjoyed the magic of ambisonics. However, what’s possible with 3D sound goes so much further than throwing sounds to different speakers. For example, ambisonics on headphones can give you a sense of height, closeness, and a better sense of the room or space. This all adds together to communicating sound in a way that makes the experience vibrant and immersive. I’m going to leave a link to the famous example of the Virtual Barber Shop – with headphones it is quite an experience. I’ll never forget the childish wonder I had years ago first listening to it.
In gaming the benefits are even more exciting, as developers can use sound as another way to communicate with the player in unique ways. Imagine the footsteps behind you to make you run away from the monsters, a noise above you to entice you to climb a ladder… these things are very hard to do in simply left/right sound.
The issue then of course is… what if the player moves? In the barbershop the “player” sits still and has no control over their positioning. How do we make the sound relative to the player’s constantly changing position? And this is where game engine plugins come into play and handle the sounds automatically, whilst providing 360 degrees of audio.
How does this all work?
Our hearing is exceptionally clever. Imagine your ears as two microphones – we can only hear from these two places. With this in mind, the fact that I can hear my dog barking, and from these two places work out that not only it’s more to the left, but is also behind me, below me, and a bit muffled because I’ve got the door closed is pretty incredible. We haven’t got eyes in the back of our head, but this is a superpower. I think we take for granted.
There’s a lot of magic at play in how our head does all of this, and I’ll add some links to some people who explain this in more detail. The crux of it is that your mind is able to take the differences between the two ears listening to the sound, and process it so we are aware of these positionings.
In order to get this magic away from just balancing between a bunch of speakers and into true 360 sound being carried in any speaker setup (including headphones), we recreate these effects to trick the brain into experiencing those sounds as it would if they were actually in the room. This involves things like doing the filtering in software, so the brain thinks it’s been clever and decoded the sound positions, but is in fact being informed by the sounds themselves. In realtime, these effects are even better, as the movement of sound sources like we usually have in real life makes the brain really zero in on things.
Here’s why games are amazing
This stuff all sounds like an awful lot of work to do for a sound… but luckily we have a calculator at our disposal that can process a bunch of these sounds in real-time, every single frame. This calculator is our player’s computer or phone, it just needs a bit of magic to get there. This is where amazing software comes into play. I’ll give a quick rundown on a few examples, and hopefully save someone the pain of getting a bunch of things set up, only to find something else might have been a better fit. I’m a Unity developer for most things, so I’ll focus mostly around that, although most of this will carry over to Unreal or anything else most likely. I’ll leave some links below to some middleware programs that work on both, and some info on middleware in case it’s of any help.
Ambisonics in Unity
For my masters project I used Unity linked to FMOD and Google’s Resonance Audio to do my spatialisation… and it was a nightmare in some respects, but wonderful in others. The good was that it worked… the sounds did pop out from where they were meant to, and FMOD’s exceptionally brilliant library, scene design and effects all really brought things to life.
However, this came with a big problem. Resonance only worked by grabbing all the audio and throwing it at the master channel without a chance to actually adjust the volumes of these sounds in FMOD’s lovely built in mixer. And it wasn’t just me, I found a few people discussing how such a big oversight rendered Resonance unusable for any serious work besides tech demos.
My choice became “do I want to restart with something else or jerryrig a solution together?” As I was so far into the project and deadlines were looming, I soldiered on with my solution, which was to use Unity to link how far away a sound was to the volume of that sound. It had mixed results, but on the whole it worked. If I’d had time I’d have sacrificed using FMOD in favour of Resonance within Unity (as that could mix sound volumes), or even just Unity’s own built-in version, but building a sound library from scratch in a few weeks would have left not enough time to actually work on the game project.
If I could offer advice, it would be to avoid using Resonance inside FMOD – it works just fine in Unity, and can mix sound. Unless Google release an update that addresses the mixing issue I would say that using another plugin (including possibly the built-in stuff) would be much better.
https://knowingneurons.com/2013/03/15/how-does-the-brain-locate-sound-sources/ – Knowing Neurons have an amazing explanation of how the brain works out where sounds are coming from.
https://www.waves.com/ambisonics-explained-guide-for-sound-engineers – Waves (maker of audio tools) have a guide on the details of Ambisonics for sound production.