Virtual Reality for your Ears

Realistic sound reproduction: Let's assume you have some decent headphones on, your eyes closed, and you hear a decent recording of a concert in which the musicians appear to be right in front of you. You hear the piano to the left, the bass to the right and the singer center stage, somewhat closer to you. Imagine the immersive experience, you and the musicians right there, present in the moment. And then you turn your head ...

The whole soundscape, including the musicians, instead of staying where they are, move with your head, too, into positions that are defined by your head's movement. The immersive experience destroyed.

How can we fix this? How to solve VR for audio?

In 2010 the iPhone 4 came out, the first smartphone with accelerometer and gyroscope, accessible to developers via an SDK. If those sensors sat in your headphones, we could make it work, we thougth. We immediately started working on a prototype.

The physics works as follows: Humans localize sounds primarily through two physical mechanisms. First, the ears have different distance to the sound source. Since the velocity of sound propagation is finite, there is a time delay between the arrival of air waves between left and right ear. Second, an ear reflects sound differently depending on the direction of the incoming sound wave and it's frequency. You can think of an ear as a directional equalizer, amplifying some frequencies and supressing others, depending on the relative angle between the ear and the sound source. This directional equalizer is referred to as
head-related transfer function (HRTF). Since earcups of individuals vary in size and shape, the HRTFs depend on the individual.

Effects other than these are used by our ears for sound localization, too, but their effect is secondary.

After a month or so we had a prototype running: Mount the smartphone to your head, together with a pair of decent headphones. Using the accelerometer data, calculate the orientation of your head relative to virtual sound sources. We downloaded the IRCAM database, a set of openly available HRTF measurements on 50 individuals. A little signal processing magic and we got the virtual sounds work real time.

We'd let the virtual sound source move in space, put the headphones on and guess the direction. Now it's here. Oh it moved up there. It worked. It was magic.

It was fun, but then we just stopped working on it for various reasons. And now, ten years later, Apple has the sensors in their AirPod Pros and with iOS 14 you got support for real 3D-Audio. Amazon Music starts to stream pieces in 3D-Audio. Games are supporting it. AR with realistic video and now audio scenes has arrived. What a world we live in!