Christmas Sales! Everyone can enjoy a 30% OFF on Mocap Suit and Mocap Gloves & FREE Shipping Worldwide.

Meta’s new audio tools promise more natural and localizable sounds in VR and AR

Meta’s new audio tools promise more natural and localizable sounds in VR and AR, from distant screams to whispers in your ear.
On February 7, Meta added immersive audio capabilities to its proprietary Presence platform. The new “XR Audio SDK” aims to make it easier for developers to integrate spatial, localizable audio. It currently only works with the Unity engine, which is widely used in VR. Support for Unreal Engine, Wwise and FMOD is planned.
The Presence Platform is a set of development tools and APIs that enable hand, voice, and augmented reality experiences with Quest 2 and Quest Pro. A recent addition in Fall 2022 is the Motion SDK for face, eye, and body tracking. The platform will be launched for the first time in autumn 2021.
Applications for new immersive audio experiences include virtual reality, augmented reality and mixed reality. For the latest headset, Meta’s Quest 2 and Quest Pro VR add CGI to the front-facing camera video. In addition to metadevices, the new sound SDK also supports “virtually any standalone mobile VR device”, as well as computer VR (such as Steam VR) and third-party devices.
New features include improved handling of head, outer ear, and torso filtering effects that can greatly affect real sound: the head-related transfer function (HRTF) is designed to accurately simulate the perception of real sound. Without it, the sounds in your environment will sound unnatural.
Room acoustics modeling benefits from sound reflections and reverberations, which are dependent on room size, shape, and surface materials. So if you start playing an AR game in an empty office, the virtual enemy’s voice might bounce off the walls just like your own. This uniformity of sound is intended to enhance the effect of presence.
The Spatial Audio Rendering and Room Acoustics features are based on the previous Oculus Spatializer and will continue to evolve. Meta said on its developer blog that the system is better suited for VR use than the built-in audio systems of popular game engines that are primarily designed for consoles and PCs.
The new Audio SDK provides flexibility and ease of use. According to Meta, even developers with no audio experience can mix audio, which is essential for immersion.
For example, Meta showed off its use of the Horizon Workrooms virtual office and conferencing app for Quest 2 and Quest Pro at the Connect event in October 2022. A scene at the negotiating table where voices from different directions are critical to credible common understanding.
The previous Oculus Spatializer will continue to be supported in Unreal Engine, FMOD or Wwise, and for those who prefer their own API solution. Meta does not currently recommend updating in these areas.
For new projects in Unity Engine, Meta recommends using the new “XR Audio SDK” to improve long-term app maintenance or to try out experimental features.
Instructions and examples can be found in the official “XR Audio SDK” documentation and in the Meta Developer Center.
Note: Links to online stores in articles may be so-called affiliate links. MIXED receives a commission from the seller if you buy through this link. The price stays the same for you.


Post time: Feb-13-2023