I'm working on an App that creates an immersive audio experience using mono points sources (AVAudio3DMixingSourceMode.pointSource
) connected to an AVAudioEnvironmentNode
.
I use AVAudio3DMixingRenderingAlgorithm.HRTFHQ
for the sources and a kAudioChannelLayoutTag_Stereo
channel layout for the environment output to generate a head-tracked, binaural audio stream when using compatible headphones (Apple AirPods Pro/Max). For the head-tracking, I use the CMHeadphoneMotionManager.startDeviceMotionUpdates()
callback.
It works pretty well and generates a convincing 360ยบ immersion.
However, if I register the App with the MPRemoteCommandCenter
, iOS automatically offers the user to spatialize my Stereo output in Control Center, which leads to an undesired double-spatialization.
Is there a way to disallow spatialization for my App?
Based on the comment in the API documentation for AVPlayerItem
, I changed the output format to kAudioChannelLayoutTag_Binaural
with the hope that this would disable spatialization but it still shows up as Stereo in the spatialization UI.