3

I need to play an arbitrary tone that is generated based on the sample rate of the output device. User is required to connect wired or wireless headphones in order to listen to the audio.

Since modern headphones can have native sample rate of 44100 or 48000, I would need to somehow reinitialize my AVAudioPlayerNode with different AVAudioFormat. Without doing anything, the audio starts to sound distorted. However, I'm facing various deadlocks when trying to attach, detach, connect or disconnectNodeOutput. Sometimes it would lock the execution for the first time I try to switch headphones (I have 2 headphones connected at the same time, one generic headset with SR of 44100 and Airpods with SR of 48000), and rarely, it would freeze on the second try.

Here is my code:

private let engine = AVAudioEngine()

private let audioUnit = MyAudioUnit()

init() {
    let audioSession = AVAudioSession.sharedInstance()
    let sampleRate = audioSession.sampleRate
    format = AVAudioFormat(
        commonFormat: .pcmFormatFloat32,
        sampleRate: sampleRate,
        channels: AVAudioChannelCount(audioSession.outputNumberOfChannels),
        interleaved: false
    )!

    engine.attach(audioUnit)
    engine.connect(
        audioUnit,
        to: engine.mainMixerNode,
        format: format
    )

    NotificationCenter.default.addObserver(
        self,
        selector: #selector(self.handleInterruption),
        name: Notification.Name.AVAudioEngineConfigurationChange,
        object: engine
    )
}

@objc
private func handleInterruption(_ notification: Notification) {
    DispatchQueue.main.async {
        let audioSession = AVAudioSession.sharedInstance()
        let sampleRate = audioSession.sampleRate
        self.format = AVAudioFormat(
            commonFormat: .pcmFormatFloat32,
            sampleRate: sampleRate,
            channels: AVAudioChannelCount(audioSession.outputNumberOfChannels),
            interleaved: false
        )!

        self.engine.detach(self.audioUnit)
        self.engine.attach(self.audioUnit)
        self.engine.connect(
            self.audioUnit,
            to: self.engine.mainMixerNode,
            format: self.format
        )
    }
}

// method called on main thread only
// BTW, using audioUnit.stop() instead of pause() would also introduce a deadlock
func setActive(_ active: Bool) {
    ...

    if active {
        try! engine.start()
        audioUnit.play()
    } else {
        audioUnit.pause()
        engine.stop()
        engine.reset()
    }
}

I've tried numerous variants for "reconnecting" the AVAudioEngine but they all ended up in the following deadlock:

enter image description here

enter image description here

I've also tried to leave it on background thread, but it doesn't matter. As soon as the app tries to use the AVAudioEngine again, everything gets stuck.

So, is there a proper way to reconnect the AVAudioPlayerNode in order to update the sample rate? Or maybe it is possible to not depend on headphones native sample rate and make the audio sound normally on a static sample rate? Thanks in advance!

Roman Samoilenko
  • 932
  • 12
  • 25
  • Have similar issues. The app hangs when I disconnect or connect instrument nodes. My current semi-workaround(not working all the time, but seems to reduce the percentage of hanging) is to call pause and then reset methods on audio engine before making any manipulations with it and then call start. – Dmitry Klochkov Dec 04 '19 at 14:25

1 Answers1

1

With your current setup, you shouldn't need to do anything when the audio route and sample rate change.

You current setup is as follows:

+------+    +-----+    +------+
|player| -> |mixer| -> |output|
+------+    +-----+    +------+

One of the benefits of having a mixer in between your player and output nodes is that a mixer will do sample rate conversions for you. So your player can output in 44100Hz, even though the speaker's sample rate is 48000Hz.

Another thing to note is that when connecting your player to your mixer, the format parameter should be the format that the player node is outputting. This should be able to be inferred though, by setting format to nil, as in:

engine.connect(player, to: engine.mainMixerNode, format: nil)
Ryan Maloney
  • 996
  • 11
  • 28
  • That's a nice suggestion, and I've also tried it during my testing. Unfortunately, system generates audio format with 44100 Hz for Airpods which are 48000 Hz, so audio is distorted. I suspected this is because there are two headsets connected to iPhone at the same time, so system messes sample rate matching but I removed the second headset and it still worked the same. – Roman Samoilenko Nov 15 '19 at 06:46