4

I am seeing different behaviour between iOS 9/10 and iOS 11 for buffers that are future scheduled on an AVAudioPlayerNode when an audio route change occurs (e.g. you plug in headphones). Has anyone experienced anything similar and how did you resolve it? Note that I reported this issue up on Apple's AVFoundation support forum almost two weeks ago and have received exactly zero response.

The code that exhibits this problem is shown below - a brief explanation first: the code is a simple loop that repeatedly schedules a buffer to play at a time in the future. The process is kicked off by invoking the 'runSequence' method which schedules an audio buffer to play at a time in the future and sets the completion callback to the nested method 'audioCompleteHandler.' The completion callback again calls the 'runSequence' method which schedules another buffer and keeps the process going forever. It is that case that there is always a buffer scheduled except when the completion handler is executing. The 'trace' method in various places is an internal method for printing only when debugging so can be ignored.

In the audio route change notification handler (handleAudioRouteChange), when a new device becomes available (case .newDeviceAvailable), the code restarts the engine and player, reactivates the audio session and calls 'runSequence' to kick the loop back into life.

This all works fine on iOS 9.3.5 (iPhone 5C) and iOS 10.3.3 (iPhone 6) but fails on iOS 11.1.1 (iPad Air). The nature of the failure is that the AVAudioPlayerNode does not play the audio but immediately calls the completion handler instead. This causes a runaway situation. If I remove the line that kicks off the loop again (as indicated in the code), it works fine on iOS 11.1.1 but fails on iOS 9.3.5 and iOS 10.3.3. This failure is different: the audio just stops and, in the debugger, I can see that the loop is not looping.

So, a possible explanation is that, under iOS 9.x and iOS 10.x, future scheduled buffers are unscheduled when an audio route change occurs while under iOS 11.x future scheduled buffers are not unscheduled.

This leads to two questions: 1. Has anyone seen any similar behaviour to this and what was the resolution? 2. Can anyone point me to documentation that describes the exact state of the engine, player(s) and session when an audio route change (or an audio interruption) occurs?

private func runSequence() {

    // For test ony
    var timeBaseInfo = mach_timebase_info_data_t()
    mach_timebase_info(&timeBaseInfo)
    // End for test only

    let audioCompleteHandler = { [unowned self] in
        DispatchQueue.main.async {
            trace(level: .skim, items: "Player: \(self.player1.isPlaying), Engine: \(self.engine.isRunning)")
            self.player1.stop()
            switch self.runStatus {
            case .Run:
                self.runSequence()
            case .Restart:
                self.runStatus = .Run
                self.tickSeq.resetSequence()
                //self.updateRenderHostTime()
                self.runSequence()
            case .Halt:
                self.stopEngine()
                self.player1.stop()
                self.activateAudioSession(activate: false)
            }
        }
    }

    // Schedule buffer...
    if self.engine.isRunning {
        if let thisElem: (buffer: AVAudioPCMBuffer, duration: Int) = tickSeq.next() {
            self.player1.scheduleBuffer(thisElem.buffer, at: nil, options: [], completionHandler: audioCompleteHandler)
            self.player1.prepare(withFrameCount: thisElem.buffer.frameLength)
            self.player1.play(at: AVAudioTime(hostTime: self.startHostTime))
            self.startHostTime += AVAudioTime.hostTime(forSeconds: TimeInterval(Double(60.0 / Double(self.model.bpm.value)) * Double(thisElem.duration)))
            trace(level: .skim, items:
                "Samples: \(thisElem.buffer.frameLength)",
                "Time: \(mach_absolute_time() * (UInt64(timeBaseInfo.numer) / UInt64(timeBaseInfo.denom))) ",
                "Sample Time: \(player1.lastRenderTime!.hostTime)",
                "Play At: \(self.startHostTime) ",
                "Player: \(self.player1.isPlaying)",
                "Engine: \(self.engine.isRunning)")
        }
        else {
        }
    }
}


@objc func handleAudioRouteChange(_ notification: Notification) {

    trace(level: .skim, items: "Route change: Player: \(self.player1.isPlaying) Engine: \(self.engine.isRunning)")
    guard let userInfo = notification.userInfo,
        let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
        let reason = AVAudioSessionRouteChangeReason(rawValue:reasonValue) else { return }

    trace(level: .skim, items: audioSession.currentRoute, audioSession.mode)
    trace(level: .none, items: "Reason Value: \(String(describing: userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt)); Reason: \(String(describing: AVAudioSessionRouteChangeReason(rawValue:reasonValue)))")

    switch reason {
    case .newDeviceAvailable:
        trace(level: .skim, items: "In handleAudioRouteChange.newDeviceAvailable")
        for output in audioSession.currentRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
            startEngine()
            player1.play()
            activateAudioSession(activate: true)
            //updateRenderHostTime()
            runSequence() // <<--- Problem: works for iOS9,10; fails on iOS11. Remove it and iOS9,10 fail, works on iOS11
        }
    case .oldDeviceUnavailable:
        trace(level: .skim, items: "In handleAudioRouteChange.oldDeviceUnavailable")
        if let previousRoute =
            userInfo[AVAudioSessionRouteChangePreviousRouteKey] as? AVAudioSessionRouteDescription {
            for output in previousRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
                player1.stop()
                stopEngine()
                tickSeq.resetSequence()
                DispatchQueue.main.async {
                    if let pp = self.playPause as UIButton? { pp.isSelected = false }
                }
           }
        }
Andrew Coad
  • 297
  • 2
  • 13

1 Answers1

5

So, problem solved with some further digging/testing:

  • There is a difference in behaviour between iOS 9/10 and iOS 11 when AVAudioSession issues a route change notification. In the notification handler, the engine state is not running (engine.isRunning == false) about 90% of the time for iOS 9/10 whereas for iOS 11 the engine state is always running (engine.isRunning == true)
  • For the 10% of the time that iOS 9/10 indicates that the engine is running (engine.isRunning == true), this is in fact not the case. The engine is NOT running regardless of what engine.isRunning says
  • Because the engine has been stopped in iOS 9/10, previously prepared audio has been released and audio will not start just by restarting the engine; you have to reschedule the file or buffer at the sample point where the engine was stopped. Sadly, you can't find the current sample time when the engine is stopped (the player returns nil) so you have to:

    • Start the engine
    • Grab the sample time and accumulate it (+=) in a persistent property
    • Stop the player
    • Reschedule the audio (and prepare it) starting at the sample time just grabbed
    • Start the player
  • The engine state is the same in iOS 9/10 for both the headphone plugged in case (.newDeviceAvailable) and the headphone removed case (.oldDeviceUnavailable) so you need to do similar to the above for the removed case as well (accumulating the sample time is needed so that you can restart the audio from the point where it was stopped since player.stop() will reset sample time to 0)

  • None of this is needed for iOS 11 but the code below works for iOS 9/10 and 11 so it's probably best to do it the same way for all versions

The code below works on my test devices for iOS 9.3.5 (iPhone 5C), iOS 10.3.3 (iPhone 6) and iOS 11.1.1 (iPad Air) (but I am still bothered by the fact that I can find no prior commentary on how to correctly handle a route change and there must be hundreds of folks out there who have come across this issue..?? Normally, when I cannot find any prior commentary on a topic I assume that I am doing something wrong or just don't get it... Oh well...):

@objc func handleAudioRouteChange(_ notification: Notification) {

    guard let userInfo = notification.userInfo,
        let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
        let reason = AVAudioSessionRouteChangeReason(rawValue:reasonValue) else { return }

    switch reason {
    case .newDeviceAvailable:

        for output in audioSession.currentRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
            headphonesConnected = true
        }

        startEngine()   // Do this regardless of whether engine.isRunning == true

        if let lrt = player.lastRenderTime, let st = player.playerTime(forNodeTime: lrt)?.sampleTime {
            playSampleOffset += st  // Accumulate so that multiple inserts/removals move the play point forward
            stopPlayer()
            scheduleSegment(file: playFile, at: nil, player: player, start: playSampleOffset, length: AVAudioFrameCount(playFile.length - playSampleOffset))
            startPlayer()
        }
        else {
            // Unknown problem with getting sampleTime; reset engine, player(s), restart as appropriate
        }

    case .oldDeviceUnavailable:
        if let previousRoute =
            userInfo[AVAudioSessionRouteChangePreviousRouteKey] as? AVAudioSessionRouteDescription {
            for output in previousRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
                headphonesConnected = false
            }
        }

        startEngine()   // Do this regardless of whether engine.isRunning == true

        if let lrt = player.lastRenderTime, let st = player.playerTime(forNodeTime: lrt)?.sampleTime  {
            playSampleOffset += st  // Accumulate...
            stopPlayer()
            scheduleSegment(file: playFile, at: nil, player: player, start: playSampleOffset, length: AVAudioFrameCount(playFile.length - playSampleOffset))
            startPlayer()   // Test only, in reality don't restart here; set play control to allow user to start audio
        }
        else {
            // Unknown problem with getting sampleTime; reset engine, player(s), restart as appropriate
        }

...

Andrew Coad
  • 297
  • 2
  • 13