I am seeing different behaviour between iOS 9/10 and iOS 11 for buffers that are future scheduled on an AVAudioPlayerNode when an audio route change occurs (e.g. you plug in headphones). Has anyone experienced anything similar and how did you resolve it? Note that I reported this issue up on Apple's AVFoundation support forum almost two weeks ago and have received exactly zero response.
The code that exhibits this problem is shown below - a brief explanation first: the code is a simple loop that repeatedly schedules a buffer to play at a time in the future. The process is kicked off by invoking the 'runSequence' method which schedules an audio buffer to play at a time in the future and sets the completion callback to the nested method 'audioCompleteHandler.' The completion callback again calls the 'runSequence' method which schedules another buffer and keeps the process going forever. It is that case that there is always a buffer scheduled except when the completion handler is executing. The 'trace' method in various places is an internal method for printing only when debugging so can be ignored.
In the audio route change notification handler (handleAudioRouteChange), when a new device becomes available (case .newDeviceAvailable), the code restarts the engine and player, reactivates the audio session and calls 'runSequence' to kick the loop back into life.
This all works fine on iOS 9.3.5 (iPhone 5C) and iOS 10.3.3 (iPhone 6) but fails on iOS 11.1.1 (iPad Air). The nature of the failure is that the AVAudioPlayerNode does not play the audio but immediately calls the completion handler instead. This causes a runaway situation. If I remove the line that kicks off the loop again (as indicated in the code), it works fine on iOS 11.1.1 but fails on iOS 9.3.5 and iOS 10.3.3. This failure is different: the audio just stops and, in the debugger, I can see that the loop is not looping.
So, a possible explanation is that, under iOS 9.x and iOS 10.x, future scheduled buffers are unscheduled when an audio route change occurs while under iOS 11.x future scheduled buffers are not unscheduled.
This leads to two questions: 1. Has anyone seen any similar behaviour to this and what was the resolution? 2. Can anyone point me to documentation that describes the exact state of the engine, player(s) and session when an audio route change (or an audio interruption) occurs?
private func runSequence() {
// For test ony
var timeBaseInfo = mach_timebase_info_data_t()
mach_timebase_info(&timeBaseInfo)
// End for test only
let audioCompleteHandler = { [unowned self] in
DispatchQueue.main.async {
trace(level: .skim, items: "Player: \(self.player1.isPlaying), Engine: \(self.engine.isRunning)")
self.player1.stop()
switch self.runStatus {
case .Run:
self.runSequence()
case .Restart:
self.runStatus = .Run
self.tickSeq.resetSequence()
//self.updateRenderHostTime()
self.runSequence()
case .Halt:
self.stopEngine()
self.player1.stop()
self.activateAudioSession(activate: false)
}
}
}
// Schedule buffer...
if self.engine.isRunning {
if let thisElem: (buffer: AVAudioPCMBuffer, duration: Int) = tickSeq.next() {
self.player1.scheduleBuffer(thisElem.buffer, at: nil, options: [], completionHandler: audioCompleteHandler)
self.player1.prepare(withFrameCount: thisElem.buffer.frameLength)
self.player1.play(at: AVAudioTime(hostTime: self.startHostTime))
self.startHostTime += AVAudioTime.hostTime(forSeconds: TimeInterval(Double(60.0 / Double(self.model.bpm.value)) * Double(thisElem.duration)))
trace(level: .skim, items:
"Samples: \(thisElem.buffer.frameLength)",
"Time: \(mach_absolute_time() * (UInt64(timeBaseInfo.numer) / UInt64(timeBaseInfo.denom))) ",
"Sample Time: \(player1.lastRenderTime!.hostTime)",
"Play At: \(self.startHostTime) ",
"Player: \(self.player1.isPlaying)",
"Engine: \(self.engine.isRunning)")
}
else {
}
}
}
@objc func handleAudioRouteChange(_ notification: Notification) {
trace(level: .skim, items: "Route change: Player: \(self.player1.isPlaying) Engine: \(self.engine.isRunning)")
guard let userInfo = notification.userInfo,
let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
let reason = AVAudioSessionRouteChangeReason(rawValue:reasonValue) else { return }
trace(level: .skim, items: audioSession.currentRoute, audioSession.mode)
trace(level: .none, items: "Reason Value: \(String(describing: userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt)); Reason: \(String(describing: AVAudioSessionRouteChangeReason(rawValue:reasonValue)))")
switch reason {
case .newDeviceAvailable:
trace(level: .skim, items: "In handleAudioRouteChange.newDeviceAvailable")
for output in audioSession.currentRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
startEngine()
player1.play()
activateAudioSession(activate: true)
//updateRenderHostTime()
runSequence() // <<--- Problem: works for iOS9,10; fails on iOS11. Remove it and iOS9,10 fail, works on iOS11
}
case .oldDeviceUnavailable:
trace(level: .skim, items: "In handleAudioRouteChange.oldDeviceUnavailable")
if let previousRoute =
userInfo[AVAudioSessionRouteChangePreviousRouteKey] as? AVAudioSessionRouteDescription {
for output in previousRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
player1.stop()
stopEngine()
tickSeq.resetSequence()
DispatchQueue.main.async {
if let pp = self.playPause as UIButton? { pp.isSelected = false }
}
}
}