2

I'm having an issue while trying to use both AVAudioPlayer and AVAudioEngine in my app. I'm using AVAudioPlayer to play remote audio files and audio live streams. I'm using AVAudioEngine to capture sound of a microphone and transcribe it. However whenever I use the AVAudioEngine and go back to the player I can't play any audio as the avasset was empty. I have to setup and play it again. what is the best way to use the mic and do not interfere with audio playback

let audioEngine = AVAudioEngine()
var inputNode: AVAudioInputNode!


// view did load

let audioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(AVAudioSession.Category.record, mode: AVAudioSession.Mode.measurement)
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
inputNode = audioEngine.inputNode

let recordingFormat = inputNode.outputFormat(forBus: 0)
inputNode.removeTap(onBus: 0)
inputNode.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in
    self.recognitionRequest?.append(buffer)
}
audioEngine.prepare()
try audioEngine.start()

//viewWillDisappear

self.audioEngine.stop()
self.inputNode.removeTap(onBus: 0)

  • You are the one saying `setCategory(AVAudioSession.Category.record)`. That means no playback. – matt Oct 13 '19 at 22:27

1 Answers1

1

FWIW matt's reply solved this for me (thanks!). Had similar problem as OP. I am mixing transcription via SFSpeechRecognizer (which needs to use recording functions via inputNode), recording the actual audio via AVAudioRecorder, and playing back audio via AVPlayer. Audio files (for me in Firestore) would play fine until I recorded a new one. Then nothing would play back. For a while I thought it had to do with latency in my storage bucket or database -- or mixing up record vs. play states in my functions.

SFSpeechRecognizer uses an AVAudioEngine instance within an AVAudioSession. Somehow my code had obscured the fact that my AVAudioRecorder also runs within a separate AVAudioSession. (Should I run these in the same session? Does that actually work for basic use cases?)

My old code:

try audioSession.setCategory(.record, mode: .measurement, options: .mixWithOthers)

My new code that works:

try audioSession.setCategory(.playAndRecord, mode: .measurement, options: .mixWithOthers)

where...

let audioSession = AVAudioSession.sharedInstance()

Was struggling with this for a while, so confirming in case others come across this.

For some reason I thought I could just tear down my AVAudioSession for transcription down and let AVPlayer do its thing. Part of me guesses that's still possible, but I tried several things along that path without success.

Sri
  • 11
  • 2