3

I have been trying play with AVAudioEngine for singed 16 bit stream data.

But pass to AVAudioFormat connect function always make crash.

codes like below:

let AUDIO_OUTPUT_SAMPLE_RATE    = 44100
let AUDIO_OUTPUT_CHANNELS       = 2
let AUDIO_OUTPUT_BITS           = 16
var audioEngine: AVAudioEngine?
var audioPlayer: AVAudioPlayerNode?

...

    audioEngine = AVAudioEngine()
    audioPlayer = AVAudioPlayerNode()

    audioEngine?.attach(audioPlayer!)

    let mixer = audioEngine?.mainMixerNode
    mixer!.outputVolume = 1.0
    let stereoFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: Double(AUDIO_OUTPUT_SAMPLE_RATE), channels: 2, interleaved: false)
    audioEngine!.connect(audioPlayer!, to: mixer!, format: stereoFormat)

...

audioEngine!.connect(...) is crashing line

I'm using Xcode 8 beta 6, OS X El Capitan, And, this happening arise from both of simulator and devices.

This is part of crash message:

ERROR:    >avae> AVAudioNode.mm:751: AUSetFormat: error -10868
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'

...
3   AVFAudio  0x000000011e0a5630 _Z19AVAE_RaiseExceptionP8NSStringz + 176
4   AVFAudio  0x000000011e0f270d _ZN19AVAudioNodeImplBase11AUSetFormatEP28OpaqueAudioComponentInstancejjP13AVAudioFormat + 213
5   AVFAudio  0x000000011e0f2630 _ZN19AVAudioNodeImplBase15SetOutputFormatEmP13AVAudioFormat + 46
6   AVFAudio  0x000000011e0f9663 _ZN21AVAudioPlayerNodeImpl15SetOutputFormatEmP13AVAudioFormat + 25
7   AVFAudio  0x000000011e099cfd _ZN18AVAudioEngineGraph8_ConnectEP19AVAudioNodeImplBaseS1_jjP13AVAudioFormat + 2377
8   AVFAudio  0x000000011e09d15f _ZN18AVAudioEngineGraph7ConnectEP11AVAudioNodeS1_mmP13AVAudioFormat + 355
9   AVFAudio  0x000000011e0fc80e _ZN17AVAudioEngineImpl7ConnectEP11AVAudioNodeS1_mmP13AVAudioFormat + 348

It's no problem playing with buffer and format from audio file.

What is mistake(s) I do?

Thanks.

oozoofrog
  • 166
  • 9

4 Answers4

3

-10868, a.k.a. kAudioUnitErr_FormatNotSupported, so it looks like your PCMFormatInt16 isn't appreciated. Changing it to .PCMFormatFloat32 does work.

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • 1
    I met the same situation and used Float32 to fix it. I noticed that the default AudioFormat in outputNode is also Float32 on 3 different iOD devices. Does that mean iOS device only support playback in Float32? – Porter Liu Nov 12 '16 at 19:16
2

From Apple, AVAudioPlayerNode

When playing buffers, there's an implicit assumption that the buffers are at the same sample rate as the node’s output format.

and then print AVAudioPlayerNode's output format, which is engine.mainMixerNode.inputFormat

    open func connectNodes() {
        print(engine.mainMixerNode.inputFormat(forBus: 0))
        engine.connect(playerNode, to: engine.mainMixerNode, format: readFormat)
    }

The result is

<AVAudioFormat 0x6000024c18b0: 2 ch, 44100 Hz, Float32, non-inter>

so choose .pcmFormatFloat32, instead of .pcmFormatInt16


OSStatus master website

dengApro
  • 3,848
  • 2
  • 27
  • 41
2

You need not to create the audio format yourself,

Get the audio format from the actual audio data ( pcm buffer or audio file )

the audio format thing from WWDC 2016 , Delivering an Exceptional Audio Experience

0


The actual audio format theory from WWDC 2015, What's New in Core Audio

  • the actual audio format from output

44

  • the actual audio format from input

44

  • do audio channel mapping, and audio bit depth keeps the same

222

dengST30
  • 3,643
  • 24
  • 25
0

You can use singed 16 bit audio format, but you should convert it at first.

// Setup your own format
let inputFormat = AVAudioFormat(
    commonFormat: .pcmFormatInt16,
    sampleRate: 44100,
    channels: AVAudioChannelCount(2),
    interleaved: true
)!

let engine = AVAudioEngine()
// Use system format as output format
let outputFormat = engine.mainMixerNode.outputFormat(forBus: 0)
self.converter = AVAudioConverter(from: inputFormat, to: outputFormat)!

self.playerNode = AVAudioPlayerNode()
engine.attach(playerNode)
engine.connect(playerNode, to: engine.mainMixerNode, format: nil)

...

// Prepare input and output buffer
let inputBuffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: maxSamplesPerBuffer)!
let outputBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: maxSamplesPerBuffer)!

// When you fill your Int16 buffer with data, send it to converter
self.converter.convert(to: outputBuffer, error: nil) { inNumPackets, outStatus in
    outStatus.pointee = .haveData
    return inputBuffer
}

// Now in outputBuffer sound in system format and we can play it
self.playerNode.scheduleBuffer(outputBuffer)

nightwill
  • 657
  • 8
  • 10