4

I have my own sound engine written in C++ and it has some custom abilities. I have the decoded + processed PCM values in my engine but I need to somehow play this through a library on IOS. It works based on a callback system. A 3rd party requests callbacks with given parameters such as numberOfFrames, pointerToFillData etc.

I am very new to IOS Audio and I do not know what gets this job done. On Android Oboe is the lib that does this.

I have found some custom libraries written like novocaine that is very similiar to what I need but its not exactly it. I looked at Core Audio but it seems to me that Core Audio SDK is deprecated. However, Audio Unit that was packed with Core Audio seems like the thing I need.

There are multiple ways to do this. This is why I could really use an example + suggestions on why your way is the preferable way...

Can anyone guide me in the right direction with examples?

I saw @hotpaw2 s answer but I need an example with a reason why his answer is better than other available options.

cs guy
  • 926
  • 2
  • 13
  • 33

2 Answers2

4

The iOS Audio Queue API is (currently) non-deprecated and callback based.

See: https://developer.apple.com/documentation/audiotoolbox/audio_queue_services

hotpaw2
  • 70,107
  • 14
  • 90
  • 153
  • Thank you for replying. I really appreciate it. Here are some questions in my mind: Right now, my audio engine only has 2 effects. I might be totally wrong here just a wild guess somehow Can I use apples built in effects to add effects to my PCM data? Also I've seen an example of AVAudioEngine AudioUnit with callbacks here https://stackoverflow.com/questions/50996398/play-audio-on-ios-from-a-memory-data-stream what are the differences between yours and this? – cs guy Mar 27 '21 at 21:05
  • also do you have any good examples for this, I tried to look it up but all of the examples are from 2012 to 2015 – cs guy Mar 27 '21 at 23:31
  • @csguy Audio Queues haven't changed much in the last 10 years. – Rhythmic Fistman Mar 29 '21 at 20:16
  • @RhythmicFistman I see then, why should one prefer Audio Queues over Audio Units? – cs guy Mar 29 '21 at 20:26
  • I use the remote IO audio unit for output when latency is important. iOS's I/O buffer size isn't explicitly tied to any API, and maybe you can configure an AQ or `AVAudioEngine` to do the right thing, but I think your chances are better with the AU. @hotpaw2 are you sure `AudioUnit`s are deprecated? I thought it was only the `AUGraph` API. – Rhythmic Fistman Mar 29 '21 at 20:52
  • Thanks for the correction. Only a few items inside the Audio Unit C API are deprecated, interapp audio, etc. – hotpaw2 Mar 29 '21 at 21:39
  • So in my case I care about audio latency and would like to be able to set the buffer size and feed my PCM values through this buffer, what is the best framework/class/api for this? @RhythmicFistman – cs guy Mar 29 '21 at 22:05
  • Audio Units do not allow setting the buffer size on iOS devices. Only suggesting a size via the Audio Session API. But if you care about latency and can code in real-time C (a less common skill set these days), then use the RemoteIO Audio Unit. – hotpaw2 Mar 29 '21 at 22:09
  • Does remote IO allow for setting buffer size? My engine is written in C++, I'm not sure if I can call C++ functions from C, never tried this before. Also how do you know all of this information. I tried to look online but there is literally 0 resources besides the old archieved core audio documentation. I can't find resources to obtain knowledge on ios audio libraries @hotpaw2 – cs guy Mar 29 '21 at 22:24
  • Some of the information was presented in WWDC sessions over the past 10 years. More information can be found inside Apple's header files, or here on stackoverflow. – hotpaw2 Mar 30 '21 at 05:01
  • @hotpaw2 could you share an example of remoteIO AudioUnit that you play your PCM with? I really couldn't find a decent one... – cs guy Apr 25 '21 at 18:21
  • Some RemoteIO code I posted on GitHub is for recording PCM, not playing, and is in Swift, not C. RecordAudio.swift: https://gist.github.com/hotpaw2 – hotpaw2 Apr 26 '21 at 00:38
3

You definitely need an AVAudioEngine.

let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
var scheduledFrames: AVAudioFrameCount = 0

init() throws {
    engine.attach(player)
    engine.prepare()
    try engine.start()
}

func addBuffer(buffer: AVAudioPCMBuffer) {
    // this will add the buffer to the end of the queue, read more in the docs
    scheduledFrames += buffer.frameLength
    player.scheduleBuffer(buffer, at: nil) {
        // buffer was played here
        scheduledFrames -= buffer.frameLength
        if scheduledFrames < 44100 { // I used around a second here, calculate depending your format
            // request new buffer
        }
    }
}

you can create an AVAudioPCMBuffer like this:

var data: UnsafeMutablePointer<UnsafeMutablePointer<Float>>!
// fill your data
let count = 100
guard
    let format = AVAudioFormat(
        standardFormatWithSampleRate: 44100,
        channels: 1
    ),
    let buffer = AVAudioPCMBuffer(
        pcmFormat: format,
        frameCapacity: AVAudioChannelCount(count)
    )
else {
    // throw
    return
}
for channel in 0..<Int(format.channelCount) {
    memcpy(buffer.floatChannelData![channel],
           data[channel],
           count * MemoryLayout<Float>.size)
}

Replace floatChannelData to int16ChannelData or int32ChannelData if you need

Phil Dukhov
  • 67,741
  • 15
  • 184
  • 220
  • Nearly what I need. How may I schedule a continuous play? I mean I need to be able to play my data continuously with no gap not only once with a buffer like here – cs guy Mar 30 '21 at 12:31
  • I've added scheduled frames calculation login. `scheduleBuffer` callback gets called after the buffer is played, so just keep around 1 sec of buffers in the queue – Phil Dukhov Mar 30 '21 at 21:02
  • Not quite what I was hoping for, this seems like a workaround. This code seems fit to play one buffer only, am I mistaken? – cs guy Mar 31 '21 at 00:23
  • 1
    No, you can schedule many buffers at once. This is more like a pseudocode, you need to start requesting buffers when start playing and stop when you have scheduled enough, and keep scheduling: when n buffer completion gets called you schedule buffer number n+44100. So you always keep 1 sec of buffers scheduled to play. You can schedule all the buffers at once if you don’t worry about memory usage, it’ll work too. To reset buffers queue you need to call `player.stop()` – Phil Dukhov Mar 31 '21 at 03:38