I currently have a RemoteIO Audio Unit configured and working, it simply takes input, and passes it to output, so I can hear myself through the headphones of my iPhone when I speak into its microphone.
The next step in what I want to do is add optional effects and create a chain. I understand that AUGraph
has been deprecated and that I need to use kAudioUnitProperty_MakeConnection
to connect things together, but I have a few key questions and I'm unable to get audio out just yet.
Firstly: If I want to go RemoteIO Input -> Reverb -> RemoteIO Output, do I need two instances of the RemoteIO Audio Unit? Or can I use the same one? I am guessing just one, but to connect different things to its input and output scopes, but I'm having trouble making this happen.
Secondly: how do render callbacks play into this? I implemented a single render callback (an AURenderCallbackStruct
and set that as the kAudioUnitProperty_SetRenderCallback
property on my RemoteIO Audio Unit, and in the implementation of the callback, I do this:
func performRender(
_ ioActionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>,
inTimeStamp: UnsafePointer<AudioTimeStamp>,
inBufNumber: UInt32,
inNumberFrames: UInt32,
ioData: UnsafeMutablePointer<AudioBufferList>
) -> OSStatus {
guard let unit = audioUnit else { crash("Asked to render before the AURemoteIO was created.") }
return AudioUnitRender(unit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData)
}
Do I need a render callback at all to make this work? Do I need two, one to render from RemoteIO -> Reverb, and another to render back to Reverb -> RemoteIO?
The CoreAudio documentation is notoriously sketchy but I'm having trouble finding any up-to-date info on how to do this without AUGraph
which is deprecated.
Any advice hugely appreciated!