2

So i've gone thru the Audio Unit Hosting Guide for iOS and they hint all along that one might always want to use an AUGraph instead of direct connections between AUs. Among the most notable reasons they mention a high-priority thread, being able to reconfigure the graph while it is running and general thread-safety.

My problem is that I'm trying to get as close to "making my own custom dsp effects" as possible given that iOS does not really let you dynamically load custom code. So, my approach is to create a generic output unit and write the DSP code in its render callback. Now the problem with this approach is if I wanted to chain two of these units with custom callbacks in a graph. Since a graph must have a single output au (for head), trying to add any more output units won't fly. That is, I cant have 2 Generic I/O units and a Remote I/O unit in sequence in a graph.

If I really wanted to use AUGraphs, I can think of one solution along the lines of:

  1. A graph interface that internally keeps an AUGraph with a single Output unit plus, for each connected node in the graph, a list of "custom callback" generic output nodes that are in theory connected to such node. Maybe it could be a class / interface over AUNode instead, but hopefully you get the idea.
  2. If I add non output units to this graph, it essentially connects them in the usual way to the backing AUGraph.
  3. If however, I add a generic output node, this really means adding the node's au to the list and whichever node I am connecting in the graph to, actually gets its input scope / element 0 callback set to something like:

    for each unit in your connected list: 
      call AudioUnitRender(...) and merge in ioData; 
    

    So that the node which is "connected" to any number of those "custom" nodes basically pulls the processed output from them and outputs it to whatever next non-custom node.

Sure there might be some loose ends in the idea above, but I think this would work with a bit more thought.

Now my question is, what if instead I do direct connections between AUs without an AUGraph whatsoever? Without an AUGraph in the picture, I can definitely connect many generic output units with callbacks to one final Remote I/O output and this seems to work just fine. Thing is kAudioUnitProperty_MakeConnection is a property. So I'm not sure that once an AU is initialized I can re set properties. I believe if I uninitialize then it's ok. If so, couldn't I just get GCD's high priority queue and dispatch any changes in async blocks that uninitialize, re connect and initialize again?

SaldaVonSchwartz
  • 3,769
  • 2
  • 41
  • 78
  • If I understand you correctly, you want to use your own audio unit. It seems starting from iOS 5.0 you can do it. Start point to look into this is AudioComponentRegister function in AudioComponent.h in AudioUnit framework. – Aliaksandr Andrashuk Oct 06 '13 at 13:37
  • What I'm doing right now is creating a higher-level wrapper API that allows me to create sbuclasses of a Generic I/O unit (all wrapped up transparently in Objective-C land) and attach render callbacks to their input scopes. Then I can also chain these units together. This feels 'almost' like creating my own units in OSX. But I'll check out `AudioComponentRegister`! – SaldaVonSchwartz Oct 06 '13 at 21:02
  • Also you can checkout AUPublic sample https://developer.apple.com/library/mac/samplecode/CoreAudioUtilityClasses/Introduction/Intro.html#//apple_ref/doc/uid/DTS40012328-Intro-DontLinkElementID_2 – Aliaksandr Andrashuk Oct 07 '13 at 11:47

0 Answers0