5

I've hunted high and low and cannot find a solution to this problem. I am looking for a method to change the input/output devices which an AVAudioEngine will use on macOS.

When simply playing back an audio file the following works as expected:

var outputDeviceID:AudioDeviceID = xxx 
let result:OSStatus = AudioUnitSetProperty(outputUnit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, 0, &outputDeviceID, UInt32(MemoryLayout<AudioObjectPropertyAddress>.size))
if result != 0  {
   print("error setting output device \(result)")
   return
}

However if I initialize the audio input (with let input = engine.inputNode) then I get an error once I attempt to start the engine:

AVAEInternal.h:88 required condition is false: [AVAudioEngine.mm:1055:CheckCanPerformIO: (canPerformIO)]

I know that my playback code is OK since, if I avoid changing the output device then I can hear the microphone and the audio file, and if I change the output device but don't initialize the inputNode the file plays to the specified destination.

Additionally to this I have been trying to change the input device, I understood from various places that the following should do this:

let result1:OSStatus = AudioUnitSetProperty(inputUnit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Output, 0, &inputDeviceID, UInt32(MemoryLayout<AudioObjectPropertyAddress>.size))
if result1 != 0  {
    print("failed with error \(result1)")
    return
}

However, this doesn't work - in most cases it throws an error (10853) although if I select a sound card that has both inputs and outputs it succeeds - it appears that when I am attempting to set the output or the input node it is actually setting the device for both.

I would think that this meant that an AVAudioEngine instance can only deal with one device, however it is quite happy working with the default devices (mic and speakers/headphones) so I am confident that isn't the issue. Looking at some solutions I have seen online people simply change the default input, but this isn't a massively nice solution.

Does anyone have any ideas as to whether this is possible?

It's worth noting that kAudioOutputUnitProperty_CurrentDevice is the only property available, there is not an equivalent kAudioInputUnitProperty_CurrentDevice key, due to the fact that as I understand it both the inputNode and outputNode are classed as "Output Units" (as they both emit sound somewhere).

Any ideas would be much appreciated as this is very very frustrating!!

Thanks

TylerP
  • 9,600
  • 4
  • 39
  • 43
  • I've also tried this with inputNode.auAudioUnit.setDeviceID(xxx) and seem to have the same issue. As far as I can see you are only able to move away from the default audio device if either a) you are only outputting audio, or b) you use one device for both input and output.. – Richard Williamson May 15 '20 at 22:03
  • Did you find a solution ? Im experiencing the exact same problem... – Chris May 22 '20 at 15:35
  • Had a reply from apple - will do an answer.. – Richard Williamson May 23 '20 at 16:04

1 Answers1

4

So I filed a support request with apple on this and another issue and the response confirms that an AVAudioEngine can only be assigned to a single Aggregate device (that is, a device with both input and output channels) - the system default units create effectively an aggregate device internally which is why they work, although I've found an additional issue in that if the input device also has output capabilities (and you activate the inputNode) then that device has to be both the input and output device as otherwise the output appears not to work.

So answer is that I think there is no answer..

  • I wonder if it would be possible to wrap an `AUAudioUnit` for an input device in an `AVAudioSourceNode` and use that in lieu of `inputNode`. I wrote some code that seemed promising but didn't work properly, but the approach could be workable. – sbooth May 26 '20 at 20:32
  • I tried that but it appears that you can't get a hardware input into an audioEngine that way. I think the solution would be to create a separate AudioUnit, write from that to a buffer and read in the buffer with a player node, but i haven't had a chance to try this yet. The CoreAudio "Play Through" example from apple does this but without an engine and appears to have pretty low latency. I tried doing a tap on the inputNode but the latency on that was at least 100ms – Richard Williamson May 28 '20 at 13:46
  • Thank you for documenting everything that you discovered. I too am struggling with this, trying to take input device A and play it through output device B and have tried many of the things that you described. Did you get any further with this? Do you know how applications like GarageBand can do it? I will try to find that CoreAudio "Play Through" example... – chaimp Jan 18 '21 at 02:09
  • I beleive most do what the Play Through example does which is to write to a buffer and then read from that into the other device - that's what I ended up having to do – Richard Williamson Jan 26 '21 at 13:18
  • My guess: An input unit requires an output unit using the same oscillator clock to pull from the input. Without the right clock to pull, using a copy buffer in between allow your custom software to manage any tiny differences in clock frequencies (thus sample rates) by fancy error concealment strategies (well outside the scope of AVAudioEngine). Otherwise audio buffer underflow or overflow is guaranteed by differences between the two crystal oscillators. – hotpaw2 Jul 31 '22 at 07:34