0

I am writing an Audio Unit (remote IO) based app that displays waveforms at a given buffer size. The app initially starts off with a preferred buffer size of 0.0001 which results in very small buffer frame sizes (i think its 14 frames). Than at runtime I have a UI element that allows switching buffer frame sizes via AVAudioSession's method setPreferredIOBufferDuration:Error:.

Here is the code where the first two cases change from a smaller to a larger sized buffer. 3-5 are not specified yet. But the app crashes at AudioUnitRender with -50 error code.

- (void)setBufferSizeFromMode:(int)mode {

   NSTimeInterval bufferDuration;

   switch (mode) {
      case 1:
         bufferDuration = 0.0001;
         break;
      case 2:
         bufferDuration = 0.001;
         break;
      case 3:
         bufferDuration = 0.0; // reserved
         break;
      case 4:
         bufferDuration = 0.0; // reserved
         break;
      case 5:
         bufferDuration = 0.0; // reserved
         break;
      default:
         break;
   }

   AVAudioSession *session = [AVAudioSession sharedInstance];
   NSError * audioSessionError = nil;

   [session setPreferredIOBufferDuration:bufferDuration error:&audioSessionError];
   if (audioSessionError) {
      NSLog(@"Error %ld, %@",
            (long)audioSessionError.code, audioSessionError.localizedDescription);
   }

}

Based on reading the CoreAudio and AVFoundation documentation, I was led to believe that you can change audio hardware configuration at runtime. There may be some gaps in audio or distortion but I am fine with that for now. Is there an obvious reason for this crash? Or must I reinitialize everything (my audio session, my audio unit, my audio buffers, etc..) for each change of the buffer duration?

Edit: I have tried calling AudioOutputUnitStop(self.myRemoteIO); before changing the session buffer duration and than starting again after it is set. I've also tried setting the session to inactive and than reactivating it but both result with the -50 OSStatus from AudioUnitRender() in my AU input callback.

Alex Bollbach
  • 4,370
  • 9
  • 32
  • 80

1 Answers1

3

A -50 error usually means the audio unit code is trying to set or use an invalid parameter value.

Some iOS devices don't support actual buffer durations below 5.3 mS (or 0.0058 seconds on older devices). And iOS devices appear free to switch to an actual buffer duration 4X longer than that, or even alternate slightly different values, at times not under the apps control.

The inNumberFrames is given to the audio unit callback as a parameter, your app can't arbitrarily specify that value.

If you want to process given buffer sizes, pull them out of an intermediating lock-free circular FIFO, which the audio unit callback can feed into.

Also: Try waiting a second or so after calling audio stop before changing parameters or restarting. There appears to be a delay between when you call stop, and when the hardware actually stops.

hotpaw2
  • 70,107
  • 14
  • 90
  • 153
  • So I have a feeling that the error has more to do with switching AudioSession properties while I/O is transpiring more than the bufferSize problems you mention. If I don't attempt to switch `prefferedBufferDuration` at runtime I can achieve a bufferSize of 14 frames in each callback resulting in a latency of 0.000317 seconds or 3.1ms. The frameSize switching to 4x is associated with the app going into background mode I believe and that is not a scenario I'm am concerned with at this moment. – Alex Bollbach Feb 16 '16 at 04:18
  • And I'm not sure what you mean by arbitrarily set inNumberFrames. I simply pass the local variable `inNumberFrames` from the input callback to a direct call to `AudioUnitRender()`. I don't know why I would need a circular FIFO buffer. The thing is my current code can handle any arbitrary buffer size from 3.1ms up to 4096 frames if it is specified at compile time. It is the switching that involves a complex sequence on disabled audiosession, stopping i/o rendering, etc.. that is giving me trouble. – Alex Bollbach Feb 16 '16 at 04:20
  • iOS decides the number of frames at run time; the value is unknown at compile time. Sometime it matches the preference setting, but often not. So you can't depend on that. – hotpaw2 Feb 19 '16 at 19:21