1

I'm having trouble getting AVAudioEngine (OS X) to play nice with all sample rates.

Here's my code for building the connections:

- (void)makeAudioConnections {  

  auto hardwareFormat = [self.audioEngine.outputNode outputFormatForBus:0];  
  auto format = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:hardwareFormat.sampleRate channels:2];  
  NSLog(@"format: %@", format);  

  @try {  
    [self.audioEngine connect:self.avNode to:self.audioEngine.mainMixerNode format:format];  
    [self.audioEngine connect:self.audioEngine.inputNode to:self.avNode format:format];  
  } @catch(NSException* e) {  
    NSLog(@"exception: %@", e);  
  }  
}  

On my audio interface, the render callback is called for 44.1, 48, and 176.4kHz. It is not called for 96 and 192 kHz. On the built-in audio, the callback is called for 44.1, 48, 88 but not 96.

My AU's allocateRenderResourcesAndReturnError is being called for 96kHz. No errors are returned.

- (BOOL) allocateRenderResourcesAndReturnError:(NSError * _Nullable *)outError {

  if(![super allocateRenderResourcesAndReturnError:outError]) {
    return NO;
  }

  _inputBus.allocateRenderResources(self.maximumFramesToRender);
  _sampleRate = _inputBus.bus.format.sampleRate;

  return YES;
}

Here's my AU's init method, which is mostly just cut & paste from Apple's AUv3 demo:

- (instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription options:(AudioComponentInstantiationOptions)options error:(NSError **)outError {

  self = [super initWithComponentDescription:componentDescription options:options error:outError];

  if (self == nil) {
    return nil;
  }

  // Initialize a default format for the busses.
  AVAudioFormat *defaultFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100. channels:2];

  // Create the input and output busses.
  _inputBus.init(defaultFormat, 8);
  _outputBus = [[AUAudioUnitBus alloc] initWithFormat:defaultFormat error:nil];

  // Create the input and output bus arrays.
  _inputBusArray  = [[AUAudioUnitBusArray alloc] initWithAudioUnit:self busType:AUAudioUnitBusTypeInput  busses: @[_inputBus.bus]];
  _outputBusArray = [[AUAudioUnitBusArray alloc] initWithAudioUnit:self busType:AUAudioUnitBusTypeOutput busses: @[_outputBus]];

  self.maximumFramesToRender = 256;

  return self;

}

To keep things simple, I'm setting the sample rate before starting the app.

I'm not sure where to begin tracking this down.

Update

Here's a small project which reproduces the issue I'm having:

Xcode project to reproduce issue

You'll get errors pulling from the input at certain sample rates.

On my built-in audio running at 96kHz the render block is called with alternating 511 and 513 frame counts and errors -10863 (kAudioUnitErr_CannotDoInCurrentContext) and -10874 (kAudioUnitErr_TooManyFramesToProcess) respectively. Increasing maximumFramesToRender doesn't seem to help.

Update 2

I simplified my test down to just connecting the input to the main mixer:

[self.audioEngine connect:self.audioEngine.inputNode to:self.audioEngine.mainMixerNode format:nil];

I tried explicitly setting the format argument.

This still will not play through at 96kHz. So I'm thinking this may be a bug in AVAudioEngine.

Taylor
  • 5,871
  • 2
  • 30
  • 64
  • To reproduce do I have to set my hardware sample rate in Audio MIDI Setup or change something in the code? – Rhythmic Fistman Jun 14 '16 at 14:44
  • @RhythmicFistman Sorry, for not being clear. You'll need to change the sample rate in Audio MIDI Setup. – Taylor Jun 14 '16 at 15:39
  • Ok, now I get errors too. My `AURenderPullInputBlock` is returning `kAudioUnitErr_TooManyFramesToProcess` = -10874, because there's a sample rate converter somewhere in there causing me to pull 514 frames instead of 512. Is that what you see? What happens if you break the stream into 512 frame chunks? Why it is rate converting is a mystery to me, maybe my output device doesn't actually support 96kHz. – Rhythmic Fistman Jun 14 '16 at 21:53
  • @RhythmicFistman do you mean to call the the `pullInputBlock` multiple times with smaller chunks? – Taylor Jun 16 '16 at 02:47
  • Yes, with 512 frame chunks (or whatever it's expecting/asking for). – Rhythmic Fistman Jun 16 '16 at 02:48
  • @RhythmicFistman I just tried hardcoding 512 frame chunks and the error went away. I shouldn't have to compensate for differing size chunks between input and output though, should I? – Taylor Jun 16 '16 at 02:56
  • I think you always have to (in core audio and programming in general). Do you know any APIs where you don't have to? I used to think iOS remote io audio units were one, but then new hardware came out and once again I had to rechunkify my audio stream. OTOH,I don't know this API and maybe this behaviour is genuinely surprising or indicative of a bug. – Rhythmic Fistman Jun 16 '16 at 03:16
  • @RhythmicFistman Apparently the `mainMixerNode` built into `AVAudioEngine` is supposed to sample rate convert. Also, when I remove my AU and just hook input to the mixer directly, it doesn't work at 96kHz. Methinks `AVAudioEngine` is just buggy. – Taylor Jun 16 '16 at 06:49

1 Answers1

2

For play-through with AVAudioEngine, the input and output hardware formats and all the connection formats must be at the same sample rate. So the following should work.

AVAudioFormat *outputHWFormat = [self.audioEngine.outputNode outputFormatForBus:0];
AVAudioFormat *inputHWFormat  = [self.audioEngine.inputNode inputFormatForBus:0];

if (inputHWFormat.sampleRate == outputHWFormat.sampleRate) {
    [self.audioEngine connect:self.audioEngine.inputNode to:self.audioEngine.mainMixerNode format:inputHWFormat];
    [self.audioEngine connect:self.audioEngine.mainMixerNode to:self.audioEngine.outputNode format:inputHWFormat];

}
  • Why the if statement? I would think you'd want to handle the case where sample rates don't agree. – Taylor Jun 21 '16 at 23:28