1

I am currently developing a VoIP application and one of the libraries I am using requires me to send the frames in the input call back. The requirements is that I must send a sample count which is defined as the number of samples in a frame. This callback will get called whenever the microphone takes in new samples.

Valid numbers are (sample rate) * (audio length) / 1000. Where audio length can be 2.5, 5, 10, 20, 40 or 60 millseconds.

Before using kAudioUnitProperty_MaximumFramesPerSlice to limit my inNumberFrames to 960, it was bringing in consistently around 1115 inNumberFrames. So I decided to limit it by setting that property. However when I do, it brings the inNumberFrames count down to 512??? Am I approaching this problem the wrong way?

static OSStatus inputRenderCallBack(void *inRefCon,
                                    AudioUnitRenderActionFlags  *ioActionFlags,
                                    const AudioTimeStamp    *inTimeStamp,
                                    UInt32 inBusNumber,
                                    UInt32 inNumberFrames, //512
                                    AudioBufferList *ioData)
{
    AudioBufferList bufferList;
    bufferList.mNumberBuffers = 1;
    bufferList.mBuffers[0].mNumberChannels = kNumberOfChannels;
    bufferList.mBuffers[0].mData = NULL;
    bufferList.mBuffers[0].mDataByteSize = inNumberFrames * sizeof(SInt16) * kNumberOfChannels;

    OCTAudioEngine *engine = (__bridge OCTAudioEngine *)(inRefCon);
    OSStatus status = AudioUnitRender(engine.ioUnit,
                                       ioActionFlags,
                                       inTimeStamp,
                                       inBusNumber,
                                       inNumberFrames,
                                       &bufferList);
    NSError *error;
    [engine.toxav sendAudioFrame:bufferList.mBuffers[0].mData
                     sampleCount:kSampleCount //960
                        channels:kNumberOfChannels //2
                      sampleRate:[engine currentAudioSampleRate] //24000
                        toFriend:engine.friendNumber
                           error:&error];

    return status;
}
cvu
  • 482
  • 2
  • 6
  • 20
  • 1
    What is the reason you are trying to limit the number of frames going in? Most of these issues are better addressed using a circular buffer. Check out this [question](http://stackoverflow.com/questions/30691684/whats-the-reason-of-using-circular-buffer-in-ios-audio-calling-app/30698791#30698791). – dave234 Jun 16 '15 at 12:01
  • I am trying to see if I can avoid the circular buffer situation in the first place. I already am using one for receiving audio, since it's necessary when the audio is being pulled for playback. – cvu Jun 16 '15 at 16:25
  • Why avoid the circular buffer? They're cheap, and solve your problem. – dave234 Jun 16 '15 at 16:28
  • But wouldn't that mean I have to stuff in a circular buffer during the callback, then have another thread that constantly checks to see if there is enough frames in that circular buffer to consume? Sounds like a mess but guess I'll give it a try. – cvu Jun 16 '15 at 17:06
  • 2
    It is messier for sure, but that is the nature of multithreaded apps. I believe it is necessary if you require a specific frame count. When you get used to them they're fine. The hardware always wants base 2 frame counts. 1115 frames sounds odd. 1024 is the default. You can get it as low as 128, but it will almost always be a base 2 number at the remoteIO. Bottom line is that the inNumberFrames is not in your control, so you have to give yourself a cushion to work with. If you can do your work on the audio thread you can add to the head, and pull from the tail in render callback too. – dave234 Jun 16 '15 at 17:27
  • Yep that's what I did! Add to the head, and consumed the tail in the same input callback thread. To avoid buffer overflows, I called `sendAudioFrame` more than once depending on how many frames are in the buffer. Thanks a lot! – cvu Jun 16 '15 at 18:10
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/80700/discussion-between-cvu-and-dave). – cvu Jun 16 '15 at 19:27

1 Answers1

1

You can (and should) use kAudioUnitProperty_MaximumFramesPerSlice to specify the maximum number of samples per frame, not the preferred number; please refer to Apple's Technical Q&A QA1533 and QA1606. To set the preferred number of samples per frame, use the setPreferredIOBufferDuration:error: method of AVAudioSession. For example, if the sample rate is 32,000, you can use:

NSError *error = nil;
NSTimeInterval bufferDuration = 0.008;
[session setPreferredIOBufferDuration:bufferDuration error:&error];

to set your frame length to 256, because 32,000 * 0.008 = 256.

Note, however, that the setPreferredIOBufferDuration:error: method sets the preferred number, not the exact number. You can suggest a frame length to the operating system, but sometimes, due to hardware limits, you may not get exactly what you want, but something close to it instead.

ycsun
  • 1,825
  • 1
  • 13
  • 21