0

We have an audio unit that read the buffers, and i have memory growth in the callback function. it works great but i am spending much time to solve this leak.

after eliminating the problem i found that the the basic configuration code, that is written by apple , is causing the memory growth(100k a second )

this is my callback, after taking all other stuff, with the problem :

AudioComponentInstance audioUnit;


static OSStatus recordingCallback(void *inRefCon, 
                                  AudioUnitRenderActionFlags *ioActionFlags, 
                                  const AudioTimeStamp *inTimeStamp, 
                                  UInt32 inBusNumber, 
                                  UInt32 inNumberFrames, 
                                  AudioBufferList *ioData) 
{
    AudioBuffer buffer;
    buffer.mNumberChannels = 1;
    buffer.mDataByteSize = inNumberFrames * 2;
    buffer.mData = malloc( inNumberFrames * 2 );

    // Put buffer in a AudioBufferList
    AudioBufferList bufferList;
    bufferList.mNumberBuffers = 1;
    bufferList.mBuffers[0] = buffer;


    OSStatus status;

// problematic block   ******

    status = AudioUnitRender(audioUnit, 
                              ioActionFlags, 
                             inTimeStamp, 
                              inBusNumber, 
                             inNumberFrames, 
                              &bufferList); 


//end of problem block    ******


   free(buffer.mData); 
}

removing that block solve the problem.

does the audioUnit needs a property ? what could be my basic problem here ? thanks.

user1280535
  • 347
  • 3
  • 13

2 Answers2

4

Like you asked in your other question (http://stackoverflow.com/questions/10278516/memory-is-growing-in-audio-buffer-code), you should not be malloc'ing inside your render callback. Malloc outside, save a reference to the buffer, and clear the buffer out by setting each of the samples to zero. 2 for() loops are much faster than a malloc().

Because of the confusing way memory management works, you have been (erroneously) led to believe that the actual call to AudioUnitRender() is what's causing your problem. It's not. It's the malloc() call above, coupled with the fact that this memory cannot be efficiently reclaimed by the time the next render callback is made. Therefore the reference is kept and your memory is leaked.

Generally speaking, your performance strategy for any audio programming is to do as much work outside of the render callback as necessary. Consider that this function needs to complete in ~10ms or less, otherwise you will hear droupouts in the audio stream.

If you are programming for the desktop, you want to greedily malloc() tons of memory during initialization, and save references to the data to quickly read from it so you don't need to allocate during rendering. Likewise, if you need to calculate any constants or stuff like that, you want to do it before rendering starts.

If you're on mobile, the strategy is similar but you probably don't want to malloc() as aggressively. Still, since a mobile phone (or iPad) has a limited amount of both CPU power and memory, you want to be super careful about how long your render callback is. The shorter, the better.

Nik Reiman
  • 39,067
  • 29
  • 104
  • 160
  • First thanks a lot . second, this is very strange because apple docs, and EVERY other good site in there, are setting the buffer in that exact way(first 4 lines) and allocate it INSIDE the callback function. ANYWAY, what is the best way of doing that? if i will send the buffer list as an argument to another function from the callback, then i will create a pool there and handle the audio(dsp) would it be ok ?? – user1280535 Apr 25 '12 at 06:43
  • Just forget about the callback for a second -- pretend it doesn't exist at all. :) Now, create the buffers you need at the first opportunity you have in your code. Use a global variable if you must (seriously). Now, within render(), just zero out the buffer contents instead of re-allocating. – Nik Reiman Apr 25 '12 at 08:14
  • thanks a lot for your time. so should i allocate it at the init once and free it at dealloc, or not ? why zero out the buffer content? it filled up with a new data anyway isn't it ? – user1280535 Apr 27 '12 at 08:17
  • oh i see, you mean not allocating it at all ? just create it, and every callback after DSP , zero out the buffer ? – user1280535 Apr 27 '12 at 08:19
  • @user1280535 Well, you still need to allocate the buffers. But you just do it during your initialization routines rather than in the callback. But yes, that is more or less what I meant to say. – Nik Reiman Apr 30 '12 at 09:08
2

If you allocate or malloc() your own audio buffers (which should be done outside the Audio Unit callback), you might want to try disabling automatic buffer allocation somewhere in your Audio Unit initialization or setup.

// Disable buffer allocation for the recorder
UInt32 myDisable = 0;
err = AudioUnitSetProperty(audioUnit,
                           kAudioUnitProperty_ShouldAllocateBuffer,
                           kAudioUnitScope_Output,
                           kInputBus,
                           &myDisable,
                           sizeof(myDisable));
hotpaw2
  • 70,107
  • 14
  • 90
  • 153