1

I'm creating UIImage objects from CMSampleBufferRef's. I'm doing this in a separate queue (in background) so I'm including the processing in an @autorealease pool. The problem is that memory is building up without any leak notification. Bellow is the method I'm using:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    @autoreleasepool {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // Get the number of bytes per row for the pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

       // Get the number of bytes per row for the pixel buffer
       size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
       // Get the pixel buffer width and height
       size_t width = CVPixelBufferGetWidth(imageBuffer);
       size_t height = CVPixelBufferGetHeight(imageBuffer);

       // Create a device-dependent RGB color space
       CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

       // Create a bitmap graphics context with the sample buffer data
       CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
      // Create a Quartz image from the pixel data in the bitmap graphics context
       CGImageRef quartzImage = CGBitmapContextCreateImage(context);
       // Unlock the pixel buffer
       CVPixelBufferUnlockBaseAddress(imageBuffer,0);

       // Free up the context and color space
       CGContextRelease(context);
       CGColorSpaceRelease(colorSpace);

       // Create an image object from the Quartz image
       UIImage *image = [[UIImage imageWithCGImage:quartzImage] retain];

       // Release the Quartz image
       CGImageRelease(quartzImage);

       return (image);
   }
}

And this is how I'm using it:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {

    CFRetain(sampleBuffer);
    dispatch_async(movieWritingQueue, ^{
    @autoreleasepool {

        if (self.returnCapturedImages && captureOutput != audioOutput) {

            UIImage *capturedImage = [self imageFromSampleBuffer: sampleBuffer];

            dispatch_async(callbackQueue, ^{

                @autoreleasepool {

                    if (self.delegate && [self.delegate respondsToSelector: @selector(recorderCapturedImage:)]) {
                        [self.delegate recorderCapturedImage: capturedImage];
                    }

                    [capturedImage release];
                }
            });
        }
        CFRelease(sampleBuffer);
    }
});
Mihai
  • 768
  • 2
  • 6
  • 18
  • Why are you converting to uiimage? If you are going to write the images to an avassetwriter, you'll want to stick with the sample buffers and cvpixelbufferref objects only as they'll be an order of magnitude faster than converting to uiimage. – jjxtra Feb 11 '15 at 15:04

2 Answers2

2

I found a temporary solution. I'm doing the same operations but on the main queue. This is not elegant or efficient at all, but at least the memory doesn't build up anymore.

I'm wondering if this is an iOS bug...?

UPDATE: This is how I'm processing the CMSampleBuffers on the main thread:

[[NSOperationQueue mainQueue] addOperationWithBlock:^ {

    CGImageRef cgImage = [self cgImageFromSampleBuffer:sampleBuffer];
    UIImage *capturedImage =     [UIImage imageWithCGImage: cgImage ];

    //do something with the image - I suggest in a background thread
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
       // do something with the image
    });

    CGImageRelease( cgImage );
    CFRelease(sampleBuffer);
}];

- (CGImageRef) cgImageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    CGContextRelease(newContext);

    CGColorSpaceRelease(colorSpace);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    /* CVBufferRelease(imageBuffer); */  // do not call this!

    return newImage;
}
Mihai
  • 768
  • 2
  • 6
  • 18
  • Hello, Mihai How to do that in the main queue? Thanks! – Benoît Freslon Feb 11 '15 at 09:30
  • Oh, if you were calling the didOutputSampleBuffer method in rapid succession for multiple images and performing the imageFromSampleBuffer method in the background, chances are your operations were overlapping. The memory build up is expected in that case and it's not an iOS bug. – Lyndsey Scott Feb 11 '15 at 13:23
  • I understand. And what is the solution for this? – Mihai Feb 11 '15 at 13:28
  • So you've probably looped through an array of images (though you haven't included that in your updated code, I'm guessing this is the case), added them to an NSOperationQueue, then performed one or a few operations at a time. The problem was since you were performing `didOutputSampleBuffer:` on a background thread,`cgImageFromSampleBuffer:` was returning before `didOutputSampleBuffer:` was complete and you were actually performing more than just a few iterations of `didOutputSampleBuffer:` at a time, so performing `didOutputSampleBuffer:` in the foreground could in fact eliminate that issue. – Lyndsey Scott Feb 11 '15 at 13:40
1

I actually had a similar problem a few days ago...

You're already releasing your CMSampleBufferRef, but also try releasing your CVPixelBufferRef, ex:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    @autoreleasepool {

       // ...

       // Free up the context and color space
       CGContextRelease(context);
       CGColorSpaceRelease(colorSpace);

       // Create an image object from the Quartz image
       UIImage *image = [[UIImage imageWithCGImage:quartzImage] retain];

       // Release the Quartz image
       CGImageRelease(quartzImage);

       CVPixelBufferRelease(imageBuffer); <-- release your pixel buffer

       return (image);
   }
}
Lyndsey Scott
  • 37,080
  • 10
  • 92
  • 128
  • Thank you for your suggestion. I already tried that but it crashes on `CFRelease(sampleBuffer)` line. Getting the pixel buffer with `CMSampleBufferGetImageBuffer` doesn't retain the data. Here is the Apple documentation: "The caller does not own the returned dataBuffer, and must retain it explicitly if the caller needs to maintain a reference to it." – Mihai Jan 30 '15 at 07:54
  • @MihaiGhete That line in the `CMSampleBufferGetDataBuffer` docs has absolutely nothing to do with `CVPixelBufferRef` being retained. (Note that my answer refers to the *pixel* data buffer you've created in imageFromSampleBuffer, not the *sample* buffer.) Releasing the pixel buffer during that method shouldn't have any effect on `CFRelease(sampleBuffer)` since you're returning an image from `imageFromSampleBuffer:`. – Lyndsey Scott Jan 30 '15 at 10:01