0

I'm trying to convert a CMSampleBuffer to an UIImage with Swift 3.0. A popular solution is to write an extension for the CMSampleBuffer class and add a getter to convert the buffer to an image. This is what it looks like:

import Foundation
import AVFoundation

extension CMSampleBuffer {
    @available(iOS 9.0, *)
    var uiImage: UIImage? {
    guard let imageBuffer = CMSampleBufferGetImageBuffer(self) else { return nil }

    let ciimage: CIImage = CIImage(cvImageBuffer: imageBuffer)


    let image:UIImage = UIImage(ciImage: ciimage)


    return image
    }

}

It works fine but it's taking up a lot of memory, 40% of the total app memory. Is there a more memory efficient solution?

EDIT:

I have changed my code and it looks like this:

var uiImage: UIImage? {
    guard let imageBuffer = CMSampleBufferGetImageBuffer(self) else { return nil }

    CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
    let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)
    let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
    let width = CVPixelBufferGetWidth(imageBuffer)
    let height = CVPixelBufferGetHeight(imageBuffer)
    let colorSpace = CGColorSpaceCreateDeviceRGB()
    let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)


    var image: UIImage?
    autoreleasepool(invoking: {() -> () in
        guard let context = CGContext(data: baseAddress,
        width: width,
        height: height,
        bitsPerComponent: 8,
        bytesPerRow: bytesPerRow,
        space: colorSpace,
        bitmapInfo: bitmapInfo.rawValue) else { return }
        guard let cgImage = context.makeImage() else { return }
        image = UIImage(cgImage: cgImage)
    })    
    CVPixelBufferUnlockBaseAddress(imageBuffer,CVPixelBufferLockFlags(rawValue: 0));



    return image
}

The memory leak has something to do with the CGContext. Is there any other way I can free/release/deallocate it besides using an autoreleasepool?

NFarrell
  • 255
  • 1
  • 17
  • What are the pixel dimensions of the sample buffer? How big is “the total app memory”? A “4k” image is 3840 by 2160, and (at 32 bits per pixel) requires a minimum of 33,177,600 bytes. – rob mayoff Jul 07 '17 at 19:50
  • Thanks for the response Rob. I'm not exactly sure the pixel dimensions. When I retrieve the height and width of the CVPixelBuffer it is 640x480. I believe that's it. The total app memory gets to just above a 1GB before it crashes. With every frame captured by the captureOutput function of my AVCaptureVideoDataOutput, it is converting the buffer into a UIImage after it is processed. So I see how it could be taxing on the phone. Does this have anything to do with it? – NFarrell Jul 07 '17 at 20:12
  • What are you doing with the images? When do you think your app is done with them? Have you used Instruments to try to figure out what's retaining them? – rob mayoff Jul 07 '17 at 20:48

0 Answers0