1

I am working with AVCaptureStillImageOutput for the first time, I save a JPEG image at some point. Instead of a JPEG image I would like to save a PNG image. What do I need to do for that?

I have those 3 lines of code along the app:

let stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)

Is there a simple way to modify those lines to get what I want? After browsing the net, it seems like the anser is NO (unless I have not been lucky enough), nevertheless I still believe there must be some good solution.

Michel
  • 10,303
  • 17
  • 82
  • 179

2 Answers2

2

There is sample code in the AVFoundation Programming Guide that shows how to convert a CMSampleBuffer to a UIImage (under Converting CMSampleBuffer to a UIImage Object). From there, you can use UIImagePNGRepresentation(image) to encode it as PNG data.

Here is a Swift translation of that code:

extension UIImage
{
    // Translated from <https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/06_MediaRepresentations.html#//apple_ref/doc/uid/TP40010188-CH2-SW4>
    convenience init?(fromSampleBuffer sampleBuffer: CMSampleBuffer)
    {
        guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil }

        if CVPixelBufferLockBaseAddress(imageBuffer, kCVPixelBufferLock_ReadOnly) != kCVReturnSuccess { return nil }
        defer { CVPixelBufferUnlockBaseAddress(imageBuffer, kCVPixelBufferLock_ReadOnly) }

        let context = CGBitmapContextCreate(
            CVPixelBufferGetBaseAddress(imageBuffer),
            CVPixelBufferGetWidth(imageBuffer),
            CVPixelBufferGetHeight(imageBuffer),
            8,
            CVPixelBufferGetBytesPerRow(imageBuffer),
            CGColorSpaceCreateDeviceRGB(),
            CGBitmapInfo.ByteOrder32Little.rawValue | CGImageAlphaInfo.PremultipliedFirst.rawValue)

        guard let quartzImage = CGBitmapContextCreateImage(context) else { return nil }
        self.init(CGImage: quartzImage)
    }
}
jtbandes
  • 115,675
  • 35
  • 233
  • 266
  • Thanks, the more I search the more it seems there is no simpler way. I'll try to work from that and see. – Michel Jan 05 '16 at 06:55
  • @Michel I just updated my answer with a Swift translation of the Apple sample code. Try it out and let me know if it works! – jtbandes Jan 05 '16 at 06:56
1

Here is Swift 4 version of the above code.

extension UIImage
{
    convenience init?(fromSampleBuffer sampleBuffer: CMSampleBuffer)
    {
        guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil }

        if CVPixelBufferLockBaseAddress(imageBuffer, .readOnly) != kCVReturnSuccess { return nil }
        defer { CVPixelBufferUnlockBaseAddress(imageBuffer, .readOnly) }

        let context = CGContext(
        data: CVPixelBufferGetBaseAddress(imageBuffer),
        width: CVPixelBufferGetWidth(imageBuffer),
        height: CVPixelBufferGetHeight(imageBuffer),
        bitsPerComponent: 8,
        bytesPerRow: CVPixelBufferGetBytesPerRow(imageBuffer),
        space: CGColorSpaceCreateDeviceRGB(),
        bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)

        guard let quartzImage = context!.makeImage() else { return nil }
        self.init(cgImage: quartzImage)
    }
}
ruralcoder
  • 1,000
  • 1
  • 10
  • 18