1

I am trying to convert the frames that I'm getting from AVCaptureVideoDataOutput delegate (as CMSampleBuffer) to UIImage. However I'm getting a fatal error: unexpectedly found nil while unwrapping an Optional value Can someone tell me what is wrong with my code? I am assuming that there is something wrong with my sampleBufferToUIImage function.

Function to convert CMSampleBuffer to UIImage:

func sampleBufferToUIImage(sampleBuffer: CMSampleBuffer) -> UIImage{

    let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)

    CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: 0))

    let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!)

    let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)

    let width = CVPixelBufferGetWidth(imageBuffer!)
    let height = CVPixelBufferGetHeight(imageBuffer!)

    let colorSpace = CGColorSpaceCreateDeviceRGB()

    let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)

    let context = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)

    // *********Getting the error from this line***********
    let quartzImage = context!.makeImage()

    CVPixelBufferUnlockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: 0))

    let image = UIImage(cgImage: quartzImage!)

    return image

}

delegate where I'm reading frame:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    if count <= 0 {
        // Calling my function to convert to UIImage.
        let image = sampleBufferToUIImage(sampleBuffer: sampleBuffer)
        let imageData = UIImagePNGRepresentation(image)
        uploadImage(jpgData: imageData)
    }

    count = count + 1
}

Setting up AVSession:

func setupCameraSession() {

    captureSession.sessionPreset = AVCaptureSessionPresetHigh

    // Declare AVCaptureDevice to default(back camera).  The "as" changes removes the optional?
    let captureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) as AVCaptureDevice

    do {
        let deviceInput = try AVCaptureDeviceInput(device: captureDevice)

        if (captureSession.canAddInput(deviceInput) == true) {
            captureSession.addInput(deviceInput)
        }

        let dataOutput = AVCaptureVideoDataOutput()
        dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) : NSNumber(value: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange as UInt32)]
        dataOutput.alwaysDiscardsLateVideoFrames = true

        if (captureSession.canAddOutput(dataOutput) == true) {
            captureSession.addOutput(dataOutput)
        }


    } catch {

    }
Lightsout
  • 3,454
  • 2
  • 36
  • 65

2 Answers2

2

Try this, works for me on Swift 3

// Sample buffer handling delegate function
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    let myPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    myCIimage         = CIImage(cvPixelBuffer: myPixelBuffer!)
    videoImage        = UIImage(ciImage: myCIimage)
    uIimage.image = videoImage
}

// AV Session

func startVideoDisplay() {        
    do {
        let tryDeviceInput = try AVCaptureDeviceInput(device: cameraDevice)
        cameraCaptureSession.addInput(tryDeviceInput)
    } catch { print(error.localizedDescription) }

    caViewLayer = AVCaptureVideoPreviewLayer(session: cameraCaptureSession)
    view.layer.addSublayer(caViewLayer)

    cameraCaptureSession.startRunning()

    let myQueue = DispatchQueue(label: "se.paredes.FunAV", qos: .userInteractive, attributes: .concurrent)

    let theOutput = AVCaptureVideoDataOutput()
    theOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value:kCVPixelFormatType_32BGRA)]
    theOutput.alwaysDiscardsLateVideoFrames = true
    theOutput.setSampleBufferDelegate(self, queue: myQueue)

    if cameraCaptureSession.canAddOutput(theOutput) {
        cameraCaptureSession.addOutput(theOutput)
    }
    cameraCaptureSession.commitConfiguration()
}
Jose Paredes
  • 329
  • 1
  • 8
  • is this code for just capturing image? can it be used as video recorder too? if yes then how? – Mr.Ghamkhar Apr 17 '17 at 06:23
  • Yes, it can be used for video recording. How? just by adding/changing the code in the captureOutput function to do the desired job, to star with such task you can play a bit by asigning the "videoImage" variable to a UIImage – Jose Paredes Apr 19 '17 at 04:28
  • I hope this short example can help: @IBOutlet weak var uiImage: UIImageView! func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) { let myPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) myCIimage = CIImage(cvPixelBuffer: myPixelBuffer!) videoImage = UIImage(ciImage: myCIimage) uiImage.image = videoImage } – Jose Paredes Apr 19 '17 at 04:39
1

AVCaptureVideoDataOutput's video setting was incorrect. Change kCVPixelFormatType_420YpCbCr8BiPlanarFullRange to kCVPixelFormatType_32BGRA

Lightsout
  • 3,454
  • 2
  • 36
  • 65