3

I'm attempting to convert a CMSampleBufferRef (as part of the AVCaptureVideoDataOutputSampleBufferDelegate in iOS) to an OpenCV Mat in an attempt to stabilise the output in semi-realtime.

I'm running a test application at the moment following this, but keep getting issues when I create and use the Mat.

Swift Controller

let wrapper : OpenCVWrapper = OpenCVWrapper()
...
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    self.wrapper.processBuffer(sampleBuffer, self.previewMat)
}

OpenCVWrapper

- (void)processBuffer:(CMSampleBufferRef)buffer :(UIImageView*)previewMat {
    // Convert current buffer to Mat
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
    CVPixelBufferLockBaseAddress( pixelBuffer, 0);

    CGFloat bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
    CGFloat bufferHeight = CVPixelBufferGetHeight(pixelBuffer);

    unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

    Mat tmp(bufferWidth, bufferHeight, CV_8UC4, pixel);
    Mat cur = tmp.clone();

    dispatch_async(dispatch_get_main_queue(), ^{
        [previewMat setImage:[UIImage imageWithCVMat:cur]];
    });
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
}

Within the Mat cur = tmp.clone() I'm getting an EXC_BAD_ACCESS

Any thoughts on what I'm doing wrong here?

I've tried bufferWidth and CGFloat and int, and switching them around in the constructor for Mat, same issue.

Community
  • 1
  • 1
Richard Poole
  • 591
  • 1
  • 5
  • 21
  • Have you tried: Mat tmp = Mat(bufferHeight,bufferWidth,CV_8UC4,pixel); Mat cur = tmp.clone(); – freshking Jan 18 '16 at 11:45
  • Yes, that didn't work either. I ended up converting the buffer to a UIImage then the UIImage to Mat. That worked, but haven't answered as it doesn't really answer the question. – Richard Poole Jan 18 '16 at 13:23

3 Answers3

4

Improved solution which fixed problem "only top 30%":

- (cv::Mat)matFromBuffer:(CMSampleBufferRef)buffer {
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

    //Processing here
    int bufferWidth = (int)CVPixelBufferGetWidth(pixelBuffer);
    int bufferHeight = (int)CVPixelBufferGetHeight(pixelBuffer);
    unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

    //put buffer in open cv, no memory copied
    cv::Mat mat = cv::Mat(bufferHeight,bufferWidth,CV_8UC4,pixel,CVPixelBufferGetBytesPerRow(pixelBuffer));

    //End processing
    CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );

    cv::Mat matGray;
    cvtColor(mat, matGray, CV_BGR2GRAY);

    return matGray;
}
Yun CHEN
  • 6,450
  • 3
  • 30
  • 33
  • If the CVPixelBuffer is coming from the iOS camera, this code will work only if you've set up the camera.videoSettings to include`[String(kCVPixelBufferPixelFormatTypeKey):kCMPixelFormat_32BGRA]`. If you're using the `kCMPixelFormat_422YpCbCr8` pixel format, then see this SO post: https://stackoverflow.com/questions/19358686/how-do-i-convert-from-a-cvpixelbufferref-to-an-opencv-cvmat – Drew H Jan 16 '23 at 18:12
2

Maybe this will work:

- (void)processBuffer:(CMSampleBufferRef)buffer :(UIImageView*)previewMat {
{
    CVImageBufferRef imgBuf = CMSampleBufferGetImageBuffer(buffer);

    // lock the buffer
    CVPixelBufferLockBaseAddress(imgBuf, 0);

    // get the address to the image data
    void *imgBufAddr = CVPixelBufferGetBaseAddressOfPlane(imgBuf, 0);

    // get image properties
    int w = (int)CVPixelBufferGetWidth(imgBuf);
    int h = (int)CVPixelBufferGetHeight(imgBuf);

    // create the cv mat
    cv::Mat image;
    image.create(h, w, CV_8UC4);
    // memcpy(image.data, imgBufAddr, w * h); copies only 25% of the image
    memcpy(image.data, imgBufAddr, w * h* 4); // copies all pixels

    // unlock again
    CVPixelBufferUnlockBaseAddress(imgBuf, 0);

    dispatch_async(dispatch_get_main_queue(), ^{
        [previewMat setImage:[UIImage imageWithCVMat:image]];
    });
}
Totoro
  • 3,398
  • 1
  • 24
  • 39
freshking
  • 1,824
  • 18
  • 31
  • 1
    I'm getting only the top 3rd of the image using this code. Sample capture: http://i.imgur.com/cRQ0OmL.png – Stan James May 27 '16 at 16:29
  • Edited the code to copy the entire image. the memcpy() command was not counting the 4 channels of the image. – Totoro Apr 10 '20 at 10:15
  • it's better to use `CVPixelBufferGetDataSize(imgBuf)` instead of manually calculating the buffer size(width*height*4) when copying – Mohamed Salah Oct 22 '22 at 00:55
  • Good answer. However, Yun CHEN's answer accomplishes the same thing without having to do a `memcpy` and therefore should be faster. Also, later iPhone models tend to add padding bytes to the end of each row in the CVPixelBuffer, so if `memcpy` is your thing make sure to do `memcpy(image.data, imgBuffAddr, CVPixelBufferGetBytesPerRow(pixelBuffer) * h);` – Drew H Jan 16 '23 at 18:16
1

The image type are not equal, you need to try and do some sort of

cvtColor(image, image_copy, CV_BGRA2BGR);

try with other type of CV_BGRA2BGR

Hope it helps.

Kiko Seijo
  • 701
  • 8
  • 11