0

I work on capture Mac screen using JPEG format, and then get the pixelBuffer and imageBuffer of the captured JPEG samplebuffer. But, the pixelBuffer is always nil, while when I convert the JPEG buffer to NSImage, the image can be got and displayed successfully.

-(void)createSession
{
if(self.session == nil)
        {
            self.session = [[AVCaptureSession alloc] init];
            self.session.sessionPreset = AVCaptureSessionPresetPhoto;
            CGDirectDisplayID displayId = [self getDisplayID];

            self.input = [[AVCaptureScreenInput alloc] initWithDisplayID:displayId];

            [self.session addInput:self.input];

            self.imageOutput = [[AVCaptureStillImageOutput alloc] init];

            NSDictionary *outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG};

            [self.imageOutput setOutputSettings:outputSettings];

            [self.session addOutput:self.imageOutput];

            [self.session startRunning];
        }
}



-(void)processSampleBuffer
{
AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in self.imageOutput.connections) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { break; }
    }

    [self.imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *  error) {
        if(imageDataSampleBuffer != nil)
        {
            NSData * imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            self.image = [[NSImage alloc] initWithData:imageData];

            CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
            CVPixelBufferLockBaseAddress(imageBuffer,0);

            size_t width = CVPixelBufferGetWidth(imageBuffer);
            size_t height = CVPixelBufferGetHeight(imageBuffer);

            NSLog(@"width %zu height %zu", width,height);

             CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
            float width1 = CVPixelBufferGetWidth(pixelBuffer);
            float height1 = CVPixelBufferGetHeight(pixelBuffer);

             NSLog(@"Pixelbuffer width %f height %f", width1,height1);

        }
        else
        {
            NSLog(@"error");
        }

    }];
}

in processSampleBuffer the self.image can get a NSImage and displayed in NSImageView successfully. but the imageBuffer and pixelBuffer are both nil.

it confused me a lot, could someone help to have a look ?

onTheWay
  • 128
  • 1
  • 8

1 Answers1

0

OK, finally, I found the answer here, hope it can help others.

https://developer.apple.com/documentation/avfoundation/avcapturephoto/2873914-pixelbuffer?language=objc

Discussion

If you requested photo capture in a RAW format, or in a processed format without compression such as TIFF, you can use this property to access the underlying sample buffer.

If you requested capture in a compressed format such as JPEG or HEVC/HEIF, this property's value is nil. Use the fileDataRepresentation or CGImageRepresentation method to obtain compressed image data.

onTheWay
  • 128
  • 1
  • 8