In iOS, I am using code to capture from AVCaptureStillImageOutput thus:
[_captureStillOutput captureStillImageAsynchronouslyFromConnection: _captureConnection completionHandler: asyncCaptureCompletionHandler];
for simplicity to boil down my code, my asyncCaptureCompletionHandler block looks like this:
void(^asyncCaptureCompletionHandler)(CMSampleBufferRef imageDataSampleBuffer, NSError *error) =
^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (CMSampleBufferIsValid(imageDataSampleBuffer)) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}
}
I have been through all my code and cross referencing with stack overflow, and have not found any suggestion why a valid sample buffer would be captured without being a proper JPEG.
_captureStillOutput = [[AVCaptureStillImageOutput alloc] init];
_captureStillOutput.outputSettings =
[NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecJPEG, AVVideoCodecKey,
nil];
if ([session canAddOutput:_captureStillOutput]) {
[session addOutput:_captureStillOutput];
}
There is supplemental info in the debugger: * Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* +[AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:] - Not a jpeg sample buffer.'
Searches in google and stack overflow both for "Not a jpeg sample buffer" produced zero results. I'm stuck. bah.