I am capturing video in preview mode and would like to display a still image captured by the camera.
I currently save the image and capture output to ivars defined in the interface as:
UIImage *snapshot
AVCaptureStillImageOutput* stillImageOutput;
The video displays fine. However, when I try to capture and display a still image, nothing is appearing and, in fact, the debugger shows the stillImageOutput and image are nil. I think this may be a timing issue with the asynchronous capture and that I need to use a completion handler, but I am weak on completion handlers.
What is the proper way to display a still image immediately after capturing it without tying up UI:
Code to capture still:
- (void)takeSnapshot {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
snapshot = [UIImage imageWithData:imageData];
}
}];
}
Code to display still. Note absence of completion handler which may be issue, however, I'm not sure how to write that...
[self takeSnapshot];
self.imageView.image = snapshot;