OK, so the following code works, but I don't get why. I am capturing still images from the Front camera using AVFoundation. I have this code before initiating capture:
if ([connection isVideoOrientationSupported]) {
AVCaptureVideoOrientation orientation;
switch ([UIDevice currentDevice].orientation) {
case UIDeviceOrientationPortraitUpsideDown:
orientation = AVCaptureVideoOrientationPortraitUpsideDown;
break;
case UIDeviceOrientationLandscapeLeft:
orientation = AVCaptureVideoOrientationLandscapeRight;
break;
case UIDeviceOrientationLandscapeRight:
orientation = AVCaptureVideoOrientationLandscapeLeft;
break;
default:
orientation = AVCaptureVideoOrientationPortrait;
break;
}
[connection setVideoOrientation:orientation];
}
and then this in the captureStillImageAsynchronouslyFromConnection:completionHandler:
to store the image:
NSData *imageData = [AVCaptureStillImageOutputjpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *i = [UIImage imageWithData:imageData];
orientation:i.imageOrientation];
UIGraphicsBeginImageContext(i.size);
[i drawAtPoint:CGPointMake(0.0, 0.0)];
image.image = UIGraphicsGetImageFromCurrentImageContext();
as you can see, I don't rotate the image or anything, just draw it in the context and save. But as soon as I try to use i
it is always rotated by 90 degrees. If I try to rotate it using
UIImage *rotated = [[UIImage alloc] initWithCGImage:i.CGImage scale:1.0f orientation:i.imageOrientation];
it doesn't work (no change from just using i
).
I understand that UIImage might just draw the image into the context using the right orientation automatically, but WTF?