I am currently rotating images using CGContextDrawImage
, but there is a giant memory spike whenever it is called. Newer devices can handle it, but devices with lower ram such as the iPhone 4s can not. The image is a large file taken by the user at AVCaptureSessionPresetHigh
I'm currently rotating the image using this extension to UIImage
func rotate(orientation: UIImageOrientation) -> UIImage{
if(orientation == UIImageOrientation.Up){return self}
var transform: CGAffineTransform = CGAffineTransformIdentity
if(orientation == UIImageOrientation.Down || orientation == UIImageOrientation.DownMirrored){
transform = CGAffineTransformTranslate(transform, self.size.width, self.size.height)
transform = CGAffineTransformRotate(transform, CGFloat(M_PI))
}
else if(orientation == UIImageOrientation.Left || orientation == UIImageOrientation.LeftMirrored){
transform = CGAffineTransformTranslate(transform, self.size.width, 0)
transform = CGAffineTransformRotate(transform, CGFloat(M_PI_2))
}
else if(orientation == UIImageOrientation.Right || orientation == UIImageOrientation.RightMirrored){
transform = CGAffineTransformTranslate(transform, 0, self.size.height)
transform = CGAffineTransformRotate(transform, CGFloat(-M_PI_2))
}
if(orientation == UIImageOrientation.UpMirrored || orientation == UIImageOrientation.DownMirrored){
transform = CGAffineTransformTranslate(transform, self.size.width, 0)
transform = CGAffineTransformScale(transform, -1, 1)
}
else if(orientation == UIImageOrientation.LeftMirrored || orientation == UIImageOrientation.RightMirrored){
transform = CGAffineTransformTranslate(transform, self.size.height, 0)
transform = CGAffineTransformScale(transform, -1, 1);
}
let ref: CGContextRef = CGBitmapContextCreate(nil, Int(self.size.width), Int(self.size.height), CGImageGetBitsPerComponent(self.CGImage), 0, CGImageGetColorSpace(self.CGImage), CGImageGetBitmapInfo(self.CGImage))
CGContextConcatCTM(ref, transform)
if(orientation == UIImageOrientation.Left || orientation == UIImageOrientation.LeftMirrored || orientation == UIImageOrientation.Right || orientation == UIImageOrientation.RightMirrored){
CGContextDrawImage(ref, CGRectMake(0, 0, self.size.height, self.size.width), self.CGImage)
}
else{
CGContextDrawImage(ref, CGRectMake(0, 0, self.size.width, self.size.height), self.CGImage)
}
let cgImg: CGImageRef = CGBitmapContextCreateImage(ref)
let rotated: UIImage = UIImage(CGImage: cgImg)!
return rotated
}
This code works perfectly on devices such as the iPhone 5 and 6, but almost always causes the iPhone 4s to crash due to low memory.
I am aware that I could just upload the image with EXIF data specifying the orientation, but I feel like just rotating the image before uploading it would prevent further problems down the line.
How does iOS rotate the image for display using the EXIF data? For example, although using
UIImage(image.CGImage, scale: 1.0, orientation: UIImageOrientation.Right)
changes the EXIF data, the image is still displayed in the correct orientation on the screen, with a lot less of a memory impact. Is there some really low-level code that can be used to achieve this by using the GPU like the above code does?
I would also like to avoid scaling the image down before uploading it.
Is there any way that I can reduce the impact of rotating the image, such as rotating it in pieces, or will I have to find an alternative?