I'm generating an UIImage as such:
//scale UIView size to match underlying UIImage size
float scaleFactor = 10.0
UIGraphicsBeginImageContextWithOptions(self.bounds.size, self.opaque, scaleFactor);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
The UIImage has a size of 3200x2400, which is what I want. However, when I convert to PNG format to send as an email attachment:
NSData* data = UIImagePNGRepresentation(image);
MFMailComposeViewController* controller;
...
[controller addAttachmentData:data mimeType:mimeType fileName:.fileName];
I end up with and image that is 720 ppi and thus ~12.8mb. Which is way too large.
I don't know where the 720 ppi is coming from, the UIImage is generated from an image that is 72 ppi. It must have something to do with:
UIGraphicsBeginImageContextWithOptions(self.bounds.size, self.opaque,scaleFactor);
I need to create an UIImage from a UIView based on the underlying UIImage (which is much larger than the UIView's bounds), but I need to maintain the original ppi. 720 ppi is far too impractical for an email attachment.
Any thoughts?