I was checking the allocations of my OpenCV app on iPhone and perceived that photos from the camera takes about 300kb and photos transformed by the app take 6 MB (20 times bigger).
Isn`t that strange? Look at the function I am using to transform UIImage from cvMat:
+ (UIImage *)UIImageFromCVMat:(const cv::Mat&)cvMat withOrientation:(UIImageOrientation) orientation{
NSData *data = [NSData dataWithBytes:cvMat.data length:cvMat.elemSize() * cvMat.total()];
CGColorSpaceRef colorSpace;
if (cvMat.elemSize() == 1) {
colorSpace = CGColorSpaceCreateDeviceGray();
} else {
colorSpace = CGColorSpaceCreateDeviceRGB();
}
CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGImageRef imageRef = CGImageCreate(cvMat.cols, // Width
cvMat.rows, // Height
8, // Bits per component - ok
8 * cvMat.elemSize(), // Bits per pixel - ok
cvMat.step[0], // Bytes per row - ok
colorSpace, // Colorspace - ok
kCGImageAlphaNone | kCGBitmapByteOrderDefault, // Bitmap info flags
provider, // CGDataProviderRef
NULL, // Decode
false, // Should interpolate
kCGRenderingIntentDefault); // Intent
UIImage *image = [[UIImage alloc] initWithCGImage:imageRef scale:1.0 orientation:orientation];
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return image;
}
I cannot see why the generated image is so large =/
Any ideas?
Cheers,