I am trying to build a image capture for my iOS app, but I keep getting color distortion on my CGImage result. Here is the camera preview, right colors.
Cola is red, all is well.
When I run my snapshot code, I get this :
Cola is blue... where did this come from?
I tried messing with some of the parameters, but only end up getting no image at all. Here is my snapshot code :
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = [cameraVideo bufRowBytes];
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, [cameraVideo bufDataPtr], [cameraVideo bufWidth]*[cameraVideo bufHeight]*4, NULL);
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGImageAlphaNoneSkipLast;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentPerceptual;
CGImageRef imageRef = CGImageCreate( [cameraVideo bufWidth], [cameraVideo bufHeight], bitsPerComponent, bitsPerPixel,
bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
CGColorSpaceRelease(colorSpaceRef);
I am at my wits end, so if anyone can see what I am doing wrong, please let me know.
FIXED
Here is the final code:
if (cameraVideo.ARPixelFormat == kCVPixelFormatType_32ARGB) {
bitmapInfo = kCGBitmapByteOrder32Big | kCGImageAlphaNoneSkipFirst;
} else {
bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst;
}