I have been scratching my head over this over hours.
I am using the following method to resize 2 images. One after another:
CGImageRef imageReference = [image CGImage];
bytes = malloc(width * height * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(bytes, width, height, bitsPerComponent,
bytesPerRow, colorSpaceReference,
kCGImageAlphaPremultipliedLast);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageReference);
CGImageRelease(imageReference);
CGContextRelease(context);
It works fine, no problem - but only one image. If I call this method again, for example:
[self resizeImageWithSize:imageSize]; //this is OK.
[self resizeImageWithSize:imageSize]; //this would not come out right
where image size is determined by: image1.size
and image2.size
. I have tried flipping the calling sequence of the methods, the first one is always correct.
They are not too big, 400 x 300, 300 x 360. And I just would like to resize one to 200 x 200 and the other for example 150 x 150. And the are just png's.
It works, but if I call this method again, the second image is wrong. Wrong being it has messed up pixels like water stains on paper. Sometimes it is even rendered unrecognizable.
Am I missing something very obvious here? I have tried to free(bytes);
, which I don't think is needed here but for the sake of trying, but still it doesn't bring anything. Am I not releasing /freeing correctly something so that the second time when the method is called, old byte data persists still? I am just guessing here. Am using ARC.