I have the following code:
UIImage *picture = [self scaledPicture:[APP getObject:@"pictures"][index]];
NSLog(@"picture size: %fx%f", picture.size.width, picture.size.height);
NSString* imageData = [UIImageJPEGRepresentation(picture, 1.0f) base64EncodedStringWithOptions:NSDataBase64Encoding76CharacterLineLength];
NSLog(@"imageData size: %d", [imageData length]);
NSString* percentEscapedPicture = [imageData stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLHostAllowedCharacterSet]];
NSLog(@"percentEscapedPicture size: %d", [percentEscapedPicture length]);
NSString* base64Picture = [NSString stringWithFormat:@"data:;base64,%@", percentEscapedPicture];
NSLog(@"base64Picture size: %d", [base64Picture length]);
NSData* data = [NSData dataWithContentsOfURL:[NSURL URLWithString:base64Picture]];
NSLog(@"data size: %d", [data length]);
UIImage* image = [UIImage imageWithData:data scale:1.0f];
NSLog(@"image size: %fx%f", image.size.width, image.size.height);
Here are the log results:
picture size: 765.000000x1024.000000
imageData size: 3584062
percentEscapedPicture size: 3958522
base64Picture size: 3958535
data size: 2619123
image size: 1530.000000x2048.000000
As you can see, the image's resolution increase by factor 2 in both dimensions. I don't understand why.
I need this (with a few extra steps between encoding and decoding) for my app. I have used the same base64 encoding/decoding code in my apps for as long as I remember. I understand that base64 increases data size by roughly 33%, but why does the resolution increase?