1

I have the following code:

UIImage *picture = [self scaledPicture:[APP getObject:@"pictures"][index]];
NSLog(@"picture size: %fx%f", picture.size.width, picture.size.height);

NSString* imageData = [UIImageJPEGRepresentation(picture, 1.0f) base64EncodedStringWithOptions:NSDataBase64Encoding76CharacterLineLength];
NSLog(@"imageData size: %d", [imageData length]);

NSString* percentEscapedPicture = [imageData stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLHostAllowedCharacterSet]];
NSLog(@"percentEscapedPicture size: %d", [percentEscapedPicture length]);

NSString* base64Picture = [NSString stringWithFormat:@"data:;base64,%@", percentEscapedPicture];
NSLog(@"base64Picture size: %d", [base64Picture length]);

NSData* data = [NSData dataWithContentsOfURL:[NSURL URLWithString:base64Picture]];
NSLog(@"data size: %d", [data length]);

UIImage* image = [UIImage imageWithData:data scale:1.0f];
NSLog(@"image size: %fx%f", image.size.width, image.size.height);

Here are the log results:

picture size: 765.000000x1024.000000  
imageData size: 3584062  
percentEscapedPicture size: 3958522  
base64Picture size: 3958535  
data size: 2619123  
image size: 1530.000000x2048.000000  

As you can see, the image's resolution increase by factor 2 in both dimensions. I don't understand why.

I need this (with a few extra steps between encoding and decoding) for my app. I have used the same base64 encoding/decoding code in my apps for as long as I remember. I understand that base64 increases data size by roughly 33%, but why does the resolution increase?

Kevin
  • 2,739
  • 33
  • 57
  • Looks like your original image was a retina image, i.e. it has a resolution of 1530 x 2048 pixels and a scale factor of 2. So it pretends the size is 765 x 1024. You want to output the scale factor as well: `NSLog(@"picture size: %fx%f scale: %f", picture.size.width, picture.size.height, picture.scale);` – Codo Mar 14 '16 at 16:14
  • __1.)__ you are printing the _point_ size of your image only, probably you original image is for retina, uses scale factor `2.0`, __2.)__ after you reloaded your image from data, you are using `1.0` for scaling it, which means your image's size in _points_ is doubled. – holex Mar 14 '16 at 16:53

1 Answers1

1

The original image had a scale property set to 2. When you round-tripped it as pure data into base 64 and back, you threw that information away.

matt
  • 515,959
  • 87
  • 875
  • 1,141
  • See also my answer here: http://stackoverflow.com/questions/35945493/why-is-my-cgimage-3-x-the-size-of-the-uiimage/35945544#35945544 Actually your question is effectively a duplicate. – matt Mar 14 '16 at 16:17
  • Thanks so much, I tried setting the scale to 1.0f but I didn't think about retina – Kevin Mar 15 '16 at 07:56