0

When I do an NSLog on the size of the image after putting it into a UIImage, it comes out at the expected size. When I try this with CGImageSource however, I get an image twice the size as I was expecting. This is the code I'm using for that:

NSString *fullPath = [self fullPathForThumbnail];

    NSURL *imageFileURL = [NSURL fileURLWithPath:fullPath];
    CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef)imageFileURL, NULL);

    if (imageSource == NULL) {
        // Error loading image
        return NO;
    }

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:NO], (NSString *)kCGImageSourceShouldCache,
                             nil];
    CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, (__bridge CFDictionaryRef)options);

    CGSize originalSize;

    if (imageProperties) {
        NSNumber *width = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
        NSNumber *height = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);

        originalSize = CGSizeMake(width.floatValue, height.floatValue);

        CFRelease(imageProperties);
    }

This only happens on retina images; non-retina images seem to be the correct size.

Andrew
  • 15,935
  • 28
  • 121
  • 203
  • 3
    Sounds like you are mixing points vs pixels. The methods explicitly indicate they are returning pixels. Dividing by scale will give you pixels and should match on both retina and non-retina displays. – bobnoble Oct 22 '12 at 01:53
  • What do you mean by 2x? Suppose A.png is 100x100, A@2x.png is 200x200. What does the program report? 200x200, or 400x400? If it reports 200x200 I think it's the expected behavior. – Yuji Oct 22 '12 at 01:55
  • I am getting a bit confused here. Lets say this scenario: – WolfLink Oct 22 '12 at 02:21
  • You enter a 100x100 pixels (normal size) picture. What does the code return? Now you enter its 200x200 pixel (100x100 point) retina counterpart, what does it return then? – WolfLink Oct 22 '12 at 02:21
  • On a 100x100 pixel image, it returns dimensions of 100x100. On a 200x200 image (on retina) it returns dimensions of 400x400. – Andrew Oct 22 '12 at 02:49

1 Answers1

0

To expand on what bnoble said:

There are two different concepts here: the size of the image and the number of pixels in the image. These two concepts are related by the resolution of the image.

The size of an image is given in units such as inches, centimeters or printer's points. There are actually different definitions of the printer's point, but the one commonly used in IT is the one promoted by Adobe, which is: 72 point = 1 inch.

In Cocoa and previously NeXTStep, the size of an image was always the physical size, the number of pixels was a separate measure. In a device-independent graphics system, you can have 1 cm x 1 cm image that is 72 dpi, 150 dpi, 300 dpi or 2400 dpi. The number of pixels in each of these images will be different, but the size is always the same.

The UIImage class on iOS used to equate the two, assuming (as do many people) that pixel-size = physical size, or in other words that the resolution is 72 dpi.

However, this changed with the retina display, the documentation for the UIImage size property now has the following to say:

In iOS 4.0 and later, this value reflects the logical size of the image and is measured in points. In iOS 3.x and earlier, this value always reflects the dimensions of the image measured in pixels.

So the size property of the UIImage is giving you the physical size. The kCGImagePropertyPixelWidth property on the other hand, is giving you the number of pixels, which for a 2x retina image is expected to be twice the number of points of the size.

mpw
  • 99
  • 2