I am developing an app in which I use images at @1x in all the devices (they are photos and I can't get @2x versions of them). When a 300x225 px image is displayed in a Retina display at 300x225 points (i.e. 600x450 px) I would expect it to look as in a non-retina device. But this is not what I get. The images looks way worse in retina devices.
This is how it looks in a non-retina device:
And this is how it looks in a Retina device:
OK, but now let's see how it looks in a Retina device when I change the frame slightly, e.g. to 301x225 or 299x225 px. The image looks way better, it's not retina-quality (of course) but it definitely looks less pixelated:
It seems that Apple uses different scaling algorithms depending on how much it has to scale the image. If it has to use a scaling factor of 2, a different scaling interpolation is used than when it is 1.99, 2.01 or any other value. But why? In a real retina device, the image scaled at 2.01 looks way better than scaled at 2. Maybe it is to force developers to include retina images?
Do you know a way to force the image to use the "smoother" scaling interpolation method? I could just set a slightly different frame so that it's not exactly twice the size of the original image, but... is there a better way? I've also tried to change the CALayer
's magnificationFilter
and minificationFilter
, but the image still looks worse on Retina devices.
EDIT: I've been playing with Photoshop and I can say that when the scaling factor is 2.0, Apple uses Nearest Neighbour, and when the scaling factor is different than 2.0, a Bicubic algorithms is used instead. I could just create @2x versions of the images with Photoshop (using Bicubic) and add them to my app, but this would dramatically increase the file size of my app.