0

So, we have a NSimage that's cropped and resized according to user input. It's working fine on both retina and non-retina Macs.

Problem is, when I connect a non-retina screen to my retina macbook, it started to double the size of the images.

The problem only occurs when launching the app on the second, non-retina screen.

While debugging, I noticed the image doubles at this point:

CGImageSourceRef finalSource = CGImageSourceCreateWithData((__bridge CFDataRef)([newImage TIFFRepresentation]), NULL);
CGImageRef finalRef =  CGImageSourceCreateImageAtIndex(finalSource, 0, NULL);

In my tests, newImage.size is 800x600, but (int)CGImageGetWidth(finalRef) is returning 1600.

I also tried:

CGImageRef cgImage = [newImage CGImageForProposedRect:&imageRect context:[NSGraphicsContext currentContext] hints:nil];

But still having the same result.

Any ideas?

Paulo Cesar
  • 2,250
  • 1
  • 25
  • 35
  • You are skipping an important part as to why you need to make a CGImageRef object. I don't know what you are doing with your image, either. Are you displaying it with NSImageView? – El Tomato Jun 15 '16 at 21:52
  • El tomato: Oh, the CGImageRef will be used to save the image to a file later in the app – Paulo Cesar Jun 16 '16 at 15:07

0 Answers0