3

I am creating a simple photo filter app for OS X and I am displaying a photo on an NSImageView (actually two photos on top of each other with two NSImageViews, but the question still applies for a single view too). Everything works super, but when I try to resize the window that contains the NSImageViews, the window (which also resizes the NSImageViews) resizes very slowly, at about less than 1fps, creating a negative impact on the user experience. I want resizing windows to be as smooth as possible. When I disable resizing the image views, the window resizes smoothly, so the cause of the slowdown is those NSImageViews.

I'm loading 20-megapixel images from my DSLR. When I scale them down to a reasonable size for screen (e.g. 1024x768), they scale smoothly, so the problem is the way NSImageView renders the images. It (I assume as the result of this behavior) tries to re-render 20MP image every time it needs to redraw it into whatever the target frame of the view is.

How can I make NSImageView rescale more smoothly? Should I feed it with a scaled-down version of my images? I don't want to do that as it's a photo editing app that also targets retina display screens and the viewport would actually be quite large. I can do it, but it's my final option. Other than scaling down, how can I make NSImageView resize faster?

Can Poyrazoğlu
  • 33,241
  • 48
  • 191
  • 389
  • Haven't really looked into this at the OSX-end, but I guess you would need to avoid resizing the NSImageViews altogether. Try doing a snapshot of the entire view and then resize this instead of the actual complex hierarchy (which you hide until the resizing is completed). Have a look into Quartz Window Services. It looks suspiciously like the kind of stuff you're looking for. – T. Benjamin Larsen Nov 02 '13 at 17:48
  • @TBlue 20 megapixels. – Can Poyrazoğlu Nov 02 '13 at 18:01

1 Answers1

4

I believe part of the solution your are looking for is in NSImage's representations. You can add many representations to an image with addRepresentation: I believe there is some intelligent selection done when drawing. In your case, I think you would need to add both representations (the scaled-down and the full resolution bitmap) to NSImage. I strongly suspect drawRect: should pick the low resolution version. I would make sure "scale up or down" is selected in NSImageView, because the default is scale down only, which may force your full resolution image to be used most of the time. There are some discussion in Apple's documentation regarding "matching" under "Setting the Image Representation Selection Criteria" in NSImage, although at first sight this may not be sufficient.

Then, whenever you need to do something with the full image, you would request the full resolution image by going through the representations ([NSImage representations] returns an array of NSImageRep).

Daniel
  • 578
  • 4
  • 17
  • 1
    yes, that did the trick. however, I need to pre-process and provide several different resolution versions of the image (especially considering the possible image view sizes in retina display macbooks), which consumes both CPU, loading time, and RAM. but that's another story. otherwise your solution works perfectly, NSImageView really grabs the most appropriate size to display when resized. – Can Poyrazoğlu Nov 05 '13 at 01:44