0

I need to apply a Gaussian Blur to a NSImage.
Since this image changes a lot over the lifetime of the application, the operation should be quick so that the UI stays somewhat responsive.

Unfortunately, I have not found a satisfying solution for this problem.

So basically, I've tried three things:

  • Core Animation (Not an option)
  • CIFilter (Terribly slow, 1s or more per operation)
  • GPUImage (Also terribly slow, 1s or more per operation)

What would be the fastest way of applying a Gaussian Blur (or any filter, for that matter) to a NSImage?


CIFilter

CIImage *imageToBlur = [CIImage imageWithData:[backgroundImage TIFFRepresentation]];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: @"CIGaussianBlur"];
[gaussianBlurFilter setValue:imageToBlur forKey:kCIInputImageKey];
[gaussianBlurFilter setValue:[NSNumber numberWithFloat: 10] forKey: @"inputRadius"];

NSCIImageRep *rep = [NSCIImageRep imageRepWithCIImage:[gaussianBlurFilter valueForKey:kCIOutputImageKey]];
NSImage *nsImage = [[NSImage alloc] initWithSize:rep.size];
[nsImage addRepresentation:rep];

_backgroundImage = nsImage;

GPUImage

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:backgroundImage];
GPUImageGaussianBlurFilter *gaussianBlurFilter = [[GPUImageGaussianBlurFilter alloc] init];
gaussianBlurFilter.blurRadiusInPixels = 10.0;

[stillImageSource addTarget:gaussianBlurFilter];
[stillImageSource processImage];

_backgroundImage = [gaussianBlurFilter imageFromCurrentlyProcessedOutput];
IluTov
  • 6,807
  • 6
  • 41
  • 103
  • **Why Core Animation is no option** The image is eventually simply displayed in the background, so theoretically one could apply the filter using Core Animation > `CALayer.filters`. Since enabling Core Animation behaves badly with many of the Cocoa components, this is not an option. Adding a layer-backed `NSImageView` sibling is also not an option, since this breaks the arrangement of the views. – IluTov Dec 09 '13 at 16:27
  • If it does not need to be a Gaussian Blur you could go with StackBlur http://incubator.quasimondo.com/processing/fast_blur_deluxe.php – lukaswelte Dec 09 '13 at 17:03
  • @lukaswelte You know what designers are like :) It needs to be a Gaussian Blur. And it's also for OSX, actually. – IluTov Dec 09 '13 at 17:13
  • Ok ;) I know that there was just a port for iOS and C++ but that should be easy to port to OSX – lukaswelte Dec 09 '13 at 17:17
  • 1
    The reason for the slow blurring is most likely not the blur itself, but the act of going to and from an NSImage. The current implementation of GPUImageGaussianBlurFilter, with a 10 pixel radius, runs in around 5 ms on a 720p video frame on an iPhone 4S, so a Mac certainly would be able to handle this comfortably. The bottleneck is the fact that you need to pass through Core Graphics to convert to and from an NSImage. The output half of this can be eliminated by feeding your filter output into a GPUImageView directly (which stays on the GPU). What's the input source? – Brad Larson Dec 09 '13 at 18:12
  • @BradLarson Hey you're the guy who's programmed `GPUImage` :) It's loaded from the resources: `[NSImage imageNamed:@"someImage"]`. What can be done? – IluTov Dec 09 '13 at 18:14
  • Similarly, you can use Core Image to render out to an OpenGL context, eliminating any slowdown on output of the image. Your input will once again still be a bottleneck, so if it doesn't need to come from an NSImage, there might be a much better way to do this with Core Image, too. – Brad Larson Dec 09 '13 at 18:16
  • @NSAddict - Is the image static, with the blur radius, etc. changing? If so, you can load it once into a GPUImagePicture (the slow part) and keep that GPUImagePicture attached to your blur filter as you adjust it. If you feed the result to a GPUImageView, you should be able to obtain realtime changes in the blur. – Brad Larson Dec 09 '13 at 18:18
  • @BradLarson Ok, the image is static, the amount of blurring will never change (Although that would be cool for the future). Until now I've been loading it once, but since the image itself will be changed many times in the lifetime of the app, this part is critical. I'm definitely no expert on this subject, how do you recommend initialising the image? – IluTov Dec 09 '13 at 18:25
  • @NSAddict - Are all of these various images always loaded from the application bundle? If so, can't you save a pre-blurred version of the image already in the bundle, since the blur will never change? If the images are generated from something else, there may be a faster path to get them on the GPU, such as using IOSurfaces. – Brad Larson Dec 09 '13 at 18:33
  • @BradLarson Yeah sorry I should've clarified. This is just a demo-project, the data will come from a non-specific file on the hard drive of the user – IluTov Dec 09 '13 at 18:34
  • @NSAddict - In that case, file loading from disk and upload to the GPU will be the slow part of this. As I said earlier, you can eliminate half of the slowdown by rendering directly to the screen in a GPUImageView or using Core Image's OpenGL rendering, but you'll still need to address loading images from disk and uploading them to the GPU. I had a process for speeding this up slightly on iOS by avoiding Core Graphics, but don't think I've rolled that over to the Mac implementation of GPUImagePicture. Take a look at some of the recent commits on the iOS side, which might be applicable here. – Brad Larson Dec 09 '13 at 18:43
  • @BradLarson It looks like it's this the drawing when it actually lags: `[_backgroundImage drawInRect:...];`. Any clues? – IluTov Dec 09 '13 at 19:32
  • @NSAddict - Right, that's why you need to avoid taking the output image as an NSImage and setting it to your view. That's slow. Instead, use a GPUImageView (if working with GPUImage) and feed the output from your filter directly to that. That will avoid the entire redraw process within the view. – Brad Larson Dec 09 '13 at 19:57

0 Answers0