0

I am able to generate a CGImage from a thumbnail in MPMoviePlayer. What I want to do is apply a filter on the image and show it on the device as fast as possible (probably in a UIImageView).

The caveat here is that I need to apply the filter to every frame of the video so the user sees filtered images in a video stream, with no lag.

At the moment I get the thumbnail, apply my filter, and set my UIImageView.image to this filtered image. The filter works fine, the image shows up, but the app really lags. Is there any way to speed this up?

I've also tried using a CAdisplaylink as this has helped me speed up multiple UIImages flying around on screen at once, but it doesn't do anything in this instance. Any help would be appreciated.

Thank you.

1 Answers1

0

Use Brad Larsons GPUImage framework. In short... it's brilliant.

Here's an overview: The GPUImage framework is a BSD-licensed iOS library that lets you apply GPU-accelerated filters and other effects to images, live camera video, and movies. In comparison to Core Image (part of iOS 5.0), GPUImage allows you to write your own custom filters, supports deployment to iOS 4.0, and has a simpler interface. However, it currently lacks some of the more advanced features of Core Image, such as facial detection.

For massively parallel operations like processing images or live video frames, GPUs have some significant performance advantages over CPUs. On an iPhone 4, a simple image filter can be over 100 times faster to perform on the GPU than an equivalent CPU-based filter.

Here's the link https://github.com/BradLarson/GPUImage containing the page to the git repository, details and sample project where live processing is done with many filters.

Pavan
  • 17,840
  • 8
  • 59
  • 100