4

Question:

How can I capture a view to be modified in Metal?

My (probably incomplete) understanding is that I need to capture the view as a texture, move it into device memory, do work on it with shaders, and then render it back into the view. I'm currently stuck on the first step of capturing the view. The scope of this question is only up to capturing the texture getting it into Metal, if I run into issues with shaders I'll open a separate question for that.

If there is an easier way to do this than dropping down to the GPU level, I'm open to suggestions.

Background:

For the purpose of learning, I'm trying to recreate the ripple effect seen in this short clip with Metal (without using Qt and OpenGL like in the video). As I understand it, I need a vertex shader and fragment shader. I think I can probably translate the openGL shaders in the the repo to Metal with some experimentation.

Apple supplies a very similar animation with iOS, the CATransition "rippleEffect", so it appears to be possible, but the APIs they use are private and will probably get your app rejected from the app store.

I've also come across BCMeshTransformView, which appears to be very similar to what I want to do, but I haven't had any luck trying to glean how it works from the source.

I've spent a couple hours searching for documentation, but I'm such a beginner at graphics in general that I'm probably no googling the right terms.

Erik
  • 2,299
  • 4
  • 18
  • 23

1 Answers1

2

The easiest way to accomplish this is using CoreImage. CI offers a very wide range of effects that can be applied to your view fairly easily (here's the full list).

If none of those work for you, you can actually write a custom CIFilter using GLSL (the OpenGL shading language), that will run directly on the GPU.

You can apply a given CIFilter to a UIView by grabbing a UIImage of the view, converting that to a CIImage, running your filter, then showing that CIImage on screen. I believe this would be easier than trying to do this with Metal, which is a far lower level API.

For a CoreImage code sample that does this, please see How to apply CIFilter to UIView?

ldoogy
  • 2,819
  • 1
  • 24
  • 38
  • 1
    Thanks for your through response. Is this method fast enough to keep up with changes in the view, or does it assume the view to be static for the period of the animation? For example, if the user swipes on a table view and I start this effect at the same time, will it be able to show the table scrolling while this effect is being applied to it? My experience with rendering the view into a CGContext is that it’s a fairly heavy piece of processing and isn’t suitable for being used in rapid succession. – Erik Jun 26 '18 at 22:11
  • Yeah it may not be efficient enough for any kind of realtime updates (but I'd definitely give it a try). On macOS you could apply a `CIFilter` to your backing `CALayer`, but that isn't supported on iOS apparently... Unfortunately, even using Metal, I'm not aware of a very quick way to capture the contents of the view and its subviews into a texture. – ldoogy Jun 27 '18 at 00:38
  • @Erik curious if it ended up being fast enough with a scrolling tableview? – Aaron Ash May 13 '20 at 19:59
  • 1
    @Aaron, I never got ad far as testing it out. – Erik May 14 '20 at 23:54