5

I am working on developing some FxPlug plugins for Motion and FCP X. Ultimately, I'd like to have them render in Metal as Apple is deprecating OpenGL.

I'm currently using CoreImage, and while I've been able to use the CoreImage functionality to do Metal processing outside of the FxPlug SDK, FxPlug only provides me the frame as an OpenGL texture. I've tried just passing this into the CoreImage filter, but I end up getting this error:

Cannot render image (with an input GL texture) using a metal-DG context.

After a bit of research, I found that I can supposedly use CVPixelBuffers to share textures between the two, but after trying to write code utilizing this method for a while, I've come to the belief that this was intended as a way to WRITE (as in, create from scratch) to a shared buffer, but not convert between. While this may be incorrect, I cannot find a way to get the existing GL texture to exist in a CVPixelBuffer.

TL;DR: I've found ways to get a resulting Metal or OpenGL texture FROM a CVPixelBuffer, but I cannot find a way to create a CVPixelBuffer from an existing OpenGL texture. My heart is not set on this method, as my ultimate goal is to simply convert from OpenGL to Metal, then back to OpenGL (ideally in an efficient way).

Has anyone else found a way to work with FxPlug with Metal? Is there a good way to convert from an OpenGL texture to Metal/CVPixelBuffer?

genpfault
  • 51,148
  • 11
  • 85
  • 139
mredig
  • 1,736
  • 1
  • 16
  • 29
  • You can create a shared texture and use OpenGL to copy from the FxPlug-provided OpenGL texture to that shared texture, use Metal to do whatever else you want to do to the shared texture, and then use OpenGL to copy it back to the FxPlug-provided texture. – Ken Thomases Dec 18 '18 at 19:51
  • I guess my problem then would be that I don't know how to copy it to the shared texture. Do you have any sample code? The FxTexture provides `(GLuint)textureId;` and `(GLenum)target;` and I've found no way to utilize either of those to copy to the CVPixelBuffer. – mredig Dec 20 '18 at 00:28

1 Answers1

2

I have written an FxPlug that uses both OpenGL textures and Metal textures. The thing you're looking for is an IOSurface. They are textures that can be used with either Metal or OpenGL, though they have some limitations. As such, if you already have a Metal or OpenGL texture, you must copy it into an IOSurface to use it with the other system.

To create an IOSurface you can either use CVPixelBuffers (by including the kCVPixelBufferIOSurfacePropertiesKey) or you can directly create one using the IOSurface class defined in <IOSurface/IOSurfaceObjC.h>.

Once you have an IOSurface, you can copy your OpenGL texture into it by getting an OpenGL texture from the IOSurface via CGLTexImageIOSurface2D() (defined in <OpenGL/CGLIOSurface.h>). You then take that texture and use it as the backing texture for an FBO. You can, for example, draw a textured quad into it using the input FxTexture as the texture. Be sure the call glFlush() when done!

Next take the IOSurface and create a MTLTexture from it via -[MTLDevice newTextureWithDescriptor:ioSurface:plane:] (described here). You'll want to create an output IOSurface to draw into and also create a MTLTexture from it. Do your Metal rendering into the output MTLTexture. Next, take the output IOSurface and create an OpenGL texture out of it via CGLTexImageIOSurface2D(). Now copy that OpenGL texture into the output FxTexture either by using it as the backing of a texture-backed FBO or whatever other method you prefer.

As you can see, the downside of this is that each render requires 2 copies - 1 of the input into an IOSurface and 1 of the output IOSurface into the output texture the app gives you. The other downside is that this is probably all moot, as with Apple having announced publicly that they're ending support for OpenGL, they're probably working on a Metal-based solution already. It may be extra work to do it all yourself. (Though the upside is that you can use that same code in other host applications that only support OpenGL.)

user1118321
  • 25,567
  • 4
  • 55
  • 86
  • I assume that starting with the FxTexture provided by the host app that that would involve what you said about copying the texture to IOSurface. Would that require using the CVPixelBuffer at all in that scenario? And another thing I was wondering, does the texture duplication have any performance detriments, or does metal overall make up for that? – mredig Jan 22 '19 at 06:29
  • Yes, you need to copy the `inputImage` sent to your `-renderOutput:withInput:withInfo:` method. It will involve using an `IOSurface` which you can create via `CVPixelBuffer` if you want, but you could also create it without the `CVPixelBuffer`. – user1118321 Jan 22 '19 at 06:31
  • There is a way described in Apple's sample code: https://developer.apple.com/documentation/metal/mixing_metal_and_opengl_rendering_in_a_view – Aliaksandr Andrashuk Dec 15 '20 at 09:10