3

I want to write gstreamer pipeline where video is rendered by OpenGLES2. My first implementation was gstreamer appsink which makes glTexSubImage2D() write to texture, then rendering. It works but extremely slow. OpenGLES2 glTexSubImage2D() is very slow function.

I know there is a special glimagesink for those purposes, but it requires Window Id where rendering will happen and this is not what I want.

I need gstreamer push video frame to OpenGLES2 Texture and then I will render it myself using my own OpenGLES2 shaders using my own transforms and conversions etc.

I can create 3 DMABUF memory regions for 3 planes YUV where 3 textures for Y, U and V are mapped. CPU writes to DMABUF memory region and data appear in GPU texture.

So I would like gstreamer (after HW video decodec) just writes ready frames into my memory pointers. But how to do this?

I can write my own gstreamer appsink which will copy frame to my DMABUF pointers but this is additional copy which I want to avoid.

There are some nice presentations and even youtube videos about gstreamer zero copy to display but no details and no code sample, How to start?

I already read all internet but cannot find suitable point to start from. Can someone give me advice?

nckm
  • 103
  • 1
  • 10

0 Answers0