I am developing a Qt app for iOS, Android & OSX.
Situation:
I have an std::vector
of QOpenGLFramebufferObjects. Each QOpenGLFramebufferObject
can of course provide its own QImage or a GLunit
texture by doing a takeTexture. So you can also say that I have a collection of QImage
s or GLunit
textures.
Problem:
Now, I want to create a .mp4
video file out of these which works at least on iOS, Android & OSX.
How should I do this? Any examples doing this with Qt? Which classes in Qt should I be looking into?
ffmpeg or GStreamer, whichever works with Qt. But I need to know how to pass these QImages or Glunit textures into the required component or API to create the video.
Should I use QVideoEncoderSettings to create the video?