0

I am developing a Qt app for iOS, Android & OSX.

Situation:
I have an std::vector of QOpenGLFramebufferObjects. Each QOpenGLFramebufferObject can of course provide its own QImage or a GLunit texture by doing a takeTexture. So you can also say that I have a collection of QImages or GLunit textures.

Problem:
Now, I want to create a .mp4 video file out of these which works at least on iOS, Android & OSX.

How should I do this? Any examples doing this with Qt? Which classes in Qt should I be looking into?

ffmpeg or GStreamer, whichever works with Qt. But I need to know how to pass these QImages or Glunit textures into the required component or API to create the video.

Should I use QVideoEncoderSettings to create the video?

TheWaterProgrammer
  • 7,055
  • 12
  • 70
  • 159
  • I doubt Qt is ready to accomplish this but I myself operate with QStreamer programmatically (what can GStreamer do for you: https://stackoverflow.com/questions/21152303/how-to-use-gstreamer-to-save-webcam-video-to-file). Look at https://gstreamer.freedesktop.org/data/doc/gstreamer/head/qt-gstreamer/html/ QtGStreamer or write own framework (which I did). – Alexander V Sep 27 '17 at 15:00
  • Strange. Qt has this whole [Multimedia](http://doc.qt.io/qt-5/qtmultimedia-index.html) module. But nothing to support creation of a video from simple QImages. I am ok to use GStreamer if required. How can I try this [QtGStreamer](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/qt-gstreamer/html/). Does it work on android & iOS? – TheWaterProgrammer Sep 27 '17 at 15:38
  • The problem with Multimedia module is that it may or may not work and we will never know why. I cannot use it, especially in embedded project. – Alexander V Sep 27 '17 at 16:14

0 Answers0