Heads up: The goal of my project is to replace a regular Intel Core PC with a Raspberry Pi 4
I have a camera simulation that runs on a Intel PC pretty well. It takes MP4 files and encodes them into jpeg with jpegenc. Using GStreamer and its plugins, namely avdec_h264 and qtdemux this works pretty well. There is also an option to use the vaapih264dec and its jpeg encoder counterpart. This is useful because the CPU usage is super high using the non hardware optimized plugins. i.e. on the Pi this program works as well but with only 4 cameras we are at 100% usage on all 4 Kernels.
Now I have been researching quite a lot and the first answer was using omxh264dec since that is the vaapi counterpart for the RPi (or so I´m assuming). I cant get this to work and every time I try anything different the Pipeline simply wont build.
I have tried :
-Swapping the demuxer
-Changing the decoder and encoder (no combination other than the CPU using ones seemed to work)
-Asking on the GStreamer forum (was just told that it doesn't work that way, but got no clue as to where to start looking elsewhere)
-Even tried to build the pipeline without the whole program but even that doesn't seem to work with omxh264
Pipeline :
gst-launch-1.0 filesrc location=/home/pi/test.mp4 ! qtdemux ! h264parse ! omxh264dec ! autovideosink
gives this error :
Leitung wird auf PAUSIERT gesetzt ...
Leitung läuft vor …
FEHLER: Von Element /GstPipeline:pipeline0/GstQTDemux:qtdemux0: Internal data s$
Zusätzliche Fehlerdiagnoseinformation:
qtdemux.c(6073): gst_qtdemux_loop (): /GstPipeline:pipeline0/GstQTDemux:qtdemux$
streaming stopped, reason not-negotiated (-4)
So my question is really : is it somehow possible to use Gstreamer and stream omxdecoded footage and if not how I can still use less CPU on my program so my RPi doesn't end up dying.