I'd like to transcode video from H264 to MJPEG on the Pi.
Schematically, there's a process that injects an H264 raw stream in a pipe and takes out (in real time) the same stream, encoded with MJPEG, from anoher pipe. Currently, the black box that takes in H264 and spits out MJPEG is a simple transcoding program, made with FFMPEG. This approach works well in modern workstations, but it is too CPU-taxing on the Raspberry Pi.
I know that The Pi supports hardware H264 decoding with OpenMax, and doesn't support MJPEG encoding out of the box.
Is there a way (complicated at will) to do MJPEG hardware encoding as well?
Are there examples on how to do this kind of transcoding with openMax?
(I'm aware of the existence of this project, but it doesn't satisfy question 1)