I have a Linux board with a camera connected to it that is recording whatever goes on to a mp4 file(s) on the SD card of the board. I'm using gstreamer which connects to the /dev/video1 source and uses H264 encoding. I run it with a command similar to this one:
gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-h264,width=640,height=480,framerate=30/1 ! h264parse ! rtph264pay ! udpsink host={host} port={port}
The upper part works fine and records everything locally, but I'd also like to stream this video to a HTML5 webpage, which is meant to change camera options and have a live preview.
I tried using HTTP via tcpsink and HLS via hlssink, but both resulted in a 8-10 second delay, which is basically unusable. The only thing that has no delay is the UDP sink. As far as I know the only way to catch the UDP stream is by having a tool like FFMPEG in the middle, that can convert the UDP stream to MJPEG for instance and serve it to the webpage.
That would probably work, but the board doesn't have a very good CPU and is already at 50% utilization. Converting stream via FFMPEG would probably push it to 100%.
Is there any other way to stream to a webpage without delay?