1

I have a Linux board with a camera connected to it that is recording whatever goes on to a mp4 file(s) on the SD card of the board. I'm using gstreamer which connects to the /dev/video1 source and uses H264 encoding. I run it with a command similar to this one:

gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-h264,width=640,height=480,framerate=30/1 ! h264parse ! rtph264pay ! udpsink host={host} port={port}

The upper part works fine and records everything locally, but I'd also like to stream this video to a HTML5 webpage, which is meant to change camera options and have a live preview.

I tried using HTTP via tcpsink and HLS via hlssink, but both resulted in a 8-10 second delay, which is basically unusable. The only thing that has no delay is the UDP sink. As far as I know the only way to catch the UDP stream is by having a tool like FFMPEG in the middle, that can convert the UDP stream to MJPEG for instance and serve it to the webpage.

That would probably work, but the board doesn't have a very good CPU and is already at 50% utilization. Converting stream via FFMPEG would probably push it to 100%.

Is there any other way to stream to a webpage without delay?

LostInTheEcho
  • 249
  • 5
  • 14
  • Change the board? – 0andriy Sep 28 '20 at 22:48
  • Heh, changing the board is unfortunately not an option. – LostInTheEcho Sep 29 '20 at 08:25
  • 1
    A WebRTC tased solution might be worth looking at - it is specifically designed for low latency/real time. There are example WebRTC Raspberry PI based cameras from some quick searches so this might be a good starting point. – Mick Sep 29 '20 at 10:42
  • I was looking at WebRTC a bit, but didn't dive really into it yet. Do you have any experience using WebRTC? Can you "catch" WebRTC directly in the browser? – LostInTheEcho Sep 29 '20 at 12:34

0 Answers0