0

i was wondering how best to achieve a live video stream to (ideally multiple) clients on a website. One important factor is that a low latency is crucial for the video in this webapp. The video stream should be realized using gstreamer.

My first try to achieve this was streaming from gstreamer straight to an html5/video-js tag. This worked, however the video always had a delay of a couple of seconds, so this won't be viable.

After that I found that probably WebRTC is the technology which is made for this, as I saw some interesting links like this one or this here.

Also this blog mentioned that an implementation of WebRTC was added to gstreamer.

As there aren't too many examples and I am a beginner with these technologies, I am not sure on how to best get started with this. That's why I really would appreciate any help.

I am unsure about the extent of this webRTC implementation of gstreamer mentioned above. Does it offer everything I need already? Or do I need a dedicated webRTC server like kurento?

UPDATE1:

ALso if it could be useful information. The application this stream should be included in uses Vue.js in the fronend and Flask in the backend. These two mainly communicate over WebSockets otherwise. I also found this implementation of webRTC for Paython. But again I am not sure if it would solve my problem and what components I need to achieve my goal.

Please feel free to also suggest other technologies if they might be more suitable. Thanks in advance!

NotARobot
  • 455
  • 2
  • 10
  • 27
  • Where is the video being captured from? Are you trying to stream files, is a user sharing a webcam etc... that will change the answer to this question a lot. thanks! – Sean DuBois Jul 02 '20 at 20:55
  • @Sean DuBois Oh I did not know that this would make a differece.The stream should come directly from a camera and show live pictures.from there. – NotARobot Jul 03 '20 at 04:15
  • I think aiortc is a great solution! There are lots of WebRTC implementations available, you should use what works best for you. https://stackoverflow.com/questions/59865405/use-webrtc-getusermedia-stream-as-input-for-ffmpeg/59906617#59906617 -- It is also possible to run Chromium headless and do `getUserMedia` one other clever solution I have seen people do. – Sean DuBois Jul 03 '20 at 06:56
  • So I came back to this project after some time. Aiortc sadly was not an option as I saw it only works for python3 and I cam forced to use Python2 in this project due to other dependencies. (Is there a way to make this work anyway?) – NotARobot Oct 15 '20 at 09:50
  • Also I don't know if I understood getUserMedia incorrectly, but I tried something with that already, using a webcam. The user than had to accept in a dialog that they want to give access to the camera. However this is not a viable solution, as the camera I am planning to use will not be directly connected to a pc with the browser on it. Or does this somehow work with accessing a gstreamer pipeline too? – NotARobot Oct 15 '20 at 09:52

1 Answers1

0

If you are able to push data into GStreamer pipeline and use WebRtcBin, this should do the job. Search for WebRTC sendrecv sample for GStreamer and will understand what to do. There are also few videos in youtube explaining how GStreamer works in general and how WebRtcBin in particular.

pvv
  • 147
  • 10