1

I have Janus(WebRTC) server. And I am using VP8/OPUS. Then Janus RTP Packet forwards to GStreamer. I have two questions.

Do I have to run one GStreamer(with multiple threads) or multiple GStremaer? Actually, Janus sent to Gstreamer multiple RTP streams. Ex) Two peer are in WebRTC room. Then, Janus sent 4 RTP packet to GStreamer. peer1: video/audio, peer2: video/audio. If I ran just one GStreamer, it is not possible to ascertain who each stream is from. So To classify I have to separate port with multiple GStreamer procceses.

Like this:

Process1:

gst-launch-1.0 \ rtpbin name=rtpbin \ udpsrc name=videoRTP port=5000 \ caps=“application/x-rtp, media=(string)video, payload=98, encoding-name=(string)VP8-DRAFT-IETF-01, clock-rate=90000” \ ! rtpvp8depay ! webmmux ! queue \ ! filesink location=track1.webm \ udpsrc port=5002 \ caps=“application/x-rtp, media=audio, payload=111, encoding-name=(string)OPUS, clock-rate=48000" \ ! rtpopusdepay ! opusparse ! oggmux \ ! filesink location=audio.ogg

process2:

gst-launch-1.0 \ rtpbin name=rtpbin \ udpsrc name=videoRTP port=5003 \ caps=“application/x-rtp, media=(string)video, payload=98, encoding-name=(string)VP8-DRAFT-IETF-01, clock-rate=90000” \ ! rtpvp8depay ! webmmux ! queue \ ! filesink location=track1.webm \ udpsrc port=5005 \ caps=“application/x-rtp, media=audio, payload=111, encoding-name=(string)OPUS, clock-rate=48000" \ ! rtpopusdepay ! opusparse ! oggmux \ ! filesink location=audio.ogg

So I confuse. Whether multiple threads? or multiple processes? Tell me details plz!

How do I mux VP8/OPUS to mp4 container in realtime? I searched for it for a long time. But I can't yet. GStreamer has so many options for each version. I am waiting for your advice! Thank your.

I've tried as much as I can.

I expect way and mp4 files.

blackgreen
  • 34,072
  • 23
  • 111
  • 129

1 Answers1

1

Hi one solution may be the plugin tee

found on the help pages

Description

Split data to multiple pads. Branching the data flow is useful when e.g. capturing a video where the video is shown on the screen and also encoded and written to a file. Another example is playing music and hooking up a visualisation module.

One needs to use separate queue elements (or a multiqueue) in each branch to provide separate threads for each branch. Otherwise a blocked dataflow in one branch would stall the other branches. Example launch line

1

gst-launch-1.0 filesrc location=song.ogg ! decodebin ! tee name=t ! queue ! audioconvert ! audioresample ! autoaudiosink t. ! queue ! audioconvert ! goom ! videoconvert ! autovideosink

Play song.ogg audio file which must be in the current working directory and render visualisations using the goom element (this can be easier done using the playbin element, this is just an example pipeline).

User 10482
  • 855
  • 1
  • 9
  • 22
pierre
  • 11
  • 1