1

I'm currently working on a remotely controlled robot that is sending two camera streams from a Jetson Nano to a PC/Android Phone/VR Headset.

I've been able to create a stable link between the robot and PC using gst-rtsp-server running this pipeline:

./test-launch nvarguscamerasrc sensor-id=1 ! video/x-raw(memory:NVMM) width=1920 height=1080 framerate=30/1 format=NV12 ! nvvidconv flip-method=2 ! omxh264enc iframeinterval=15 ! h264parse ! rtph264pay name=pay0 pt=96

And receiving it on PC using:

gst-launch-1.0 -v rtspsrc location=rtspt://192.168.1.239:8554/test ! application/x-rtp, payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false

On PC, there's an excellent latency of about ~120ms, so I thought there wouldn't be a problem running that same thing on Android. Using gstreamer's prebuild binaries from here and a modification from here to be able to use rtspsrc I've succesfully managed to receive the rtsp stream. But this time the video is "slowed down" (probably some buffer problems, or HW acceleration?)

Worked my way around that by using latency=150 drop-on-latency=true parametrs of rtspsrc which only keeps those frames with lower latency but as expected the output encoded image is trash.

So my question is: Why is there such a difference between a phone and a PC receiving the stream.

It seems that the gst-rtsp-stream is defaulting to sending via tcp which i tried to configure with gst_rtsp_media_factory_set_protocols(factory, GST_RTSP_LOWER_TRANS_UDP_MCAST) but doing that I can no longer receive the stream even on a PC with the same pipeline.

Is there a way to force gst-rtsp-server to send via udp. Or is there a way to optimize the phone encoding performance to run as quick a the PC does? (I have an Galaxy S10+, so I guess it should be able to handle that)

My goal is a clear video on Android/VR Headset with minimal latency (preferably the same ~120ms as on PC)

  • 1
    You may tell if PC and android devices are connected in same way (is one using wired ethernet while the other one uses wifi?). The rtsp server uses TCP because your client query asked for that using `rtspt` where last `t` queries for TCP transport. Just using `rstp` instead should use UDP. You may have a look to protocols property of rtspsrc for more details. Using netstat you may see the differences. – SeB Feb 10 '22 at 18:06
  • Both are using Wi-Fi. Oh, didn't know I could choose the protocol from the client. Is there a way to look into a communication to ensure which protocol is actually used? – Stanislav Svědiroh Feb 10 '22 at 18:13
  • 1
    As said above, on Linux system you may check with netstat. Have a second terminal where you would run for monitoring each second: `sudo watch -n 1 'netstat -laputen'`. – SeB Feb 10 '22 at 18:17
  • 1
    Also precise if the problem is just latency (note that rtspsrc will use rtpjitterbuffer that has default latency of 2000 ms unless otherwise specified) or if it is more a problem of keeping sync/throughput. – SeB Feb 10 '22 at 18:28
  • @SeB I've setup network sniffer through my router and wireshark and can confirm we're now running over UDP, thanks! Still trying to find the exact problem, my PC is keeping up with the stream just fine, now it seems to me, that the gstreamer on my android device is having hard time decoding the video (10FPS, ~20% usage on my 8-core CPU). Maybe the `avdec_h264` isn't HW accelerated on my device? – Stanislav Svědiroh Feb 10 '22 at 18:44
  • You may use gst-inspect-1.0 for checking what h264 decoders are available on your receiver. Also note that there may be a limitation about kernel socket max buffersize. You may try this on receiver side first (but may aplly on sender side as well, especially for high pixel rates in MJPG). Try setting property `udp-buffer-size` of rtspsrc to a huge value such as 32000000. – SeB Feb 10 '22 at 18:57
  • Also note that multicast is weird for wifi. Especially at 2.5 GHz. You may use unicast over wifi. – SeB Feb 10 '22 at 19:00
  • Hm, I wonder how to call gst-inspect-1.0 from an Android app. Also this question: https://stackoverflow.com/questions/41759418/gstreamer-hardware-accelerated-video-decoding-on-android suggests /etc/media_codecs.xml, but I don't have a file like that in my storage. I'll try to figure that out. Also rn I have all protocols enabled (udp uni, multi, tcp), how do I know which one I'm using? I want to consume the same stream from multiple devices at once, is it possible on Wi-Fi without using multicast? – Stanislav Svědiroh Feb 10 '22 at 20:07
  • Found out that I have `OMX.Exynos.avc.dec` with HW acceleration. Trying to build that into the pipeline with `amcviddec-omxexynosavcdec`, but it doesn't work when I just put that in instest of `avdec_h264` what am I missing here? – Stanislav Svědiroh Feb 10 '22 at 21:15

1 Answers1

1

The rtsp server uses TCP because your client query asked for that using rtspt where last t queries for TCP transport. Just using rstp instead should use UDP. You may have a look to protocols property of rtspsrc for more details.

Full story is in the comments here and continued to solution here: Gstreamer Android HW accelerated H.264 encoding

SeB
  • 1,159
  • 6
  • 17