0

On my Jetson Nano, I'm running this pipeline:

gst-launch-1.0 -vvvvv v4l2src device=/dev/video2 ! image/jpeg,width=1920,height=1080,framerate=30/1 ! rtpjpegpay ! udpsink host=224.1.2.3 port=8556

That results in the following debug output

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)2213689011, timestamp-offset=(uint)2954707277, seqnum-offset=(uint)5542
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)2213689011, timestamp-offset=(uint)2954707277, seqnum-offset=(uint)5542
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 2954758611
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 5677

I can see the traffic on my client machine (packets sent from the server to a multicast address) which results in around 50Mbps.

However, when trying to subscribe to that stream and display the output using this pipeline:

gst-launch-1.0 -vvvvvv udpsrc address=224.1.2.3 port=8556 ! application/x-rtp, encoding-name=JPEG, payload=26 ! rtpjpegdepay ! jpegdec ! videoconvert ! videoscale ! autovideosink

I only get this debug output and no image pops up.

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, framerate=(fraction)0/1, width=(int)1920, height=(int)1080
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, framerate=(fraction)0/1, width=(int)1920, height=(int)1080

everything seems to be detected properly except from the framerate showing "0/1". Also no message from autovideosink and jpegdec src pad

I'm also providing an udp packet screenshot which seems suspicious to me as it has Don't fragment flag set while its length is only 1442 bytes (which doesn't accomodate the whole FullHD image) udp packet

I've tried lowering the image resolution to a minimum the camera supports and I occasionally get one frame displayed on the screen, but no continoous video.

I'm using this USB camera btw: UC-684

  • Your two pipelines work for me. I would focus on network related issues. Is the IP resolved the same on both machines? firewall? – Christian Fritz Mar 19 '23 at 21:13
  • Other possible causes might be: 1. kernel socket max buffer size. Try setting buffer-size property of udpsink and udpsrc to higher values. 2. Wifi and multicast may not work well...Try streaming to only one host setting updsrc property auto-multicast to false – SeB Mar 21 '23 at 16:17
  • Sorry for the misleading information, in last case above I meant setting `updsink`'s property auto-multicast to false. – SeB Mar 21 '23 at 21:37

1 Answers1

0

My guess it's because you're facing jitter/package loss. Chances that some of the packets are out of order or missing are substantial when sending 50mbps. In the current pipeline if any of the packets are shuffled or lost the whole frame is probably going to be dropped. If network between client and server is not perfect, this might lead to no frames being decoded.

This explains why you might be seeing some of the frames in lower resolution: lower bitrate -> higher chance of receiving all packets for a frame in an expected order.

It can be confirmed by adding rtpjitterbuffer after udpsrc. Jitter buffer will take care of packets ordering. Then you might look up its stats property - it will show the number of lost packets.

Another way would be to connect your jetson and pc via wire to hopefully reduce package loss and see if it helps.

You also might replace v4l2src device=/dev/video2 with videotestsrc ! jpegenc to make sure it's not something with your usb camera stream. Also you might test streaming with lower resolution with videotestsrc and see if it helps.

Ivan
  • 624
  • 4
  • 7
  • Streaming h264/h265 would probably be a better choice for jetson as it can be hardware-encoded and then shared to clients using something like https://github.com/aler9/rtsp-simple-server – Ivan Mar 19 '23 at 23:47
  • Thanks for your input! According to this theory, why does the connection work when uni-casting the video stream (forgot to mention that in the original post, also that also rules out the possibility of camera issues). Package loss and jitter will likely cause some problems, but they would be the same for unicast. Also thanks for the h264 suggestion, but MJPEG is crucial for me as I'm streaming the video from a mobile robot to VR headset and require minimal latency and no compression artifacts. – Stanislav Svědiroh Mar 20 '23 at 11:34