On my Jetson Nano, I'm running this pipeline:
gst-launch-1.0 -vvvvv v4l2src device=/dev/video2 ! image/jpeg,width=1920,height=1080,framerate=30/1 ! rtpjpegpay ! udpsink host=224.1.2.3 port=8556
That results in the following debug output
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)2213689011, timestamp-offset=(uint)2954707277, seqnum-offset=(uint)5542
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)2213689011, timestamp-offset=(uint)2954707277, seqnum-offset=(uint)5542
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)NULL, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 2954758611
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 5677
I can see the traffic on my client machine (packets sent from the server to a multicast address) which results in around 50Mbps.
However, when trying to subscribe to that stream and display the output using this pipeline:
gst-launch-1.0 -vvvvvv udpsrc address=224.1.2.3 port=8556 ! application/x-rtp, encoding-name=JPEG, payload=26 ! rtpjpegdepay ! jpegdec ! videoconvert ! videoscale ! autovideosink
I only get this debug output and no image pops up.
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, framerate=(fraction)0/1, width=(int)1920, height=(int)1080
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, framerate=(fraction)0/1, width=(int)1920, height=(int)1080
everything seems to be detected properly except from the framerate showing "0/1". Also no message from autovideosink and jpegdec src pad
I'm also providing an udp packet screenshot which seems suspicious to me as it has Don't fragment flag set while its length is only 1442 bytes (which doesn't accomodate the whole FullHD image)
I've tried lowering the image resolution to a minimum the camera supports and I occasionally get one frame displayed on the screen, but no continoous video.
I'm using this USB camera btw: UC-684