You would use uridecodebin that can decode various types of urls, containers, protocols and codecs.
With Jetson, the decoder selected by uridecodebin for h264 would be nvv4l2decoder, that doesn't use GPU but better dedicated HW decoder NVDEC.
nvv4l2decoder outputs into NVMM memory in NV12 format, while opencv appsink expects BGR format in system memory. So you would use HW converter nvvidconv for converting and copying into system memory. Unfortunately, nvvidconv doesn't support BGR format, so first convert into supported BGRx format with nvvidconv, and finally use CPU plugin videoconvert for BGRx -> BGR conversion such as:
pipeline='uridecodebin uri=rtsp://127.0.0.1:8554/test ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1'
cap = cv2.VideoCapture(pipeline, cv2.CAP_GSTREAMER)
This is for the general way.
Though, for some streaming protocols it may not be so simple.
For RTP-H264/UDP, ffmpeg backend may only work with a SDP file.
For gstreamer backend you would instead use a pipeline such as:
pipeline='udpsrc port=46002 multicast-group=234.0.0.0 ! application/x-rtp,encoding-name=H264 ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1'
As you can use FFMPEG, I'd speculate that received stream is using RTP-MP2T. So you would try:
# Using NVDEC, but this may fail depending on sender side's codec:
cap = cv2.VideoCapture('udpsrc multicast-group=234.0.0.0 port=46002 ! application/x-rtp,media=video,encoding-name=MP2T,clock-rate=90000,payload=33 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1', cv2.CAP_GSTREAMER)
# Or using CPU (may not support high pixel rate with Nano):
cap = cv2.VideoCapture('udpsrc multicast-group=234.0.0.0 port=46002 ! application/x-rtp,media=video,encoding-name=MP2T,clock-rate=90000,payload=33 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1', cv2.CAP_GSTREAMER)
[Note that I'm not familiar with 234.0.0.0, so unsure if multicast-group should be used as I did].
If this doesn't work, you may try to get more information about received stream. You may try working ffmpeg such as:
ffmpeg -hide_banner -loglevel debug -i udp://234.0.0.0:46002 -f xv display
If you see:
Stream #0:0, 133, 1/1200000: Video: h264 (Constrained Baseline), 1 reference frame, yuv420p(progressive, left), 720x576, 0/1, 25 fps, 25 tbr, 1200k tbn, 50 tbc
you may have to change clock-rate to 1200000 (default value is 90000):
application/x-rtp,media=video,encoding-name=MP2T,clock-rate=1200000
This is assuming the stream is mpeg2 ts. In this case, first lines show:
...
Opening an input file: udp://127.0.0.1:5002.
[NULL @ 0x55761c4690] Opening 'udp://127.0.0.1:5002' for reading
[udp @ 0x55761a27c0] No default whitelist set
[udp @ 0x55761a27c0] end receive buffer size reported is 131072
[mpegts @ 0x55761c4690] Format mpegts probed with size=2048 and score=47
[mpegts @ 0x55761c4690] stream=0 stream_type=1b pid=41 prog_reg_desc=HDMV
[mpegts @ 0x55761c4690] Before avformat_find_stream_info() pos: 0 bytes read:26560 seeks:0 nb_streams:1
...
ffmpeg tries to guess and here found the stream was in mpegts format. You would check in your case what ffmpeg finds. Note that first guess may not be correct, you would have to check the whole log and see what it finds working.
Another speculation would be that your stream is not RTP, but rather raw h264 stream. In such case you may be able to decode with something like:
gst-launch-1.0 udpsrc port=46002 multicast-group=234.0.0.0 ! h264parse ! nvv4l2decoder ! autovideosink
If this works, for opencv you would use:
pipeline='udpsrc port=46002 multicast-group=234.0.0.0 ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1'