0

I’m using this pipeline for streaming processed frames:

pipeline = Gst.parse_launch(‘appsrc name=m_appsrc ! capsfilter name=m_capsfilter ! videoconvert ! x264enc ! rtph264pay ! udpsink name=m_udpsink’)

i can capture frames with appsink

cap = cv2.VideoCapture(
‘udpsrc port=5004 caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264”’
’ ! rtph264depay’
’ ! avdec_h264’
’ ! videoconvert’
’ ! appsink’, cv2.CAP_GSTREAMER) 

But i want to recieve frame on NVR and i want to know url for connection. When I try to connect by url rtsp://127.0.0.1:5004 with opencv:

cap = cv2.VideoCapture(‘rtsp://127.0.0.1:5004’)

I get error:

[tcp @ 0x2f0cf80] Connection to tcp://127.0.0.1:5004?timeout=0 failed: Connection refused

How can I find the url to connect to the stream?

Thank you in advance!

UPD: I'm trying to send and recieve frames on the same Jetson Nano, but in different docker containers (run with flag --net=host).
I found example for rtsp streaming, added 276-283 lines to my code and run pipeline without errors. In second container I run this script:

cap = cv2.VideoCapture('rtsp://localhost:8554/ds-test', cv2.CAP_FFMPEG)
if cap.isOpened():
    print('opened')

But video is not opening.

  • You may better explain your case, especially what you're trying on which system. The first pipeline has updsink without any parameters so it would stream to localhost on port 5004. With gstreamer on localshost you seem to be able to get it. Did you use same host with the last VideoCapture (I guess it would use FFMPEG backend), seems it used TCP transport...not sure this was expected. Did you use rtspt: ? – SeB Feb 24 '22 at 19:33
  • @SeB rtspt instead of rtsp has no effect – TheConst _ Feb 24 '22 at 21:37

1 Answers1

0

You may be confused with RTP and RTSP. RTSP is basically an application layer protocol for providing a SDP giving stream properties and establishing a network transport link for RTP (usually over UDP, but TCP may also be used if asked using rtspt: url, or specifying transport protocol to gstreamer rtspsrc, or if going thru networks that may prevent normal operation).

I'm not sure for your case with docker containers, but you may try to create a SDP file test.sdp with the following content:

m=video 5004 RTP/AVP 96
c=IN IP4 127.0.0.1
a=rtpmap:96 H264/90000

saying that RTP stream is video to be received on port 5004, as localhost with IPv4, where payload 96 has H264 encoded video with clock-rate 90000. Then you may be able to receive with opencv videoCapture using FFMPEG backend with:

cap = cv2.VideoCapture("test.sdp", cv2.CAP_FFMPEG)

similar to your gstreamer backend opencv capture.

Be sure there is no firewall rule preventing UDP/5004 to be sent or received.

Also note that for running, with that simple sdp the receiver may not be able to get resolution, etc... so better have your sender to periodically send its config using:

pipeline = Gst.parse_launch(‘appsrc name=m_appsrc ! capsfilter name=m_capsfilter ! videoconvert ! x264enc insert-vui=1 ! rtph264pay ! udpsink name=m_udpsink’)

# or
pipeline = Gst.parse_launch(‘appsrc name=m_appsrc ! capsfilter name=m_capsfilter ! videoconvert ! x264enc ! h264parse config-interval=1 ! rtph264pay ! udpsink name=m_udpsink’)

# or
pipeline = Gst.parse_launch(‘appsrc name=m_appsrc ! capsfilter name=m_capsfilter ! videoconvert ! x264enc ! h264parse ! rtph264pay config-interval=1 ! udpsink name=m_udpsink’)
SeB
  • 1,159
  • 6
  • 17
  • Thank you! But i need to connect with rtsp [deepstream-test1-rtsp-out](https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/blob/20c6b13671e81cf73ca98fa795f84cab7dd6fc67/apps/deepstream-test1-rtsp-out/deepstream_test1_rtsp_out.py#L276) work proper way, but not my [script](https://gist.github.com/f2d7120c11d568afab351f2a1d50aaaf.git) When my script is running, ```cv2.VideoCapture('rtsp://localhost:8554/ds-test', cv2.CAP_FFMPEG)``` in another container waits about 40secs and doesn't open. when script isn't running, VideoCapture exit immediately. – TheConst _ Feb 28 '22 at 10:15
  • deepstream-test1-rtsp-out uses .h264 file as source, my script uses buffer, I think it is main difference, that requires additional capsfilter properties but I can not find structured information about the correct capsfilter setting – TheConst _ Feb 28 '22 at 10:15