3

I'm using OpenCV and GStreamer 0.10.

I use this pipeline to receive the MPEG ts packets over UDP with a custom socket sockfd provided by python and display it with xvimagesink, and it works perfectly. Following commend line is for this pipeline:

PIPELINE_DEF = "udpsrc do-timestamp=true name=src blocksize=1316 closefd=false buffer-size=5600 !" \
           "mpegtsdemux !" \
           "queue !" \
           "ffdec_h264 max-threads=0 !" \
           "ffmpegcolorspace !" \
           "xvimagesink name=video"

Now, I want to get one frame from this pipeline and display it with OpenCV. How can I do it? I know a lot about getting buffer data from appsink. But I still do not know how to convert those buffer to each frames for OpenCV. Thanks for reply, and any help :]

nayana
  • 3,787
  • 3
  • 20
  • 51
Hao Zhou
  • 31
  • 4
  • Thanks for editing Tyler, Try :] – Hao Zhou Jul 24 '16 at 18:28
  • Welcome to SO :) please show us some work with appsink - if you already used it this is the actual solution - or is it not working properly for you? I think you have to just extract the data part of the buffer which represents one video frame which should be processable for OpenCV.. also another thing is that you are using realy outdated version of GStreamer which is already for years not supported by anybody - for this task - as you are not using specific drivers which would require 0.10 - you can switch to 1.x - there are python bindings for this version as well. Good luck – nayana Jul 25 '16 at 06:50
  • Thanks, I have tried to use switch to gst 1.0, but there is one the most important property called "sockfd" which I need to use to make a custom socket for this pipeline. It worked well with gst 0.10. However, when I switched to gst 1.0, and tried to use "GSocket" instead of "sockfd". It shown "no linked with src" likes this http://stackoverflow.com/questions/37795191/error-in-pipeline-porting-pygst-program-from-gstreamer-0-10-to-1-0, so i went back to gst 0.10. – Hao Zhou Jul 27 '16 at 16:24

1 Answers1

0

Thanks, I have tried to use rtph264pay to broadcast the live video steam to udpsink. Following commend line is for the gst pipeline:

PIPELINE_DEF = 
"udpsrc name=src !" \               
"mpegtsdemux !" \       
"queue !" \               
"h264parse !" \ 
"rtph264pay !" \   
"udpsink host=127.0.0.1 port=5000"  

And I built a sdp file to make it can be received by opencv likes videocapture("123.sdp") 123.sdp, following content is for this sdp file:

c=IN IP4 127.0.0.1 
m=video 5000 RTP/AVP 96 
a=rtpmap:96 H264/90000 

It worked well now, just need to delete "blocksize=1316 closefd=false buffer-size=5600" to release the limitation.

Hao Zhou
  • 31
  • 4
  • but where is the openCV part? is this really the answer? – nayana Jul 28 '16 at 06:44
  • Thanks, i used the opencv : video_capture = cv2.VideoCapture("123.sdp") "123.sdp" is the name of sdp file. – Hao Zhou Jul 28 '16 at 12:25
  • ok - maybe I am just not familiar with OpenCV.. but you may want to add this to you answer.. because you ask on OpenCV, but in your answer you do not mention it.. then you may accept the answer if you wish – nayana Jul 28 '16 at 12:35