0

I am able to stream and receive webcam feed in two terminal via udp

command for streaming:

ffmpeg -i /dev/video0 -b 50k -r 20 -s 858x500 -f mpegts udp://127.0.0.1:2000  

command for recieving:

ffplay  udp://127.0.0.1:2000 

Now i have to use this received video stream as input in python/opencv how can i do that. I will be doing this using rtp and rstp as well. But in case of rtsp it is essential to initiate the receiving terminal, but if I do that then port will become busy and my program will not be able to take the feed.How could it be resolved. I am currently using opencv 2.4.13, python 2.7 in ubuntu 14.04

vasu gupta
  • 61
  • 3
  • 9

1 Answers1

1

Check this tutorial, and use cv2.VideoCapture("udp://127.0.0.1:2000"). You will need to build opencv with FFmpeg so that it works.

Mohamed Moanis
  • 477
  • 7
  • 18
  • thank you for your help it worked.........but now i have to do the same thing using rtsp...but the problem is that in rtsp you have to first initiate receiver but if i do that my program will not run as port will be busy – vasu gupta Jul 11 '16 at 12:12
  • I dont know much of the semantics of RTSP. But if it isn't working, You can use the FFmpeg APIs to manually receive the RTSP packets, decode them into frames then pass them to your opencv code. But I guess `VideoCapture` abstract all that away? – Mohamed Moanis Jul 11 '16 at 12:17
  • You can check this [link](http://stackoverflow.com/questions/21041370/opencv-how-to-capture-rtsp-video-stream) for RTSP, I guess he used `VideoCapture` and just passed the URL to it. – Mohamed Moanis Jul 12 '16 at 13:11