I want to process camera stream in opencv (that works fine), but then output the resulting processed frames over the network so I can see them as it's an embedded system without a monitor. I've spent days looking, but cannot work out how to push the frames to gstreamer in python. It looks like I should create an appsrc->udp pipeline in gstreamer (that's no problem), but I cannot find anywhere how to push the frames to appsrc in opencv using python. I've seen some examples in C++ but I've no idea how to translate them to python. It seems one can either copy buffers like this: Push images into Gstreamer pipeline Or this implies an easier way by opening an appsrc directly in opencv videowriter: How to write opencv mat to gstreamer pipeline?
Does anyone out there have a very simple example of how to output opencv to gstreamer in python?