0

I am sending an encoded H264 stream using gstreamer and decoding it on an Intel hardware. My sender is actually an application and its pipeline elements look something like this:

caps2 = gst_caps_new_simple("video/x-raw",
               "format", G_TYPE_STRING, "I420",
               "width", G_TYPE_INT, 640,
               "height", G_TYPE_INT, 480,
               "framerate",GST_TYPE_FRACTION, 15 ,1,
               "pixxel-aspect-ratio",GST_TYPE_FRACTION,1,1, NULL);
gst_app_src_set_caps(GST_APP_SRC(app->videosrc),caps2);

gst_bin_add_many(GST_BIN(app->pipeline), app->videosrc,   app->x264enc,   app->rtppay,  app->udpsink, NULL );

So basically an appsrc element getting data from a live camera, encoded using h264 encoder and sent through udp sink .

The receiver pipeline looks like this :

gst-launch-1.0 -v udpsrc port=5002 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264, payload=96  ! rtph264pdepay !  vaapih264dec ! videoconvert ! vaapisink sync=FALSE

The output looks like the one shown in the figure:output of the receiver pipeline on my hardware

I want to know where am I going wrong?is it latency or color format?

vahdet
  • 6,357
  • 9
  • 51
  • 106
  • what have you tried? have you recorded the camera locally? have you tried using videotestsrc instead of camera input (using the rest of rtp/udp logic)? have you tried normal xvimagesink or similar instead vaapi? I've randomly found this https://gist.github.com/esrever10/7d39fe2d4163c5b2d7006495c3c911bb you should probably use caps parameter for udpsrc (not sure if it makes difference) – nayana Mar 14 '19 at 14:22
  • rtpjitterbuffer? I would read some example concerning RTP and GStreamer. UDP transmission and hardware decoding is a complex system. I would try starting of a little less complex. – Florian Zwoch Mar 14 '19 at 15:34

0 Answers0