I am sending an encoded H264 stream using gstreamer and decoding it on an Intel hardware. My sender is actually an application and its pipeline elements look something like this:
caps2 = gst_caps_new_simple("video/x-raw",
"format", G_TYPE_STRING, "I420",
"width", G_TYPE_INT, 640,
"height", G_TYPE_INT, 480,
"framerate",GST_TYPE_FRACTION, 15 ,1,
"pixxel-aspect-ratio",GST_TYPE_FRACTION,1,1, NULL);
gst_app_src_set_caps(GST_APP_SRC(app->videosrc),caps2);
gst_bin_add_many(GST_BIN(app->pipeline), app->videosrc, app->x264enc, app->rtppay, app->udpsink, NULL );
So basically an appsrc element getting data from a live camera, encoded using h264 encoder and sent through udp sink .
The receiver pipeline looks like this :
gst-launch-1.0 -v udpsrc port=5002 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264, payload=96 ! rtph264pdepay ! vaapih264dec ! videoconvert ! vaapisink sync=FALSE
The output looks like the one shown in the figure:output of the receiver pipeline on my hardware
I want to know where am I going wrong?is it latency or color format?