2

I'm currently using Libavcodec to encode video frames using H.264. Because I'm streaming this video frames after they're encoded, minimizing latency is crucial to me. My current settings are:

avcodec_register_all();
codec = avcodec_find_encoder(AV_CODEC_ID_H264);

// allocate and set ffmpeg context
context = avcodec_alloc_context3(encoder->codec);
context->bit_rate = bitrate;
context->width = out_width;
context->height = out_height;
context->time_base.num = 1;
context->time_base.den = 30;
context->gop_size = 1; // send SPS/PPS headers every packet
context->max_b_frames = 0;
context->pix_fmt = AV_PIX_FMT_YUV420P;

// set encoder parameters to max performance
av_opt_set(context->priv_data, "preset", "ultrafast", 0);
av_opt_set(context->priv_data, "tune", "zerolatency", 0);

// open capture encoder
avcodec_open2(context, codec, NULL);

These settings work well, but I am trying to switch to hardware-based encoding to take the workload off my CPU. I currently have an NVIDIA GPU, so I tried using the following settings with h264_nvenc:

codec = avcodec_find_encoder_by_name("h264_nvenc");

// allocate and set ffmpeg context
context = avcodec_alloc_context3(encoder->codec);
context->dct_algo = FF_DCT_FASTINT;
context->bit_rate = bitrate;
context->width = out_width;
context->height = out_height;
context->time_base.num = 1;
context->time_base.den = 30;
context->gop_size = 1; // send SPS/PPS headers every packet
context->max_b_frames = 0;
context->pix_fmt = AV_PIX_FMT_YUV420P;

// set encoder parameters to max performancen
av_opt_set(context->priv_data, "preset", "llhq", 0);
av_opt_set(context->priv_data, "tune", "zerolatency", 0);

The issue is, I noticed that the latency with h264_nvenc is significantly more than the latency with AV_CODEC_ID_H264 (the software-based version). I think that my settings or setup of h264_nvenc must be wrong, since GPU-based encoding should be faster than software-based encoding. Could anyone point me in the right direction? Thanks so much!

M. Ying
  • 197
  • 2
  • 15
  • `context->gop_size = 1` --> this will code each frame as intra-coded. Possibly, both the increased encoding and data I/O is adding to the latency. – Gyan Jan 17 '20 at 08:22

1 Answers1

1

av_opt_set(context->priv_data, "tune", "zerolatency", 0) worked for libx264, but when I swithed to "h264_nvenc" I had to add:

av_opt_set(m_codec_context->priv_data, "zerolatency", "1", 0);
av_opt_set(m_codec_context->priv_data, "delay", "0", 0);
puksec
  • 21
  • 3