I've wrote quicksync encoder hardware implementation for realtime streaming. Everything works fine, however now I got stuck on encoder parametrization. Docs from Intel are very poor and non-specific. I've read:
- SDK API Reference Manual
- Intel Media Developer's Guide(rev. 2017)
I've also tried to follow the NVENC guidelines(I did same for Nvidia a while ago and everything works fine) but no luck here either.
So far with:
m_mfxEncParams.IOPattern = MFX_IOPATTERN_IN_VIDEO_MEMORY;
m_mfxEncParams.AsyncDepth = 4;
m_mfxEncParams.mfx.TargetKbps = my_target_bps_ / 1000;
m_mfxEncParams.mfx.MaxKbps = 0;
//m_mfxEncParams.mfx.InitialDelayInKB = 0;
m_mfxEncParams.mfx.BufferSizeInKB = my_width_ * my_height_ * 1.5;
//FRAME INFO PARAMS
ConvertFrameRate(30, &m_mfxEncParams.mfx.FrameInfo.FrameRateExtN, &m_mfxEncParams.mfx.FrameInfo.FrameRateExtD);
m_mfxEncParams.mfx.FrameInfo.FourCC = MFX_FOURCC_NV12;
m_mfxEncParams.mfx.FrameInfo.ChromaFormat = MFX_CHROMAFORMAT_YUV420;
m_mfxEncParams.mfx.FrameInfo.PicStruct = MFX_PICSTRUCT_PROGRESSIVE;
m_mfxEncParams.mfx.FrameInfo.Shift = 0;
m_mfxEncParams.mfx.FrameInfo.CropX = 0;
m_mfxEncParams.mfx.FrameInfo.CropY = 0;
m_mfxEncParams.mfx.FrameInfo.CropW = my_width_;
m_mfxEncParams.mfx.FrameInfo.CropH = my_height_;
m_mfxEncParams.mfx.FrameInfo.Width = MSDK_ALIGN16(width_);
m_mfxEncParams.mfx.FrameInfo.Height =
(MFX_PICSTRUCT_PROGRESSIVE == m_mfxEncParams.mfx.FrameInfo.PicStruct)
? MSDK_ALIGN16(height_)
: MSDK_ALIGN32(height_);
m_mfxEncParams.mfx.CodecId = MFX_CODEC_AVC;
m_mfxEncParams.mfx.CodecProfile = MFX_PROFILE_AVC_BASELINE;
m_mfxEncParams.mfx.CodecLevel = 0;
m_mfxEncParams.mfx.GopPicSize = 0;
m_mfxEncParams.mfx.GopRefDist = 1;
m_mfxEncParams.mfx.GopOptFlag = MFX_GOP_STRICT;
m_mfxEncParams.mfx.IdrInterval = 0;
//TRADEOF BETWEEN QUALITY AND SPEED
m_mfxEncParams.mfx.TargetUsage = MFX_TARGETUSAGE_BALANCED;
m_mfxEncParams.mfx.RateControlMethod = MFX_RATECONTROL_CBR;
m_mfxEncParams.mfx.NumSlice = 0;
m_mfxEncParams.mfx.NumRefFrame = 0;
m_mfxEncParams.mfx.EncodedOrder = 0; // binary flag, 0 signals encoder to take frames in display order
//CODING OPTION 2
auto codingOption2 = m_mfxEncParams.AddExtBuffer<mfxExtCodingOption2>();
codingOption2->LookAheadDepth = 0;
codingOption2->MaxSliceSize = 0;
codingOption2->MaxFrameSize = 0;
codingOption2->BRefType = MFX_B_REF_OFF;
codingOption2->ExtBRC = 0;
codingOption2->IntRefType = 0;
codingOption2->IntRefCycleSize = 0;
codingOption2->IntRefQPDelta = 0;
codingOption2->AdaptiveI = 0;
codingOption2->AdaptiveB = 0;
Right now for FullHD I've got 20 FPS of really low quality with bitrate around 2mb/s(peak, static is around 800kb/s), while for FullHD on Nvidia I've got 40 FPS of really quality image with bitrate around 1.3mb/s(peak, static is around 300kb/s)
I need some kind of starting point settings for real-time streaming with QuickSync or at least some extra literature on that subject.
EDIT: I should specify. That hardware's implementation sole purpose is to encode image to send to browser via webrtc. That is why I use Baseline profile. I know in theory chromium can decode Main and High profiles, however for now I just want to stay within borders of officially supported Baseline profile(same profile which I use with NVENC).