0

I want to create a simple video player that will show HDR video on HDR TV. For example, this "LG Chess HDR" video. It is encoded with HEVC, its bit depth is 10 bit, pixel format is YUV420P10LE and it has metadata abount BT2020 color space and PQ transfer function.

In this NVIDIA article I found the next:

The display driver takes the scRGB back buffer, and converts it to the standard expected by the display presently connected. In general, this means converting the color space from sRGB primaries to BT. 2020 primaries, scaling to an appropriate level, and encoding with a mechanism like PQ. Also, possibly performing conversions like RGB to YCC if that display connection requires it.

It means that my player should render pixels in the scRGB color space (linear encoding, sRGB primaries, full range is -0.5 through just less than +7.5). So I need to get frames from the source video in this color space somehow, preferably in FP16 pixel format (half float, 16 bits per one color channel). I come to the following simple pipeline to render videos to HDR:

source HDR video in BT2020 color space with applied PQ -> [some video library] -> 
-> video frames with colors in scRGB color space -> [my program] ->
-> rendered video on HDR TV with applied conversions by display driver

I'm trying to use FFmpeg as this library and do not understand how to get frames from the source HDR video in scRGB color space.

I use sws_scale FFmpeg method now to get frames and know about filters API. But I did not found any information and help about how to transparantly get frames in scRGB using these functionality without parsing metadata for all source videos and create custom video filters for them.

Please, tell me what I can do to get frames in the scRGB color space using FFmpeg. Can someone tell other libraries with which I can do it?

1 Answers1

0

Either I'm mistaken or you delve too much detail. As far as I know for UHD HDR decoding, process is same as SDR + you should send metadata to TV/Monitor.

In case of PC I believe you should check UHD Metadata and NVAPI Functions section of the document. FP16 stuff probably comes from OpenGL or DirectX and probably about games and rendering, so you should never use these formats (only 10+bit yuv) unless you reinventing the wheel.

Hope that helps.

the kamilz
  • 1,860
  • 1
  • 15
  • 19
  • 1
    Thank you for the answer, kamilz. I connect HDR TV via HDMI to my PC. I enable HDR on the HDR TV using NVAPI function with metadata parameters. Then, If I open my source 10bit HDR video in the VLC player on the HDR TV display, I see video in HDR. I want to make my own player that firstly do the same. I can read video frames and decode them to UC16 or FP32, then I can cast them to FP16, but I don't understand which color matrix / transfer function I should apply for these frames to achieve scRGB color space. – Виталий Синявский Mar 05 '18 at 13:52
  • Ok, I'm assuming you already checked VLC source and did you also check this project?: https://github.com/mpv-player/mpv/blob/1f2d8ed01cfc85fb910f21e9a7290265d0dcf11c/video/out/gpu/video_shaders.c – the kamilz Mar 06 '18 at 08:47
  • Yes, I checked them, but didn't understand how to achieve scRGB color space. – Виталий Синявский Mar 12 '18 at 13:46