I want to create a simple video player that will show HDR video on HDR TV. For example, this "LG Chess HDR" video. It is encoded with HEVC, its bit depth is 10 bit, pixel format is YUV420P10LE and it has metadata abount BT2020 color space and PQ transfer function.
In this NVIDIA article I found the next:
The display driver takes the scRGB back buffer, and converts it to the standard expected by the display presently connected. In general, this means converting the color space from sRGB primaries to BT. 2020 primaries, scaling to an appropriate level, and encoding with a mechanism like PQ. Also, possibly performing conversions like RGB to YCC if that display connection requires it.
It means that my player should render pixels in the scRGB color space (linear encoding, sRGB primaries, full range is -0.5 through just less than +7.5). So I need to get frames from the source video in this color space somehow, preferably in FP16 pixel format (half float, 16 bits per one color channel). I come to the following simple pipeline to render videos to HDR:
source HDR video in BT2020 color space with applied PQ -> [some video library] ->
-> video frames with colors in scRGB color space -> [my program] ->
-> rendered video on HDR TV with applied conversions by display driver
I'm trying to use FFmpeg as this library and do not understand how to get frames from the source HDR video in scRGB color space.
I use sws_scale FFmpeg method now to get frames and know about filters API. But I did not found any information and help about how to transparantly get frames in scRGB using these functionality without parsing metadata for all source videos and create custom video filters for them.
Please, tell me what I can do to get frames in the scRGB color space using FFmpeg. Can someone tell other libraries with which I can do it?