1

I'm loading textures using CreateWICTextureFromMemoryEx from DirectXTK. Everything I load is converted to sRGB. Is there any way to force WIC to create RGB surface instead?

Or maybe there is a way to convert an already loaded texture from sRGB to RGB? Back in D3DX there used to be D3DX11_FILTER_SRGB flag for that (from what I understand) but it's not deprecated.

Any help will be very appreciated, thanks!

The Apache
  • 1,076
  • 11
  • 28
Konstanty
  • 13
  • 1
  • 3

1 Answers1

2

The DirectX Tool Kit loader uses DXGI_FORMAT_*_SRGB when loading WIC images for a few reasons:

  • The WIC metadata for a PNG file contains the sRGB chunk (/sRGB/RenderingIntent is true)

  • The WIC metadata for a JPG indicates sRGB (/app1/ifd/exif/{ushort=40961} is 1)

  • The WIC metadata for a TIFF indicates sRGB (/ifd/exif/{ushort=40961} is 1)

  • If you pass 'true' as the forceSRGB parameter to the Ex version of the function

So the image is in fact likely in sRGB colorspace. Therefore, the DXGI_FORMAT_*_SRGB indicates that reads from that texture should be subject to de-gamma to get them into a linear colorspace. I'm assuming you are not using gamma-correct rendering here?

Gamma-correct rendering is achieved by using a DXGI_FORMAT_*_SRGB or HDR (10:10:10:2, 16:16:16:16) backbuffer format. You also need to use linear colors for Clear. See DeviceResources, Gamma-correct rendering, The Importance of Being Linear, and Linear-Space Lighting (i.e. Gamma) for details.

A quick and easy workaround if you control the texture file would be to use texconv in the DirectXTex library to convert the source image to a DDS. You can use various switches like -srgbi or -srgbo to force the SRGB behavior you are after.

Note that I'm also adding an option to let you ignore the sRGB metadata when using WICTextureLoader for a future release of DirectX Tool Kit. Linear rendering is best, but sometimes it's nice to have the option to avoid the DXGI_FORMAT_*_SRGB format being used.

UPDATE: The more recent versions of WICTextureLoader in DirectX Tool Kit have the following options flags which help the loader determine the right choice for your scenario:

  • WIC_LOADER_FORCE_SRGB Will always return an *_SRGB format if one exists for the format.

  • WIC_LOADER_IGNORE_SRGB Will have the loader ignore the WIC colorspace metadata above if present.

  • Normally if there's no WIC metadata, the reader will assume it's linear (i.e. not sRGB). If you provide WIC_LOADER_SRGB_DEFAULT it assumes that lacking metadata means it should be *_SRGB instead.

Chuck Walbourn
  • 38,259
  • 2
  • 58
  • 81
  • I'm a bit starstruck at the moment, as I was just reading through your dx11 samples! Thank you for taking the time to reply! I'm porting dx9 renderer to dx11 and I ran into sRGB issues. My backbuffer and textures are sRGB. All is fine until I start multiplying texture with vertex color. I can't find info - does the driver convert vertex colors to sRGB? Is alpha channel affected? I get results that look correct only if I convert vertex color (along with alpha channel) to sRGB in the shader, then multiply with sampled color and return the result. This doesn't seem right... – Konstanty Sep 22 '16 at 20:47
  • I guess my question should read "If my texture is sRGB and backbuffer is sRGB how do I multiply color sampled from a texture with the color passed down in the vertex in order to get correct alpha blending" since that was the root of the original problem. I was trying to move the whole rendering to linear space, which I am familiar with. – Konstanty Sep 22 '16 at 20:52
  • When you sample from a texture with an sRGB format, the sampler should digamma it such that your shader works with all linear color space. You then write your linear color values and the sRGB render target view apply gamma to get it back into the sRGB color space for rendering. The vertex (and clear) colors would then be specified in linear color space. – Chuck Walbourn Sep 22 '16 at 21:02
  • BTW, I've been debating changing the ``bool forceSRGB = false`` to a ``DWORD flags = 0`` with ``TEXTURE_FORCE_SRGB`` and a ``TEXTURE_IGNORE_SRGB`` for a while as the sRGB PNG behavior is really confusing to folks just doing 2D stuff with ``SpriteBatch``. I just have to figure out how to it without breaking a lot of existing code. – Chuck Walbourn Sep 22 '16 at 21:03
  • Thanks for clarifying that up. We investigated further and we think we have pinpointed the problem. Our editor saves png using GDI+ with rendering intent: perceptual, sRGB color space and gamma 0.45. Photoshop and Corel open the file correctly. However, the WIC loader seemingly ignores the gamma value and seeing rendering intent byte value of 1 assumes the gamma value is 2.2. This results in sampled texels being way too dark. Here on the left you can see header of PNG file saved with GDI+ and on the right with Corel (that one loads correctly with WIC) i.imgur.com/rtAta13.png – Konstanty Sep 23 '16 at 12:47
  • One more thing - when I set backbuffer to _SRGB I have to manually apply gamma on output color in pixel shader. Is there any flag or render state I should set so it would get converted automatically? – Konstanty Sep 23 '16 at 12:57
  • The WIC codec returns the sRGB chunk metadata, but not the value. sRGB is assumed to be 2.2 which is why I set the DXGI format to ``DXGI_FORMAT_*_SRGB``. There's really not a trivial mechanism for using other gamma values. Check that you've set up the backbuffer and RTV correctly. See [DeviceResources](https://raw.githubusercontent.com/walbourn/directx-vs-templates/master/d3d11game_win32_dr/DeviceResources.cpp). – Chuck Walbourn Sep 23 '16 at 15:34
  • Addendum: It also comes out as sRGB when loading 8bpp color palette indexed bitmaps. I suppose sRGB is the default unless the loader is *really sure* that it's linear-space. Which makes sense I guess. – dialer Oct 06 '21 at 08:13