2

UPDATE:

Thank you all very much for your answers. As Jesse Hall suggested, it looks like it is a driver (or hardware) problem. I tried the same app on other configurations and it worked as expected.

I tested the app on other computers which share the same GPU (ATI 4800 HD) but different versions of the driver and they all showed the same erroneous behavior (what seems to be a double gamma correction on write). On these computers, if have to set D3DRS_SRGBWRITEENABLE to false to fix the display. Anyone knows if this is a known bug on this hardware?

Even more strange is that I get the same end results with these two configurations:

  • D3DRS_SRGBWRITEENABLE = FALSE and D3DSAMP_SRGBTEXTURE to TRUE
  • D3DRS_SRGBWRITEENABLE = FALSE and D3DSAMP_SRGBTEXTURE to FALSE

In the pixel debugger, I see that linearization is applied properly in case 1 but (automatic) correction on write gives the same output as case 2 (which performs no conversion at all)...

// -- END OF UPDATE

I'm having some trouble fixing the gamma correction of a DirectX9 application.

When I enable texture linearization in the samplers (D3DSAMP_SRGBTEXTURE) and sRGB write for output (D3DRS_SRGBWRITEENABLE), it looks like gamma correction is applied twice.

Here is my setup. I used the following texture (from here) to draw a fullscreen quad: enter image description here

The results were visually too bright on the right side of the picture. I used PIX to debug one of those grey pixels and, if everything was set up properly, I would have expected an output value of 0.73 (=0.5^(1.0/2.2)). Unfortunately, the output of the pixel shader was 0.871 (which looks like it could be a gamma correction applied twice ?). I stepped inside the pixel shader with the debugger and the texture fetch returned a value of (0.491, 0.491, 0.491), which should mean linearization on read worked properly.

enter image description here

When I disable D3DRS_SRGBWRITEENABLE, the output of the pixel shader is 0.729 which looks much more correct to me.

enter image description here enter image description here

Any idea where does this conversion come from (in the debugger the pixel shader output was 0.491)? What other flags/render states should I check?

Thank you very much for your help!

Ozirus
  • 1,216
  • 13
  • 13
  • 1
    Sounds like a driver bug. Have you checked on hardware from a different vendor? If you have a solid texture, read it as sRGB and then write it out unmodified as sRGB, you should get the original value (except maybe one or two least-significant-bits due to precision loss during conversions). – Jesse Hall Jun 30 '11 at 18:47
  • Thank you very much! It effectively looks like a driver problem. I updated the question to give more feedback about the problem because I do not completely understand it yet ;-) – Ozirus Jul 07 '11 at 17:39

2 Answers2

3

One possibility is that you are applying the linear to gamma transform twice. Once when writing to the render target with D3DRS_SRGBWRITEENABLE. Then other time when presenting the frame buffer with D3DPRESENT_LINEAR_CONTENT (if you have specified that flag). You do not need D3DPRESENT_LINEAR_CONTENT since you already transformed back to rgb space with D3DRS_SRGBWRITEENABLE.

Another possibility is that your graphics hardware is filtering the texture before converting to linear space for your pixel shader. You can test for this by disabling D3DSAMP_SRGBTEXTURE and filtering, then doing the conversion to linear space and filtering in the pixel shader. Or simply drawing the texture large enough that filtering is not an issue. A good article on gamma correction that also mentions that GeForce 8 and later cards correctly convert to linear space before filtering can be found here:

The Importance of Being Linear

If you are not using D3DPRESENT_LINEAR_CONTENT, then my next guess is that your graphics card does not support the gamma transformations you are doing. Check the device capabilities programatically or with a tool like DirectX Caps Viewer Tool:

DirectX Caps Viewer Tool

fospathi
  • 537
  • 1
  • 6
  • 7
dschaeffer
  • 618
  • 6
  • 15
2

This is related to the question but not exactly, but worth knowing!

I have run across a bug in the D3D9 runtime on Vista/Win7. This runtime is an emulation layer of sorts written on top of D3D10. In D3D10 the SRGB state is a property of the texture format, in D3D9 it is a sampler based renderstate. When setting a D3D9 texture, the SRGB state always set to 'off' lost as the texture's format is probably being used, and D3D9 doesnt have an SRGB texture format. This means that the D3DSAMP_SRGBTEXTURE state needs to be set after the texture is bound in order to show up correctly.

Zoner
  • 616
  • 6
  • 5
  • Interesting to know... Do you have any URL with more information about this? – Ozirus Jul 08 '11 at 07:58
  • Weird but correct, at least for me. After I changed codes to set sampler states after binding texture, the rendering result is right. as to the emulation layer, I heard of it but didn't realize it could be such a d*ck. WTH, I'll move to D3D12 anyway. – crazii Oct 12 '16 at 03:28