I am trying to implement my own depth stencil like functionality in a pixel shader.
I have a normalised depth value, and I want to convert it to an integer in order to use the SM5 InterlockedMin() operation, like so:
uint d_uint = (uint)(dnorm * 4294967295);
uint d_uint_original = 0;
InterlockedMin(depthmask[c], d_uint, d_uint_original);
if (d_uint < d_uint_original) { //if true, we will have written a new depth into the mask, so write the element to the field
field[c].x = d;
field[c].yzw = i.n;
}
Where dnorm
is the depth value.
However, uint d_uint = (uint)(dnorm * 4294967295);
only ever evaulates to 0.
I know this because I can view the buffer in RenderDoc and see the shader only writes 0. I can also reduce the constant, to say 229496729, and it will write it correctly. If I set it to 2294967295 however it does not.
I know I am tripping over a floating point quantisation issue, but don't know how.
What is the correct way to encode a normalised value in an integer array?
I am aware of DX's UNORM/SNORM, and maybe I need a break, but its not clear from that documentation how they should be used, specifically converting one to UINT in order to be used with the interlocked functions.