0

Im doing shadowmapping for directional lights, and the standard depth bias matrix doesnt give the same results on diffirent hardware.

After some debugging, I'm pretty sure that some gpu-s write to the depth buffer normalized values [0, 1] and some orthographic projection values [-1, 1].

How do i force all gpus to write values in the same interval? Is there an OpenGL command?

1 Answers1

0

After some debugging, I'm pretty sure that some gpu-s write to the depth buffer normalized values [0, 1] and some orthographic projection values [-1, 1].

How did you infer that? It is just wrong.

How the z-values are transformed is well-defined in the GL spec. They will be converted from normalized device coords (what you called "orthographic projection value") which are in [-1,1] to window space, using the range set via glDepthRange (which is [0,1] if you don't change it). If you have an integer depth buffer (which is also the typical case), the values will also be converted to integer, using the full range of the available bits (typically 24).

derhass
  • 43,833
  • 2
  • 57
  • 78
  • Sorry for the incorrect naming of things. Why than do i have to apply a tranformation from [-1,1]->[0,1] on the z member of the shadowcoord on some cards, and not on other cards? shadowcoord is shadowmap sampling variable –  Jun 26 '14 at 21:04
  • @Aloalo: it is not "on some cards, and not on other cards". All conforming GL implementations handle this the same way. You might have found some driver bug, but I hradly doubt that. I think the bug is somewhere in your code. – derhass Jun 26 '14 at 21:43