I'm trying to use the DepthBias
property on the rasterizer state in DirectX 11 (D3D11_RASTERIZER_DESC
) to help with the z-fighting that occurs when I render in wireframe mode over solid polygons (wireframe overlay), and it seems setting it to any value doesn't change anything to the result. But I noticed something strange... the value is defined as a INT
rather than a FLOAT
. That doesn't make sense to me, but it still doesn't happen to work as expected. How do we properly set that value if it is a INT
that needs to be interpreted as a UNORM
in the shader pipeline?
Here's what I do:
- Render all geometry
- Set the rasterizer to render in wireframe
- Render all geometry again
I can clearly see the wireframe overlay, but the z-fighting is horrible. I tried to set the DepthBias
to a lot of different values, such as 0.000001
, 0.1
, 1
, 10
, 1000
and all the minus equivalent, still no results... obviously, I'm aware when casting the float as integer, all the decimals get cut... meh?
D3D11_RASTERIZER_DESC RasterizerDesc;
ZeroMemory(&RasterizerDesc, sizeof(RasterizerDesc));
RasterizerDesc.FillMode = D3D11_FILL_WIREFRAME;
RasterizerDesc.CullMode = D3D11_CULL_BACK;
RasterizerDesc.FrontCounterClockwise = FALSE;
RasterizerDesc.DepthBias = ???
RasterizerDesc.SlopeScaledDepthBias = 0.0f;
RasterizerDesc.DepthBiasClamp = 0.0f;
RasterizerDesc.DepthClipEnable = TRUE;
RasterizerDesc.ScissorEnable = FALSE;
RasterizerDesc.MultisampleEnable = FALSE;
RasterizerDesc.AntialiasedLineEnable = FALSE;
As anyone figured out how to set the DepthBias
properly? Or perhaps it is a bug in DirectX (which I doubt) or again maybe there's a better way to achieve this than using DepthBias
?
Thank you!