0

I am using SSAO very nearly as per John Chapman's tutorial here, in fact, using Sascha Willems Vulkan example. One difference is the fragment position is saved directly to a G-Buffer along with linear depth (so there are x, y, z, and w coordinates, w being the linear depth, calculated in the G-Buffer shader. Depth is calculated like this:

float linearDepth(float depth)
{   
    return (2.0f * ubo.nearPlane * ubo.farPlane) / (ubo.farPlane + ubo.nearPlane - depth * (ubo.farPlane - ubo.nearPlane)); 
}

My scene typically consists of a large, flat floor with a model in the centre. By large I mean a lot bigger than the far clip distance.

At high depth values (i.e. at the horizon in my example), the SSAO is generating occlusion where there should really be none - there's nothing out there except a completely flat surface. Along with that occlusion, there comes some banding as well.

Any ideas for how to prevent these occlusions occurring?

mike
  • 1,192
  • 9
  • 32

1 Answers1

0

I found this solution while I was writing the question, which works only because I have a flat floor.

I look up the normal value at each kernel sample position, and compare to the current normal, discarding any with a dot product that is close to 1. This means flat planes can't self-occlude.

Any comments on why I shouldn't do this, or better alternatives, would be very welcome! It works for my current situation but if I happened to have non-flat geometry on the floor I'd be looking for a different solution.

vec3 normal = normalize(texture(samplerNormal, newUV).rgb * 2.0 - 1.0);
<snip>
for(int i = 0; i < SSAO_KERNEL_SIZE; i++)
    {
        <snip>
        float sampleDepth = -texture(samplerPositionDepth, offset.xy).w; 
        vec3 sampleNormal = normalize(texture(samplerNormal, offset.xy).rgb * 2.0 - 1.0);   
        if(dot(sampleNormal, normal) > 0.99)
            continue;
mike
  • 1,192
  • 9
  • 32