0

Iam trying to implement normals for my height map but they dont seems to work.

Look at these: Normals on slope

enter image description here

enter image description here

Note that the pattern occurs along the edges. Why? Vertices are shared (indexing) and normals are average for vertex from all triangles that vertex is part of.

Algorithm for normals looks like that:

 float size=Size;
  int  WGidY=int(gl_WorkGroupID.y);
  int  WGidX=int(gl_WorkGroupID.x);

    vec4 tempVertices[3];
    tempVertices[0]=imageLoad(HeightMap, ivec2(WGidX, WGidY));
    tempVertices[1]=imageLoad(HeightMap, ivec2(WGidX, WGidY+1));
    tempVertices[2]=imageLoad(HeightMap, ivec2(WGidX+1, WGidY));
    vec4 LoadedNormal=imageLoad(NormalMap, ivec2(WGidX, WGidY));
    vec4 Normal=vec4(0.0f);
    Normal.xyz=cross((tempVertices[0].xyz-tempVertices[1].xyz),  (tempVertices[0].xyz-tempVertices[2].xyz));
    Normal.w=1; 
    imageStore(NormalMap, ivec2(WGidX,WGidY),          Normal+LoadedNormal);
Kedriik
  • 331
  • 4
  • 12

2 Answers2

2

No need to do averaging like that. You can compute it directly in one step as follows:

vec3 v[4] = {
    imageLoad(HeightMap, ivec2(WGidX-1, WGidY)).xyz,
    imageLoad(HeightMap, ivec2(WGidX+1, WGidY)).xyz,
    imageLoad(HeightMap, ivec2(WGidX, WGidY-1)).xyz,
    imageLoad(HeightMap, ivec2(WGidX, WGidY+1)).xyz,
};
vec3 Normal = normalize(cross(v[1] - v[0], v[3] - v[2]));
imageStore(NormalMap, ivec2(WGidX,WGidY), vec4(Normal, 1));

Also you don't even need to store the HeightMap mesh explicitly. Instead you can send the same low-resolution quad to the GPU, tessellate it with a tessellation shader, apply the height map to the generated vertices by sampling from a one-channel texture, and compute the normals on-the-fly as above.

Community
  • 1
  • 1
Yakov Galka
  • 70,775
  • 16
  • 139
  • 220
  • Iam using spherified cube so i cant use .z component for height. – Kedriik Nov 11 '16 at 12:46
  • @Kedriik: I've updated my answer. But it still holds: I would warp the mesh in the vertex-shader rather than storing an explicit mesh. – Yakov Galka Nov 11 '16 at 14:29
  • 1
    @Kedriik: Btw, note that you cannot sample with `imageLoad` beyond the bounds of the texture. You should use `texelFetch` for that with a `sampler2D`. – Yakov Galka Nov 11 '16 at 18:43
  • @kedriik: I don't know why you are not satisfied with that image. I personally don't see any artifacts in it. If you see otherwise, can you explain exactly what the problem is? Perhaps draw arrows pointing to those artifacts? – Yakov Galka Nov 16 '16 at 00:52
  • http://prnt.sc/d7xems http://prnt.sc/d7xetj I checked indexing few times and i couldnt find mistake but it seems that some of the normals are mismatch. – Kedriik Nov 16 '16 at 10:20
  • @Kedriik: ah, those. I wouldn't call it an artifact. But it happens because you sample the normals in the vertex shader, I guess, rather than in the fragment shader. The result is that you interpolate it linearly along the triangle edges rather than bilinearly in the texture space. Although you may increase quality by sampling in the fragment shader, in the end of the day you won't gain much because your model is simply too low-resolution compared to the features in the hightmap, so whatever you do you'll get some aliasing. – Yakov Galka Nov 16 '16 at 10:26
  • thanks, now that is something. But i need to clarify some. I compute normals in compute shader and i store it in texture2D item (negative values are allowed because of used GL_FLOAT param). So i have one normal per few triangles and i take it from these normalMap texture. What is your hint to make it better? i mean, you said sample it in fragment shader but what do you exactly mean? – Kedriik Nov 16 '16 at 14:22
  • That's exactly what I mean. You do `color = texture(normals, uv)` in the fragment shader, and make sure that the vertex shader pass the correct `uv` coordinates. I can't explain more without basing it on your shaders code, which you did not post. – Yakov Galka Nov 16 '16 at 14:26
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/128269/discussion-between-kedriik-and-ybungalobill). – Kedriik Nov 16 '16 at 14:33
0

ok guys, I found a problem. This is symptom of "greedy triangulation". The normals inside a triangle are interpolated by barycentric algorithm but the edges are interpolated linearly to prevent color differences between adjacting triangles. Thank you, again, Paul Bourke: http://paulbourke.net/texture_colour/interpolation/

If you dont have enough triangles dont use Phong Shading (maybe normal mapping?). After tweaks:

http://prntscr.com/dadrue

http://prntscr.com/dadtum

http://prntscr.com/dadugf

Kedriik
  • 331
  • 4
  • 12