0

I could not find a way to generate a normal/bump map in PyOpenGL. Specifically I want a map which I can read with glReadPixels(). I do not know how to get a map of the form: (width, height, normals) i.e. the shape should be (w x h x 3).

How can I get such a map? Which Fragment, vertex, geometry shaders are needed for this?

I provide the following inputs to the Vertex shader:

layout (location = 0) in vec3 vertexPosition_modelspace;
layout (location = 1) in vec3 normal;

uniform mat4 ModelMatrix;
uniform mat4 ViewMatrix;
uniform mat4 ProjectionMatrix;

I would like to get a smooth normal map, where every pixel corresponds to a normalized normal vector. Which shaders do I need for this? What I need to do is to render a normal map from a scene. I use marching cubes to get the triangles, normals and vertices. I added a normal buffer. I do not have texture.

NormalBuffer = glGenBuffers(1)
print(NormalBuffer.shape)
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, NormalBuffer)
glBufferData(GL_ELEMENT_ARRAY_BUFFER, normal_arrays[i], GL_STATIC_DRAW)
normal_buffers[i] = NormalBuffer

I use this to draw the triangles and to bind the buffer object.

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, normal_buffers[i])
glVertexAttribPointer(1,           
                      3,      
                      GL_FLOAT,     
                      GL_FALSE,     
                      0,            
                      None)         

glDrawElements(GL_TRIANGLES,
               index[1]*index[0],
               GL_UNSIGNED_INT,
               None)

Is it possible to render the normal map with the shader and how? Is it possible like this to read it?

glReadPixels(0, 0, height, width, GL_RGB, GL_UNSIGNED_INT)
Ferchar
  • 1
  • 2
  • Yes. This is what I want to do. To render a scene and store the normal vectors which correspond to the fragments of the framebuffer. I have already rendered a depth/height map of a scene. – Ferchar Sep 25 '19 at 18:58
  • @Ferchar: If that's what you want, then you should add that information to the question. You should also add in the various vertex and fragment shaders. Also, what's a "normal shader"? – Nicol Bolas Sep 25 '19 at 20:20
  • you need to add texture coordinates too ... so you can store your interpolated normal to correct position of target render to texture ... Also this would work only for meshes with single texture per whole surface/layer ... basically you create TBN matrix for each face from passed vertex data in vertex or geometry or teselation and in fragment convert interpolated normal into local TBN and store result into texture ... – Spektre Sep 26 '19 at 07:22

0 Answers0