I'm doing some stuff with 2D opengl rendering.
Is there a way to render a vertex array object but have the data be passed through multiple shaders? For example, a shader that applies a normal map to the texture, and then a shader that blurs the image. It would be very difficult and unclean to combine the two shaders into one let alone potentially combining more than 2 shaders. This is my current code for creating the vertex array object:
# TEX_COORDS = [0, 1, 1, 1,
# 0, 0, 1, 0]
# TEX_INDICES = [0, 1, 2,
# 1, 2, 3]
# self.vertices looks something like this: [-1, -1, 1, -1, -1, 1, 1, 1], but with different coordinates
self.vbo = self.ctx.buffer(struct.pack("8f", *self.vertices))
self.uv_map = self.ctx.buffer(struct.pack("8f", *TEX_COORDS))
self.ibo = self.ctx.buffer(struct.pack("6I", *TEX_INDICES))
self.vao_content = [(self.vbo, "2f", "vertexPos"), (self.uv_map, "2f", "vertexTexCoord")]
self.vao = self.ctx.vertex_array(self.program, self.vao_content, self.ibo) # self.program is the shader program object
And I'm doing texture.use()
(texture
being a moderngl texture object) and then self.vao.render()
to render it onto the screen.