0

I'm doing some stuff with 2D opengl rendering.

Is there a way to render a vertex array object but have the data be passed through multiple shaders? For example, a shader that applies a normal map to the texture, and then a shader that blurs the image. It would be very difficult and unclean to combine the two shaders into one let alone potentially combining more than 2 shaders. This is my current code for creating the vertex array object:

# TEX_COORDS = [0, 1,  1, 1,
#               0, 0,  1, 0]
# TEX_INDICES = [0, 1, 2,
#                1, 2, 3]
# self.vertices looks something like this: [-1, -1,  1, -1, -1,  1,  1,  1], but with different coordinates
self.vbo = self.ctx.buffer(struct.pack("8f", *self.vertices))
self.uv_map = self.ctx.buffer(struct.pack("8f", *TEX_COORDS))
self.ibo = self.ctx.buffer(struct.pack("6I", *TEX_INDICES))
self.vao_content = [(self.vbo, "2f", "vertexPos"), (self.uv_map, "2f", "vertexTexCoord")]
self.vao = self.ctx.vertex_array(self.program, self.vao_content, self.ibo) # self.program is the shader program object

And I'm doing texture.use() (texture being a moderngl texture object) and then self.vao.render() to render it onto the screen.

genpfault
  • 51,148
  • 11
  • 85
  • 139
DaNubCoding
  • 320
  • 2
  • 11

1 Answers1

1

A single rendering call will only ever use a single set of vertex, fragment, and other shaders. You cannot chain together shaders for a particular stage via the API; you must manufacture a single such shader that does those multiple things.

How you go about that process is up to you. You can have one shader that has all possible operations, with a bunch of uniform variables that define which operations will be applied. Or you can dynamically build shaders to fit particular needs.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
  • "A shader that applies a normal map to the texture, and then a shader that blurs the image". I can't imagine how a situation like this could be achieved in a single shader without sacrificing performance? – DaNubCoding Feb 02 '23 at 18:53
  • @AndrewDaNub: That *particular* circumstance fundamentally *requires* multiple rendering calls. Bluring would require reading from multiple, neighboring pixels, which you simply cannot do in a single shader. – Nicol Bolas Feb 02 '23 at 18:54
  • That particular circumstance is what I need right now. I already have a blur shader that does it by finding the average color of neighboring pixels. It would be ideal if I can now apply a normal map to the texture via a fragment shader prior to the blur shader, but there doesn't seem to be a way to do that.. – DaNubCoding Feb 02 '23 at 18:58