My situation is this: a 2D image, as a 2D texture, is produced from a software renderer that actually illustrates a "3D" visual. OpenGL is then used essentially for nothing more than displaying this 2D texture. As a result, despite rendering what appears to be a 3D visual, regardless of any effort I might make with a shader to render the depth buffer, it cannot be done as there is really nothing there. I would like to have access to the depth buffer to enable such shaders.
So, I would like to somehow populate the depth buffer based on my image. I think this could be doable as the software renderer in question can produce a "depth map" image as well as its "regular" image as a mode of rendering — the depth map image looking exactly like a rendering of the depth buffer (greyscale, objects closer to the camera are black). So I suppose my question is: is it possible for me to translate a "pre-rendered" image representing the depth into the depth buffer? How could I go about doing this?
Edit: If this is helpful, I am specifically working with OpenGL 3.3.
Edit 2: Continuing to research what I might be able to do here I have found this discussion which suggests I "either use framebuffer objects or a fragment shader which writes to gl_FragDepth." However the discussion quickly becomes a bit much for me to digest, I think I understand the concept of a fragment shader which writes to gl_FragDepth however how does this actually work in practice?
I am thinking I do something like the following pseudocode?
program = createProgram(); //write to gl_FragDepth in the frag shader
glUseProgram(program);
glColorMask(GL_FALSE,GL_FALSE,GL_FALSE,GL_FALSE);
glEnable(GL_DEPTH_TEST);
glGenTextures(1, &depth_texture);
glBindTexture(GL_TEXTURE_2D, depth_texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, depth->width, depth->height, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_INT, depth->pixels)
glDisable(GL_DEPTH_TEST);
glBindTexture(GL_TEXTURE_2D, 0);
Do I need to enable depth testing?
Edit 3:
If I understand correctly, after doing some more reading, I think I need to do something like the following, however I can't quite get it to work. Does something here look glaringly incorrect? What I find to be happening is in the frag shader the sampler2Ds tex0
and tex1
contain the same values somehow, as a result, I am either able to write the color values to gl_FragDepth
or the depth values to color which creates interesting but unhelpful results.
Summarized Frag Shader:
out vec4 color;
uniform sampler2D tex0; // color values
uniform sampler2D tex1; // depth values
void main(void) {
color = texture(tex0, uv);
gl_FragDepth = texture(tex1, uv).z;
}
Summarized OpenGL:
// declarations
static GLuint vao;
static GLuint texture = 1;
static GLuint depth_texture = 2;
// set up shaders
program = createProgram();
glUseProgram(program); //verified that this is working
// enable depth testing
glEnable(GL_DEPTH_TEST);
// prepare dummy VAO
glGenVertexArrays(1, &vao);
glBindVertexArray(vao);
// prepare texture for color values
glActiveTexture(GL_TEXTURE0);
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
// prepare texture for depth values
glActiveTexture(GL_TEXTURE1);
glGenTextures(1, &depth_texture);
glBindTexture(GL_TEXTURE_2D, depth_texture);
// disable depth mask while working with color values
glDepthMask(GL_FALSE);
// select GL_TEXTURE0 and bind the color values
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texture);
// specify texture image for colorvalues
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, tex_width, tex_height, 0, TEX_FORMAT, TEX_TYPE, fb->pixels);
// enable depth mask while working with depth values
glDepthMask(GL_TRUE);
// select GL_TEXTURE1 and bind the depth values
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, depth_texture);
// specify texture image for depth values
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, tex_width, tex_height, 0, TEX_FORMAT, TEX_TYPE, fb->depth);
// draw
glViewport(win_x, win_y, win_width, win_height);
glDrawArrays(GL_TRIANGLES, 0, 3);