Having a 16-bit uint texture in my C++ code, I would like to use it for z-testing in an OpenGL ES 3.0 app. How can I achieve this?
To give some context, I am making an AR app where virtual objects can be occluded by real objects. The depth texture of real environment is generated, but I can't figure out how to apply it.
In my app, I first use glTexImage2D
to render backdrop image from the camera feed, then I draw some virtual objects. I would like the objects to be transparent based on a depth texture. Ideally, the occlusion testing needs to be not binary, but gradual, so that I can alpha blend the objects with background near the occlusion edges.
I can pass and read the depth texture in the fragment shader, but not sure how to use it for z-testing instead of rendering.