2

In short

I would like to read a single pixel value from a WebGL 2 depth texture in JavaScript. Is this at all possible?

The scenario

I am rendering a scene in WebGL 2. The renderer is given a depth texture to which it writes the depth buffer. This depth texture is used in post processing shaders and the like, so it is available to us.

However, I need to read my single pixel value in JavaScript, not from within a shader. If this had been a normal RGB texture, I would do

function readPixel(x, y, texture, outputBuffer) {
    const frameBuffer = gl.createFramebuffer();
    gl.bindFramebuffer( gl.FRAMEBUFFER, frameBuffer );
    gl.framebufferTexture2D( gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture, 0 );
    gl.readPixels(x, y, 1, 1, gl.RGBA, gl.UNSIGNED_BYTE, outputBuffer);
}

This will write the pixel at x, y into outputBuffer.

However, is it at all possible to do the same with a depth texture? If I just pass a depth texture to my function above, the output buffer only has zeros, and I receive a WebGL warning GL_INVALID_FRAMEBUFFER_OPERATION: Framebuffer is incomplete.. Checking the framebuffer state reveals FRAMEBUFFER_INCOMPLETE_ATTACHMENT.

Naturally, the depth texture is not an RGBA texture, but is there some other values we can give it to get our depth value, or is it impossible?

Motivation

I am aware of that this question has been asked some number of times on StackOverflow and elsewhere in some form of another, but there is always some variation making it confusing for me to get a straight-up yes or no answer to the question in the form I ask it here. In addition, many questions and sources are very old, WebGL 1 only, with some mentions of webgl_depth_texture making a difference etc etc.

If the answer is no, I'd welcome any suggestions for how else to easily obtain this depth pixel. As this operation is not done for every frame, I value simplicity over performance. The use case is picking, and classical ray intersection is not feasible. (I also know that I can encode a scalar depth value into and out of an RGB pixel, but I need to be able to access the pixel from within the js code in the first place.)

I'd welcome any insights.

Berthur
  • 4,300
  • 2
  • 14
  • 28

1 Answers1

1

There is no possibility WebGL 2.0 is based on OpenGL ES 3.0.
In OpenGL ES 3.2 Specification - 4.3.2 Reading Pixels is clearly specified:

[...] The second is an implementation-chosen format from among those defined in table 3.2, excluding formats DEPTH_COMPONENT and DEPTH_STENCIL [...]

Rabbid76
  • 202,892
  • 27
  • 131
  • 174
  • Thank you, this is already useful information. However, to a layman this in itself doesn't sound like it implies that you cannot read a depth texture. One might hope that you could e.g. somehow treat it as an RGB value, or convert, or something else. Do you know it to be completely impossible (without using a custom shader to read, pack and write it into an RGBA texture)? – Berthur May 26 '21 at 19:07
  • @Berthur It implies that you cannot read it. However, I don't want to copy complete chapter of the specification to the answer. This is one of the differences between OpenGL ES and (desktop) OpenGL. – Rabbid76 May 26 '21 at 19:09
  • 1
    I'm wondering whether it would be possible to work around it: 1. as far as i understand, you can read the depth buffer as a texture (e.g. https://stackoverflow.com/questions/47651396/webgl2-fbo-depth-attachment-values). 2. that could be rendered to a normal texture using a screen quad, no? 3. in order to avoid copying the whole thing, the view frustum can be limited to a single pixel (like https://webgl2fundamentals.org/webgl/lessons/webgl-picking.html). obviously this is an additional round trip, but would that work? I'm quite new to webgl, and opengl was long time ago. – Adam Feb 05 '22 at 02:01
  • 1
    @Adam Yes, that sounds like a description of what I ended up doing :) I did the depth queries in a custom shader of an additional render pass, which was given the pixel xy coordinate, and rendered the result into a 1x1 render target. For WebGL1 support, I also found myself forced to encode the depth value into an RGBA colour. In this SO question, I was trying to find a definitive answer to whether it would have been possible without the additional render pass. – Berthur Feb 10 '22 at 10:41