Given a WebGL scene (created from THREE.js), how would you go about accessing the floating point values (as an array of data outside of the WebGL context) from the DEPTH_ATTACHMENT
given the framebuffer has been bound to texture using framebufferTexture2D
.
I've gathered one solution thus far which is to render the scene to a texture target using a custom shader override which accesses the depth texture information and then encodes it to RGB format. The code used is very similar to this THREE.js example found here: Depth-Texture-Example.
#include <packing>
varying vec2 vUv;
uniform sampler2D tDiffuse;
uniform sampler2D tDepth;
uniform float cameraNear;
uniform float cameraFar;
float readDepth (sampler2D depthSampler, vec2 coord) {
float fragCoordZ = texture2D(depthSampler, coord).x;
float viewZ = perspectiveDepthToViewZ( fragCoordZ, cameraNear, cameraFar );
return viewZToOrthographicDepth( viewZ, cameraNear, cameraFar );
}
void main() {
vec3 diffuse = texture2D(tDiffuse, vUv).rgb;
float depth = readDepth(tDepth, vUv);
gl_FragColor.rgb = vec3(depth);
gl_FragColor.a = 1.0;
}
Once this has rendered I can then use readPixels
to read the specific pixels into an array. However, this option has incredibly low precision restricted to 256 discrete values given vec3(float) = vec3(float, float, float)
. Is there a way to get higher precision out of this specific method or an alternative?
Ultimately what I want is access to the depth buffer as an array of floating point values outside of the WebGL context and in an efficient manner. I have a custom rasterizer that can create a rather good depth buffer but I don't want to waste any time redoing steps that are already done.