I would like to store frames from a webcam, and then display these (delayed) frames on a THREE.PlaneGeometry texture.
I already have a webcam frame encapsulated in a THREE.Texture subclass, and can apply this to a material. Now I would like to buffer these frames and delay them:
**** pseudocode within tick() function ****
for (let i = 0; i < bufferedFrames.length - 1; i++) {
bufferedFrames[i] = bufferedFrames[i+1];
}
bufferedFrames[bufferedFrames.length - 1] = currentWebCamTexture;
// Now display delayed texture.
plane.material.map = bufferedFrames[0]; // Use a frame from the past.
plane.material.map.needsUpdate = true;
renderer.render(scene, camera)
Unfortunately it doesn't seem like it's possible to store old frames. Even THREE.Texture's clone() method states that a clone does not create a "deep copy" of the image. And hence, I am not really buffering up an array of unique frames.
Currently I can solve this if I employ WebGLRenderTargets, i.e. keep an array of WebGLRenderTargets and render a scene that only contains my centered webcam frame. This seems wasteful to re-render a scene just to recover a texture. Is this the right way of doing it? Is there a cleaner way?