4

I am trying to get the pixel RGBA data from a canvas for further processing. I think the canvas is actually a Unity game if that makes a difference.

I am trying to do this with the canvas of the game Shakes and Fidget. I use the readPixels method from the context.

This is what I tried:

var example = document.getElementById('#canvas');
var context = example.getContext('webgl2');      // Also doesn't work with: ', {preserveDrawingBuffer: true}'
var pixels = new Uint8Array(context.drawingBufferWidth * context.drawingBufferHeight * 4); 
context.readPixels(0, 0, context.drawingBufferWidth, context.drawingBufferHeight, context.RGBA, context.UNSIGNED_BYTE, pixels);

But all pixels are black apparently (which is not true obviously).

Edit: Also, I want to read the pixels multiple times. Thanks everyone for your answers. The answer provided by @Kaiido worked perfectly for me :)

TPRammus
  • 491
  • 1
  • 6
  • 19
  • 1
    A WebGL canvas is always possible to read since you can't draw anything into it that's not already CORS approved – gman Jan 05 '19 at 16:59

3 Answers3

4

You can require a Canvas context only once. All the following requests will either return null, or the same context that has been created before if you passed the same options to getContext().

Now, the one page you linked to didn't pass the preserveDrawingBuffer option when creating their context, which means that to be able to grab the pixels info from there, you will have to hook up in the same event loop as the one the game loop occur.
Luckily, this exact game does use a simple requestAnimationFrame loop, so to hook up to the same event loop, all we need to do is to also wrap our code in a requestAnimationFrame call.

Since callbacks are stacked, and that they do require the next frame from one such callback to create a loop, we can be sure our call will get stacked after their.

I now realize it might not be obvious, so I'll try to explain further what requestAnimationFrame does, and how we can be sure our callback will get called after Unity's one.

requestAnimationFrame(fn) pushes fn callback into a stack of callbacks that will all get called at the same time in First-In-First-Out order, just before the browser will perform its paint to screen operations. This happens once in a while (generally 60 times per second), at the end of the closest event loop.
It can be understood as a kind of setTimeout(fn , time_remaining_until_next_paint), with the main difference that it is guaranteed that requestAnimationFrame callback executor will get called at the end of the event loop, and thus after other js execution of this event loop.
So if we were to call requestAnimationFrame(fn) in the same event loop that the one where the callbacks will get called, our fake time_remaining_until_next_paint would be 0, and fn will get pushed at the bottom of our stack (last in, last out).
And when calling requestAnimationFrame(fn) from inside the callbacks executor itself, time_remaining_until_next_paint would be something around 16, and fn will get called among the first ones at the next frame.

So any calls to requestAnimationFrame(fn) made from outside of the requestAnimationFrame's callbacks executor is guaranteed to be called in the same event loop than a requestAnimationFrame powered loop, and to be called after.

So all we need to grab these pixels, is to wrap the call to readPixels in a requestAnimationFrame call, and to call it after Unity's loop started.

var example = document.getElementById('#canvas');
var context = example.getContext('webgl2') || example.getContext('webgl');
var pixels = new Uint8Array(context.drawingBufferWidth * context.drawingBufferHeight * 4);
requestAnimationFrame(() => {
  context.readPixels(0, 0, context.drawingBufferWidth, context.drawingBufferHeight, context.RGBA, context.UNSIGNED_BYTE, pixels);
  // here `pixels` has the correct data
});
Kaiido
  • 123,334
  • 13
  • 219
  • 285
3

Likely you either need to read the pixels in the same event as they are rendered, or you need to force the canvas to use preserveDrawingBuffer: true so you can read the canvas at any time.

To do the second override getContext

HTMLCanvasElement.prototype.getContext = function(origFn) {
  const typesWeCareAbout = {
    "webgl": true,
    "webgl2": true,
    "experimental-webgl": true,
  };
  return function(type, attributes = {}) {
    if (typesWeCareAbout[type]) {
      attributes.preserveDrawingBuffer = true;
    }
    return origFn.call(this, type, attributes);
  };
}(HTMLCanvasElement.prototype.getContext);

Put that at the top of the file before the Unity game OR put it in a separate script file and include it before the Unity game.

You should now be able to get a context on whatever canvas Unity made and call gl.readPixels anytime you want.

For the other method, getting pixels in the same event, you would instead wrap requestAnimationFrame so that you can insert your gl.readPixels after Unity's use of requestAnimationFrame

window.requestAnimationFrame = function(origFn) {
  return function(callback) {
    return origFn(this, function(time) {
      callback(time);
      gl.readPixels(...);
    };
  };
}(window.requestAnimationFrame);

Another solution would be to use a virtual webgl context. This library shows an example of implementing a virtual webgl context and shows an example of post processing the unity output

note that at some point Unity will likely switch to using an OffscreenCanvas. At that point it will likely require other solutions than those above.

gman
  • 100,619
  • 31
  • 269
  • 393
  • Why do you overwrite requestAnimationFrame? As I explained in my answer all the callbacks are executed in the same event loop, and you are assured that a call made out of an rAF powered loop will get stacked **after** the ones in an rAF loop. So no need to call readPixels in every frames (and even possibly multiple times per frame). Also, when Unity will use an OffsetCanvas, there will probably be no need for any workaround since BitmapRenderer contexts do not throw there rendering buffer. – Kaiido Jan 06 '19 at 00:42
  • How do you guarantee your raf callback happens after unity's? It could just as easily happen before Unity's in which case you'll get nothing. You don't know when unity will start using raf. As for calling `gl.readPixels` every frame the code above is just an example. I shouldn't have to spell out that you can conditionally call `gl.readPixels` only when you want to read. – gman Jan 06 '19 at 01:18
  • because rAF is FIFO, so if you call from inside a callback as you do with rAF loops, you'll be among the firsts to get out of next callbacks (only other rAF loops may be prior to you). So it is guaranteed that a call to rAF made outside of a rAF callback will get out after the ones made inside. – Kaiido Jan 06 '19 at 03:26
  • Sounds like you're assuming they only want to read it once on demand. I'm assuming they might want to read it every frame. – gman Jan 06 '19 at 04:22
  • Indeed that was my assumption, conforted by the 3 lines of code provided, but even if they wanted to read it every frame, as long as the loop is initiated after unity's one, you can be sure it will get executed after for every frame with the default rAF. – Kaiido Jan 06 '19 at 04:37
  • There is no way to know your code will get executed after unity's since you have no idea when unity will start doing rafs nor will you know if for some reason it decides to stop doing rafs (during a load for example) – gman Jan 06 '19 at 05:22
  • And your code has no way to tell if it is indeed wrapping unity's callback or an other completely unrelated one. There might very well be multiple different loops running at the same time. – Kaiido Jan 06 '19 at 05:38
  • True, but I provided 3 solutions. And, even if that's the case you could just add code to support that in the wrapper. Wrap it once per frame, add callbacks to a list. The point is not to show a perfect solution that handles every possible case. It's to provide options. None of our solutions handles OffscreenCanvas. OffscreenCanvas is not required to use a BitmapRenderer. In fact it would more likely use `transferControlToOffscreenCanvas` since that's far less overhead than BitmapRenderer. Example: http://twgljs.org/examples/offscreencanvas.html – gman Jan 06 '19 at 05:53
  • Hmm sorry about the OffscreenCanvas, *readPixels* won't be available from main thread however the OffscreenCanvas got created. But just for your info, when you do `canvas.transferControlToOffscreenCanvas()` `canvas` is now just the same as a canvas with a BitmapRender context, except that you don't need to transfer yourself the Bitmap at every frame, but you can still call toDataURL or toBlob whenever you want, the rendering buffer is persistent. – Kaiido Jan 06 '19 at 06:23
  • After a week of googling and getting some stupid unintelligible semi-philosophical advice on how WebGL works, redefining `HTMLCanvasElement` prototype before everything else is executed proved to be the **only working solution.** Thank you. – ttaaoossuuuu Mar 18 '19 at 08:44
0

Alternatively, you can stream the content of the canvas to a video element, draw the video's content to another canvas and read the pixels there.

This should be independent of the frame being painted by a requestAnimationFrame, but is asynchronous.

We need a video, another canvas and a stream:

var example = document.getElementById('#canvas');
var stream=example.captureStream(0);//0 fps

var vid=document.createElement("video");
vid.width=example.width;
vid.height=example.height;
vid.style="display:none;";
document.body.appendChild(vid);

var canvas2=document.createElement("canvas");
canvas2.width=example.width;
canvas2.height=example.height;
canvas2.style="display:none;";
var width=example.width;
var height=example.height;
body.appendChild(canvas2);

var ctx = canvas2.getContext('2d');

Now you can read the game canvas by requesting a frame from the stream, pushing it into the video and painting the video onto our canvas:

stream.requestFrame();
//wait for the game to draw a frame
vid.srcObject=stream;
//wait
ctx.drawImage(vid, 0, 0, width, height, 0, 0, width, height);
var pixels = new Uint8Array(context.drawingBufferWidth * context.drawingBufferHeight * 4); 
ctx.readPixels(0, 0, ctx.drawingBufferWidth, ctx.drawingBufferHeight, ctx.RGBA, ctx.UNSIGNED_BYTE, pixels);
  • While this code may solve the question, [including an explanation](https://meta.stackoverflow.com/questions/392712/explaining-entirely-code-based-answers) of how and why this solves the problem would really help to improve the quality of your post, and probably result in more up-votes. Remember that you are answering the question for readers in the future, not just the person asking now. Please edit your answer to add explanations and give an indication of what limitations and assumptions apply. – PerplexingParadox May 08 '21 at 14:38