I have a canvas in my browser that displays a feed from my webcam. What I want to do, is send the canvas data to my nodejs server, manipulate it, and send it back.
I can do it sending the canvas data via socket.io like so:
socket.emit('canvas_data', canvas.toDataURL());
And then rebuilding it on the nodejs server:
let img = new Image();
img.src = data; // this is the canvas_data from the first step
const canvas = createCanvas(640,480);
const ctx = canvas.getContext('2d');
ctx.drawImage(img,0,0,640,480);
However this seems really wasteful as I'm taking an already rendered canvas, converting it to base64, sending it, and then rebuilding it on the other side.
The whole point of this is to use tfjs on the server side:
let converted = tfjs.browser.fromPixels(canvas);
If I just send the canvas from the first step:
socket.emit('canvas_data', canvas);
And then run tfjs:
let converted = tfjs.browser.fromPixels(data);
I get the following error:
Error: pixels passed to tf.browser.fromPixels() must be either an HTMLVideoElement, HTMLImageElement, HTMLCanvasElement, ImageData in browser, or OffscreenCanvas, ImageData in webworker or {data: Uint32Array, width: number, height: number}, but was object
Is there a more efficient way to accomplish this?