0

I want to synthesize and record the process content drawn on Canvas(input) together into a movie and overlay it on top of the Video Stream

Is there a way for the WebCodecs API to do this?

My current approach is to use createImageBitmap() to capture one picture each for Video(input) and Canvas(input) and pass it to the WebWorker

In WebWorker, the Video and Canvas are drawn to another Canvas(output). After drawing, the picture is passed in again, and this cycle is repeated

And then do a captureStream() on Canvas(output)

Finally, Encode this Canvas(Output) Stream and write it to FileWritable

But I found that this was very likely to cause congestion in my program, even though I used createImageBitmap to generate the ImageBitmap passed in the transferable object to the WebWorker

I wonder if there's a better way to overlay canvas content on top of video and output a movie

Sorry English is not my first language

MING
  • 35
  • 6
  • _"Is there a way for the WebCodecs API to do this?"_ Yes it can. Remember though, WebCodecs is not a recorder. It will **encode** your pixels (ImageData or VideoFrame) into the chosen codec like JPEG if encoding image, or H.264 (or VP9) frame if encoding video. You will need to also "contain" the frames yourself. This means putting H.264 into an MP4 shell that you create. This means learning about fragmented-MP4 structure (check bytes in a Hex editor as you research their meaning). – VC.One Mar 16 '23 at 01:45

1 Answers1

0

Yes, webcodecs can output a new video from your canvas animation. Basic steps for these are as follows:

  1. Create and configure a video encoder instance. Check the documentation to learn how to (https://developer.mozilla.org/en-US/docs/Web/API/VideoEncoder)
  2. Create and configure a muxer. There are many different muxers available as libraries. I am using this one to make webm files, because it is simple to use: https://github.com/Vanilagy/webm-muxer.
  3. Create the videoFrames using the canvas as the argument, just adding the respective video timestamp: new VideoFrame(canvas, { timestamp });
  4. Feed the videoFrame to the encoder, which will output an encodedVideoFrame
  5. Feed the encodedVideoFrame to the muxer.

This example use the webcam stream, you can grab the code there and just change the source of the video frames to the canvas:

https://vanilagy.github.io/webm-muxer/demo-streaming/