I have multiple MJPEG streams - IP Cameras - and I need to display them, animated, to a canvas (I have to do some effects to them etc, so I really need canvas
).
OPTION 1: was using setInterval
, and redrawing the image to canvas like this: ctx.drawImage(img,0,0,640,360);
but only first frame is drawn to canvas, in Firefox.
According to specs (https://html.spec.whatwg.org/multipage/scripting.html#image-sources-for-2d-rendering-contexts:canvasimagesource-3):
[...] when a CanvasImageSource object represents an animated image in an HTMLOrSVGImageElement, the user agent must use the default image of the animation (the one that the format defines is to be used when animation is not supported or is disabled), or, if there is no such image, the first frame of the animation, when rendering the image for CanvasRenderingContext2D APIs.
So this is not a bug. In Chrome it works at this moment but it is considered a bug and will be "solved" soon...
OPTION 2: was using setInterval
reloading the image (I have the option to get current frame with my ip cameras - so I can get one frame and "refresh" the image) then re-drawing to canvas. That works well but there is a problem that the streams are ~30fps so re-establishing a connection to the server at 1/30 seconds is really hard and I lose a lot of frames (imagine when having multiple cameras on the same page...).
PS: I am unable to re-encode MJPEG streams to mp4, webm or other kind of video.
IDEA: Is there any option to manually split the mjpeg at boundary, with js and then get each frame? I really don't know how to do this and if it's possible.
SO: What can I do?
Thank you very much!
PS2: Jquery or any other free js library accepted. Server-side: only PHP or CGI.