5

I'm trying to take a snapshot from my webcam with the navigator.mediaDevices.getUserMedia() and canvas.getContext('2d').drawImage() functions.

When I do it like this, it works perfectly:

function init(){
  myVideo = document.getElementById("myVideo") 
  myCanvas = document.getElementById("myCanvas");
  videoWidth = myCanvas.width;
  videoHeight = myCanvas.height;
    
  startVideoStream();
}

function startVideoStream(){
  navigator.mediaDevices.getUserMedia({audio: false, video: { width: videoWidth, height: videoHeight }}).then(function(stream) {
    myVideo.src = URL.createObjectURL(stream);
  }).catch(function(err) {
      console.log("Unable to get video stream: " + err);
  });
}

function snapshot(){
  myCanvas.getContext('2d').drawImage(myVideo, 0, 0, videoWidth, videoHeight);
}
<!DOCTYPE html>
<html>
<head>
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <script src="debug.js"></script>
</head>
<body onload="init()">
    <div id="mainContainer">
        <video id="myVideo" width="640" height="480" autoplay style="display: inline;"></video>
        <canvas id="myCanvas" width="640" height="480" style="display: inline;"></canvas>
        <input type="button" id="snapshotButton" value="Snapshot" onclick="snapshot()"/>
    </div>
</body>
</html>

The thing is, I don't want to use a button click for taking the snapshot, but take the snapshot as soon as the the camera stream is loaded. I tried calling the snapshot() function directly after setting the video source:

function init(){
  myVideo = document.getElementById("myVideo") 
  myCanvas = document.getElementById("myCanvas");
  videoWidth = myCanvas.width;
  videoHeight = myCanvas.height;
    
  startVideoStream();
}

function startVideoStream(){
  navigator.mediaDevices.getUserMedia({audio: false, video: { width: videoWidth, height: videoHeight }}).then(function(stream) {
    myVideo.src = URL.createObjectURL(stream);
    snapshot();
  }).catch(function(err) {
      console.log("Unable to get video stream: " + err);
  });
}

function snapshot(){
  myCanvas.getContext('2d').drawImage(myVideo, 0, 0, videoWidth, videoHeight);
}

But it doesn't work. My canvas stays white. I guess it's because the the camera stream is not fully loaded at this point.

So is there any other event getting fired, which I could use for drawing the snapshot as soon as the camera feed is loaded? Or am I totally on the wrong track?

Thanks in advance!

xcess
  • 53
  • 1
  • 4
  • can you print myVideo in snapshot please? – Fanyo SILIADIN Jan 25 '17 at 15:52
  • When I log myVideo in the snapshot() function it says: `` – xcess Jan 25 '17 at 15:55
  • maybe a look at this issue will help: http://stackoverflow.com/questions/12256668/html5-drawimage-from-video-to-canvas – Fanyo SILIADIN Jan 25 '17 at 16:00
  • The post is about wrong video formats, but it brought me on the right track :-) There is a "play"-EventListener for the video element, so I can use `myVideo.addEventListener('play', function(){ snapshot(); }, false);})` Thank you very much! – xcess Jan 25 '17 at 16:17

1 Answers1

5

Wait for the loadedmetadata event:

navigator.mediaDevices.getUserMedia({video: true})
  .then(stream => {
    video.srcObject = stream;
    return new Promise(resolve => video.onloadedmetadata = resolve);
  })
  .then(() => canvas.getContext('2d').drawImage(video, 0, 0, 160, 120))
  .catch(e => console.log(e));
<video id="video" width="160" height="120" autoplay></video>
<canvas id="canvas" width="160" height="120"></canvas>

The above should work in all browsers (that do WebRTC).

In Chrome you can also do this - but play() doesn't return a promise in any other browser yet.

Also note that URL.createObjectURL(stream) is deprecated. Use srcObject.

Update: Thanks to @KyleMcDonald in comments for pointing out the importance of registering the loadedmetadata listener synchronously with setting the srcObject!—Code updated.

jib
  • 40,579
  • 17
  • 100
  • 158
  • *"`URL.createObjectURL` is deprecated"* It's more *setting a video's src to a stream by using URL.createObjectURL* which is deprecated, right ? URL.createObjectURL is still the way to create an URL from a Blob. If I'm not mistaken, no one does support `srcObject = Blob` yet. – Kaiido Jan 26 '17 at 04:47
  • @Kaiido That's correct. `srcObject` support for blob and mediaSource are lacking atm. Though once that improves, here's hoping `URL.createObjectURL` will go away entirely. – jib Jan 26 '17 at 04:53
  • But then will `srcObject` be applied to ``, ` – Kaiido Jan 26 '17 at 04:57
  • 2
    @Kaiido Good points. srcObject won't work with those, so createObjectURL will probably remain for those uses. I'll update the answer. createObjectURL is particularly bad for streams, because streams can hold live hardware resources open, as few people call `revokeObjectURL` correctly. – jib Jan 26 '17 at 05:08
  • :-) Already upvoted, but glad you updated. Also, there could be a word about WHATWG `play()` which returns a Promise, avoiding us to create one, even if currently only chrome does follow this version of the specs. (Maybe you know FF plans about it ?). And you're right about revokeObjectURL, I keep adding it in every of my comments/answers. But for other uses, createObjectURL is just way better than a readAsDataURL, especially from user files, it's just a pointer to the hard-drive. – Kaiido Jan 26 '17 at 05:10
  • @Kaiido Thanks for the pointer! I've filed [a bug](https://bugzil.la/1334017) on FF. – jib Jan 26 '17 at 05:21
  • Thank you for your help! The code works perfectly :-) – xcess Jan 26 '17 at 15:32
  • This code fails intermittently for me when the metadata loads too quickly. I solved it like this: `.then(stream => new Promise(resolve => { video.onloadedmetadata = resolve; video.srcObject = stream }))` instead. – Kyle McDonald Apr 16 '19 at 06:20
  • @KyleMcDonald Thanks! You don't have to set the listener first, but it does need to be done synchronously on the same run of the JS event loop, to not miss anything. Good catch! I've updated my code. – jib Apr 16 '19 at 23:04