0

I am building streaming service using webrtc+janusgateway+streamCapture.

This, starts streaming video:

  public streamVideo() {
    var video = $('#video1').get(0);
        var stream;

    video.onplay = () => {
      if (video.captureStream) {
        stream = video.captureStream();
      } else if (video.mozCaptureStream) {
        stream = video.mozCaptureStream();
      } else {
        alert('captureStream() not supported');
      }

            console.log(stream);
      $("#secondvideoforll").get(0).srcObject = stream;

    this.sfutest.createOffer(
      {
        media: { audioRecv: 0, videoRecv: 0, audioSend: 1, videoSend: 1}, // Publishers are sendonly
        stream: stream,
        success: (jsep) => {
          Janus.debug("Got publisher SDP!");
          Janus.debug(jsep);
          var publish = { "request": "configure", "audio": 1, "video": 1 };
          this.sfutest.send({"message": publish, "jsep": jsep});
        },
        error: (error) => {
          Janus.error("WebRTC111 error:", error);
        }
      });
    }
  }

Video playback works perfectly, but when i try to create an offer(and further addStream). I get this error:

WebRTC111 error: DOMException [InternalError: "Cannot create an offer with no local tracks, no offerToReceiveAudio/Video, and no DataChannel."
code: 0
nsresult: 0x0]

The same offer creation(without the stream parameter) works for webcam streeming, but not for video streaming.

The main difference i found is that webcam uses: LocalMediaStream, while my streamCapture uses MediaStream.

Any ideas on this one?

IvRRimUm
  • 1,724
  • 3
  • 21
  • 40

2 Answers2

0

When calling video.captureStream() getTracks() returns empty array, but after 1.5 seconds, it returns tracks as expected.

Error produced if no tracks are added: WebRTC111 error: DOMException [InternalError: "Cannot create an offer with no local tracks, no offerToReceiveAudio/Video, and no DataChannel

Adding this for doc purposes, as others might find this confusing.

Solution:

setInterval(function(){
// We wait till the mediaTracks are added to mediaStream
console.log(stream.getTracks());
// Further actions with the mediaStream
}, 1000);

Thanks!

Reference: https://github.com/w3c/webrtc-pc/issues/923

Community
  • 1
  • 1
IvRRimUm
  • 1,724
  • 3
  • 21
  • 40
0

This is because you are capturing stream before video is played. You need to capture stream after video has started playing to avoid delay in stream tracks.

A better way instead of setTimeout is to use:

  • Use await on video.play() method and capture stream after that.
  • Use 'onplay' event of video element to capture stream when video is played.