1

I am using this NPM package with Angular 8 [ https://www.npmjs.com/package/webrtc-adapter ] to replicate the WebRTC getDisplayMedia functionality here [ https://webrtc.github.io/samples/src/content/getusermedia/getdisplaymedia/ ]

I figured out how to start and stop a recording (live screen capture) but I cannot figure nor find any documentation on how to download the actual recoding, Please see code below thank you.

import { Component, OnInit, ViewChild, ElementRef, AfterViewInit } from '@angular/core';
import adapter from "webrtc-adapter";

@Component({
  selector: 'tgh-web-rtc-screen-api',
  template: `
  <div class="row">
   <div class="col-md-12">
    <button (click)="startRecording()"> Record </button> &nbsp;
    <button (click)="stopRecording()"> Stop </button> &nbsp;
    <button (click)="resumeRecording()"> Resume </button> &nbsp;
    <button (click)="downloadRecording()"> Download</button>
   </div>
   <div class="col-md-12">
    <video #video class="video"></video>
   </div>
  </div>
  `
})

// Must implement AfterviewInit to work properly for video recording
export class WebRtcScreenApiComponent implements OnInit , AfterViewInit {

    // The HTML reference to the video element (<video></video> tag)
    @ViewChild("video") video: ElementRef;

    _navigator = <any> navigator;
    _localStreamReference: any;

    constructor() { }

    ngOnInit() {

    }

    ngAfterViewInit() {
        // set the initial state of the video
        let video:HTMLVideoElement = this.video.nativeElement;
        video.muted = false;
        video.controls = true;
        video.autoplay = false;
    }

    // Starts the recording and calls the on success methiod if passed
    startRecording() {

        // For Firefox, it requires you specify whether to present the option to share a screen or window to the user
        if (adapter.browserDetails.browser == 'firefox') {
            adapter.browserShim.shimGetDisplayMedia(window, 'screen');
        }

        // Modern way TODO: Figure out why the stream is needed
        this._navigator.mediaDevices.getDisplayMedia({video: true}).then(stream => {
            this.onSucces(stream);
        })
        .catch(e => {
            this.onError(e);
        });

    }

    //Starts the screen recording
    onSucces(stream: MediaStream): void {

        this._localStreamReference = stream;

        var video = document.querySelector('video');
        video.srcObject = stream;   
        video.onloadedmetadata = function(e) {
            video.play();
        };

    }

    // Stops the screen recording
    stopRecording(): void {

        const tracks = this._localStreamReference.getTracks();
        tracks.forEach((track) => {
            track.stop();
        });

    }

    // Resumes recording
    resumeRecording(): void {

        const tracks = this._localStreamReference.getTracks();
        tracks.forEach((track) => {
            track.play();
        });

    }

    // Downloads rercording in browser
    downloadRecording() {
        // ?
    }

    // on WebRTC error
    onError(error: Error):void {
        console.log('Error message: ' + error.message);
        console.log('Error name: ' + error.name);
    }

}
  • Recoding video in client side is not the right way of implementing recording of video calls. what if the user hard reload the browser or close the browser. The recording will be lost. My suggesions will be go with an MCU. – msmukesh4 Mar 05 '20 at 08:20

1 Answers1

0

Here is the code from webRTC samples to download the recording https://github.com/webrtc/samples/blob/gh-pages/src/content/getusermedia/record/js/main.js#L47

Test page - https://webrtc.github.io/samples/src/content/getusermedia/record/

And also in your code you are only using getDisplayMedia API, If you want to record then you have to use MediaStream Recording API as well - https://developer.mozilla.org/en-US/docs/Web/API/MediaStream_Recording_API

Karthik
  • 2,282
  • 2
  • 22
  • 23