Questions tagged [mediastream]

The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element.

The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element. Media Capture and Streams

348 questions
8
votes
1 answer

chrome.desktopCapture - can't record both system audio and microphone?

I've built a Chrome extension that captures screen activity and microphone input and outputs a video file. Since chrome.desktopCapture can't record audio input alongside screen capture, I'm getting the mic in its own, separate stream. So: //get…
8
votes
2 answers

Webcam Light Stays on Even After I run MediaStreamTrack.stop()

I am building a react app and need to access the webcam which get with the following code: navigator.mediaDevices.getUserMedia({ video: true, audio: false }) .then(function(stream) { video.srcObject = stream; window.localstream =…
pengcheng95
  • 292
  • 4
  • 13
8
votes
0 answers

Where is a comprehensive list of supported media types when recording with the Media * API?

I am trying to learn how to record media in the browser and I may be over-complicating things. There is an abundant supply of straight-forward examples but I got bogged down at the part when the recordings are pushed to a Blob object with an…
toraritte
  • 6,300
  • 3
  • 46
  • 67
7
votes
2 answers

How to convert Audio buffer to MP3 in Javascript?

I am using MediaRecorder in ReactJS to record audio from the microphone and storing into the blob with MIME type "audio/mp3". I want to convert this blob to MP3 and upload it in S3 bucket. I am able to convert it into WAV by using audioContext,…
Vijay Singh Kholiya
  • 383
  • 1
  • 6
  • 19
7
votes
2 answers

ReferenceError: MediaStream is not defined - in unitTest with Jest

I am trying to run unit-test with jest framework. I have some webrtc related code in my project written in typescript. I am continuously getting this error. I tried to mock MediaStream but to no avail. My test file: import * as React from…
Gyanesh Gouraw
  • 1,991
  • 4
  • 23
  • 31
7
votes
0 answers

Changing FocusMode not working using MediaStream API in Google Chrome

In Google Chrome Browser i was able to get live feed of my connected USB Camera using getUserMedia() API. I have a slider to change the brightness value and this is working fine. I also want focusMode to toggle from continuous to manual(The camera…
vgokul129
  • 777
  • 12
  • 31
7
votes
1 answer

How to create a MediaStream from a uploaded audio file or a audio file URL using javascript?

I know how to use navigator.getUserMedia to get the audio stream from the browser and system's default input device(a microphone). But what if I would like to get the MediaStream from an uploaded audio file or an audio file URL? Appreciate if can…
Peiti Li
  • 4,634
  • 9
  • 40
  • 57
7
votes
1 answer

WebRTC: use of getStats()

I'm trying to get stats of a webRTC app to measure audio/video streaming bandwidth. I checked this question and I found it very useful; however, when I try to use it I get TypeError: Not enough arguments to RTCPeerConnection.getStats. I think that…
Don Diego
  • 1,309
  • 3
  • 23
  • 46
7
votes
2 answers

how to get running mediaStream

I've created a webCam Stream with navigator.getUserMedia({ "video": true }, function(stream){ videoTag.src = window.URL.createObjectURL(stream); videoTag.play(); } Can I access the MediaStream object in stream in global scope?* (something…
Breaker222
  • 1,334
  • 1
  • 14
  • 23
7
votes
1 answer

Capturing an image in HTML5 at full resolution

It is possible to capture an image in javascript using the MediaStream API. But in order to do so it is first necessary to instantiate a video object, then paint a frame into a canvas to get an image. But unfortunately many devices (e.g. phones)…
Michael
  • 9,060
  • 14
  • 61
  • 123
6
votes
0 answers

Create MediaStream from ReadableStream

I'm using puppeteer-stream to get a stream of a browser controlled by Node, running on a server. I am able to write this stream out to a file with no issues. I wanted to stream this stream via WebRTC to a browser (basically to see what the browser…
navinpai
  • 955
  • 11
  • 33
6
votes
0 answers

Webrtc disable track doesn't turn off webcam

I am trying to implement a toggle video feature using webRTC. Refer to the following code: let localVideo = document.querySelector('#local'); const…
RisingGeek
  • 83
  • 1
  • 9
6
votes
1 answer

HTMLImageElement - src as stream

In the past, you could use URL.createObjectURL() and pass it a MediaStream. However, this has been removed (see https://www.fxsitecompat.dev/en-CA/docs/2017/url-createobjecturl-stream-has-been-deprecated/). The replacement functionality was to…
J Trana
  • 2,150
  • 2
  • 20
  • 32
6
votes
1 answer

Combining and position two media streams

I am trying to produce something similar to https://recordscreen.io/ It positions the users camera over the screen recording I've got both streams separately right now. I've tried position one over another but I want it in a single video element…
Reece Ward
  • 213
  • 2
  • 9
6
votes
1 answer

MediaStream events active and inactive are not triggered in a remote WebRTC Peer Connection

I have a remote MediaStream object obtained by a remote WebRTC Peer Connection. I want to check when the remote MediaStream becomes inactive (indipendently by the reason). I have read that for this purpose I should use the events active and inactive…
Alessandro C
  • 3,310
  • 9
  • 46
  • 82
1
2
3
23 24