Questions tagged [mediastream]

The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element.

The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element. Media Capture and Streams

348 questions
3
votes
0 answers

Detecting if another tab is sharing screen on client's browser, and stopping them from sharing

Generally, with a webcam, if a browser tab, or a desktop app is using the webcam, other sources cannot use that webcam until the first app stops. In a similar way is it possible to detect if there is a screen already being shared in another tab…
Umut
  • 101
  • 9
3
votes
0 answers

How do I sample audio from microphone in javascript, for microphone animation?

I am recording audio from microphone in a browser using MediaStream Recording API. I would like to provide user with a visual feedback of a pulsating microphone icon. Something like this, only tied to the actual sound amplitude from the mike.…
Irina Rapoport
  • 1,404
  • 1
  • 20
  • 37
3
votes
1 answer

What's the best way to reduce the frame rate used by a MediaRecorder?

I have a MediaStream from a WebRTC remote peer from which I would like to create a video recording in the browser. I'm currently creating the MediaRecorder like this: const recorder = new MediaRecorder(mediaStream); The original stream has a frame…
3
votes
0 answers

Can't create blank video media stream track that works on all browsers using RTCPeerConnection

I can get this to work on Firefox and Chrome but not Safari. On Safari it will be created but then the browser crashes when you use that for RTCPeerConnection.addTrack. Is there a way to create a MediaStreamTrack of type video that is just a blank…
John
  • 87
  • 1
  • 11
3
votes
0 answers

How can I use MediaRecorder API of browser in react native

I just want to use media recorder API to record the stream which I'm using in WebRtc for react-native. Actually I want to save the stream to a file on the backend server. and hence i need to send the data of buffers through the sockets once the…
3
votes
2 answers

video.captureStream stops when video is over

I'm building a custom WebRTC solution that allows you to use an mp4 file as if it was your camera. To do that, I'm creating a
Maurício Giordano
  • 3,116
  • 4
  • 32
  • 60
3
votes
2 answers

How to mix / combine multiple WebRTC media streams (screen capture + webcam) into a single stream?

I have a live screen capture media stream returned from getDisplayMedia(), and a live webcam media stream returned from getUserMedia(). I currently render the webcam video on top of the screen share video, to create a picture-in-picture effect: I…
amiregelz
  • 1,833
  • 7
  • 25
  • 46
3
votes
2 answers

webRTC meaning of remote video track muted / enabled

Having implemented a couple years back a mechanism for signaling via a data channel message that a remote user muted their local video (e.g., set enable to false) and then taking the appropriate action on the remote side (e.g., showing the remote…
SBG
  • 357
  • 4
  • 17
3
votes
0 answers

Adding mediastream.clone() to a video element srcObject does not fire loadedmetadata event if video track is not enabled

I have a react app configured as such: // landingpage.js const {mediaStream, setMediaStream} = useContext(StreamContext); useEffect(() => { navigator.mediaDevices.getUserMedia({ video: true, audio: true }) .then((stream) => { const video =…
Joseph K.
  • 1,055
  • 3
  • 23
  • 46
3
votes
0 answers

Passing mediaStream on in chrome extension

I'm building an extension that one part of it's functionality is recording the screen. after enabling record in the popup interface the background.js creates a mediaStream object successfully. I'm doing in the background script because the…
3
votes
0 answers

Getting P5js AudioIn to MediaStream Object

I'm trying to merge Audio & Video Streams. I'm using p5.js and am unable to figure out how to get a mediastream object from P5.js 's AudioIn Function? Below is the code snippet : audioRecorder = new…
NazimZeeshan
  • 615
  • 1
  • 7
  • 12
3
votes
0 answers

Chrome: play audio on two outputs with HTMLAudioElement.setSinkId

I want to play the same audio MediaStream on two different audio output media devices with the following code: const audio0 = new Audio('/audio.mp3') audio0.play() const stream = audio0.captureStream() const audio1 = new Audio() audio1 …
Samuel
  • 1,271
  • 1
  • 15
  • 29
3
votes
0 answers

Mediastream pipe to NodeJS socket.io stream to Google Speech API and stream back the responses

I want to implement speech to text using Google Speech API, but in my frontend I don't quite get what should I do, I am using Socket.io Stream in both backend and frontend. Frontend (Javascript) bindSendAudioMessage() { let me = this; …
3
votes
0 answers

Focus the camera on an object using focusMode on WebRTC

I'm trying to get the camera to focus on an object, by using focusMode on the stream track received from getUserMedia. But after changing the property of focus mode to manual mode, I don't see it reflect on the stream also I could see that after…
3
votes
0 answers

Recording MediaStream server-side (with video and audio) from WebRTC to a file

I am designing an application which aim is to stream video (with audio) from camera (mobile application) and save it on the disk (on the the server). Everything with the communication between client and server is ok. I am using WebRTC. One of the…