MediaSource JavaScript API extends the HTMLMediaElement to allow JavaScript to generate media streams for playback.
Questions tagged [media-source]
325 questions
12
votes
3 answers
how to play
I have two tags like this:
I want to play them one by one, i.e. .play() first one and when it launches onend event, I'll .play() the second one. Unfortunately this gives me a pop sound…

Stepan Yakovenko
- 8,670
- 28
- 113
- 206
12
votes
1 answer
How to generate Initialization Segment of webm video to use with Media Source API
I'm building a small application that use MediaRecoder API to split the recoding video from webcam and upload all the part to server.
I see that with Media Source API, I need to play the first part and then play any other part.
According to…

nvcnvn
- 4,991
- 8
- 49
- 77
11
votes
2 answers
Is it possible to add a stream as source to an html canvas element as to a html video element?
According to MDN:
The HTMLMediaElement interface adds to HTMLElement the properties
and methods needed to support basic media-related capabilities that
are common to audio and video.
HTMLMediaElement.captureStream(). It can be used with both…

sçuçu
- 2,960
- 2
- 33
- 60
11
votes
2 answers
Unable to stream video over a websocket to Firefox
I have written some code stream video over a websocket so a sourcebuffer which works in Chrome and Edge.
However, when I run this in Firefox, the video never plays back, just a spinning wheel animation is displayed. When I check the

Hans
- 2,448
- 2
- 24
- 30
10
votes
0 answers
Get cluster offsets from ebml cues
I'm attempting to stream youtube using Media Source Extensions (MSE) and it works fine but when trying to get seeking working I came across the problem that I don't know the byte ranges for certain time ranges. Using ebml parsing I can get the cues…

Zachrip
- 3,242
- 1
- 17
- 32
10
votes
1 answer
Chrome: to play a video that is being downloaded via fetch/XHR
What I'm trying to achieve is to make Chrome load a video file as data (via the Fetch API, XHR, whatever) and to play it using

thorn0
- 9,362
- 3
- 68
- 96
10
votes
3 answers
Get mime type for MediaSource.isTypeSupported
How do I get the Mime type I need to pass to MediaSource.isTypeSupported with ffprobe/ffmpeg?
For instance, on my computer, that returns true:
MediaSource.isTypeSupported('video/mp4; codecs="avc1.64000d,mp4a.40.2"')
while that…

Guig
- 9,891
- 7
- 64
- 126
10
votes
4 answers
Flush & Latency Issue with Fragmented MP4 Creation in FFMPEG
I'm creating a fragmented mp4 for html5 streaming, using the following command:
-i rtsp://172.20.28.52:554/h264 -vcodec copy -an -f mp4 -reset_timestamps 1 -movflags empty_moov+default_base_moof+frag_keyframe -loglevel quiet -
"-i…

galbarm
- 2,441
- 3
- 32
- 52
10
votes
1 answer
Use Media Source Extensions with raw video frames
I'm trying to do live realtime streaming of H264 video from a server to the browser.
The H264 stream is not wrapped inside a MP4 container, but instead it finds it's way to the browser (through web sockets) in the form of raw H264 frames.
The…

galbarm
- 2,441
- 3
- 32
- 52
9
votes
3 answers
WebRTC video/audio streams out of sync (MediaStream -> MediaRecorder -> MediaSource -> Video Element)
I am taking a MediaStream and merging two separate tracks (video and audio) using a canvas and the WebAudio API. The MediaStream itself does not seem to fall out of sync, but after reading it into a MediaRecorder and buffering it into a video…

Jacob Greenway
- 461
- 2
- 8
- 15
9
votes
1 answer
Failed to execute 'appendBuffer' on 'SourceBuffer': The HTMLMediaElement.error attribute is not null
I am trying to stream a video file via socket.io to my client (currently using Chrome as client).
I am only getting the first frame of the video and afterwards the Failed to appendBuffer is appears:
Failed to execute 'appendBuffer' on…

Moti
- 462
- 2
- 6
- 18
9
votes
1 answer
Display getUserMediaStream live video with media stream extensions (MSE)
I am trying to display a MediaStream taken from a webcam using getUserMedia, and to relay it to a remote peer using whatever mechanism possible for it to be played (as an experiment). I am not using webRTC directly as I want control over the raw…

Ionut Campean
- 91
- 1
- 3
9
votes
1 answer
Frame by frame decode using Media Source Extension
I've been digging through the Media Source Extension examples on the internet and haven't quite figured out a way to adapt them to my needs.
I'm looking to take a locally cached MP4/WebM video (w/ 100% keyframes and 1:1 ratio of clusters/atoms to…

Dustin Kerstein
- 450
- 5
- 13
9
votes
2 answers
MediaSource API - append/concatenate multiple videos together into a single buffer
UPDATE:
SO I was able to get this to work by using the offsetTimestamp property (incrementing it after appending each video).
My questions now:
1) Why isn't this done properly when setting the MediaSource.mode to sequence?
2) Why is my…

Andy Hin
- 30,345
- 42
- 99
- 142
9
votes
3 answers
mp3 support on Firefox MediaSourceExtension
I'm looking into implementing adaptive and progressive audio streaming in the browser, with no plugins.
MSE is the HTML5 API I was waiting for, available in FF 42, but it seems that the audio format support in Firefox is not there?...
mp3 audio is…

Hagay Lupesko
- 368
- 1
- 11