3

The situation is pretty straight-forward; I am receiving a stream of NAL units via WebSockets. How do I feed them into an HTML5 video tag using MSE?

Research indicates that I should mux the data into a fragmented mp4, but I haven't found any specifics on how to accomplish that. Does anyone have specifics?

cdbfoster
  • 339
  • 4
  • 13
  • Can you reproduce example stream of NAL units? – guest271314 Oct 05 '16 at 03:42
  • I don't understand your question. – cdbfoster Oct 05 '16 at 03:44
  • Can you reproduce receiving stream of NAL units at plnkr https://plnkr.co ? Are the NAL units received as an `ArrayBuffer`? See also http://stackoverflow.com/questions/38081377/unable-to-stream-video-over-a-websocket-to-firefox/ – guest271314 Oct 05 '16 at 03:45
  • I don't think I could reproduce an example very easily; I'm receiving them from a private server. Yes, I have them in an `ArrayBuffer`, parsed out of a blob. Edit: That link just demonstrates receiving mp4 data and playing it. I'm receiving h.264 data and I need to mux it into mp4 (or something) in order to play it. – cdbfoster Oct 05 '16 at 03:49
  • _"parsed out of a blob"_ Have you tried creating an `Blob URL` from `Blob` to set at ` – guest271314 Oct 05 '16 at 03:59
  • Again, I don't have an mp4. I have an h.264 stream, which, as far as I understand, is not directly playable by the browser without some other container. – cdbfoster Oct 05 '16 at 04:08
  • If you have a `Blob`, you can use `URL.createObjectURL()` to create a `Blob URL` of the stream at that point, to set as `src` of ` – guest271314 Oct 05 '16 at 04:11
  • @cbfoster [Video decoding in OSH JS Toolkit](https://opensensorhub.org/2016/09/16/video-decoding-in-osh-js-toolkit/) – guest271314 Oct 05 '16 at 05:31
  • *Again*, setting the stream source as the blob is only useful if the browser understands the data. It's not mp4, webm, or ogg, and so it won't. **I need to mux the h.264 data into mp4**. Edit: Didn't see your newest link. I'll take a look – cdbfoster Oct 05 '16 at 05:32
  • you CAN"T. You must multiplex the elementary stream in some container like mp4. – Michael IV Nov 12 '17 at 22:24
  • ...about a year late, and the need to use a container like mp4 is stated directly in the original question... – cdbfoster Nov 14 '17 at 15:13

1 Answers1

3

If you receive a stream data e.g. hls, nalu h.264...and so on, you can transform and mux that into a fragmented mp4. Setting HTML5 video tag combines with MSE like creating mediaSource, mediaSource.addSourceBuffer, sourceBuffer.appendBuffer. That will play video while fmp4 right feed into buffer.

You may check out https://github.com/ChihChengYang/wfs.js which demonstrates transmuxing NALu h.264 streams from websocket. That works directly on top of a standard HTML5 element and MSE.

Mr. J
  • 66
  • 4