1

how can I save my chunks of streams which converted into blobs in my node js server real-time

client.js | I am my cam stream as binary to my node js server

    handleBlobs = async (blob) => {

        let arrayBuffer = await new Response(blob).arrayBuffer()

        let binary = new Uint8Array(arrayBuffer)

        this.postBlob(binary)


    };

 postBlob = blob => {

        axios.post('/api',{blob})
            .then(res => {
                console.log(res)
            })
    };

server.js

app.post('/api', (req, res) => {
    console.log(req.body)
});

how can I store the incoming blobs or binary into one video file at the end of video recording completion.

Nane
  • 2,370
  • 6
  • 34
  • 74
  • Check this out: https://medium.com/@daspinola/video-stream-with-node-js-and-html5-320b3191a6b6 It looks like he does something similar to what you want. – Michele Jul 03 '19 at 13:31
  • Thanks for sharing I've already read that article but I am getting chunks of data from the client so this article might not help – Nane Jul 03 '19 at 14:02
  • Hi Nane, this looks like a duplicate of https://stackoverflow.com/questions/56826079/how-to-concat-chunks-of-incoming-binary-into-video-webm-file-node-js. This has a bounty on it so not sure if you can delete it now... if not, I'm happy to provide my answer here as well :) :) – willascend Jul 05 '19 at 19:20

2 Answers2

4

This appears to be a duplicate of How to concat chunks of incoming binary into video (webm) file node js?, but it doesn't currently have an accepted answer. I'm copying my answer from that post into this one as well:

I was able to get this working by converting to base64 encoding on the front-end with the FileReader api. On the backend, create a new Buffer from the data chunk sent and write it to a file stream. Some key things with my code sample:

  1. I'm using fetch because I didn't want to pull in axios.
  2. When using fetch, you have to make sure you use bodyParser on the backend
  3. I'm not sure how much data you're collecting in your chunks (i.e. the duration value passed to the start method on the MediaRecorder object), but you'll want to make sure your backend can handle the size of the data chunk coming in. I set mine really high to 50MB, but this may not be necessary.
  4. I never close the write stream explicitly... you could potentially do this in your /final route. Otherwise, createWriteStream defaults to AutoClose, so the node process will do it automatically.

Full working example below:

Front End:

const mediaSource = new MediaSource();
mediaSource.addEventListener('sourceopen', handleSourceOpen, false);
let mediaRecorder;
let sourceBuffer;

function customRecordStream(stream) {
  // should actually check to see if the given mimeType is supported on the browser here.
  let options = { mimeType: 'video/webm;codecs=vp9' };
  recorder = new MediaRecorder(window.stream, options);
  recorder.ondataavailable = postBlob 
  recorder.start(INT_REC)
};

function postBlob(event){
  if (event.data && event.data.size > 0) {
    sendBlobAsBase64(event.data);
  }
}

function handleSourceOpen(event) {
  sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"');
} 

function sendBlobAsBase64(blob) {
  const reader = new FileReader();

  reader.addEventListener('load', () => {
    const dataUrl = reader.result;
    const base64EncodedData = dataUrl.split(',')[1];
    console.log(base64EncodedData)
    sendDataToBackend(base64EncodedData);
  });

  reader.readAsDataURL(blob);
};

function sendDataToBackend(base64EncodedData) {
  const body = JSON.stringify({
    data: base64EncodedData
  });
  fetch('/api', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
    },
    body
  }).then(res => {
    return res.json()
  }).then(json => console.log(json));
}; 

Back End:

const fs = require('fs');
const path = require('path');
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const server = require('http').createServer(app);

app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json({ limit: "50MB", type:'application/json'}));

app.post('/api', (req, res) => {
  try {
    const { data } = req.body;
    const dataBuffer = new Buffer(data, 'base64');
    const fileStream = fs.createWriteStream('finalvideo.webm', {flags: 'a'});
    fileStream.write(dataBuffer);
    console.log(dataBuffer);
    return res.json({gotit: true});
  } catch (error) {
    console.log(error);
    return res.json({gotit: false});
  }
});
willascend
  • 1,503
  • 8
  • 15
  • Hello @willascend , your source code is working fine. But there seems to be a problem. When I run the saved finalvideo.webm file as a player, I only hear a crashing sound, and the correct sound and video are not played. Is there any solution for this? Or am I misunderstanding your source code? – user3662974 Mar 26 '22 at 06:21
  • It should work without any issue. This was also written in 2019 - not sure if things have changed. Try creating a new question to see if you can get your issue solved that way. – willascend Mar 26 '22 at 13:42
0

Without attempting to implement this (Sorry no time right now), I would suggest the following:

  1. Read into Node's Stream API, the express request object is an http.IncomingMessage, which is a Readable Stream. This can be piped in another stream based API. https://nodejs.org/api/stream.html#stream_api_for_stream_consumers

  2. Read into Node's Filesystem API, it contains functions such as fs.createWriteStream that can handle the stream of chunks and append into a file, with a path of your choice. https://nodejs.org/api/fs.html#fs_class_fs_writestream

  3. After completing the stream to file, as long as the filename has the correct extension, the file should be playable because the Buffer sent across the browser is just a binary stream. Further reading into Node's Buffer API will be worth your time. https://nodejs.org/api/buffer.html#buffer_buffer

wntwrk
  • 317
  • 1
  • 2
  • 10