4

We have a situation trying to serve a video stream.

Since HTML5 video tag does not support udp to multicast, we are trying to re-use an already converted ffmpeg stream and send it to more than one response. But that does not work.

The first response gets the stream alright, but the second one does not. It seems that the stream cannot be piped out to another response, neither can it be cloned.

Has anyone done that before? Any ideas?

Thanks in advance!

Here's the code:

var request = require('request');
var http = require('http');
var child_process = require("child_process");
var n = 1;
var stdouts = {};

http.createServer(function (req, resp) {

console.log("***** url ["+req.url+"], call "+n);

if (req.url != "/favicon.ico" && req.url != "/")
{
var params = req.url.substring(1).split("/");

switch (params[0])
{
  case "VIEW":
    if (params[1] == "C2FLOOR1" || params[1] == "C2FLOOR2" || params[1] == "C2PORFUN" || params[1] == "C2TESTCAM")
      var camera = "rtsp://192.168.16.19:554/Inter/Cameras/Stream?Camera="+params[1];
    else
      var camera = "http://192.168.16.19:8609/Inter/Cameras/GetStream?Camera="+params[1];

    // Write header
    resp.writeHead(200, {'Content-Type': 'video/ogg', 'Connection': 'keep-alive'});

    if (stdouts.hasOwnProperty(params[1]))
    {
      console.log("Getting stream already created for camera "+params[1]);

      var newStdout = Object.create(stdouts[params[1]]);

      newStdout.pipe(resp);
    }
    else
    {
        // Start ffmpeg
        var ffmpeg = child_process.spawn("ffmpeg",[
        "-i",camera,
        "-vcodec","libtheora",
        "-qscale:v","7",        // video quality
        "-f","ogg",             // File format
        "-g","1",               // GOP (Group Of Pictures) size
        "-"                     // Output to STDOUT
        ]);

        stdouts[params[1]] = ffmpeg.stdout;

        // Pipe the video output to the client response
        ffmpeg.stdout.pipe(resp);

    console.log("Initializing camera at "+camera);
    }

    // Kill the subprocesses when client disconnects
/*
    resp.on("close",function(){
      ffmpegs[params[1]].kill();
      console.log("FIM!");
    });
*/
    break;
}
}
else
{
resp.writeHeader(200, {"Content-Type": "text/html"});
resp.write("WRONG CALL");
resp.end();
}
n++;

}).listen(8088);

console.log('Server running at port 8088');
SergioBR
  • 91
  • 2
  • 7

1 Answers1

2

Streams can be thought of as fixed-sized queues in that they "buffer" or store a certain pre-defined amount of data before they either refuse to accept more or data begin to fall off the reading end of the stream; they can be considered queues in that they store data in "first in, first out" (FIFO) order.

Writing to a stream makes data available to be consumed by all of the stream's current readers (those entities which have opened the stream for reading.)

Reading from a stream removes a fixed amount of data from one end of the stream, freeing up space on the other end to potentially accept more data.

Once data are read from the stream and more data have been added to the stream to fill its buffer, those data which have been read are gone.

Node.js streams can indeed have multiple readers at the same time, each of which has it's own pointer into the stream's buffer indicating how much data it has consumed but if you were to add a new reader after a stream has already flushed data out of its buffer, those data are no longer available.

I believe the browser timeouts you're seeing when a subsequent request attempts to read from a stream are occurring because the stream has been exhausted of data and further attempts to read from the stream are waiting until new data become available.

In essence, it appears you are attempting to use streams as if they were long lived data caches, which they are not.

You could try increasing the "high water mark" of your streams so they would buffer an entire video stream in memory which would allow subsequent readers to read the entire data stream from memory without exhausting the stream's buffer. (See node's stream documentation for how to increase a stream's buffer size.)

However, I doubt you'll see much improvement in response by doing this over, say, increasing the size of your operating system's I/O buffers or reading data from an in-memory or solid-state hard drive.

Update (per comment)

If you need to provide one or more continuous (live) video feeds to multiple viewers over time, have you considered creating and initializing your video data streams in "flowing" mode and relying the events they emit to deliver data to your users?

I'd think about

  • creating a stream for each live feed in in the global context (to assure the stream will persist between requests) and in "flowing" (or event-driven) mode
  • assigning the new stream to a global feeds at a unique feedName
  • setting a Stream.end event listener to it to reinitialize the stream if the feed ends unexpectedly

Then in the HTTP Request handler

  • finding the requested feed in feeds at feedName
  • registering event listeners on it for
    1. Stream.readable - to assure there are data to be read
    2. Stream.data - to pipe data to the response
    3. Stream.end and http.ServerResponse.close - to gracefully unregister the previous two handlers

I'd also set a timer to unregister all events set in your request handler and send an appropriate error to the client should data not be forthcoming from the stream within some reasonable amount of time.

("flowing" mode is described in the node.js Stream documentation.)

Rob Raisch
  • 17,040
  • 4
  • 48
  • 58
  • Hi Rob, thanks for your answer, we are aware of that, and we don't need to cache the stream for others to pick up past data, we are streaming live camera videos, so new browsers will pick up the stream at their present point;
    d
    – SergioBR Aug 30 '13 at 12:03
  • 1
    continuing... (had some problem on the net) we don't need to cache the stream for others to pick up past data, we are streaming live camera videos, so new browsers will pick up the stream at their present point, meaning, consuming the stream data at the same time as the other request. The problem is that, trying to save the stream reference (like shown in the code) to use the same stream to pipe to another response, doesn't seem to work... To make things short, we need something like a multicast, to use on html5. Thank you! – SergioBR Aug 30 '13 at 12:47