0

I have a background process which generates lots of output. I want to send the output to client as they become available. My expectation is client will send a HTTP GET/POST request, server opens a connection and keeps it alive, then server keeps streaming data as they become available.

Real World Example: When you run a test or shell command in AWS/Heroku/GoogleAppEngine, it shows the output in real time, as if the command was running on my local machine. How are they doing that?


Server: In this sample server, the process generates a message every second.

function handleRequest (request, response) {
  for (let i = 0; i < 10; i++) {
    response.write('This is message.' + i)
    require('child_process').execSync("sleep 1")
  }
  response.end()
}


Client: The client should receive data as they become available i.e. one by one. But the output is of course what I expected, the entire data is collected at once and sent back to the client.

request.put(formData)
       .on('data', function (data) {
         console.log(data.toString())
       })


I am pretty new to NodeJS, but I am hoping maybe I can return some form of writable stream back to client, and as data are written to this stream on server side, the client will receive it on its end?

Rash
  • 7,677
  • 1
  • 53
  • 74
  • I think you are talking about WebSockets – Jarek Kulikowski Jul 16 '17 at 03:39
  • @JarekKulikowski But this is not a web application and I do not want my clients to keep on sending requests to see if more data is available. Since `response` object has `on()` method, I thought sending data in chunks from server would be very easy. – Rash Jul 16 '17 at 03:42
  • In the case of WebSockets the client (browser or another server) opens a connection and waits for data from the server. The difference between WebSockets protocol (ws://) and http:// is that ws:// connection remains open indefinitely. I've only used WebSockets between browser and WebServer, but, if I am not mistaken it would be just as easy to set it up between two node servers. – Jarek Kulikowski Jul 16 '17 at 03:49
  • @JarekKulikowski I am considering websockets but before that I want to understand, say I have a huge file, so I write in server `fs.createReadStream()` and pipe that to `response` and in client I do `request.on('data', callback)`. In this scenario the data is chunked and delivered to client `as they are being read`. My question is how my scenario any different than this? – Rash Jul 16 '17 at 03:54
  • I don't think it is different in principle, but I don't think I can answer this question as I've never moved large amounts of data over WebSockets. If you are connecting two node machines as client<->server, then WebSockets don't necessarily help because you can do the same thing via http in a more stable fashion. I think this link could help https://stackoverflow.com/questions/39335686/what-is-the-most-efficient-way-of-sending-files-between-nodejs-servers – Jarek Kulikowski Jul 16 '17 at 04:58

0 Answers0