0

I'm writing some testing code in Node.js that just repeatedly POSTs HTTP requests to a web-server. In simplified form:

function doPost(opts, data) {
    var post_req = http.request(opts, function(res) {
        res.setEncoding('utf8')
        res.on('data', function (chunk) { })
    })
    post_req.write(JSON.stringify(data))
    post_req.end()
}
setInterval(doPost, interval, opts, msg)

I'd prefer that these requests are issued sequentially, i.e. that a subsequent POST was not sent until the first POST received a response.

My question is: due to the non-blocking architecture of the underlying libuv library used by the runtime, is it possible that this code sends one POST out over the connection to the web-server, but then is able to execute another post even if a response from the server has not yet arrived?

If I imagine this with a select() loop, I'd be free to call write() for the second POST and just get EWOULDBLOCK. Or if the network drops, will it just build up a backlog of POST request queued up to the IO thread-pool? It's unclear to me what behavior I should expect in this case. Is there something I must do to enforce completion of a POST before the next POST can start?

Community
  • 1
  • 1
  • did you look into the async library? https://www.npmjs.com/package/async `eachSeries` is the function you want to look at. – aschmid00 Jul 15 '16 at 18:17

2 Answers2

0

Inherintly Node.js runs on a single thread, to run multiple processes, you'll have to run clusters, they are are somewhat akin to multi-threading in Java. (See Node.js documentation on clusters). For example, your code will look something like this:

var cluster = require('cluster');
var numCPUs = require('os').cpus().length;

if (cluster.isMaster) {
  // Fork workers.
  for (var i = 0; i < numCPUs; i++) {
    cluster.fork();
  }
  cluster.on('exit', function(worker, code, signal) {
    console.log('worker ' + worker.process.pid + ' died');
  });
}
else {

  //call the code in doPost
  doPost(opts, data);
}
criticalJ
  • 161
  • 1
  • 5
  • Sorry, I think my question gave the wrong impression; I'm not trying to achieve parallel posts, I think I'd prefer subsequent POST sends to block until I receive the 200 OK response for the prior post. I just want to clarify whether that's how it works with Node.js, because based on what I know of non-blocking IO, there isn't anything guaranteeing that. – zcombinator Jul 15 '16 at 15:01
  • @zcombinator, If you want asynchronous control you'll have to use an external library. [Async](https://github.com/caolan/async) is great for tool for asynchronous control flow. – criticalJ Jul 18 '16 at 13:00
0

I think I've found my answer. I ran some tests under packet capture and found that when the network drops it's important to throttle your POST requests otherwise requests get enqueue'd to the IO pool and depending on the state of connectivity, some may send, others may not, and message order is mangled.