Node.js is single threaded. Therefore as long as you have a long running routine it cannot process other pieces of code. The offending piece of code in this instance is your double for loop which takes up a lot of CPU time.
To understand what you're seeing first let me explain how the event loop works.
Node.js event loop evolved out of javascript's event loop which evolved out of web browsers event loop. The web browser event loop was originally implemented not for javascript but to allow progressive rendering of images. The event loop looks a bit like this:
,-> is there anything from the network?
| | |
| no yes
| | |
| | '-----------> read network data
| V |
| does the DOM need updating? <-------------'
| | |
| no yes
| | |
| | v
| | update the DOM
| | |
'------'--------------'
When javascript was added the script processing was simply inserted into the event loop:
,-> is there anything from the network?
| | |
| no yes
| | |
| | '-----------> read network data
| V |
| any javascript to run? <------------------'
| | |
| no yes
| | '-----------> run javascript
| V |
| does the DOM need updating? <-------------'
| | |
| no yes
| | |
| | v
| | update the DOM
| | |
'------'--------------'
When the javascript engine is made to run outside of the browser, as in Node.js, the DOM related parts are simply removed and the I/O becomes generalized:
,-> any javascript to run?
| | |
| no yes
| | |
| | '--------> RUN JAVASCRIPT
| V |
| is there any I/O <------------'
| | |
| no yes
| | |
| | v
| | read I/O
| | |
'------'--------------'
Note that all your javascript code is executed in the RUN JAVASCRIPT part.
So, what happens with your code when you make 10 connections?
connection1: node accepts your request, processes the double for loops
connection2: node is still processing the for loops, the request gets queued
connection3: node is still processing the for loops, the request gets queued
(at some point the for loop for connection 1 finishes)
node notices that connection2 is queued so connection2 gets accepted,
process the double for loops
...
connection10: node is still processing the for loops, the request gets queued
(at this point node is still busy processing some other for loop,
probably for connection 7 or something)
request1: node is still processing the for loops, the request gets queued
request2: node is still processing the for loops, the request gets queued
(at some point all connections for loops finishes)
node notices that response from request1 is queued so request1 gets processed,
console.log gets printed and res.send('over') gets executed.
...
request10: node is busy processing some other request, request10 gets queued
(at some point request10 gets executed)
This is why you see node taking 10 seconds answering 10 requests. It's not that the requests themselves are slow but their responses are queued behind all the for loops and the for loops get executed first (because we're still in the current loop of the event loop).
To counter this, you should make the for loops asynchronous to give node a chance to process the event loop. You can either write them in C and use C to run independent threads for each of them. Or you can use one of the thread modules from npm to run javascript in separate threads. Or you can use worker-threads which is a web-worker like API implemented for Node.js. Or you can fork a cluster of processes to execute them. Or you can simply loop them with setTimeout if parallelism is not critical:
router.use('/test/:id', function (req, res) {
var id = req.param('id');
console.log('start cpu code for ' + id);
function async_loop (count, callback, done_callback) {
if (count) {
callback();
setTimeout(function(){async_loop(count-1, callback)},1);
}
else if (done_callback) {
done_callback();
}
}
var outer_loop_done=0;
var x2=0;
async_loop(10000,function(){
x1++;
async_loop(30000,function(){
x2++;
},function() {
if (outer_loop_done) {
console.log('cpu code over for ' + id);
request('http://terranotifier.duapp.com/wait3sec/' + id,
function (a,b,data){
console.log('IO over for ' + data);
res.send('over');
}
);
}
});
},function(){
outer_loop_done = 1;
});
});
The above code will process a response from request()
as soon as possible rather than wait for all the async_loop
s to execute to completion without using threads (so no parallelism) but simply using event queue priority.