1

I push 100 tasks to the queue on the client side like this.

var ref = firebase.database().ref('queue/tasks');
for(var i = 0;i<100;i++){
 ref.push({'foo': 'bar',i:i})
};

At the server, my worker looks like this

var queue = new Queue(ref, function(data, progress, resolve, reject){
  console.log(data);
  resolve();
}

The issue is that to complete processing all the tasks, it takes about 60 seconds, which is way to slow. Is there any way my worker can receive the tasks faster. I want to use the queue for the my client to send a request to the server. But with the current speed of the queue, I will not be able to support many concurrent users. I am looking to support 50k concurrent users.

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
Pankaj
  • 2,538
  • 3
  • 26
  • 39

1 Answers1

1

One thing I noticed is that you don't have any options being passed in, and you can set a number of workers that will pull simultaneously from your Queue.

options = {
  'numWorkers': 5,
  'sanitize': false,
  'suppressStack': true,
}    
var queue = new Queue(ref, options, function(data, progress, resolve, reject){
  console.log(data);
  resolve();
})

However, while I did notice a speed increase on my machine, with 1 worker my process ran through 100 in about 9 seconds, but with 5 concurrent workers it finished in about 5 seconds.

I'm on a brand new 13" MacBookPro with the i7 chip.

I used performance-now to perform my benchmarks by doing this before the Queue is set up:

var start = now();

and this after the resolve() inside the Queue

console.log(((now()-start)/1000).toFixed(3));
  • 9 seconds for 100 tasks is still very high when compared to the request directly served by the node server. The limitation here I don't think is the worker/server machine, as the rate at which it receives the request is itself very slow. – Pankaj Jan 14 '17 at 05:47
  • I agree that it is still slow, and at that type of scale, I'm sure you need to consider other solutions for your message queue. – Timothy Johnson Jan 14 '17 at 21:11