My app uses a RabbitMQ queue to store messages, then I have a worker consuming those messages and inserting them in the database. The intention is not to stress the database in workload peaks. The problem I have is that in those peak the publish rate of the queue es really high, and the worker start receiving more messages per second than it can process until it crashes.
Is there any way to control the consumption rate so I can ensure the worker is not receiving messages faster than it can consume them? Messages are not critical, so I don't mind how much time they stay enqueued until they can be processed by the worker.
I'm using amqplib for Node.JS, and this is the code I'm using for the worker:
open.then(function(conn) {
var ok = conn.createChannel();
ok = ok.then(function(ch) {
ch.assertQueue(q);
ch.consume(q, function(msg) {
if (msg !== null) {
message = JSON.parse(msg.content.toString());
processMessage(message);
}
}, {noAck: true});
});
return ok;
}).then(null, console.warn);