Mongo db operations are getting starved in a rabbit mq consumer .
rabbitConn.createChannel(function(err, channel) {
channel.consume(q.queue, async function(msg) {
// The consumer listens to messages on Queue A for suppose based on a binding key.
await Conversations.findOneAndUpdate(
{'_id': 'someID'},
{'$push': {'messages': {'body': 'message body'}}}, function(error, count) {
// Passing a call back so that the query is executed immediately as mentioned in the
// mongoose document http://mongoosejs.com/docs/api.html#model_Model.findOneAndUpdate
});
});
});
The problem is if there are a large number of messages being read the mongo operations are getting starved and executed when the queue has no more messages. So if there are 1000 messages in the queue.The 1000 messages are read first and then and then mongo operation is getting called.
- Would running the workers in a different nodejs process work ?
Ans: Tried doing this decoupling the workers from the main thread, does not help.
- I have also written a load balancer with 10 workers but that does not seem to help, is the event loop not prioritizing the mongo operations ?
Ans: Does not help either the 10 workers read from the queue and only execute the findOneAndUpdate once there is nothing more to read from the queue.
Any help would be appreciated.
Thank you