If I have an event listener to any network connection where the underlying messages from the protocol are guaranteed to be ordered. It could be NodeJS' net TCP socket connection or a rabbitmq AMPQ connection with a high prefetch value.
I want to find a way to serially process them in the order that the messages arrive from the event listener.
Here's the rough sketch of the solution I've come up with exploiting NodeJS' single threaded event loop behaviour. I believe it will always work from a theoretical standpoint unless I've missed something.
Are they any drawbacks to this approach and can we do better? What I can see immediately is that the stack size of the recursive call can grow and exceed the limit if we receive a lot of messages in quick succession.
beingProcessed = false;
queue = new Queue(); // synchronous in memory queue
.on("messageFromTCPSocketOrAMPQ", (msg) => { // sync callback
// very important that the callback here does no async work prior to inserting into the
// queue to guarantee that the next message won't be pushed first
queue.push(msg); // in memory queue
processSerially(msg);
})
// sync function
processSerially() {
if(beingProcessed || queue.size() === 0) {
return;
}
beingProcessed = true;
// this function can be made never throw depending on the use case
doSomeTaskAsyncWithMessage(queue.pop(), () => {
beingProcessed = false;
// this recursive call in the callback is the reason why this works
processSerially();
})
}