I have an Azure web application that sends messages to a Azure queue and I have workers that poll the queue and process the messages as they become available. For the most part, this is working fine and I am satisfied with this design.
However there is one scenario where it is not working so great and I looking for suggestions on how to handle this specific case:
Once in a while, a particular user (let's call him Bob) will perform a few actions that result in several messages being sent to the queue which creates a backlog and other users have to wait for Bob's messages to be processed before their messages are processed. They are penalized for the fact that Bob sent many messages in the queue.
How can I improve my design to prevent messages for one user do not cause messages for other users to be delayed? My first instinct was to create one queue for each user but if I have a few thousand users, I'm not sure it's a reasonable design.
Are there any design patterns that exist to address this scenario? If so, are there any C# implementations of the pattern I could reuse?