0

We have a requirement where there is need to implement queuing so that we can control concurrency and not overload the database. From Microsoft world we have chosen to use MSMQ to store the requests. Planning to create 100s of queues (1 for each customer) and process the queues in parallel to achieve maximum concurrency.

Our main design goal is to allow only one call per customer/queue at a time and process each customer independently.

We are stuck on choosing the right technology to process the queues. There are two.

  1. Create a .NET windows service and process multiple queues using the queue object event ReceiveCompleted. Here we have the option to attach same method to multiple queues. Need to add custom concurrency code for single queue using some sync lock. Reference1

  2. Have multiple WCF services with WAS to process each queue (same copy for different queues; change in queue name only) and implement throttling to limit concurrency per service. Looks like only one queue can be processed by a WCF service. Reference2

Which is the one do you would suggest? Is there a better way? Please help us!!!

Community
  • 1
  • 1
Sonu
  • 1
  • 2

1 Answers1

0

Your option #1 is simpler and more straightforward. It should also perform well if you follow the same pattern as the code in your referenced MSDN link. It uses asynchronous MSMQ calls which will be serviced by a worker thread pool. That eliminates the need to constantly poll the queues for newly-arrived messages and results in essentially zero computational overhead for idle queues.

Ensuring that no more than one message per queue is being processed at a time is also quite easy. Simply do not call Queue.BeginReceive() on any message queue until after you have completed the processing on that queue's previous message. No custom concurrency code required.

Dan Hermann
  • 1,107
  • 1
  • 13
  • 27