0

I am considering using Service Broker for messaging between two .NET apps, where one app needs to reliably send messages to the other. I have a pretty good idea on how this looks at the initiator side.

However, the consumer app needs to load messages from the queue, do some processing, and then acknowledge that processing went well. The processing might fail, in which case I need to retry the message. I would also need to keep all messages that has been processed for history purposes.

As far as I can tell, the message is removed from the queue when I do a RECEIVE. So what should the consumer app do in case of a failure ? Push the message onto the queue again ? Can I mark a message as "in processing" and then remove it once my task completed successfully ?

How would I model this using Service Broker ?

Martin Brown
  • 24,692
  • 14
  • 77
  • 122
driis
  • 161,458
  • 45
  • 265
  • 341
  • 1
    Service Broker is a great thing - if you need to have messaging inside or between SQL Server. For comms between two .NET apps, I would look at WCF rather than SQL Server Service Broker... – marc_s Oct 20 '11 at 20:41
  • I need an async queue, that is persistent. If either or both apps dies, the enqueued items cannot be lost. Is there anything in WCF that will give me that ? MSMQ or other distributed message queues are (unfortunately) not an option for this project at the moment. – driis Oct 20 '11 at 21:08
  • Why is MSMQ not an option?? It's there, it's free with every Windows Server, it integrates nicely in with WCF ... it would be the perfect fit for your needs, I believe... – marc_s Oct 21 '11 at 04:50
  • @marc_s, don't ask, I'd love to use the right tool for the job. Unfortunately I am constrained by stuff that is out of my hands at the moment. – driis Oct 21 '11 at 08:09

1 Answers1

3

You can receive within a transaction and rollback the transaction in case of error. Beware though that the built-in poison message handling will disable the queue after 5 such rollbacks for a given message.

Ben Thul
  • 31,080
  • 4
  • 45
  • 68
  • Each individual task might take seconds to complete. Wouldn't it be bad practice and cause contention in the queue to keep a transaction open for that long ? – driis Oct 20 '11 at 21:01
  • 1
    @driis: Queues and RECEIVE in particular are designed to work well in the presence of long lived transactions: enqueue of new messages will not conflict with locks hold by the RECEIVE transaction, and concurrent RECEIVE can proceed in parallel. See http://msdn.microsoft.com/en-us/library/ms171615.aspx – Remus Rusanu Oct 20 '11 at 21:50
  • If processing is *really* long then you should dequeue, update some state table, **enqueue a timer**, commit and start processing. If succeed, reset the timer. If you fail, the timer will fire and you app logic will read the saved state, **enqueue a new timer**, commit and attempt again. See http://msdn.microsoft.com/en-us/library/ms187804.aspx – Remus Rusanu Oct 20 '11 at 21:53