6

I have a asp.net website which processes requests using a 3rd party exe. Currently my workflow is

  1. User accesses website using any browser and fills out a form with job details

  2. Website calls a WCF self hosted windows service which is listening on a port

  3. Windows service launches 3rd party exe to process the job and returns the result to website

  4. Website displays the returned result to the user

The above website was a prototype which now needs to be turned into a production ready deployment. I realize that the above architecture has many points that could break. For example, if the machine is powered off or if the windows service crashes and is no longer listening on the port, all current requests will stop processing. To make the architecture more robust, I am considering the following

  1. User accesses website using any browser and fills out a form with job details

  2. Website writes out the job details to a database

  3. Windows service which is polling the database every 10 seconds for a new job picks up the job and executes it using the 3rd party application. The results are written back to the database.

  4. Website which has now started polling the database, picks up the results and displays them to the user.

The second architecture provides me with more logging capabilities and jobs can start again if they are in a queue. However it involves large amounts of polling which may not be scalable. Can anyone recommend a better architecture?

  • We had done POC for similar kind of requirement, we considered MSMQ at step 2 but didn't implement in the project. Looking forward for answers on this interesting implementation.. – Sunny Jan 09 '13 at 20:10
  • @Sundeep - What alternative to MSMQ did you end up using for your project? Was the project completed or is it still ongoing? Thanks –  Jan 09 '13 at 20:15
  • It was long time ago, application does some backend processing and send notification email to user. So, wanted to replace it for instant processing. – Sunny Jan 09 '13 at 20:25
  • 1
    If you really want this to be robust, then you should either use MSMQ directly, or use one of the WCF bindings which rely on MSMQ. Don't reinvent the wheel, especially if it matters. – John Saunders Jan 10 '13 at 04:13
  • 1
    I;ve used both MQSeries (but MSMQ is the same principle) and "save to db"...both are valid options - HOWEVER: Since you already have and know WCF, look at the 'netmsmqbinding' which allow you to use WCF to write to MSMQ (and read from). http://msdn.microsoft.com/en-us/library/system.servicemodel.netmsmqbinding.aspx – BlueChippy Jan 10 '13 at 04:16

2 Answers2

0

I have implemented the same architecture in one of my applications where users are making multiple requests to process. So I have -

  1. Users goto website and select parameters etc. Submit the request
  2. Request is stored into database table with all the details + user name etc
  3. A service looks the database table and picks up the request in FIFO manner
  4. After the request is processed, the status is updated as Failed or Completed into database table against that requestId, which can be seen by users on website
  5. Service picks up next request if there is any, otherwise stops.
  6. Service runs every 30 mins
Umesh
  • 2,704
  • 19
  • 21
  • I would have gone with this method if I had the luxury of picking up requests once every 30 minutes. However, for my current application, the response needs to be real time and there is potential for large volume (200+ requests every hour). –  Jan 10 '13 at 04:31
0

Instead of Polling I would go with MSMQ or RabbitMQ.

That way you could off-load your processing to multiple consumers (possibly separate servers from the web server) of the queue, and process more requests in parallel.

Pablo Romeo
  • 11,298
  • 2
  • 30
  • 58
  • This seems to be the best option right now. Others have suggested MSMQ too but I can accept only one answer. I will accept this as the answer since it has a good reason to pick this method over polling. –  Jan 10 '13 at 16:29