3

For my new project every component is going to be deployed in Azure. I have a 3rd party application that processes events using RabbitMQ and I want to subscribe to these events and process them to store the data in the events in my own database.

What would be the best way to go? Using webjobs and Writing my own Custom Trigger/ Binder for RabbitMQ?

Thanks for the advice in advance

Marcel Hoekstra
  • 1,334
  • 12
  • 19

2 Answers2

2

Based on your requirement, I assume that Azure WebJob is an ideal approach to achieve your purpose. In that case, you could use a WebJob as a consumer client to subscribe the events and process the data. Please try to create a WebJob and following the link provided by Mitra to subscribe the event and implement your logic processes in the WebJob.

Please pay attention that WebJob run as background processes in the context of an Azure Web App. In order to keep your WebJob running continuously, you need to be running in standard mode or highly and enable the "Always On" setting.

Consideration of scaling, you could use the Azure Websites scale feature to scale extra WebJobs instances. For scaling, you could refer to this tutorial.

Bruce Chen
  • 18,207
  • 2
  • 21
  • 35
1

For having subscription based routing, you can use Topics in Rabbitmq. Using topics you can push events to specific queues and then consumers at those queues can do processing to write data into the database. The only thing to take care of is to have a correct routing key for each queue.

That way you can have subscription based mechanism. The only thing with this approach will be that for each event there will be one queue.

The benefit of having one queue per event is it will be easy to keep track of events and so easy debugging.

If the number of events are very large then you can have only one queue but after consuming the message you have to trigger the event.

Here is the link for the reference: https://www.rabbitmq.com/tutorials/tutorial-five-python.html

Mitra Ghorpade
  • 717
  • 4
  • 8