3

I want to monitor some events coming from my application.

One option is to send data to Azure Event Hub and use stream analytics to do some post-processing and enter the data into cosmos db.

Another option is to store to cosmos db from application and run a periodic azure function to do the processing and store it back.

What is the right way to do it? Is there a better way to do it?

Pranav Raj
  • 781
  • 1
  • 8
  • 19

3 Answers3

2

The best architecture way is to have Event Hubs to Cosmos DB. I have done the same implementations using Application -> EventHub -> ChangeFeed Azure Function -> Cosmosdb

You can read about Here.

Sajeetharan
  • 216,225
  • 63
  • 350
  • 396
1

ChangeFeed is offered by Azure Cosmos DB out of the box for this case. It works as a trigger on Cosmos DB change.

zile
  • 109
  • 11
1

It depends on the kind of processing that you would like to do with the events ingested. If it is event at a time processing, a simple Azure Function with CosmosDB changefeed processor might be enough. If you would like to do stateful processing like windowing or event order based computation, azure Stream Analytics would be better. Stream Analytics also provides native integration to PowerBI dashboards. Same job can send the data both to CosmosDB and PowerBI. If you are going to use Azure Stream Analytics, you will have to use EventHub for event ingestion. Using EventHub for ingestion also has other benefits like being able to archive events to blob storage.

Vignesh Chandramohan
  • 1,306
  • 10
  • 15