I have the following scenario that I am thinking of implementing via Azure Stream Analytics.
My input consists of:
- Events streaming in from an Azure Event Hub.
- Reference data that relates to the events. Some of this data is "slowly changing" from day to day.
I need to join the events and the reference data, process them and output tables that will make up a "data warehouse" (with Power BI in mind as the consumer).
The output would be made up of:
- A facts table where the most important events are stored.
- A few dimension tables that hold the values that make up the facts.
Is Azure Stream Analytics suitable for this kind of work? It seems to me that ASA is well suited to persisting the events from the event hub stream and into a facts table. However, the additional work of keeping the dimensions tables up to date - i.e. adding new values periodically - is not a good fit.
Am I correct in this analysis? Should I switch over to Azure Data Factory for my project?