-1

I have a scenario where event hub gets data in every 10 seconds, which pass to the stream analytics and then which is passed to the Azure SQL Server. The technical team raised the concerns that Azure SQL is unable to handler so much of data, if data raises 2,00,00,000. then it stops to work.

Can you please guide me is it actual problem of Azure SQL, if it is then can you please suggest me the solution.

  • 2
    Let me gets this clear. The eventhub receives one event every ten seconds, that event is send to Azure Stream Analytics (ASA) and from there it is stored in Sql Server. is there any aggregation taking place in the ASA job or are the events stored in Sql Server as is? – Peter Bons Aug 01 '17 at 08:15
  • There are multiple events come to the eventhub which should pass to Azure SQL through Stream analytics. – Hardik Shah Aug 01 '17 at 09:01
  • That is not a complete answer to my questions. – Peter Bons Aug 01 '17 at 09:07
  • Which service tier is your Azure SQL running on? Each service tier caters for different performance requirements. If you only want to store the event data, you can also check out Azure Table Storage/ Data Lake Store. Stream Analytics can point to one of these as well. – Mihir Aug 01 '17 at 09:58

1 Answers1

2

Keep in mind that 4TB is the absolute maximum size of an Azure SQL Premium instance. If you plan to store all events for your use case, then this will fill up very quickly. Consider using CosmosDb or Event Hub Capture if you really need to store the messages indefinitely and use SQL for aggregates after processing with SQL DW or ADLS.

Remeber that to optimise Event Hubs you must have a partitioning strategy to optimise the throughput. See the docs.

Murray Foxcroft
  • 12,785
  • 7
  • 58
  • 86