0

WebApplication1 is running on large number of instances (around 100,000) across the geographical location. WebApplication2 is running on only one machine and will be adding/updating records in the cloud storage.

Now I want to sync all these records(added by Application2) in all instances where my Application 1 is running(i.e ~100,000) once or twice in a day around same time. I will be caching it in-memory for later use.

One Record is 3KB large and at max there are 1000 records. (3MB total storage). I am restricted to use any cloud storage that Azure provides.

Any suggestions on which storage to use keeping in mind the cost and less throttles.

vinal
  • 41
  • 3
  • How are the Application1 instances connected and informed or otherwise understand when to sync? – Noah Stahl Aug 27 '20 at 18:45
  • @NoahStahl: My requirement is to get that data on a fixed time once or twice in a day. Thats the SLA. – vinal Aug 28 '20 at 04:17

2 Answers2

0

You could use Azure IOT Hub.

To send messages out to the instances requires the Standard edition.

It supports 4kB message size to OK there.

If you need to send 400000 messages per day, you need S1, which will cost ca. USD 30 per month.

From your question it sounds like you need to send 1000 * 100000 = 100000000 messages per day, this will require the S3, costing USD 3000 per month.

Shiraz Bhaiji
  • 64,065
  • 34
  • 143
  • 252
0

While there are many options, if it's just a matter of delivering small records upon request from a single source, I'd suggest just using a simple Azure Function with HTTP trigger to host a public web API endpoint URL. That's straightforward and well-understood tech that any client can interface to over HTTP, and clients can poll this cheaply looking for the latest version of the content. The automatic scaling should handle the large client pool as demand changes.

The function itself could persist the data cheaply in private blob storage container if it's just a small number of textual records, which would save the overhead of a proper database.

The next level up could be a proper ASP.NET Core web API hosted in Azure App Service, backed by a database like CosmosDB. But if you can keep it simple, why not?

Noah Stahl
  • 6,905
  • 5
  • 25
  • 36
  • Don't you think cold start of Azure Function will be an issue when serving to these many requests? – vinal Aug 29 '20 at 16:15
  • No. Including a simple timer "warmup" function can keep the function app warm. And if requests come in more frequently then one per 15 minutes, your function app will always be warm. But even with an extra occasional second or two, I assume this isn't something a user would see? – Noah Stahl Aug 29 '20 at 17:31