0

I am running a website on an Azure Web App (standard tier S1) where you can download files from a blob storage in different formats (xml, json).

I am inserting these files (html, text, images as base64 string) as block blobs into a container. Right now I am downloading all blobs and convert the result into a desired format, zip it and offer this as download on my ASP.NET website.

I see a problem here that the whole download and zip process runs inside the web app which could take a while and also reach the memory limit (1.75 GB on S1) when I am downloading, converting and zipping a lot of files. I am expecting more than 100.000 files per container with a resulting container size of 10 GB.

What would be the best way to offer a download of the whole container in a specific format (xml, json) which also has been zipped?

Possible solutions I have found that could help are:

Thanks

Community
  • 1
  • 1
jimbo
  • 582
  • 1
  • 11
  • 28

1 Answers1

0

Depending on the experience you are trying to provide to your customers one really simple solution that bypasses the need for your web app to do any processing or zipping is to create a container per customer and then simply give them a SAS token that provides read access only to that container (see here) and then have them use a tool like AzCopy (see here) to download the files. This completely bypasses the need to zip the files up etc - although it may not be the best user experience.

Jason Hogg - MSFT
  • 1,369
  • 9
  • 10