In my application, we're uploading a large amount of image data at a time. Request made through an Angular portal and the ASP.NET web API is receiving the request both are hosted on Azure server. From the API I'm directly converting the image data to bytes and uploading to Azure blob. Is this a proper way to upload or Do I need to save those images on my server first (like on some path 'C:/ImagesToUpload') and then upload to Azure blob from there? I'm concerned because we're uploading a large amount of data and the way I'm using right now, will create memory issue or not, I've no idea about that. so if someone
2 Answers
I have developed same thing. We have same requirement as large number of files. I think You have to first compress the file in API side then have to send it in blob file using the SAS token. But make sure that In Azure Blob storage you must have to pass data less then the size of 5 MB so I also found solution of that. Here I have sample code that will work pretty good after some testing.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(SettingsProvider.Get("CloudStorageConnectionString", SettingType.AppSetting));
var blobClient = storageAccount.CreateCloudBlobClient();
var filesContainer = blobClient.GetContainerReference("your_containername");
filesContainer.CreateIfNotExists();
var durationHours = 24;
//Generate SAS Token
var sasConstraints = new SharedAccessBlobPolicy
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(durationHours),
Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Read
};
// Generate Random File Name using GUID
var StorageFileName = Guid.NewGuid() + DateTime.Now.ToString();
var blob = filesContainer.GetBlockBlobReference(StorageFileName);
var blobs = new CloudBlockBlob(new Uri(string.Format("{0}/{1}{2}", filesContainer.Uri.AbsoluteUri, StorageFileName, blob.GetSharedAccessSignature(sasConstraints))));
//Code for divide the file into the 4MB Chunk if its Greater than 4 MB then
BlobRequestOptions blobRequestOptions = new BlobRequestOptions()
{
SingleBlobUploadThresholdInBytes = 4 * 1024 * 1024, //1MB, the minimum
ParallelOperationThreadCount = 5,
ServerTimeout = TimeSpan.FromMinutes(30)
};
blob.StreamWriteSizeInBytes = 4 * 1024 * 1024;
//Upload it on Azure Storage
blobs.UploadFromByteArrayAsync(item.Document_Bytes, 0, item.Document_Bytes.Length - 1, AccessCondition.GenerateEmptyCondition(), blobRequestOptions, new OperationContext());
But make sure before call this funtion if you have huge amount of data then use any of compression technology. I have used "zlib" library. You can find it on http://www.componentace.com/zlib_.NET.htm for C# .NET it's freeware. If you want to know more then visit this https://www.zlib.net/.

- 63
- 8
Per my understanding, you could also leverage fineuploader to directly upload your files to Azure Blob storage without sending the file to your server first. For detailed description, you could follow Uploading Directly to Azure Blob Storage.
The script would look like as follows:
var uploader = new qq.azure.FineUploader({
element: document.getElementById('fine-uploader'),
request: {
endpoint: 'https://<your-storage-account-name>.blob.core.windows.net/<container-name>'
},
signature: {
endpoint: 'https://yourapp.com/uploadimage/signature'
},
uploadSuccess: {
endpoint: 'https://yourapp.com/uploadimage/done'
}
});
You could follow Getting Started with Fine Uploader and install the fine-uploader
package, then follow here for initializing FineUploader for Azure Blob Storage, then follow here to configure CORS for your blob container and expose the endpoint for creating the SAS token. Moreover, here is a similar issue for using FineUploader.
From the API I'm directly converting the image data to bytes and uploading to Azure blob.
I'm concerned because we're uploading a large amount of data and the way I'm using right now, will create memory issue or not
For the approach about uploading the file to your Web API endpoint first, then upload to azure storage blob, I would prefer use MultipartFormDataStreamProvider
for storing the uploaded file into a temp file in the server instead of MultipartMemoryStreamProvider
which would use the memory. Details you could follow the related code snippet in this issue. Moreover, you could follow the github sample for uploading files using the Web API.

- 1
- 1

- 18,207
- 2
- 21
- 35