2

Attempting to upload a file larger than 4MB results in a RequestBodyTooLarge exception being thrown with the following message:

The request body is too large and exceeds the maximum permissible limit.

While this limit is documenting in the REST API reference (https://learn.microsoft.com/en-us/rest/api/storageservices/put-range) it is not documented for the SDK Upload* methods (https://learn.microsoft.com/en-us/dotnet/api/azure.storage.files.shares.sharefileclient.uploadasync?view=azure-dotnet). There are also no examples of working around this.

So how to upload large files?

Brad Patton
  • 4,007
  • 4
  • 34
  • 44

2 Answers2

13

After much trial and error I was able to create the following method to work around the file upload limits. In the code below _dirClient is an already initialized ShareDirectoryClient set to the folder I'm uploading to.

If the incoming stream is larger than 4MB the code reads 4MB chunks from it and uploads them until done. The HttpRange is where the bytes will be added to the file already uploaded to Azure. The index has to be incremented to point to the end of the Azure file so the new bytes will be appended.

public async Task WriteFileAsync(string filename, Stream stream) {

    //  Azure allows for 4MB max uploads  (4 x 1024 x 1024 = 4194304)
    const int uploadLimit = 4194304;

    stream.Seek(0, SeekOrigin.Begin);   // ensure stream is at the beginning
    var fileClient = await _dirClient.CreateFileAsync(filename, stream.Length);

    // If stream is below the limit upload directly
    if (stream.Length <= uploadLimit) {
        await fileClient.Value.UploadRangeAsync(new HttpRange(0, stream.Length), stream);
        return;
    }

    int bytesRead;
    long index = 0;
    byte[] buffer = new byte[uploadLimit];

    // Stream is larger than the limit so we need to upload in chunks
    while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0) {
        // Create a memory stream for the buffer to upload
        using MemoryStream ms = new MemoryStream(buffer, 0, bytesRead);
        await fileClient.Value.UploadRangeAsync(ShareFileRangeWriteType.Update, new HttpRange(index, ms.Length), ms);
        index += ms.Length; // increment the index to the account for bytes already written
    }
}
Brad Patton
  • 4,007
  • 4
  • 34
  • 44
1

If you want to upload larger files to file share or blob storage, there is an Azure Storage Data Movement Library.

It provides high-performance for uploading, downloading larger files. Please consider using this library for larger files.

Ivan Glasenberg
  • 29,865
  • 2
  • 44
  • 60
  • That looks like it using v11.x Microsoft.Azure.Storage.File. I'm currently using v12.x Azure.Storage.Files.Shares https://learn.microsoft.com/en-us/dotnet/api/overview/azure/storage?view=azure-dotnet – Brad Patton Sep 24 '20 at 16:08
  • 1
    Either way it's a bit confusing to have these different libraries just to manage remote file shares – Brad Patton Sep 24 '20 at 16:09
  • @BradPatton, I mean if you're focusing on large file uploading/downloading, you can consider it since this library is optimized for doing such thing. But for something else, you don't need to consider this library:). – Ivan Glasenberg Sep 25 '20 at 01:29