0

How to upload large files with asp.net core web api? When exceeding 500mb, the following exception message appears. I also set the "COMPlus_gcAllowVeryLargeObjects" environment variable and startup.

services.Configure<FormOptions>(options =>
    {
        options.BufferBodyLengthLimit = Int64.MaxValue;
        options.MemoryBufferThreshold = Int32.MaxValue;
        options.MultipartBodyLengthLimit = long.MaxValue;
        options.MultipartBoundaryLengthLimit = int.MaxValue;
        options.MultipartHeadersLengthLimit = Int32.MaxValue;
    });

Below is my code. Is there anything I need to set up?

[HttpPost("{id}")] 
[RequestFormLimits(MultipartBodyLengthLimit = long.MaxValue)]
[DisableRequestSizeLimit]
public ActionResult UploadLargeFiles(string id, [FromForm]IFormFile files)
{
    try
    {
        string fileName = files.FileName;
        int fileSize = Convert.ToInt32(files.Length);

        var uploadProvider = new JObject();
        var res = new JArray();

        var isExistence = _mailService.GetUploadFolder(id);
        if (isExistence != HttpStatusCode.OK)
        {
            var createFolder = _mailService.CreateUploadFolder(id);
            if (createFolder != HttpStatusCode.Created)
            {
                ModelState.AddModelError("OneDriveFolderError", "");
                return BadRequest(ModelState);
            }
        }
        if (files.Length > 0)
        {
            byte[] data = new byte[fileSize];

            var uploadSessionUrl = _mailService.CreateUploadSession(id, fileName);
            if (uploadSessionUrl != null)
            {
                uploadProvider = _mailService.UploadByteFile(id, uploadSessionUrl, data, fileName, fileSize);
                res.Add(uploadProvider);

                Array.Fill(data, (byte)0);
            }
            else
            {
                ModelState.AddModelError("sessionFail", "");
                return BadRequest(ModelState);
            }
        }

        var Link = this.SaveFileDownloadLink(res);
        return Ok(Link);
    }
    catch (ArgumentNullException e)
    {
        return NotFound(e.Message);
    }
}

Upload the file to OneDrive in byte format. Can a byte hold 2 GB or more?

public JObject LargeFileUpload(string upn, string url, byte[] file, string fileName, int fileSize)
{
    int fragSize = 4 * 1024 * 1024; //4MB => 4 * 1024 * 1024;
    var byteRemaining = fileSize;
    var numFragments = ( byteRemaining / fragSize ) + 1;
    int i = 0;
    var responseCode = HttpStatusCode.OK;
    var jObject = new JObject();

    while (i < numFragments)
    {
        var chunkSize = fragSize;
        var start = i * fragSize;
        var end = i * fragSize + chunkSize - 1;
        var offset = i * fragSize;

        if (byteRemaining < chunkSize) {
            chunkSize = byteRemaining;
            end = fileSize - 1;
        }

        var contentRange = " bytes " + start + "-" + end + "/" + fileSize;

        using (var client = new HttpClient())
        {
            var content = new ByteArrayContent(file);
            content.Headers.Add("Content-Length", chunkSize.ToString());
            content.Headers.Add("Content-Range", contentRange);

            var response = client.PutAsync(url, content);
            var strData = response.Result.Content.ReadAsStringAsync().Result;
            responseCode = response.Result.StatusCode;

            
            if (responseCode == HttpStatusCode.Created)
            {
                JObject data = JObject.Parse(strData);
                string downloadUrl = data["@content.downloadUrl"].ToString();
                string itemId = data["id"].ToString();

                
                fileSize = fileSize / 1000;
                jObject = JObject.FromObject(new { name = fileName, id = itemId, url = downloadUrl, size = (double)fileSize });
            }
            
            else if (responseCode == HttpStatusCode.Conflict)
            {
                var restart = RestartByteFile(upn, url, fileName);
                responseCode = restart;
            }
        }
        byteRemaining = byteRemaining - chunkSize;
        i++;
    }

    if (responseCode == HttpStatusCode.Created) { return jObject; }
    else return jObject = JObject.FromObject(new { result = "Fail" });
}
김세림
  • 291
  • 1
  • 5
  • 17
  • I believe in .NET Core 2.1 I had to modify the default `web.config` to allow for large file uploads. I believe a limitation was around 2 GB. They are still working on it: https://github.com/dotnet/AspNetCore/issues/2711 – Andy Aug 04 '20 at 02:00
  • @Andy Currently, even 1GB cannot be uploaded. I am still getting System.OutOfMemoryException error. – 김세림 Aug 04 '20 at 02:09
  • I misread your question -- I thought you were hitting the limitations of Kestrel/IIS. It's a memory issue. I put up an answer that may help you. – Andy Aug 04 '20 at 03:00

1 Answers1

1

So, your issue is quite apparent. You are trying to upload 500+MB file. In your controller you are getting the size of the files, then doing this:

byte[] data = new byte[fileSize];

You don't want to do this... IFormFile already gives you a Stream that you can read from. Specifically, anything that implements from IFormFile must implement these methods:

void CopyTo(Stream target);
Task CopyToAsync(Stream target, CancellationToken cancellationToken = default(CancellationToken));
Stream OpenReadStream();

So, if you have a stream ready to go, why are you copying that stream to memory?

The quick and dirty way around this is to dump the IFormFile to disk. Then open the file on disk and move/stream it to where it needs to go:

[HttpPost("{id}")] 
[DisableRequestSizeLimit]
public async Task<IActionResult> UploadLargeFiles(string id, [FromForm]IFormFile file)
{
    FileInfo fi = null;
    try
    {
        // this is from my code, but you want to store this
        // somewhere on disk that works with your hosting setup.
        fi = _fileStorageService.GetTempFile();

        // open a stream for writing and copy it over
        using (var s = fi.OpenWrite())
        {
            await file.CopyToAsync(s).ConfigureAwait(false);
        }
    }
    catch { fi?.Delete(); fi = null; }
    
    if(fi == null) { /* return a 500 error */ }

    // TODO: At this point, you have a file that you can open and
    // stream or move to where ever you want

    // Make sure that when you are done with the file, that you
    // delete it if you no longer need it.
}

So now you own the file, it's in your possession. You can now send it to a BackgoundService or other HostedService -- or even to another transient, scoped or singleton service for processing. The sky is the limit.

This method may not be the most efficient because you are essentially copying the file from one part of the disk to another.

You can cut that corner and take the stream returned by OpenReadStream of IFormFile and pass it to a process that would send that stream to another service, sort of like how we used CopyToAsync() to send it to a file on disk.

Just keep in mind that once this endpoint completes, the IFormFile will be disposed. So you'd have to finish the processing while in the context of this execution.

At the end of the day, whatever you end up doing, don't read the file into memory. It's already resident in IFormFile. No reason to duplicate it again in memory.

Andy
  • 12,859
  • 5
  • 41
  • 56