0

.Net 4.5.1 - MVC5 - EF6

The requirement is to allow large file uploads(<200MB). I have a ViewModel with HttpPostedFileBase. I set the Input Stream of that to a DB Entity NonMapped Stream Property. Which is then sent to the Repository where I save the Copy the Stream to a SQLFileStream. This all works.

However... When debugging after about ~4 large uploads I receive a System.OutOfMemory Exception. I can see that HttpRawUploadedContent is continuing to grow and is keeping the no longer used PostedFiles in memory. What am I doing wrong? How can I get these things to dispose? Is there a better way? Any guidance is appreciated.

AttachmentViewModel.cs

public class AttachmentViewModel
{
    #region Public Properties

    public int Id { get; set; }

    public string FileName { get; set; }

    [MaxFileSize]
    public HttpPostedFileBase PostedFile { get; set; }

    
    #endregion Public Properties
}

Non Mapped Property on Database FileEnitity

    [NotMapped]
    public Stream PostedStream {  get; set; }

FileRepository.cs Add Method

    public override T Add(T entity)
    {
        base.Add(entity);

        //Save to generate ID
        _ctx.SaveChanges();

        using (var tx = new TransactionScope())
        {
            try
            {
                var id = entity.Id;

                var rowData =
                    _ctx.Database.SqlQuery<FileStreamData>(FileRowSelect, new SqlParameter("id", id))
                        .First();

                using (var dest = new SqlFileStream(rowData.Path, rowData.Transaction, FileAccess.Write))
                {
                    //Copy the posted stream to the SQLFileStream
                    entity.PostedStream.CopyTo(dest);
                }

                tx.Complete();
            }
            catch
            {
                //Error Uploading Stream Revert
                base.Remove(entity);
                _ctx.SaveChanges();
                throw;
            }
        }
        return entity;
    }

Screen Shot of Memory enter image description here

Community
  • 1
  • 1
Tony
  • 1,297
  • 10
  • 17
  • 1
    You need to find another solution. Web servers are not designed to handle this type of upload size. HTTP as a protocol isn't designed to handle this type of upload size, to be honest. – Chris Pratt May 15 '15 at 17:38
  • 1
    Do you have requestLengthDiskThreshold on web.config? – Kadir May 15 '15 at 17:51
  • Yes it is set to 204800 – Tony May 15 '15 at 17:52
  • @Chris Pratt The more I think about the more I am annoyed by the first response. Not only is it bad information but it doesn't address the question at all. I would love to read in the Http specification where it says it was designed only for small file uploads. Furthermore, what does Http have to do with this question at all? What a stupid response. The requirement is to upload large files and store them in a database. Instead of posting your misinformed opinion as if it was fact and stating "find another solution"; how about suggest another way that fulfills the requirements. – Tony May 15 '15 at 18:30
  • 1
    Remember that HTTP as a spec was created back in the early 90's before there even was such a thing as a "web". HTTP is designed to be quick and efficient for *short* communications. It was never design for nor intended to be used for massive extended communication sessions uploading hundreds of megabytes worth of data. That's just a simple fact. What does this have to do with anything? It's the freaking the protocol you're attempting to use to upload the files. Why do you think the FTP protocol exists? – Chris Pratt May 15 '15 at 19:34
  • 1
    As for sites like YouTube. Well, they have to use what they have to use. For better or for worse, that means HTTP. But, one difference between them and you is *massive* amounts of money to throw at the problem. They can buy servers with terabytes of RAM, large super fast hard drives, CDN storage, etc. They also many times by direct-connect pipes so that that's no interchange between their datacenter and a major pipe provider like Level3. That makes alot of things possible on a protocol that wasn't designed for it. – Chris Pratt May 15 '15 at 19:37
  • 1
    You still have provided no proof that Http was not designed for large File Uploads. The fact you even brought up Ftp shows me that you don't understand the problem. Ftp gains absolutely nothing in this scenario. If I wanted the user to be able to resume uploads I would use Ftp. The "Freaking Protocol" used has nothing to do with the question or the problem. The problem has a to do with a memory issue in either in .Net, IIS, or my code. I am talking about uploading a ~4 Files less than 200MB; youtube has millions of uploads up to 2GB. Obviously, the "Freaking Protocol" can support my needs. – Tony May 15 '15 at 20:22
  • @kad1r Thank for your response you got me in the right direction. If you post an answer I will accept it. – Tony May 15 '15 at 21:42
  • I'm glad it helped you to fix. – Kadir May 15 '15 at 22:31

1 Answers1

1

The credit should go to @kad1r as he pointed me in the right direction. (i.e.: not Http)

Turns out I did not understand the what requestLengthDiskThreshold actually does.

The RequestLengthDiskThreshold should be less than the MaxRequestLength value and indicates at what point or 'threashold' that the request will begin to be buffered transparently onto disk.

http://forums.asp.net/t/1680176.aspx?httpRuntime+maxRequestLength+vs+requestLengthDiskThreshold+

By increasing this value IIS stores more of the request in memory rather than on disk.

Tony
  • 1,297
  • 10
  • 17