0

i'm using a folder inside my project to upload files .

[HttpPost, DisableRequestSizeLimit]
public IActionResult Upload()
{
    try
    {
        var file = Request.Form.Files[0];
        var folderName = Path.Combine("Resources", "Images");
        var pathToSave = Path.Combine(Directory.GetCurrentDirectory(), folderName);

        if (file.Length > 0)
        {
            var fileName = ContentDispositionHeaderValue.Parse(file.ContentDisposition).FileName.Trim('"');
            var fullPath = Path.Combine(pathToSave, fileName);
            var dbPath = Path.Combine(folderName, fileName);

            using (var stream = new FileStream(fullPath, FileMode.Create))
            {
                file.CopyTo(stream);
            }

            return Ok(new { dbPath });
        }
        else
        {
            return BadRequest();
        }
    }
    catch (Exception ex)
    {
        return StatusCode(500, $"Internal server error: {ex}");
    }
}

i was wondering if there's a risk to lose this files when we have new update for the customer . if there's a better solution for upload file and getting file link afterwards with .net core please let me know :)

David Browne - Microsoft
  • 80,331
  • 6
  • 39
  • 67
Thameur Saadi
  • 159
  • 1
  • 8
  • 1
    I would suggest putting all that upload code in a service class, which should make it easier to test and not bloat your controller. In addition, if you are saving files to the web server itself, you are limited to just the one web server. You may want to look at azure storage blobs and a database to store the url to the blob. Azure storage blobs will be replicated for you, so you don't have to worry about backups, etc. – jjxtra Apr 22 '20 at 15:05
  • i thought about this but the company is using aws , so i tried with S3 bucket but i couldn't use it for the web api since it doesn't provide you with a link to the file you can only dwnload it to you machine. (that was my first try ) if you know a way to use s3 bucket in web service that would help a lot – Thameur Saadi Apr 22 '20 at 15:12
  • 1
    https://docs.aws.amazon.com/AmazonS3/latest/dev/HLuploadFileDotNet.html. Given the path and file, you should be able to reconstruct the full url with a second call, example here in the second answer: https://stackoverflow.com/questions/10975475/amazon-s3-upload-file-and-get-url – jjxtra Apr 22 '20 at 22:28

2 Answers2

1

i was wondering if there's a risk to lose this files when we have new update for the customer

Deploying an application means, you'll copy the new executables (dlls) and other files stored in git to the place where the old version is running. Risk is, that you'll do it wrong and delete the data directory.

That said: You should not save user data together with your executables or other files that are part of your app (e.g. images used in HTML, ...). It's much easier to handle (backups, deployments, ...) if data is clearly separated.

if there's a better solution

The solution: Save it in a folder that can be configured by admins. This can be done using so called Options: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/configuration/options?view=aspnetcore-3.1

You'll end up with a class that stores your path

public class StorageOptions {
   public string BasePath {get;set;}
}
Christoph Lütjen
  • 5,403
  • 2
  • 24
  • 33
0

so at the end i decided to use 'aws S3 bucket' using this code

PutObjectResponse response = null;

            using (var stream = new MemoryStream(fileBytes))
            {
                var request = new PutObjectRequest
                {
                    BucketName = bucket,
                    Key = "folder/" + fileName,
                    InputStream = stream,
                    ContentType = file.ContentType,
                    CannedACL = S3CannedACL.PublicRead,
                };

                response = await client.PutObjectAsync(request);
            };

and as you mentioned in comments i can get the link afterwards of the file

Thameur Saadi
  • 159
  • 1
  • 8