0

I am trying to download file from a remote linux server to my local computer using SftpClient.

Here is my code to download the file

        public MemoryStream DownloadFile2(string path)
        {
            var connectionInfo = _taskService.GetBioinformaticsServerConnection();
            MemoryStream fileStream = new MemoryStream();
                        
            using (SftpClient client = new SftpClient(connectionInfo))
            {
                client.ConnectionInfo.Timeout = TimeSpan.FromSeconds(200);
                client.Connect();

                
                client.DownloadFile(path, fileStream);
                fileStream.Seek(0, SeekOrigin.Begin);
                
                var response = new MemoryStream(fileStream.GetBuffer());
                return fileStream;
            }
        }

And here is the controller that called above method.

        public FileResult DownloadFile(string fullPath, string fileName)
        {
            if (!string.IsNullOrEmpty(fileName))
            {
                fullPath = string.Concat(fullPath, "/", fileName);
            }
            var ms = _reportAPI.DownloadFile2(fullPath);

            var ext = Path.GetExtension(fullPath);
            if (ext == ".xlsx")
            {
                return File(ms, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet", fileName);
            }
            return File(ms, "application/octet-stream", fileName);            
        }

I have manage to do it for most of the files, however for certain large '.xlsx' extension files, when I tried to open it, for some reason, I received below error.

enter image description here

If I am on IISExpress, I still manage to open it after I clicked on 'Yes' button, but if I'm using the normal IIS, it failed to open the file after clicked on 'Yes' button.

For other type of files or smaller excel files, it works as expected.

Any idea how can I modified my code to solve this issue?

sicKo
  • 1,241
  • 1
  • 12
  • 35
  • I've encountered issues like this with large files when the ftp connection was interrupted by the server because the file was larger than some maximum number of bytes it was configured to serve (it was around 100MB). – David Waterworth Aug 26 '20 at 05:52
  • Also show how you call this code. And is the file intact when you download it directly, using a different application? How large are the files that cause problems, and does the entire file get downloaded? – CodeCaster Aug 26 '20 at 06:00
  • @DavidWaterworth how do u resolve it? – sicKo Aug 26 '20 at 06:15
  • @CodeCaster yes, the file is intact and entire file got downloaded . The size is just 2MB and above. This is a mvc application, so it's downloaded from the same application via Razor UI. I have updated the post above. – sicKo Aug 26 '20 at 06:17
  • @sicKo it was a limitation at the remote end, I had to ring and point out to them that the files were larger than the maximum size limit they'd set. – David Waterworth Aug 26 '20 at 06:18
  • 1
    @DavidWaterworth in my case, I dont think that is an issue as I can download the file successfuly using Putty or WinSCP. Thanks anyway :) – sicKo Aug 26 '20 at 06:49

1 Answers1

0

I was able to resolve this by modifying my code as below

        public MemoryStream DownloadFile2(string path)
        {
            var connectionInfo = _taskService.GetBioinformaticsServerConnection();
            MemoryStream fileStream = new MemoryStream();
            byte[] fileBytes = null;
            using (SftpClient client = new SftpClient(connectionInfo))
            {
                client.ConnectionInfo.Timeout = TimeSpan.FromSeconds(200);
                client.Connect();

                client.DownloadFile(path, fileStream);

                fileBytes = fileStream.ToArray();
                
                var response = new MemoryStream(fileBytes);
                return response;
            }
        }
sicKo
  • 1,241
  • 1
  • 12
  • 35
  • FYI that `.ToArray()` will cause the entire contents of the file to load in memory. If the files get extremely large, or this server handles many of the same type of request at once, that may become an issue. – Gus Aug 26 '20 at 15:26