0

In my file manager, I need to offer the capability of downloading files. I need to be able to select both individual files but also directories. This could be an example:

/www/index.html
/www/images/
/www/styles.css

If I select those 3 items (2 files and 1 folder), I need to add them all to a ZIP archive. I already have an working example, where I utilize DownloadFolder() and DownloadFile(). However, it goes like this:

  1. Download each file to disk
  2. If there are any folders, recursively look through them and download those files to their respective folders (automatically done)
  3. Call System.IO.Compression.ZipFile.CreateFromDirectory() to ZIP the downloaded files to a ZIP archive
  4. Delete the downloaded files from before
  5. Stream the ZIP file back using new FileStream(zipFile, FileMode.Open, FileAccess.Read, FileShare.None, 4096, FileOptions.DeleteOnClose) so the ZIP file gets deleted afterwards

This is quite bad, because I need to first download the files, add them to an archive, delete the files I just downloaded, stream the archive to the user, and then finally delete the archive to clean up. What would be better:

  1. Tell FluentFTP which files to stream
  2. Create a ZIP archive ON DISK
  3. Add each file and directory recursively to the archive
  4. Stream the archive back and delete the file afterwards

By doing this, I should be able to make very, very large files (100+ GB if that's a case), and all I would have to care about, is temporary storage until the archive has been deleted.

I wasn't able to find any information on how to do this, so something tells me, I need to call the GetListing() method with the FtpListOption.Recursive flag, then create each directory "manually", and finally call the Download() method, which returns a stream.

Are there any better ways, though?

MortenMoulder
  • 6,138
  • 11
  • 60
  • 116
  • Do I understand right that what you want is to prevent having all files on the disk before you start compressing them. So in the end to avoid having all twice on the disk (compressed and uncompressed)? – Martin Prikryl Jun 18 '20 at 13:53
  • @MartinPrikryl I want to avoid writing anything to memory. I have plenty of disk space, so that won't be a problem. I would, however, prefer if I could directly download/stream the files via FluentFTP directly into the ZIP file on disk, so it uses as little memory as possible (if any). – MortenMoulder Jun 18 '20 at 13:55
  • Then I do not understand why aren't you happy with what you have already. Streaming files from FTP to the ZIP will only save you disk space, not memory. – Martin Prikryl Jun 18 '20 at 13:59
  • @MartinPrikryl Because if I try to zip a 50GB file (single file) using ZipFile.CreateFromDirectory it will eventually throw a OutOfMemoryException. I was wondering if there was a way to bypass the memory and write directly to disk instead. – MortenMoulder Jun 18 '20 at 14:13
  • I doubt it will help. If you want to test anyway, try `ZipFile.CreateFromDirectory` vs. `File.Open(biglocalfile).CopyTo(ZipFile.Open(archiveFileName, ZipArchiveMode.Create).CreateEntry(...).Open())` (streaming local file to ZIP). – Martin Prikryl Jun 18 '20 at 14:38

0 Answers0