0

I have a folder structure that is about 7 levels deep, and contains about 1.0 million files of various sizes and formats. Everything in it that could be compressed has been. It's about 350GB.

I want to push this entire directory into cloud storage.

I was wondering if there's a way to package the whole directory in a tar or rar file or something, chunk it up and send it. Then I could recreate the package (unchunk) in a cloud service and then unpackage if needed. And do this as quickly as possible.

Is there a way using System.Io.Packaging for example to NOT try to compress the directory structure but just package it? I think there is- but I don't know if this is efficient, or in fact if I am thinking about this the right way.

Maybe there's a better way to push this large directory structure into the cloud?

Nicros
  • 5,031
  • 12
  • 57
  • 101

1 Answers1

0

If your goal is to copy your folder structure on your file system to cloud storage, you can use command line utility – AzCopy to do that. This utility is designed to simplify the task of transferring data in to and out of a azure storage account. You can read more about AzCopy on our blog post here and here.