I have a folder structure that is about 7 levels deep, and contains about 1.0 million files of various sizes and formats. Everything in it that could be compressed has been. It's about 350GB.
I want to push this entire directory into cloud storage.
I was wondering if there's a way to package the whole directory in a tar or rar file or something, chunk it up and send it. Then I could recreate the package (unchunk) in a cloud service and then unpackage if needed. And do this as quickly as possible.
Is there a way using System.Io.Packaging
for example to NOT try to compress the directory structure but just package it? I think there is- but I don't know if this is efficient, or in fact if I am thinking about this the right way.
Maybe there's a better way to push this large directory structure into the cloud?