0

I have a very large directory tree sitting in the file system of a Google Compute instance. If I do a "find foo -type f | wc -l" on it, it comes back with 34577, and I have no reason to assume that number is wrong.

Yet when I do a "gsutil -m cp -r foo gs://bar" on it, and then, later, delete what I put in the bucket, it only shows 14557 files, and worse, if I do a "gsutil -m cp -r foo/* gs://bar/foo/*" on it, there are only 14176 files, and whole branches of the tree are omitted, not even showing up in gsutil's terminal output!

Can somebody explain what's going on?

hbquikcomjamesl
  • 259
  • 2
  • 16
  • What files were skipped? – Michael Hampton Dec 07 '18 at 01:24
  • Whole branches of the directory tree being uploaded. If the file count in the directory tree is accurate, over ten thousand files, too numerous to single out. One exception was thrown, because of a funky filename, but uploads continued after that exception, so that seems unlikely to be a root cause. – hbquikcomjamesl Dec 07 '18 at 16:34

0 Answers0