0

I am trying to automate backups to a remote GCP bucket. I am using the gsutil CLI and Crontab to schedule transfers of all files in a given directory. I would like to compress the files as they are being transferred, and was led to believe that the following command would work:

gsutil -m cp -r -z * [local_path] gs://[GCP Path]

This however is not looking to be the case and it seems that individual files are being copied, can someone provide me with the exact command followed by a direct link to GCP documentation where this noted?

Thanks!

1 Answers1

0

Depending on the file extensions, you might need to change the lowercase z to uppercase Z in your command.

The -z (lowercase z) option applies gzip content-encoding to any file upload whose extension matches the -z extension list. This is useful when uploading files with compressible content such as .js, .css, or .html files.

The -Z (capital Z) option applies gzip content-encoding to file uploads. This option works like the -z option described above, but it applies to all uploaded files, regardless of extension.

John Hanley
  • 4,754
  • 1
  • 11
  • 21
  • I ran ```gsutil -m cp -r -Z * [local_path] gs://[GCP Path]``` and this did not compress any of my files. – chargingBrontosaurus May 11 '23 at 19:33
  • @chargingBrontosaurus - are the files compressible? some file types are not compressible, some are already compressed. Edit your question with more details. Test with a single file. – John Hanley May 11 '23 at 19:37