-1

I'm uploading a file that is 8.6T in size.

$ nohup gsutil -o GSUtil:parallel_composite_upload_threshold=150M cp big_file.jsonl gs://bucket/big_file.jsonl > nohup.mv-big-file.out 2>&1 &

At some point, it just hangs, with no error messages, nothing.

Any suggestions on how I can move this large file from the box to the GS bucket?

Bob van Luijt
  • 7,153
  • 12
  • 58
  • 101
  • 2
    1) Unless you have a dedicated gigabit Internet connection, uploading 8 TB will take months to complete. 2) The maximum Google Cloud Storage object size is 5 TB. You cannot complete the upload of 8.6 TB successfully. – John Hanley Oct 11 '22 at 08:02

1 Answers1

1

In accordance to what @John Hanley mentioned, the maximum size limit for individual objects stored in Cloud Storage is 5 TB, as stated in Buckets and Objects Limits

Here are some workaround that you can try :

  1. You can try uploading it across multiple folders on a single bucket since there is no limit on the actual bucket size.

  2. Second option you can try is chunking of your files up to 32 chunks Parallel composite uploads.

  3. Another option that you may also consider is Transfer Appliance for a faster and higher capacity of upload to Cloud Storage.

You might want to take a look as well to GCS's best practices documentation.

Marvin Lucero
  • 468
  • 1
  • 8