-2

Is there a way to copy or move files using gsutil command in batches? For example If I want to copy 100 files from given folder to another.

user101010
  • 97
  • 3
  • 10

3 Answers3

1

Another way of doing it is using the Client libraries. For example in Python:

from google.cloud import storage

storage_client = storage.Client()

bucket_name = 'my_bucket'
bucket = storage_client.get_bucket(bucket_name)

blobs_to_move = [blob for blob in bucket.list_blobs(prefix="folder1/")]

with storage_client.batch():
    for blob in blobs_to_move[:100]:
        # copy to new destination
        new_blob = bucket.copy_blob(blob, bucket, "folder2/" + blob.name[8:])
        # delete in old destination
        blob.delete()   

This will move the first 100 files from the folder1 in the GCS bucket my_bucket to folder2.

Maxim
  • 4,075
  • 1
  • 14
  • 23
1

Try this:

gsutil ls gs://bucketA | head -n 100 | shuf | gsutil cp -m -I gs://bucketB

This will get a listing of files from bucketA, take the first 100 items, randomize them with shuf, and pipe them into gsutil to copy to bucketB. The -I flag reads the file list from stdin.

Travis Webb
  • 14,688
  • 7
  • 55
  • 109
0

Slight modification so it will randomly move 100 files instead of first 100 files from bucketA to bucketB:

gsutil ls gs://bucketA | shuf | head -n 100 | gsutil -m mv  -I gs://bucketB
ZygD
  • 22,092
  • 39
  • 79
  • 102