I have a thousands of files in one gcs bucket. Out of which i wanted to copy some n list of files using gsutil -m cp command. By reading the documentation I can perform something like this in my python script.
cat filelist | gsutil -m cp -I gs://my_bucket
If I perform below operation like this i am getting Argument list too long
for a list more than few hundred files
alist = [f'{_}.txt' for _ in range(1000000)]
alist_in_str = '\n'.join(alist)
subprocess.call(['printf', alist_in_str, '| gsutil -m cp -I gs://my_bucket'])
What is the efficient way to copy a list of files using gsutil in python script?