We are migrating from S3 to GCS and one of the functionalities that needs to be supported in GCS is the ability to upload large files in a multi-part fashion. S3 boto provides us with functions like initiate_multipart_upload
and copy_part_from_key
which I currently use for uploading large files in multiple parallel chunks.
I have seen similar discussion on stackoverflow in below two questions
- Google cloud storage compatibility with aws s3 multipart upload
- Google Cloud Storage support of S3 multipart upload
Both the discussions point to this documentation which talks about an XML API to achieve this. However I'm looking for a python based implementation which uses storage.client() methods to stay consistent with rest of our integrations.
Appreciate any help on this.