Home > Net >  Google cloud storage equivalent for S3 multipart upload
Google cloud storage equivalent for S3 multipart upload

Time:04-12

We are migrating from S3 to GCS and one of the functionalities that needs to be supported in GCS is the ability to upload large files in a multi-part fashion. S3 boto provides us with functions like initiate_multipart_upload and copy_part_from_key which I currently use for uploading large files in multiple parallel chunks.

I have seen similar discussion on stackoverflow in below two questions

Both the discussions point to this documentation which talks about an XML API to achieve this. However I'm looking for a python based implementation which uses storage.client() methods to stay consistent with rest of our integrations.

Appreciate any help on this.

CodePudding user response:

Unfortunately the multipart upload in Cloud Storage is only available using the XML API approach.

Alternatively you could use resumable uploads, as this is the recommended way to upload large files using the GCS client.

Finally, you could also open a Feature Request in Google's Issue Tracker so that this can be considered by their product team.

CodePudding user response:

You can just upload parts simultaneously and merge them at end.

The merging is called "Composition" at Google: https://cloud.google.com/storage/docs/composing-objects#create-composite-rest

Boto3 doesn't allow this operation.

  • Related