We can copy objects to different buckets in the same namespace using.
CopyObjectRequest copyReq = CopyObjectRequest.builder()
.copySourceIfMatch(encodedUrl)
.destinationBucket(toBucket)
.destinationKey(objectKey)
.build();
CopyObjectResponse copyRes = s3.copyObject(copyReq);
But if I need to transfer file to another namespace (so different connection details) - how it can be achieved?
CodePudding user response:
When using copyObject()
, you must use a single set of credentials that has Read permission on the source bucket and Write permission on the destination bucket.
Assuming that you want to copy the object between buckets owned by different AWS Accounts, then your options are:
Option 1: Push
This option uses credentials from the AWS Account that owns the 'source' bucket. You will need:
- Permission on the IAM User/IAM Role to
GetObject
from the source bucket - A Bucket Policy on the destination bucket that permits
PutObject
from the IAM User/IAM Role being used - I recommend you also set
ACL=bucket-owner-full-control
to give the destination bucket ownership of the object
Option 2: Pull
This option uses credentials from the AWS Account that owns the 'destination' bucket. You will need:
- A Bucket Policy on the source bucket that permits
GetObject
from the IAM User/IAM Role being used - Permission on the IAM User/IAM Role to
PutObject
to the destination bucket
Your situation
You mention "different connection details", so I assume that these are credentials that have been given to you by the owner of the 'other' account. As per the options above:
- If your account is the source account, add a Bucket Policy that permits
GetObject
using those credentials - If your account is the destination account, add a Bucket Policy that permits
PutObject
using those credentials and useACL=bucket-owner-full-control