Home > Enterprise >  How to copy a file from "folder" A, to folder B on the same Bucket using a Lambda
How to copy a file from "folder" A, to folder B on the same Bucket using a Lambda

Time:10-29

thanks a lot.

I know, that S3 doesn't have a folders. But I have a prefix, abcabc/Pdf and abcabc/BackUpPdf. The problem is, when the client upload a file to the prefix abcabc/Pdf a Lambda is trigered to copy this file to abcabc/BackUpPdf. I'm using boto3

Here is a part of my code in the Lambda, in key is saved the source file path

def lambda_handler(event, context):
    print(event)    
    print('Inside the function')
    

    # Get the object from the event and show its content type
    
    bucket = event['Records'][0]['s3']['bucket']['name']
    
    key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
    
    print('Bucket: '   bucket   ', Key: '   key)
    
    client.Object(bucket, 'abcabc/BackUpPdf/TestFile.pdf').copy_from(CopySource = key, ACL='public-read')

The trouble is, Im getting a ClientError: An error occurred (AccessDenied) when calling the CopyObject operation: Access Denied. I thought it was because of the IAM role, so I put S3FullAccess to the Lamda role but i still have the same problem. Looking for in a different forums, I found that you can edit a policy for the bucket like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": "*",
            "Action": [
                "s3:GetObject",
                "s3:GetObjectTagging",
                "s3:PutObject",
                "s3:PutObjectTagging"
            ],
            "Resource": "arn:aws:s3:::bucket-name/*"
        },
        {
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:ListBucket",
            "Resource": "arn:aws:s3:::bucket-name"
        }
    ]
}

In fact, there was a mistake. It worked, once I changed the way of the instruction like this:

client.Object(bucket, 'abcabc/BackUpPdf/TestFile.pdf').copy_from(CopySource = bucket   '/'   key)

Additionally, the Block all public access are enabled in the Bucket

CodePudding user response:

I think you may be passing the CopySource parameter to copy_from incorrectly. You can pass a string, which should be of the form:

CopySource: `{bucket}/{key}`

Or you can pass a dict such as:

CopySource: {'Bucket': 'bucket', 'Key': 'key'}

You seem to be passing the string {key} rather than {bucket}/{key}. I'd recommend the dict option over the string option.

Another possible reason is that you have S3 Block Public Access enabled. That would prevent an object being stored with a public-read ACL. Note: Block Public Access can be enabled at account level or at bucket level.

  • Related