I have cdk script which makes one S3 bucket
and lambda
then add s3 trigger to lambda
const up_bk = new s3.Bucket(this, 'cdk-st-in-bk', { // image-resize用のbucket
bucketName: `cdk-st-${targetEnv}-resource-in-bk`,
removalPolicy: RemovalPolicy.DESTROY,
autoDeleteObjects: true,
cors: [{
allowedMethods: [
s3.HttpMethods.GET,
s3.HttpMethods.POST,
s3.HttpMethods.PUT,
s3.HttpMethods.DELETE,
s3.HttpMethods.HEAD,
],
allowedHeaders: ["*"],
allowedOrigins: ["*"],
exposedHeaders: ["ETag"],
maxAge: 3000
}]
});
const resizerLambda = new lambda.DockerImageFunction(this, "ResizerLambda", {
code: lambda.DockerImageCode.fromImageAsset("resizer-sam/resizer"),
});
resizerLambda.addEventSource(new S3EventSource(up_bk, {
events: [ s3.EventType.OBJECT_CREATED ],
}));
Now,It makes role automatically st-dev-base-stack-ResizerLambdaServiceRoleAE27CE82-1LWJL0D35A0GW
But it has only AWSLambdaBasicExecutionRole
So,when I try to access S3 from bucket there comes error like `
For example,
obj = s3_client.get_object(Bucket=bucket_name, Key=obj_key)
"An error occurred (AccessDenied) when calling the GetObject operation: Access Denied"
I guess I should add the AmazonS3FullAccess
to this role.
However how can I do this??
CodePudding user response:
You need to give the Lambda function permission to read from the bucket:
up_bk.grantRead(resizerLambda);
If you also need it to write to the bucket, do:
up_bk.grantReadWrite(resizerLambda);