Home > Enterprise >  How to grant access on s3 and DynamoDB to Fargate JobDefinition?
How to grant access on s3 and DynamoDB to Fargate JobDefinition?

Time:08-07

I'm developing a CDK stack for configuring an AWS Batch Fargate job that needs to access S3 and DynamoDB. I granted access to executionRole. But when I run Job it fails to access s3 with the following error message:

Unable to get IAM security credentials from EC2 Instance Metadata Service

This is the code:

import { ComputeEnvironment, ComputeResourceType, JobDefinition, JobQueue, PlatformCapabilities } from "@aws-cdk/aws-batch-alpha";
import { Duration, Stack, StackProps } from "aws-cdk-lib";
import { ITable } from "aws-cdk-lib/aws-dynamodb";
import { IVpc } from "aws-cdk-lib/aws-ec2";
import { ContainerImage } from "aws-cdk-lib/aws-ecs";
import { ManagedPolicy, PolicyDocument, PolicyStatement, Role, ServicePrincipal } from "aws-cdk-lib/aws-iam";
import { IBucket } from "aws-cdk-lib/aws-s3";
import { Construct } from "constructs";
import { stage } from "..";

export interface CapsJobStackProps extends StackProps {
    mainBucket: IBucket;
    capsTable: ITable;
    vpc: IVpc
}

export class CapsJobStack extends Stack {
    constructor(scope: Construct, id: string, private props: CapsJobStackProps) {
      super(scope, id, props);

      const computeEnvironment = new ComputeEnvironment(this, 'CapsComputeEnvironment', {
        computeResources: {
          type: ComputeResourceType.FARGATE,
          vpc: props.vpc,
          maxvCpus: 2
        },
      });

      const jobQueue = new JobQueue(this, 'CapsJobQueue', {
        computeEnvironments: [{computeEnvironment, order: 1}]
      });

      const jobRole = new Role(this, 'CapsJobRole', {
        assumedBy: new ServicePrincipal("ecs-tasks.amazonaws.com"),
        inlinePolicies: {
          EcsTask: new PolicyDocument({
            statements: [
              new PolicyStatement({
                actions: ["ecr:GetAuthorizationToken", "ecr:BatchCheckLayerAvailability", "ecr:GetDownloadUrlForLayer", "ecr:BatchGetImage", "logs:CreateLogStream", "logs:PutLogEvents" ],
                resources: ["*"]
              })
            ]
          })
        }
      });
      props.capsTable.grantReadWriteData(jobRole);
      props.mainBucket.grantReadWrite(jobRole);

      const jobDefinition = new JobDefinition(this, 'CapsJob', {
        container: {
          image: ContainerImage.fromAsset('../src/Caps.Uploader', {
            file: 'DockerfileCdk'
          }),
          environment: {
            "OD_ENVIRONMENT": stage,
            "OD_BUCKET": props.mainBucket.bucketName,
            "OD_CAPSTABLE": props.capsTable.tableName
          },
          executionRole: jobRole,
          vcpus: 0.25,
          memoryLimitMiB: 512
        },
        timeout: Duration.days(1),
        retryAttempts: 3,
        platformCapabilities: [PlatformCapabilities.FARGATE]
      });
    }
}

What I'm missing? How to grant access on s3 and DynamoDB to the job?

CodePudding user response:

executionRole is for ECS service itself to be able to, e.g. access ECR. You have to use task role. In CDK this is jobRole.

  • Related