Home > Enterprise >  Access Denied when trying to PutObject to s3
Access Denied when trying to PutObject to s3

Time:11-24

I'm using the Serverless Framework to create a lambda that saves a CSV to a S3 bucket.

I already have a similar lambda that does this with another bucket.

This is where it gets weird: I can upload the CSV to the first S3 bucket I created (many months back), but I'm getting an AccessDenied error when uploading the same CSV to the new S3 bucket which was, as far as I can tell, created in the exact same way as the first via the serverless.yml config.

The error is:

Error: AccessDenied: Access Denied

These are the relevant bits from the serverless.yml:

provider:
  name: aws
  runtime: nodejs12.x
  stage: ${opt:stage, 'dev'}
  region: eu-west-1
  environment:
    BUCKET_NEW: ${self:custom.bucketNew}
    BUCKET: ${self:custom.bucket}

  iam:
    role:
      statements:
        - Effect: 'Allow'
          Action: 'lambda:InvokeFunction'
          Resource: '*'
        - Effect: 'Allow'
          Action:
            - 's3:GetObject'
            - 's3:PutObject'
          Resource:
            - 'arn:aws:s3:::*' # Added this whilst debugging
            - 'arn:aws:s3:::*/*' # Added this whilst debugging
            - 'arn:aws:s3:::${self:custom.bucket}'
            - 'arn:aws:s3:::${self:custom.bucket}/*'
            - 'arn:aws:s3:::${self:custom.bucketNew}'
            - 'arn:aws:s3:::${self:custom.bucketNew}/*'

functions:
  uploadReport:
    handler: services/uploadReport.handler
    vpc:
      securityGroupIds:
        - 000001
      subnetIds:
        - subnet-00000A
        - subnet-00000B
        - subnet-00000C

resources:
  Resources:
    Bucket:
      Type: 'AWS::S3::Bucket'
      Properties:
        BucketName: ${self:custom.bucket}
    BucketNew:
      Type: 'AWS::S3::Bucket'
      Properties:
        BucketName: ${self:custom.bucketNew}

custom:
  stage: ${opt:stage, 'dev'}
  bucket: ${self:service}-${self:custom.stage}-report
  bucketNew: ${self:service}-${self:custom.stage}-report-new

Lambda code (simplified):

const fs = require('fs')
const AWS = require('aws-sdk')

const S3 = new AWS.S3({
  httpOptions: {
    connectTimeout: 1000,
  },
})


const uploadToS3 = (params) => new Promise((resolve, reject) => {
  S3.putObject(params, err => (err ? reject(err) : resolve(params.Key)))
})

module.exports.handler = async () => {
  const fileName = `report-new.csv`
  const filePath = `/tmp/${fileName}`

  // Some code that creates a CSV file at filePath.

  const bucketParams = {
    Bucket: process.env.BUCKET_NEW, // Works for process.env.BUCKET, but not process.env.BUCKET_NEW.
    Key: fileName,
    Body: fs.readFileSync(filePath).toString('utf-8'),
  }

  try {
    const s3Upload = await uploadToS3(bucketParams)
  } catch (e) {
    throw new Error(e) // Throws Error: AccessDenied: Access Denied.
  }
}

CodePudding user response:

I think you are likely missing some permissions I often use "s3:Put*" on my serverless applications which may not be advisable since it is so broad.

Here is a minimum list of permissions required to upload an object I found here What minimum permissions should I set to give S3 file upload access?

"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObjectAcl",
"s3:ListBucket",
"s3:GetBucketLocation"

CodePudding user response:

Found the solution but it's my own mistake. My lambda was actually within a VPC. My original question (before the edit) did not show this.

Lambda in a VPC can't talk with S3 buckets unless the VPC has an Endpoint Gateway that enables it to talk with any specifically referenced buckets.

I had previously created an Endpoint Gateway that let it talk with the initial bucket I created a while back, but forgot to update the Endpoint Gateway to let it talk to the new bucket.

Leaving this answer here unless anyone else spends an entire day trying to fix something silly.

  • Related