Home > front end >  AWS Lambda: upload code from amazon S3 location
AWS Lambda: upload code from amazon S3 location

Time:02-12

I have configured AWS pipeline to deploy my latest nodeJS code as zip file to S3. Instead of going to AWS Lambda and choosing the "upload from Amazon S3 location" option, is there a way to make my lambda service to take the latest code once it is written to S3, so that it will automatically update its function code when you update the zip file?

CodePudding user response:

Using Lambda triggers, you can create a trigger than is fired whenever a new object is written to the S3 location. This tutorial specifies how you can set this up.

Then, using the AWS SDK, you can update your function's code programatically.

Take a look at the aptly named Update Function Code documentation.

The Python SDK documentation is here. Here is an example of the code from the documentation:

response = client.update_function_code(
    FunctionName='string',
    ZipFile=b'bytes',
    S3Bucket='string',
    S3Key='string',
    S3ObjectVersion='string',
    ImageUri='string',
    Publish=True|False,
    DryRun=True|False,
    RevisionId='string',
    Architectures=[
        'x86_64'|'arm64',
    ]
)

I found this after posting the above, there appears to be an automated solution within CodePipeline itself.

https://docs.aws.amazon.com/lambda/latest/dg/services-codepipeline.html

I imagine this would be a cleaner solution to your problem.

CodePudding user response:

I have used this link to configure my AWS Code Build yaml in deploy time and the latest source code uploaded as zip file in S3 was taken by AWS Lambda.

https://docs.aws.amazon.com/cli/latest/reference/lambda/update-function-code.html

sample command used in my yaml is given below

- aws lambda update-function-code --function-name my-function --s3-bucket my-s3-bucket --s3-key sourceFileName.zip
  • Related