Home > Software engineering >  How to share modules across multiple AWS Lambda and Elastic Beanstalk applications?
How to share modules across multiple AWS Lambda and Elastic Beanstalk applications?

Time:12-16

In several Lambda functions and Elastic Beanstalk instances for a project, they all use the same helper functions and constants.

I am trying to follow the DRY method and not hard code these parameters into each Lambda/EB application, but instead have the modules that contain them just import into each Lambda/EB application.

I was ideally hoping to

  • put all these modules in a separate GitHub repo
  • create a codepipeline to an S3 bucket
  • import them into EB/Lambdas wherever needed

I have the first 2 steps done, but can't figure out how to import the modules from S3.

Does anyone have any suggestions on how to do this?

CodePudding user response:

There are a few ways I would consider:

  1. turning the common code into a Pypi package. This probably requires the most effort but will result in a very simple sharing (just add it to the requirements.txt of both services).
  2. Do mostly the same, but leave it on a GitHub repo (no need to publish the package to Pypi), then add git https://github.com/path/to/package-two@41b95ec#egg=package-two to the requirements.txt.
  3. If you are using the serverless framework - you can use serverless-package-external plugin to copy the common code from the EB to Lambda.

CodePudding user response:

The best way to track changes in code is using a repo but if you need to use an s3 as a repo you can consider enabling versioning in the s3 bucket/repo and define some s3 event source to trigger your pipeline.

For using those dependencies I think it's best to consider using layer for lambda functions or shared EFS volumes in instances for Beanstalk if these dependencies are very important in size.

  • Related