We can do it at entire project level like: `
custom:
pkgfunc:
buildDir: _build
requirementsFile: "requirements.txt"
but I needed to do something like:
functions:
my_func:
name: my_func
handler: package.handler
requirementsFile: "my_func_requirements.txt"
I wanted to check if it's possible and if yes, then, how good of a solution this is. My Primary objective is to reduce the lambda size as much as possible. Whichever lambda does not need a library should not get it packaged.
To reduce the stack size, I've separated some of the functions requiring huge libraries into different lambdas and I'm calling them internally via API/Boto calls.
Help is much appreciated.
CodePudding user response:
import numpy as np import psycogp2
def main(event, context): a = np.arange(15).reshape(3, 5)
print("Your numpy array:")
print(a)
if name == "main": main('', '')
CodePudding user response:
Yes, you can define a requirements file for each Lambda function in a serverless.yml file in an AWS-Python environment. In the serverless.yml file, you can specify the path to the requirements file for each Lambda function under the functions section, using the requirements property.
For example, if you have two Lambda functions named function1 and function2, and each function has a requirements file named requirements1.txt and requirements2.txt respectively, you could specify the requirements files for each function in the serverless.yml file as follows:
functions:
function1:
requirements: requirements1.txt
function2:
requirements: requirements2.txt
When you deploy your serverless application, the serverless framework will automatically install the specified requirements for each Lambda function. This allows you to manage the dependencies for each function separately, and ensures that each function has the necessary packages and libraries to run correctly.