I'm trying to create an API for machile learning. For that, I'm using AWS, GitHub and a python project. I created a GitHub Action that saves, in a zip, all the site-packages (in the venv) and the ML models and scripts to run the API. This Action, also add this zip to a S3 Bucket and launch a Lambda function, that imports the zip previously placed in S3.
My problem is: the zip file is too large for lambda.
Unzipped size must be smaller than 262144000 bytes
How can I fix this? IS there a better alternative?
PS.: I need to have the gitub connected to the API
CodePudding user response:
Your deployment package is too big. Double check if you really need all the software / libraries you've zipped.
From the docs:
There is a hard limit of 50MB for compressed deployment package with AWS Lambda and an uncompressed AWS Lambda hard limit of 250MB.
Note that you have a region wide soft limit of 75GB for all AWS Lambda functions that have been deployed.
Otherwise, you have the following options:
- Container images
- Layers
- Using other AWS services e.g. ECS
A hacky way:
If you really need to ship all the dependencies along with your AWS Lambda, you might want to put a large binary or dependency package into an S3 bucket. Then, make your Lambda fetch these.