Home > Software engineering >  how to run S3 placed python code in Lambda Function
how to run S3 placed python code in Lambda Function

Time:07-08

I have multiple python scripts(test1.py, test2.py, test3.py) which I already uploaded on S3 bucket(test_s3_bucket) inside the scripts directory.

scripts/test1.py

def abc():
  return "function abc"

scripts/test2.py

from test1 import abc
def xyz():
  t1 = abc()
  return "function xyz > {}".format(t1)

Now I need to run those python code(stored in s3 bucket) through Lambda Function.

import boto3
def lambda_handler(event, context):
  test2_obj = s3.download_file("test_s3_bucket", "script/test2.py", "/tmp/test2.py")
return test2_obj.xyz()

But I am facing issue when I run the lambda function

{
  "errorMessage": "'NoneType' object has no attribute 'xyz'",
  "errorType": "AttributeError",
  "requestId": "exxxxxxx-bxxe-4xxd-bxx1-xxxxxxxxxxxx",
  "stackTrace": [
    "  File \"/var/task/lambda_function.py\", line 4, in lambda_handler\n    test2_obj.xyz()\n"
  ]
}

If this is not a correct approch, could you please suggest which approch will be good to use.

Thanks in advance.

CodePudding user response:

The Lambda developer guide shows you how to package your code as a zip file.

https://docs.aws.amazon.com/lambda/latest/dg/python-package.html

Individual files aren't loaded from S3.

CodePudding user response:

I got the answer, we can achieve it 2 different ways.

1st using "upload from" S3 or .zip where we will put our all dependency with lambda code in single package so that we can all dependency parallely.

2nd using layers and we can do that.

but in my scenario I will create different lambda function for operation and those dependency library will be using for all lambda function so in that case Layers is a better choice.

  • Related