I know there are several posts about this but I couldn't quite understand.
I have the following project structure:
engine
--alpha-api
----app.py
----__init__.py
--api_utils
----utils.py
----__init__.py
--beta_api
and I have the following Dockefile:
FROM public.ecr.aws/lambda/python:3.8
COPY . ${LAMBDA_TASK_ROOT}
WORKDIR ${LAMBDA_TASK_ROOT}
RUN pip install -r ./alpha-api/requirements.txt --target ./alpha-api
The Dockerfile is under the alpha-api. That cannot be changed since I have two different lambdas in the same repo.
When I run docker build with the context of the engine directory I do maintain the context and get the same project structure. (docker build -f ./alpha-api/Dockerfile -t alpha-api .
)
However, when i run a container and call ./alpha-api/app.py I get the following error:
Traceback (most recent call last):
File "app.py", line 9, in <module>
from api_utils.utils import DateStringFormat
ModuleNotFoundError: No module named 'api_utils'
Do you know why it doesn't recognize api_utils?
CodePudding user response:
In order for python python to understand that a folder is actually a python module, it must contain a __init__.py
file. Try adding an empty __init__.py
to your api_utils folder.
CodePudding user response:
When you run python ./alpha-api/app.py
, Python will only have all files within ./alpha-api
in its path.
Setting your PYTHONPATH
to ./
(or ${LAMBDA_TASK_ROOT}
) prior to running the file could help.