I have multiple Python scripts from which I want to run a docker container. From a related question How to run multiple Python scripts and an executable files using Docker? , I found that the best way to do that is to have run.sh
a shell file as follows:
#!/bin/bash
python3 producer.py &
python3 consumer.py &
python3 test_conn.py
and then call this file from a Dockerfile as:
FROM python:3.9
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY requirements.txt /usr/src/app
RUN pip install --no-cache-dir -r requirements.txt
COPY . /usr/src/app
CMD ["./run.sh"]
However, in the container logs the following error is prompting exec ./run.sh: no such file or directory
, which makes no sense to me since I copied everything on the current directory, run.sh included, to /usr/src/app on my container via COPY . /usr/src/app
Please, clone my repo and on the root directory call docker-compose up -d and check myapp container logs to help me.
https://github.com/Quilograma/IES_Project
Thank you!
Can't run multiple python scripts in a single container.
CodePudding user response:
You should explicitly specify what shell interpreter be used for running your script.
Changing the last line to CMD ["bash", "-c", "./run.sh"]
might solve your issue.
CodePudding user response:
If you need to run three separate long-running processes, do not try to orchestrate them from a shell script. Instead, launch three separate containers. If you're running this via Compose, this is straightforward: have three containers all running the same image, but override the command:
to run different main processes.
version: '3.8'
services:
producer:
build: .
command: ./producer.py
consumer:
build: .
command: ./consumer.py
test_conn:
build: .
command: ./test_conn.py
Make sure the scripts are executable (run chmod x producer.py
on your host system, and commit that to source control) and begin with a "shebang" line #!/usr/bin/env python3
as the very first line.