Suppose I wrote a docker-compose.dev.yml
file to set the development environment of a Flask project (web application) using Docker. In docker-compose.dev.yml
I have set up two services, one for the database and one to run the Flask application in debug mode (which allows me to do hot changes without having to recreate/restart the containers). This allows everyone on the development team to use the same development environment very easily. However, there is a problem: it is evident that while developing an application it is necessary to install libraries, as well as to list them in the requirements.txt
file (in the case of Python). For this I only see two alternatives using a Docker development environment:
- Enter the console of the container where the Flask application is running and use the
pip install ...
andpip freeze > requirements.txt
commands. - Manually write the dependencies to the
requirements.txt
file and rebuild the containers.
The first option is a bit laborious, while the second is a bit "dirty". Is there any more suitable option than the two mentioned alternatives?
Edit: I don't know if I'm asking something that doesn't make sense, but I'd appreciate if someone could give me some guidance on what I'm trying to accomplish.
CodePudding user response:
The second option is generally used in python environments. You just add new packages to requirements.txt and restart the container, which has a line with pip install -r requirements.txt
in its dockerfile that do the installing.
CodePudding user response:
If the goal is to have a consistent dev environment, the safest way I can think of would be to build a base image with the updated dependencies and publish to a private registry so that you can refer to a specific tag like app:v1.2
. So the Dockerfile can look like:
FROM AppBase:v1.2
...
This means that there is no need to install the dependencies and results in a quicker and consistent dev env setup.