The Dockerfile contains:
RUN /bin/bash -c "python3 -m pip install --upgrade pip && python3 -m pip install conan"
Once built, it never run it again and use the cache instead. I'm fine as long as the versions (pip's conan's) haven't changed.
What's the best practice to handle that case? I'd like docker/buildah
to detect whether it needs to change the layer if there is a new version. Purposely, I didn't add any version to always get the latest versions.
I struggled finding the cause of a bug I had, Conan has changed their SSL certificate in a new version and I was stuck with a previous version that prevented me from installing packages.
CodePudding user response:
This conflict is exactly why to use a version lock file that lists out specific versions of packages to use. Even if you think you usually want the latest version, this gives you (or the packaging tool) a place to record a specific set of versions that you know works.
Using Python's basic setuptools system, for example, you can declare in your setup.cfg
file that your application needs that specific package
[options]
install_requires=
conan
Now in your local (non-Docker) development environment, you can install that
rm -rf venv # clean up the old virtual environment
python3 -m venv venv # create a new virtual environment
. venv/bin/activate # activate it
pip install -e . # install current directory and its dependencies
or alternatively, if you already have the virtual environment set up
pip install --upgrade -e .
Now you can ask pip
to dump out the requirements file
pip freeze > requirements.txt
In your Dockerfile, COPY
the new requirements.txt
file in, and use that to install packages.
FROM python:3.9
# Upgrade pip, since it likes to complain about being out of date
RUN pip install --upgrade pip
# Install package dependencies
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
# Install the rest of the application
COPY . .
CMD ["./application.py"]
Docker's layer caching system means, if the requirements.txt
file has changed, Docker will re-run pip install
, and if it hasn't, it will skip over it (until the next COPY
line with a changed file). Meanwhile, it's under your control which exact version to use (you can either put a version constraint in setup.cfg
or manually edit requirements.txt
to avoid a broken version) so a breaking upstream change won't mean you can't ship code. Finally, you're using the same packaging system in your development environment and in Docker, so it's easy to keep things consistent (I'd generally discourage pip install
ing individual packages in a Dockerfile).