Home > Blockchain >  How to update source code without rebuilding image each time?
How to update source code without rebuilding image each time?

Time:11-30

Is there a way to avoid rebuilding my Docker image each time I make a change in my source code ?

I think I have already optimize my Dockerfile enough to decrease building time, but it's always 2 commands and some waiting time for sometimes just one line of code added. It's longer than a simple CTRL S and check the results.

The commands I have to do for each little update in my code:

docker-compose down
docker-compose build
docker-compose up

Here's my Dockerfile:

FROM python:3-slim as development

ENV PYTHONUNBUFFERED=1

COPY ./requirements.txt /requirements.txt
COPY ./scripts /scripts

EXPOSE 80

RUN apt-get update && \
    apt-get install -y \
    bash \
    build-essential \
    gcc \
    libffi-dev \
    musl-dev \
    openssl \
    wget \
    postgresql \
    postgresql-client \
    libglib2.0-0 \
    libnss3 \
    libgconf-2-4 \
    libfontconfig1 \
    libpq-dev && \
    pip install -r /requirements.txt && \
    mkdir -p /vol/web/static && \
    chmod -R 755 /vol && \
    chmod -R  x /scripts

COPY ./files /files

WORKDIR /files

ENV PATH="/scripts:/py/bin:$PATH"

CMD ["run.sh"]

Here's my docker-compose.yml file:

version: '3.9'

x-database-variables: &database-variables
  POSTGRES_DB: ${POSTGRES_DB}
  POSTGRES_USER: ${POSTGRES_USER}
  POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
  ALLOWED_HOSTS: ${ALLOWED_HOSTS}

x-app-variables: &app-variables
  <<: *database-variables
  POSTGRES_HOST: ${POSTGRES_HOST}
  SPOTIPY_CLIENT_ID: ${SPOTIPY_CLIENT_ID}
  SPOTIPY_CLIENT_SECRET: ${SPOTIPY_CLIENT_SECRET}
  SECRET_KEY: ${SECRET_KEY}
  CLUSTER_HOST: ${CLUSTER_HOST}
  DEBUG: 0

services:
  website:
    build:
      context: .
    restart: always
    volumes:
      - static-data:/vol/web
    environment: *app-variables
    depends_on:
      - postgres

  postgres:
    image: postgres
    restart: always
    environment: *database-variables
    volumes:
      - db-data:/var/lib/postgresql/data

  proxy:
    build:
      context: ./proxy
    restart: always
    depends_on:
      - website
    ports:
      - 80:80
      - 443:443
    volumes:
      - static-data:/vol/static
      - ./files/templates:/var/www/html
      - ./proxy/default.conf:/etc/nginx/conf.d/default.conf
      - ./etc/letsencrypt:/etc/letsencrypt

volumes:
  static-data:
  db-data:

CodePudding user response:

Mount your script files directly in the container via docker-compose.yml:

volumes:
  - ./scripts:/scripts
  - ./files:/files

Keep in mind you have to use a prefix if you use a WORKDIR in your Dockerfile.

CodePudding user response:

Quickly answer

Is there a way to avoid rebuilding my Docker image each time I make a change in my source code ?

If your app needs a build step, you cannot skip it.

Also in you case, you can start the requirements before the python app, so on each source code modification, you just need to build & run your python app, not the entire stack: postgress, proxy, etc

Docker purpose

The main docker goal or feature is to enable developers to package applications into containers which are easy to deploy anywhere, simplifying your infrastructure.

So, in this sense, docker is not strictly for the developer stage. In the developer stage, the programmer should use an specialized IDE (eclipse, intellij, visual studio, etc) to create and update the source code.

These IDEs has features like hot reload (automatic application updates when source code change), variables & methods auto-completion, etc. These features achieve to reduce the developer time.

Docker for source code changes by developer

Is not the main goal but if you don't have an specialized ide or you are in a very limited developer workspace, docker can rescue you

If you are a java developer (for instance), you need to install java on your machine and some IDE like eclipse, configure the maven, etc etc. With docker, you could create an image with all the required techs and the establish a kind of connection between your source code and the docker container. This connection in docker is called Volumes

In case of technologies that need a build process: java & c#, there is a time penalty because, the developer should perform a build on any source code change. This is not required with the usage of specialized ide as I explained.

I case of technologies who not require build process like: php, just the libraries/dependencies installation, docker will work almost the same as the specialized IDE.

Docker for local development with hot-reload

In your case, your app is based on python. Python don't require a build process. Just the libraries installation, so if you want to develop with python using docker instead the classic way: install python, execute python app.py, etc you should follow these steps:

  • Don't copy your source code to the container
  • Just pass the requirements.txt to the container
  • Execute the pip install inside of container
  • Run you app inside of container
  • Create a docker volume : your source code -> internal folder on container

Here an example of some python framework with hot-reload:

FROM python:3
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY requirements.txt /usr/src/app
RUN pip install -r requirements.txt
CMD [ "mkdocs", "serve",  "--dev-addr=0.0.0.0:8000" ]

and how build as dev version:

docker build -t myapp-dev .

and how run it with volumes to sync your developer changes with the container:

docker run --name myapp-dev -it --rm -p 8000:8000 -v $(pwd):/usr/src/app mydocs-dev

As a summary, this would be the flow to run your apps with docker in a developer stage:

  • start the requirements before the app (database, apis, etc)
  • create an special Dockerfile for development stage
  • build the docker image for development purposes
  • run the app syncing the source code with container (-v)
  • developer modify the source code
  • if you can use some kind of hot-reload library on python
    • the app is ready to be opened from a browser

Docker for local development without hot-reload

If you cannot use a hot-reload library, you will need to build and run whenever you want to test your source code modifications. In this case, you should copy the source code to the container instead the synchronization with volumes as the previous approach:

FROM python:3
RUN mkdir -p /usr/src/app
COPY . /usr/src/app
WORKDIR /usr/src/app
RUN pip install -r requirements.txt
RUN mkdocs build
WORKDIR /usr/src/app/site
CMD ["python", "-m", "http.server", "8000" ]

Steps should be:

  • start the requirements before the app (database, apis, etc)
  • create an special Dockerfile for development stage
  • developer modify the source code
  • build
docker build -t myapp-dev.
  • run
docker run --name myapp-dev -it --rm -p 8000:8000 mydocs-dev
  • Related