Home > Mobile >  Docker - Build a service after the dependant service is up and running
Docker - Build a service after the dependant service is up and running

Time:09-04

I have a docker-compose file for a Django application. Below is the structure of my docker-compose.yml

version: '3.8'

volumes:
  pypi-server:


services:
  backend:
    command: "bash ./install-ppr_an_run_dphi.sh"
    build:
      context: ./backend
      dockerfile: ./Dockerfile
    volumes:
      - ./backend:/usr/src/app
    expose:
      - 8000:8000
    depends_on:
      - db

  pypi-server:
    image: pypiserver/pypiserver:latest
    ports:
      - 8080:8080
    volumes:
      - type: volume
        source: pypi-server
        target: /data/packages
    command: -P . -a . /data/packages
    restart: always

  db:
    image: mysql:8
    ports:
      - 3306:3306
    volumes:
      - ~/apps/mysql:/var/lib/mysql
    environment:
      - MYSQL_ROOT_PASSWORD=gary
      - MYSQL_PASSWORD=tempgary
      - MYSQL_USER=gary_user
      - MYSQL_DATABASE=gary_db

  nginx:
    build: ./nginx
    ports:
      - 80:80
    depends_on:
      - backend

Django app is dependent on a couple of private packages hosted on the private-pypi-server without which the app won't run. I created a separate dockerfile for django-backend alone which install packages of requirements.txt and the packages from private-pypi-server. But the dockerfile of django-backend service is running even before the private pypi server is running. If I move the installation of private packages to docker-compose.yml command code under django-backend service in , then it works fine. Here the issue is that, if the backend is running and I want to run some commands in django-backend(./manage.py migrat) then it says that the private packages are not installed.

Im not sure how to proceed with this, it would be really helpful If i can get all these services running at once by just running the command docker-compose up --build -d

CodePudding user response:

Created a separate docker-compose for pypi-server, which will be up and running even before I build/start other services.

CodePudding user response:

Have you tried adding the pipy service to depends_on of the backend app?

  backend:
    command: "bash ./install-ppr_an_run_dphi.sh"
    build:
      context: ./backend
      dockerfile: ./Dockerfile
    volumes:
      - ./backend:/usr/src/app
    expose:
      - 8000:8000
    depends_on:
      - db
      - pypi-server

Your docker-compose file begs a few questions though.

  1. Why to install custom packages to the backend service at a run time? I can see so many problems which might arise from this such as latency during service restarts, possibly different environments between runs of the same version of the backend service, any problems with the installation would come up during the deployment bring it down, etc. Installation should be done during the build of the docker image. Could you provide your Dockerfile maybe?

  2. Is there any reason why the pypi server has to share docker-compose with the application? I'd suggest having it in a separate deployment especially if it is to be shared among other projects.

  3. Is the pypi server supposed to be used for anything else than a source of the custom packages for the backend service? If not then I'd consider getting rid of it / using it for the builds only.

  4. Is there any good reason why you want to have all the ports exposed? This creates a significant attack surface. E.g. an attacker could bypass the reverse proxy and talk directly to the backend service using port 8000 or they'd be able to connect to the db on the port 3306. Nb docker-compose creates subnetworks among the containers so they can access each other's ports even if those ports are not forwarded to the host machine.

  5. Consider using docker secrets to store db credentials.

  • Related