Home > Software design >  Use Python and Node.js in the same Dockerfile and create one image that I cloud use both
Use Python and Node.js in the same Dockerfile and create one image that I cloud use both

Time:02-21

I have written a small script in Node.js that calls a Python file and intercepts the output of the Python file. I first build the Dockerfile with docker build -t backend_airbnb . then I run the docker compose with docker compose up -d. After that I check if the container is running, but it closes directly without an error message. It only says backend_airbnb exited with code 0.

How can I build a multistage Dockerfile that first installs the python requirements and then installs Node (or vice versa) and runs npm start? So that I can execute my Python file when a POST request goes in.

folder structure

|-- app.js
|-- requriments.txt
|-- test.js
|-- routes
|-- |-- model.py
|-- |-- post_price.js

Dockerfile

FROM python:3.6.8
#RUN mkdir -p /usr/src/app
COPY requirements.txt /opt/app/requirements.txt
WORKDIR /opt/app
RUN pip install -r requirements.txt

FROM node:14
WORKDIR  /opt/app
COPY package*.json ./
RUN npm install
ENV NODE_ENV=container
COPY . .
EXPOSE 4001
CMD npm start

docker-compose.yml

version: '3.8'

services:
    backend:
        container_name: backend_airbnb
        image: backend_airbnb
        expose:
            - "4001"
        ports:
            - "4001:4001"
        networks:
            - backendProxyNetwork

networks:
    backendProxyNetwork:
      external: true

EDIT Did not work, still the same problem New Dockerfile (Multiple FROMs - what it means)

FROM python:3.6.8 AS build
#RUN mkdir -p /usr/src/app
COPY requirements.txt /opt/app/requirements.txt
WORKDIR /opt/app
RUN pip install -r requirements.txt

FROM node:14
WORKDIR  /opt/app
COPY --from=build package*.json ./
RUN npm install
ENV NODE_ENV=container
COPY . .
EXPOSE 4001
CMD npm start

CodePudding user response:

There are some errors on your defined image:

1st. You are trying to use Python from the node image, but that image doesn't have python installed, so that won't work.

2nd. Even if you install your Python dependencies on the first stage of a multi-stage build, if you don't pass those dependencies to the next stage, it's like you didn't do anything.

There are few ways of reaching what you want, but I will tell you what I would do.

First, you will need to agree in which Python version you want to use in your project, let's say you want to use Python 3.10.

Then, you will need to create a venv within that build container, since this is what you will pass afterwards to your runtime container:

FROM python:3.10 as build

WORKDIR /opt/app
RUN python -m venv /opt/app/venv
ENV PATH="/opt/app/venv/bin:$PATH"

COPY requirements.txt .
RUN pip install -r requirements.txt

Now you will have all your dependencies installed on your venv, so you can carry them to the runtime container (where you will need to install the same Python version that you used in your build image).

FROM node:14

RUN apt update \
    && apt install software-properties-common \
    && add-apt-repository ppa:deadsnakes/ppa \
    && apt update \
    && apt install python3.10

WORKDIR /opt/app
COPY --from=build /opt/app/venv /venv

ENV PATH="/opt/app/venv/bin:$PATH"
ENV NODE_ENV=container

COPY package-*.json .
RUN npm install

COPY . .
EXPOSE 4001
CMD npm start

With that you will have Python 3.10 installed on your runtime Node image and with the dependencies which you already downloaded/compiled on your Python 3.10 build image.

CodePudding user response:

To run docker-compose with your DockerFile, you need to specify it's location by providing it in another service.

services:
  server:
      build:
        context: ./ 
        dockerfile: Dockerfile 
      networks:
      ports:
      environment:

  • Related