I have a dash app and i'm trying to host it to the port 8050 so i'm trying to create it using gunicorn and I ran my Dockerfile which contains:
FROM airflow-update
CMD gunicorn -b 0.0.0.0:8050 /py_scripts.index:server
and then i ran
docker run -p 8050:8050 airflow
and i got the error below:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not translate host name "postgres" to address: Name or service not known
My docker-compose.yaml file is like this:
version: '3'
x-airflow-common:
&airflow-common
image: ${AIRFLOW_IMAGE_NAME:-airflow:latest}
environment:
&airflow-common-env
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./logs:/opt/airflow/logs/pipeline-logs
- ./py_scripts:/opt/airflow/py_scripts
- ./data:/opt/airflow/data
- ./dbt-redshift:/opt/airflow/dbt-redshift
- ./output:/opt/airflow/output
user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
depends_on:
redis:
condition: service_healthy
postgres:
condition: service_healthy
services:
postgres:
image: postgres:13
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
volumes:
- postgres-db-volume:/var/lib/postgresql/data
healthcheck:
test: ["CMD", "pg_isready", "-U", "airflow"]
interval: 5s
retries: 5
restart: always
redis:
image: redis:latest
ports:
- 6379:6379
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 30s
retries: 50
restart: always
airflow-webserver:
<<: *airflow-common
command: webserver
ports:
- 8080:8080
healthcheck:
test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
interval: 10s
timeout: 10s
retries: 5
restart: always
airflow-scheduler:
<<: *airflow-common
command: scheduler
healthcheck:
test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
interval: 10s
timeout: 10s
retries: 5
restart: always
airflow-worker:
<<: *airflow-common
command: celery worker
healthcheck:
test:
- "CMD-SHELL"
- 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
interval: 10s
timeout: 10s
retries: 5
restart: always
airflow-init:
<<: *airflow-common
command: version
environment:
<<: *airflow-common-env
_AIRFLOW_DB_UPGRADE: 'true'
_AIRFLOW_WWW_USER_CREATE: 'true'
_AIRFLOW_WWW_USER_USERNAME: ${_AIRFLOW_WWW_USER_USERNAME:-airflow}
_AIRFLOW_WWW_USER_PASSWORD: ${_AIRFLOW_WWW_USER_PASSWORD:-airflow}
flower:
<<: *airflow-common
command: celery flower
ports:
- 5555:5555
healthcheck:
test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
interval: 10s
timeout: 10s
retries: 5
restart: always
volumes:
postgres-db-volume:
What am i doing wrong. Should i update my docker-compose file in regards to postgres?
CodePudding user response:
Compose creates a network named default
, but your docker run
command isn't attached to that network.
The best approach to this is to move this setup inside your Compose setup. If the only thing you're replacing is the command that's being run, you don't even need a custom image. This is the same pattern you have for the several other containers in this setup.
services:
postgres: { ... }
airflow-server:
<<: *airflow-common
command: gunicorn -b 0.0.0.0:8050 /py_scripts.index:server
ports:
- '8050:8050'
restart: always
If you really want to run it with docker run
, you need to find the network Compose creates by default and specifically attach to it. docker network ls
will show it; its name will end with ..._default
. Note that there's a huge amount of additional setup in the airflow-common
block and docker run
won't see any of this at all.
docker run --net airflow_default -p 8050:8050 airflow
CodePudding user response:
I fixed it by adjusting @davidmaze's answer as below:
airflow-server:
<<: *airflow-common
ports:
- '8050:8050'
restart: always
entrypoint: gunicorn --chdir /opt/airflow/py_scripts -b 0.0.0.0:8050 index:server