I'm trying to run an airflow image using docker. A while ago it was working normally. However, I ran other applications on localhost
without using docker (using VisualStudio) and when I rerun my airflow after that, localhost
was no longer working.
I've tried to reinstall Docker and AirFlow imagem, but no success.
I'm using Airflow 2.1.4 docker imagem available on Apache website.
My AirFlow is set to run on localhost:8080
. Is there a way to know if another application is using 8080
?
I don't know what necessary information I can post here to clarify my problem. I believe that checking if there is other application binded to run on localhost:8080
(either an instance of visual studio or the docker itself) can solve the problem. But how to do this?
My docker-compose.yaml
file:
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Basic Airflow cluster configuration for CeleryExecutor with Redis and PostgreSQL.
#
# WARNING: This configuration is for local development. Do not use it in a production deployment.
#
# This configuration supports basic configuration using environment variables or an .env file
# The following variables are supported:
#
# AIRFLOW_IMAGE_NAME - Docker image name used to run Airflow.
# Default: apache/airflow:2.1.4
# AIRFLOW_UID - User ID in Airflow containers
# Default: 50000
# AIRFLOW_GID - Group ID in Airflow containers
# Default: 0
#
# Those configurations are useful mostly in case of standalone testing/running Airflow in test/try-out mode
#
# _AIRFLOW_WWW_USER_USERNAME - Username for the administrator account (if requested).
# Default: airflow
# _AIRFLOW_WWW_USER_PASSWORD - Password for the administrator account (if requested).
# Default: airflow
# _PIP_ADDITIONAL_REQUIREMENTS - Additional PIP requirements to add when starting all containers.
# Default: ''
#
# Feel free to modify this file to suit your needs.
---
version: "3"
x-airflow-common: &airflow-common
# In order to add custom dependencies or upgrade provider packages you can use your extended image.
# Comment the image line, place your Dockerfile in the directory where you placed the docker-compose.yaml
# and uncomment the "build" line below, Then run `docker-compose build` to build the images.
image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.1.4}
# build: .
environment: &airflow-common-env
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
AIRFLOW__CORE__FERNET_KEY: ""
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: "true"
AIRFLOW__CORE__LOAD_EXAMPLES: "true"
AIRFLOW__API__AUTH_BACKEND: "airflow.api.auth.backend.basic_auth"
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-0}"
depends_on: &airflow-common-depends-on
redis:
condition: service_healthy
postgres:
condition: service_healthy
services:
postgres:
image: postgres:13
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
volumes:
- postgres-db-volume:/var/lib/postgresql/data
healthcheck:
test: ["CMD", "pg_isready", "-U", "airflow"]
interval: 5s
retries: 5
restart: always
redis:
image: redis:latest
expose:
- 6379
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 30s
retries: 50
restart: always
airflow-webserver:
<<: *airflow-common
command: webserver
ports:
- 5000:8080
healthcheck:
test: ["CMD", "curl", "--fail", "http://localhost:5000/health"]
interval: 10s
timeout: 10s
retries: 5
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
airflow-scheduler:
<<: *airflow-common
command: scheduler
healthcheck:
test:
[
"CMD-SHELL",
'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"',
]
interval: 10s
timeout: 10s
retries: 5
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
airflow-worker:
<<: *airflow-common
command: celery worker
healthcheck:
test:
- "CMD-SHELL"
- 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
interval: 10s
timeout: 10s
retries: 5
environment:
<<: *airflow-common-env
# Required to handle warm shutdown of the celery workers properly
# See https://airflow.apache.org/docs/docker-stack/entrypoint.html#signal-propagation
DUMB_INIT_SETSID: "0"
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
airflow-init:
<<: *airflow-common
entrypoint: /bin/bash
# yamllint disable rule:line-length
command:
- -c
- |
function ver() {
printf "dddd" $${1//./ }
}
airflow_version=$$(gosu airflow airflow version)
airflow_version_comparable=$$(ver $${airflow_version})
min_airflow_version=2.1.0
min_airflow_version_comparable=$$(ver $${min_airflow_version})
if (( airflow_version_comparable < min_airflow_version_comparable )); then
echo
echo -e "\033[1;31mERROR!!!: Too old Airflow version $${airflow_version}!\e[0m"
echo "The minimum Airflow version supported: $${min_airflow_version}. Only use this or higher!"
echo
exit 1
fi
if [[ -z "${AIRFLOW_UID}" ]]; then
echo
echo -e "\033[1;33mWARNING!!!: AIRFLOW_UID not set!\e[0m"
echo "If you are on Linux, you SHOULD follow the instructions below to set "
echo "AIRFLOW_UID and AIRFLOW_GID environment variables, otherwise files will be owned by root."
echo "For other operating systems you can get rid of the warning with manually created .env file:"
echo " See: https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#setting-the-right-airflow-user"
echo
fi
one_meg=1048576
mem_available=$$(($$(getconf _PHYS_PAGES) * $$(getconf PAGE_SIZE) / one_meg))
cpus_available=$$(grep -cE 'cpu[0-9] ' /proc/stat)
disk_available=$$(df / | tail -1 | awk '{print $$4}')
warning_resources="false"
if (( mem_available < 4000 )) ; then
echo
echo -e "\033[1;33mWARNING!!!: Not enough memory available for Docker.\e[0m"
echo "At least 4GB of memory required. You have $$(numfmt --to iec $$((mem_available * one_meg)))"
echo
warning_resources="true"
fi
if (( cpus_available < 2 )); then
echo
echo -e "\033[1;33mWARNING!!!: Not enough CPUS available for Docker.\e[0m"
echo "At least 2 CPUs recommended. You have $${cpus_available}"
echo
warning_resources="true"
fi
if (( disk_available < one_meg * 10 )); then
echo
echo -e "\033[1;33mWARNING!!!: Not enough Disk space available for Docker.\e[0m"
echo "At least 10 GBs recommended. You have $$(numfmt --to iec $$((disk_available * 1024 )))"
echo
warning_resources="true"
fi
if [[ $${warning_resources} == "true" ]]; then
echo
echo -e "\033[1;33mWARNING!!!: You have not enough resources to run Airflow (see above)!\e[0m"
echo "Please follow the instructions to increase amount of resources available:"
echo " https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#before-you-begin"
echo
fi
mkdir -p /sources/logs /sources/dags /sources/plugins
chown -R "${AIRFLOW_UID}:${AIRFLOW_GID}" /sources/{logs,dags,plugins}
exec /entrypoint airflow version
# yamllint enable rule:line-length
environment:
<<: *airflow-common-env
_AIRFLOW_DB_UPGRADE: "true"
_AIRFLOW_WWW_USER_CREATE: "true"
_AIRFLOW_WWW_USER_USERNAME: ${_AIRFLOW_WWW_USER_USERNAME:-airflow}
_AIRFLOW_WWW_USER_PASSWORD: ${_AIRFLOW_WWW_USER_PASSWORD:-airflow}
user: "0:${AIRFLOW_GID:-0}"
volumes:
- .:/sources
airflow-cli:
<<: *airflow-common
profiles:
- debug
environment:
<<: *airflow-common-env
CONNECTION_CHECK_MAX_COUNT: "0"
# Workaround for entrypoint issue. See: https://github.com/apache/airflow/issues/16252
command:
- bash
- -c
- airflow
flower:
<<: *airflow-common
command: celery flower
ports:
- 5555:5555
healthcheck:
test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
interval: 10s
timeout: 10s
retries: 5
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
volumes:
postgres-db-volume:
My airflow webserver log:
airflow-webserver_1 | ....................
airflow-webserver_1 | ERROR! Maximum number of retries (20) reached.
airflow-webserver_1 |
airflow-webserver_1 | Last check result:
airflow-webserver_1 | $ airflow db check
airflow-webserver_1 | Unable to load the config, contains a configuration error.
airflow-webserver_1 | Traceback (most recent call last):
airflow-webserver_1 | File "/usr/local/lib/python3.6/pathlib.py", line 1248, in mkdir
airflow-webserver_1 | self._accessor.mkdir(self, mode)
airflow-webserver_1 | File "/usr/local/lib/python3.6/pathlib.py", line 387, in wrapped
airflow-webserver_1 | return strfunc(str(pathobj), *args)
airflow-webserver_1 | FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/logs/scheduler/2021-09-20'
airflow-webserver_1 |
airflow-webserver_1 | During handling of the above exception, another exception occurred:
airflow-webserver_1 |
airflow-webserver_1 | Traceback (most recent call last):
airflow-webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 565, in configure
airflow-webserver_1 | handler = self.configure_handler(handlers[name])
airflow-webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 738, in configure_handler
airflow-webserver_1 | result = factory(**kwargs)
airflow-webserver_1 | File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/log/file_processor_handler.py", line 47, in __init__
airflow-webserver_1 | Path(self._get_log_directory()).mkdir(parents=True, exist_ok=True)
airflow-webserver_1 | File "/usr/local/lib/python3.6/pathlib.py", line 1252, in mkdir
airflow-webserver_1 | self.parent.mkdir(parents=True, exist_ok=True)
airflow-webserver_1 | File "/usr/local/lib/python3.6/pathlib.py", line 1248, in mkdir
airflow-webserver_1 | self._accessor.mkdir(self, mode)
airflow-webserver_1 | File "/usr/local/lib/python3.6/pathlib.py", line 387, in wrapped
airflow-webserver_1 | return strfunc(str(pathobj), *args)
airflow-webserver_1 | PermissionError: [Errno 13] Permission denied: '/opt/airflow/logs/scheduler'
airflow-webserver_1 |
airflow-webserver_1 | During handling of the above exception, another exception occurred:
airflow-webserver_1 |
airflow-webserver_1 | Traceback (most recent call last):
airflow-webserver_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module>
airflow-webserver_1 | from airflow.__main__ import main
airflow-webserver_1 | File "/home/airflow/.local/lib/python3.6/site-packages/airflow/__init__.py", line 46, in <module>
airflow-webserver_1 | settings.initialize()
airflow-webserver_1 | File "/home/airflow/.local/lib/python3.6/site-packages/airflow/settings.py", line 444, in initialize
airflow-webserver_1 | LOGGING_CLASS_PATH = configure_logging()
airflow-webserver_1 | File "/home/airflow/.local/lib/python3.6/site-packages/airflow/logging_config.py", line 73, in configure_logging
airflow-webserver_1 | raise e
airflow-webserver_1 | File "/home/airflow/.local/lib/python3.6/site-packages/airflow/logging_config.py", line 68, in configure_logging
airflow-webserver_1 | dictConfig(logging_config)
airflow-webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 802, in dictConfig
airflow-webserver_1 | dictConfigClass(config).configure()
airflow-webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 573, in configure
airflow-webserver_1 | '%r: %s' % (name, e))
airflow-webserver_1 | ValueError: Unable to configure handler 'processor': [Errno 13] Permission denied: '/opt/airflow/logs/scheduler'
airflow-webserver_1 |
airflow-webserver_1 exited with code 1
I'm using Docker desktop on windows 10.
A screen of my docker container -ps showing that localhost
is binded on 8080
running Unhealthy
EDIT:
CodePudding user response:
To check all containers already running in docker you can run the following command and check if anyone is using the port 8080:
docker container ls
Usually everything related on web development is by default deployed on 80/8080/443/8443, so the best workaround here is to change the port binding of your docker container to another port different from 8080.
If you are using "docker" command, you will see something like a "-p local_port:container_port" option, you can set another port for the "local_port" in order to not bind container 8080 to your local 8080. This way you will get everything running
To help you a bit more we will need your OS at least to send you a valid command to check if you have any process using 8080 port and how do you wake up your docker image (the arguments, command, docker-compose...)
CodePudding user response:
The logs for your server show
FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/logs/scheduler/2021-09-20'
and
PermissionError: [Errno 13] Permission denied: '/opt/airflow/logs/scheduler'
Your compose file contains
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-0}"
Which means that all of your containers are sharing several folders from your host, and running as whatever user the environment variable AIRFLOW_UID
is. One of two things probably happened-
- That env variable has changed. It was set and now unset, or something like that or
- You did
chown
on those folders with your user. This is probably the issue. Thedags
folder in particular is almost certainly owned by your personal user, anduid 50000
would not have permissions to read those files. If the files were owned by 50000, you wouldn't be able to reed them yourself. For local work, I would recommend running these commands in such a way thatAIRFLOW_UID
is set to your user id- likeAIRFLOW_UID=$(id -u) docker-compose up
.
Now... I don't develop on Windows, and I am aware that the line I just gave you would not work on Windows, and that filesystem permissions are different with docker-desktop and even between different versions of it. But your issue is going to come down to the permissions on those shared bind mount folders.
CodePudding user response:
I used the following command to check the container health log:
docker inspect --format='{{json .State.Health}}' dags_airflow-webserver_1
Is showed me the error:
0curl: (7) Failed to connect to localhost port 5000: Connection refused\n"
I've searched on StackOverflow and found the sammue issue here
What I've done to solve the problem was run VSCode and Docker Desktop as Administrator