According to airflow connection management page, we can use environment variable to create connections:
export AIRFLOW_CONN_MY_PROD_DATABASE='my-conn-type://login:password@host:port/schema?param1=val1¶m2=val2'
So, I've downloaded the official docker-compose.yml
:
$ curl -LfO 'https://airflow.apache.org/docs/apache-airflow/2.2.0/docker-compose.yaml'
And added environment variable of a connection like below:
...
47 image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.2.0}
48 # build: .
49 environment:
50 &airflow-common-env
51 AIRFLOW_CONN_MY_PROD_DB: my-conn-type://login:password@host:port/schema?param1=val1¶m2=val2
52 AIRFLOW__CORE__EXECUTOR: CeleryExecutor
...
Then, I load all containers using docker-compose up
and have an access to airflow-worker
service:
$ docker-compose exec airflow-worker /bin/bash
And check out the all connection list:
airflow@52d9c6ab9309:/opt/airflow$ airflow connections list
But it said:
No data found
Am I missing something?
CodePudding user response:
Airflow only lists connections from the underlying metastore. So connections from environment variables, or a different secrets backend such as Vault, will not be displayed in the Airflow UI/listed by the CLI.
To check if the connection works, you can open a Python terminal and run:
from airflow.hooks.base import BaseHook
conn = BaseHook.get_connection("my_prod_db")
# now you can print e.g. print(conn.host)