Home > Enterprise >  Use Airflow Bash Operator with Airflow Config values automatically included
Use Airflow Bash Operator with Airflow Config values automatically included

Time:11-17

We are using Airflow 2.3.4.

We want to use the Bash Operator to perform Airflow commands. Following this documentation on the Bash operator. One can add environment variables to the bash operator so they can be used in the commands.

Is there a way to also add values from the airflow config that are stored as environment variables?

Or can we just transfer all environment variables here without having to list them all?

CodePudding user response:

You can add whatever you want:

import os

# one env
bash_task = BashOperator(
    task_id="bash_task",
    bash_command="echo $var1_name && echo $var2_name",
    env={
        "var1_name": "{{ <any jinja var> }}",
        "var2_name": "static value",
    },
)

# all env from airflow host
bash_task = BashOperator(
    task_id="bash_task",
    bash_command="echo $<any env var>",
    env=os.environ,
)

# all env from airflow host   extra env
bash_task = BashOperator(
    task_id="bash_task",
    bash_command="echo $<any env var>",
    env={**os.environ, "var1_name": "{{ <any jinja var> }}",
        "var2_name": "static value",},
)
  • Related