Home > Net >  How to pass JSON variable to external bash script in Airflow BashOperator
How to pass JSON variable to external bash script in Airflow BashOperator

Time:01-16

Here I am getting variables from Airflow GUI in form of JSON

example_vars=Variable.get("example_vars", deserialize_json=True)

dag = DAG('NEW_DAG', description='NEW_DAG', schedule_interval=None, default_args = default_args,catchup=False )

shell_command = 'path-to-my-file/my_script.sh '

running_task = BashOperator( task_id='task_to_run', bash_command=shell_command, trigger_rule="all_done",params={"val": example_vars}, dag=dag )

running_task

I tried from this page Airflow BashOperator: Passing parameter to external bash script

in the bash script----> echo {{ params.val }}

and it prints {{ params.val }} not the json file.

CodePudding user response:

You need

shell_command = 'path-to-my-file/my_script.sh "{{example_vars}}"'

See the examples at https://airflow.apache.org/docs/apache-airflow/stable/howto/operator/bash.html

CodePudding user response:

import json

import shlex

# JSON variable

data = {'key': 'value'}

# Convert JSON variable to string

json_data = json.dumps(data)

# Quote the string to escape any special characters

escaped_json_data = shlex.quote(json_data)

# Pass the quoted string to the bash script

bash_command = './script.sh ' escaped_json_data

# Create a BashOperator

bash_task = BashOperator( task_id='bash_task', bash_command=bash_command, dag=dag)

In bash script-->

#!/bin/bash

json_data=$1

  • Related