Home > database >  How to execute multiple sql files in airflow using PostgresOperator?
How to execute multiple sql files in airflow using PostgresOperator?

Time:03-24

I have multiple sql files in my sql folder. I am not sure how to execute all the sql files within a DAG?

  - dags
    - sql
      - dummy1.sql
      - dummy2.sql

For a single file, below code works

sql_insert= PostgresOperator(task_id='sql_insert',
                             postgres_conn_id='postgres_conn',
                             sql='sql/dummy1.sql')

CodePudding user response:

With a list

sql_insert= PostgresOperator(task_id='sql_insert',
                             postgres_conn_id='postgres_conn',
                             sql=['sql/dummy1.sql', 'sql/dummy2.sql'])

Or you can make it dynamic

import glob
sql_insert= PostgresOperator(task_id='sql_insert',
                             postgres_conn_id='postgres_conn',
                             sql=glob.glob("sql/*.sql")]
  • Related