3

We are using Airflow 2.1.4 via Google Cloud composer and are referencing our queries via the "BigQueryInsertJobOperator" and for the query we are referencing a path on the Composer GCS bucket (ie "query" : "{% include ' ...). This works fine except that we have some DAGs where the first step is compiling new queries and those queries are then referenced by subsequent stages. In those cases, the DAG does not consider the newly generated queries but always take the ones that were present before.

Is there a parameter to set up so that the operator refresh at certain interval and make sure to take the latest query file available and not a cache from a previous file ?

Thank you for your help.

0 Answers0