I am trying to create a DAG in Airflow 2+ which will trigger multiple data fusion pipelines using CloudDqtaFusionStartPipeline operator and they will run in parallel.
However, I want to assign the parameter values (like pipeline name, runtime argument etc.) for each data fusion pipeline dynamically, based on the output of previous Python task.
The flow I am trying is something like below.
start - read_bq - [df_1, ... df_n]
Here, read_bq is a Python task which will read the values from a BigQuery table as a list (values like pipeline name, runtime argument etc.)
Then looping over that list I will determine how many data fusion pipelines to trigger and assign the values returned from BQ to those pipelines.
The problem I am facing, neither CloudDqtaFusionStartPipeline does have any task_instance option which can be used for xcom pull, nor can I run a loop within DAG by doing xcom pull (as it works only with task).
Any technical help or suggestion is appreciated.
Thanks, Santanu