0

I am trying to share a common variable with all the tasks e.g pipeline_id which i calculate using the current_system time.

Is there a way to pass this variable to all the tasks in DAG? currently i get different value in different tasks as they are running on different processes.

I have some logic to separate different pipeline runs.

  • you can create the pipeline_id in PythonOperator at the beginning of the dag and pass it to the each task (op_kwargs, conf, params) - depend on your operators – ozs Jun 28 '22 at 10:45
  • Hi varun sharma, If my answer addressed your question, please consider accepting and upvoting it. If not, let me know so that I can improve my answer.Accepting an answer will help the community members with their research as well. – Shipra Sarkar Jul 06 '22 at 04:54

2 Answers2

0

In airflow, variables have a global scope which can be used for overall configuration.Variables are used for values that are runtime-dependent.

As you want to share common variables, you can try using Xcoms which can be used to pass data from one Task/Operator to another one. Xcoms(cross communications) are identified by key and the task_id or dag_id from where it comes from. These are per-task-instance and used for communication among dags. For more information, you can check this link.

Shipra Sarkar
  • 1,385
  • 3
  • 10
0

I think i have found a better way. As airflow DAG execution starts it populates AIRFLOW_CTX_DAG_RUN_ID=manual__2022-07-08T16:30:02.549233+00:00 ,

we can extract the timestamp part out of this variable and use this as a common global time for all of the tasks.

I don't need to use XCOM with this as the variable can be accessed inside each task.