22

Is it possible to pass parameters to Airflow's jobs through UI?

AFAIK, 'params' argument in DAG is defined in python code, therefore it can't be changed at runtime.

Alexander Ershov
  • 1,105
  • 2
  • 12
  • 26

4 Answers4

11

Depending on what you're trying to do, you might be able to leverage Airflow Variables. These can be defined or edited in the UI under the Admin tab. Then your DAG code can read the value of the variable and pass the value to the DAG(s) it creates.

Note, however, that although Variables let you decouple values from code, all runs of a DAG will read the same value for the variable. If you want runs to be passed different values, your best bet is probably to use airflow templating macros and differentiate macros with the run_id macro or similar

Bryan
  • 432
  • 5
  • 13
  • 1
    In general, I want to run one script in parallel with different parameters. I can't do it with global variables. As I understood, macros package contains constants and some functions like date and uuid, but I want to pass a general string. So, all in all, I see this solution: create n scripts and n global variables. In this case, it will be possible to run n jobs in parallel. Anyway, thanks for the answer. – Alexander Ershov Nov 20 '17 at 19:14
  • **@Bryan**, **@AlexanderErshov** I'm sufficiently sold on `template macro`s: **[1]** [offloading processing from scheduler to executors](https://docs.astronomer.io/v2/apache_airflow/best-practices-guide.html#macros) **[2]** custom arguments (are there more [3], [4] .. ?). But even after a fine bit of research, I'm not clear on how *macro*s can produce the effect of **passing params to `DAG`s / `Operator`s from `Airflow`'s `WebUI`**. Any pointers on this? – y2k-shubham Aug 09 '18 at 08:04
11

Two ways to change your DAG behavior:

  1. Use Airflow variables like mentioned by Bryan in his answer.
  2. Use Airflow JSON Conf to pass JSON data to a single DAG run. JSON can be passed either from

UI - manual trigger from tree view enter image description here UI - create new DAG run from browse > DAG runs > create new record enter image description here

or from

CLI

airflow trigger_dag 'MY_DAG' -r 'test-run-1' --conf '{"exec_date":"2021-09-14"}'

Within the DAG this JSON can be accessed using jinja templates or in the operator callable function context param.

def do_some_task(**context):
    print(context['dag_run'].conf['exec_date'])


task1 = PythonOperator(
    task_id='task1_id',
    provide_context=True,
    python_callable=do_some_task,
    dag=dag,
)

#access in templates
task2 = BashOperator(
    task_id="task2_id",
    bash_command="{{ dag_run.conf['exec_date'] }}",
    dag=dag,
)

Note that the JSON conf will not be present during scheduled runs. The best use case for JSON conf is to override the default DAG behavior. Hence set meaningful defaults in the DAG code so that during scheduled runs JSON conf is not used.

ns15
  • 5,604
  • 47
  • 51
0

It is possible to improve the usability of the Answer by ns15 by building a user interface within Airflow web. Airflow's interface can be expanded with plugins, for instance web views. Plugins are saved in the Airflow plugins folder, normally $AIRFLOW_HOME/plugins

A full example is given here, where this UI image has been found:

Manual DAG trigger with custom parameter

mayeulk
  • 100
  • 8
0

A form builder will be built-in in Airflow 2.6.0, without the need for a plugin, thanks to AIP-50 (Airflow Improvement Proposal 50).

Sample views:

Yes/No switch: Yes/No switch

Date picker: Date picker

Select of the recent configs: select of the recent configs

mayeulk
  • 100
  • 8