7

We have 100 of dags which has a prefix with "dag_EDW_HC_*" . we have below command to pause the dag

Command: airflow pause dag_id

Is there any way we can pause all the 100 dags "dag_EDW_HC_*" in a single go .. (In programmatic in python or any other way) ..?

Ravi
  • 793
  • 3
  • 16
  • 29
  • 1
    I also referred few of the links like https://stackoverflow.com/questions/44360354/airflow-unpause-dag-programmatically but to pause multiple dags i havent got any reference. – Ravi Aug 06 '20 at 14:09

3 Answers3

7

The absolute easiest (and likely fastest) way I can think of is to update the database:

UPDATE  dag
   SET  is_paused = false
 WHERE  dag_id LIKE 'dag_EDW_HC%';
joebeeson
  • 4,159
  • 1
  • 22
  • 29
  • Thanks a lot for the reply.. So iit means , in airflow database directory inside DAG table , we can make it as false ? – Ravi Aug 06 '20 at 16:17
  • what happens if DAGs are currently running and we carry out this update ? – hopeIsTheonlyWeapon Aug 06 '20 at 17:58
  • The same thing that would happen if you did it through any other means: new tasks/runs won't be scheduled. – joebeeson Aug 06 '20 at 18:08
  • Since Feb 2022, Google provides a composer_dags.py script to do this. Check my answer below (https://stackoverflow.com/a/74026479/391034) on how to use it – mkumar118 Oct 11 '22 at 10:17
0

If you want to do this regularly you can create a DAG specifically for this purpose with the corresponding PythonOperator for that and specify parameters when triggering DAG. From a running task instance (in the python_callable function that we pass to a PythonOperator or in the execute method of a custom operator) you have access to the DagBag object which contains dag ids of all DAGs loaded into Airflow environment which you can use to get DagModel-s which you can loop through and pause all DAGs:

def python_callable():
   dag_bag = DagBag(read_dags_from_db=False)
   for dag_id_ in dag_bag.dag_ids:
       dag_model = airflow.models.dag.DagModel.get_dagmodel(dag_id_)
       dag_model.set_is_paused(True)

The current code is for the 2.0.1 version and it could differ for different versions of Airflow. You should check the documentation for your version of Airflow server if this callable does not work for you.

alaptiko
  • 499
  • 3
  • 14
0

Since Feb 2022, Google provides a composer_dags.py script which can be used to pause all your DAGs in a given environment, using the below command:

python3 composer_dags.py --environment COMPOSER_1_ENV \
  --project PROJECT_ID \
  --location COMPOSER_1_LOCATION \
  --operation pause

source: https://cloud.google.com/composer/docs/migrate-composer-2-snapshots-af-1#step_pause_dags_in_your_environment

mkumar118
  • 442
  • 5
  • 12