0

Recently, we have been getting some errors on airflow where certain dags will not run any tasks but are being marked as complete. We had the start_date using days_ago from airflow.

from airflow.utils.dates import days_ago

enter image description here

Shivangi Singh
  • 1,001
  • 2
  • 11
  • 20

2 Answers2

1

From: https://forum.astronomer.io/t/dag-run-marked-as-success-but-no-tasks-even-started/1423

If you see dag runs that are marked as success but don’t have any task runs, this means the dag runs’ execution_date was earlier than the dag’s start_date.

This is most commonly seen when the start_date is set to some dynamic value e.g. airflow.utils.dates.days_ago(0). This creates the opportunity for the execution date of a delayed dag execution to be before what the dag now thinks is it’s start_date. This can even happen in a cyclic pattern, where a few dagruns will work, and then at the beginning of every day a dagrun will experience this problem.

This simplest way to avoid this problem is the never use dynamic start_date. It is always better to specify a static start_date. If you are concerned about accidentally triggering multiple runs of the same dag, just set catchup=False.

Shivangi Singh
  • 1,001
  • 2
  • 11
  • 20
  • I have also seen this occur when `start_date` was set in both the `default_args` dict and the `DAG()` object but to different values. DAG runs were marked success between `earlier_start` and `later_start` without running, but ran normally from `later_start` onward – Connor Dibble Sep 14 '22 at 22:41
0

There is an open ticket in Airflow project with this issue: https://github.com/apache/airflow/issues/17977