0
create_volume = PythonOperator(
task_id='dummy',
python_callable=create_temp_volume,
provide_context=True,
retries=3,
dag=dag,
on_success_callback=partial(update_status_test, DAG_ID),
on_failure_callback=partial(update_status_test, DAG_ID),
on_retry_callback=partial(update_status_test, DAG_ID)

The above one is the task, on which it should call "update_status_test" function on success or failure or up_for retry. It is getting called properly.

The function used on callback is the below one .

def update_status_test(parent_dag_id, context):
context['task_instance'].refresh_from_db()
print(parent_dag_id)
print('start_date', context['task_instance'].start_date)
print('duration', context['task_instance'].duration)
print('current_state', context['task_instance'].current_state)
print('end_date', context['task_instance'].end_date)
print('task_id', context["task_instance"].task_id)

start_date 2022-08-06 04:22:13.843927+00:00 INFO - e2e_workflow_project21_1 INFO - duration None INFO - current_state <bound method [running]>> INFO - end_date None INFO - task_id create_temp_volume [2022-08-06 06:22:16,501] {taskinstance.py:1070} INFO - Marking task as UP_FOR_RETRY.dag_id=e2e_workflow_project21_1,...........

The pattern should be once the task is marked success or failure or any state changed, the the callable should be executed, but some times it is getting executed the task status changes, which is giving me the end date and duration of the task as NONE and the current status of the task is running.

Sometimes it is working as expected giving all proper info like end date and duration and current status either success, failed or up for retry. But not all the cases giving correct info.

I am using the callback on task level, should I use them on DAG level as I am using Airflow 1.10.11 and python 2.9

Boss Boss
  • 3
  • 1
  • 5

0 Answers0