0

`I am trying to run a Dag with MSSQL version 19, backend and airflow version as 2.5.0

below is the DAG:

``*import pendulum
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from datetime import timedelta

# Define the DAG with timezone 
dag = DAG("my_tz_dag", start_date=pendulum.datetime(2016, 1, 1, tz="Europe/Amsterdam"), schedule_interval=timedelta(minutes=15))

# Define the task function
def print_hello():
    print("Hello, world!")

# Define the task using the PythonOperator
task = PythonOperator(task_id="print_hello", python_callable=print_hello, dag=dag)`

`

I am getting this in logs and scheduler goes down

Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Transaction (Process ID 194) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. (1205) (SQLExecDirectW)') [SQL: UPDATE dag_run SET last_scheduling_decision=?, updated_at=? WHERE dag_run.id = ?] [parameters: ((datetime.datetime(2023, 4, 3, 18, 43, 12, 737104, tzinfo=Timezone('UTC')),

Scheduler goes down with this issue

I tried without giving timezone as well ,as below but this one does not start it self .Manual Trigger works fine , But scheduling does not work,

from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.bash_operator import BashOperator

default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': datetime(2023, 3, 1),
    'retries': 1,
    'retry_delay': timedelta(minutes=5)
}

dag = DAG(
    'log_cleaning',
    default_args=default_args,
    schedule_interval='*/5 * * * *',
    catchup=False
)
log_cleaning = BashOperator(
    task_id='log_cleaning',
    bash_command='find /usr/local/airflow/logs/* -mtime +7 -exec rm {} \;',
    dag=dag
)

``

0 Answers0