I have scheduled my airflow DAGs to run, every DAG has one task inside of them. When the DAGs run, the tasks inside them don't get executed.
Here's my code for the same (I am trying to SSH into an EC2 server and run a bash command):
from datetime import timedelta, datetime
from airflow import DAG
from airflow.contrib.operators.ssh_operator import SSHOperator
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'email': ['removed@example.com'],
'email_on_failure': True,
'email_on_retry': True,
'start_date': datetime.now() - timedelta(days=1),
'retries': 1,
'retry_delay': timedelta(minutes=5),
}
dag = DAG(dag_id='back_fill_reactivated_photo_dimension',
default_args=default_args,
schedule_interval='55 * * * *',
dagrun_timeout=timedelta(seconds=120))
t1_bash = """
/usr/local/bin/dp/database_jobs/run_py.sh "backfill_photo_dim_reactivated.py"
"""
t1 = SSHOperator(
ssh_conn_id='ssh_aws_ec2',
task_id='backfill_photo_dim',
command=t1_bash,
dag=dag)
The Airflow UI shows the DAG to be in the running state but the actual task inside the DAG never runs, am I missing something in my code?
Also, is there a way to force run a DAG regardless of it's CRON schedule?