I have created the dag with the following configuration
job_type='daily'
SOURCE_PATH='/home/ubuntu/daily_data'
with DAG(
dag_id="transformer_daily_v1",
is_paused_upon_creation=False,
default_args=default_args,
description="transformer to insert data",
start_date=datetime(2022,9,20),
schedule_interval='31 12 * * *',
catchup=False
) as dag:
task1=PythonOperator(
task_id="dag_task_1",
python_callable=get_to_know_details(job_type,SOURCE_PATH),
)
def get_to_know_details(job_type,SOURCE_PATH):
print("************************",job_type,SOURCE_PATH)
Each time when i start the airflow using command
airflow standalone
the dag function executed automatically without Triggering as seen in the logs
standalone | Starting Airflow Standalone
standalone | Checking database is initialized
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
WARNI [airflow.models.crypto] empty cryptography key - values will not be stored encrypted.
************************ daily /home/ubuntu/daily_data
WARNI [unusual_prefix_8fc9338bb4cf0c5518fed57dffa1a11abec44c36_example_kubernetes_executor] The example_kubernetes_executor examp
le DAG requires the kubernetes provider. Please install it with: pip install apache-airflow[cncf.kubernetes]
airflow version - 2.2.5