Hello I am trying to use the experimental Airflow rest API. Below is my Dag definition:
from airflow import DAG
from airflow.operators import BashOperator,PythonOperator
from datetime import datetime, timedelta
seven_days_ago = datetime.combine(datetime.today() - timedelta(7),
datetime.min.time())
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': seven_days_ago,
'email': ['airflow@airflow.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
}
dag = DAG('python_test', default_args=default_args)
t1 = BashOperator(
task_id='testairflow',
bash_command='python ${AIRFLOW_HOME}/dags/python_test.py',
dag=dag)
and below is my script which simply reads the passed parameters and prints it.
import argparse
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Argument... ')
parser.add_argument('--env', metavar='path', required=True, default='dev', help='Execution Environment')
parser.add_argument('--config_path', required=True, help='Custom config s3 path..')
parser.add_argument('--data_lake_bucket', required=False, default="s3://dl_bucket")
args = parser.parse_args()
print("--------------------------------------")
print(args.env)
print(args.config_path)
print(args.data_lake_bucket)
print("--------------------------------------")
And below is my curl command to trigger it.
curl -X POST \
http://localhost:8080/api/experimental/dags/python_test/dag_runs \
-H 'Cache-Control: no-cache' \
-H 'Content-Type: application/json' \
-d '{"conf":"{\"--env\":\"dev\", \"--config_path\":\"this_is_conf\"}"}'
I am trying to figure out how to read the passed parameters with -d in the python script or any other job? Current I see the job is getting triggered but doesn't read the parameters.