I have the directory for my dag in the airflow/dags directory, and when calling airflow dags list
while logged into the webserver, the dag's ID shows up in the list. However, calling airflow dags list
while logged into the scheduler returns the following error:
Killed
command terminated with exit code 137
The dag also does not show up in list on the webserver UI. When manually entering the dag_id in the url, it shows up with every task in the right place, but triggering a manual run via the Trigger DAG
button results in a pop_up stating Cannot find dag <dag_id>
. Has anyone run into this issue before? Is this a memory problem?
My DAG code is written in python, and the resulting DAG object has a large number of tasks (>80). Running on airflow 1.10.15 with a kubernetes executor