I have been using Netflix Conductor
as workflow orchestration before and Apache Airflow
is new to me. In Conductor, the execution of workflows works in these steps:
- Workflow starts via REST API call
- Each separate worker (service) polls its own tasks by constantly calling REST API methods of Conductor
- After completing or failing these tasks, each worker calls REST API to change status of workflow
Each of these tasks workers are separate services. They are implemented on different programming languages.
I can't seem to find any examples of how to use these concepts on Apache Airflow
. Constantly using BashOperator
seems to me very bad solution.
Are there any examples that shows how to use workers, some of them are written not on python, to listen and execute its tasks that are defined in DAGs?