1

I have been using Netflix Conductor as workflow orchestration before and Apache Airflow is new to me. In Conductor, the execution of workflows works in these steps:

  1. Workflow starts via REST API call
  2. Each separate worker (service) polls its own tasks by constantly calling REST API methods of Conductor
  3. After completing or failing these tasks, each worker calls REST API to change status of workflow

Each of these tasks workers are separate services. They are implemented on different programming languages.

I can't seem to find any examples of how to use these concepts on Apache Airflow. Constantly using BashOperator seems to me very bad solution.

Are there any examples that shows how to use workers, some of them are written not on python, to listen and execute its tasks that are defined in DAGs?

Mr.D
  • 7,353
  • 13
  • 60
  • 119
  • Depending on your environment and deployment, Airflow supports a lot of different operators. E.g. there are also KubernetesPodOperators, allowing arbitrary docker containers to be executed. But it depends on how your airflow environment is set up. – Blokje5 Jan 27 '20 at 11:01
  • @Blokje5 I just want to make all my services to listen to their specific task ids and execute them if their turn on workflow arrives – Mr.D Jan 27 '20 at 12:16
  • Airflow has the worker concept, but on another level. Instead of your service polling the airflow API, you would have an Airflow worker executing one of the operators. The airflow worker would poll the API for available tasks to execute. (see for example the celery executor: https://airflow.readthedocs.io/en/stable/executor/celery.html). – Blokje5 Jan 27 '20 at 14:36

0 Answers0