We would like to use Apache Airflow to orchestrate work across global data centers (regions). From what I can tell the only way to make this work is to give access/permission to all tasks to write directly to some cloud exposed database. Does anyone know of a better way to implement this? I would prefer there was a way for tasks to communicate back to the central Airflow database asynchronously through a message queue, but I've seen no mention of that. Any suggestions?
Asked
Active
Viewed 234 times
1 Answers
1
To my knowledge, a central database for Airflow internals is required. You can't escape from it right now.
See the doc: https://airflow.apache.org/configuration.html?highlight=initdb#setting-up-a-backend

Antoine Augusti
- 1,598
- 11
- 13