How should I configure or send task logs from airflow 1.9 to Elastic search? I found the config templates in the current git repository but not sure if that can be done in v 1.9
-
https://github.com/apache/incubator-airflow/tree/master/airflow/config_templates – Amit Kumar May 17 '18 at 06:14
1 Answers
As far as I understand your question, you are asking if it is possible to simply configure Airflow to send its logs directly to Elastic.
Reference
This is - at least for v.1.9.0 - no trivial configuration issue. Even though it says in the configuration file that you can write data to Elastic I couldn't find
1) a way to set up a proper ElasticSearch connection
2) any code in the Airflow repository using those settings to push logs to a web interface or to ElasticSearch
It seems to me this will be a new feature in the future, for reference see this: https://issues.apache.org/jira/browse/AIRFLOW-1454
Conclusion
So the current standard way to do this would be to log the Airflow logs to a specific folder, commonly set like this in airflow.cfg
:
base_log_folder = {AIRFLOW_HOME}/logs
Then use a typical setup with e.g. FileBeat to send those logs, depending on your setup, directly into Elastic or into Logstash.

- 8,033
- 6
- 26
- 41
-
sure that is what I am planning to use Use a fluent D forwarder to forward the logs into Kafka from there into ElasticSearch – Amit Kumar May 17 '18 at 19:40
-