I am trying to ship all airflow logs to kafka by attaching a new handler to the root logger, but not all logs are being published. Do I need to configure something else here?
This is what I'm doing:
custom_log_config.py
LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
# Configure a new handler for publishing logs to kafka
environment = get_app_env()
LOGGING_CONFIG["handlers"]["kafka_handler"] = {
"class": "com.test.log_handler.KafkaHandler",
"formatter": "airflow",
"version": environment.version,
"log_file": log_file,
"filters": ["mask_secrets"],
}
# Attach handler to root logger of airflow
LOGGING_CONFIG["root"]["handlers"].append("kafka_handler")
And finally I'm setting airflow configs to use the new logger class described above:
airflow.logging__logging_config_class=com.test.log_handler.custom_log_config.LOGGING_CONFIG
While some logs do flow to kafka, I'm missing task run logs (eg. following loggers: taskinstance.py, standard_task_runner.py, cli_action_loggers.py)