3

We have a requirement where we need to send airflow metrics to datadog. I tried to follow the steps mentioned here https://docs.datadoghq.com/integrations/airflow/?tab=host

Likewise, I included statsD in airflow installation and updated the airflow configuration file (Steps 1 and 2)

After this point, I am not able to figure out how to send my metrics to datadog. Do I follow the Host configurations or containarized configurations? For the Host configurations, we have to update the datadog.yaml file which is not in our repo and for containerized version, they have specified how to do it for Kubernetics but we don't use Kubernetics.

We are using airflow by creating a docker build and running it over on Amazon ECS. We also have a datadog agent running parallely in the same task (not part of our repo). However I am not able to figure out what configurations I need to make in order to send the StatsD metrics to datadog. Please let me know if anyone has any answer.

Vibhav
  • 181
  • 5
  • 11
  • I am having similar issue. There seem to be no good real working examples for containerized configuration. Examples out there are in docker-compose env or host env. If there is a containerized example, it's not airflow. – alltej Feb 24 '22 at 14:50

0 Answers0