0

This name, similar to the 2FA security schema comes from a scenario in which I want to be sure periodically that certain ETL triggers are in place.

Not only I want to monitor whether certain procedure did work/failed with alarms, which I already implemented. But also have a second infrastructure checking that the triggers are in place. This way If I don't get an error I can be sure that everything worked.

Imagine you have an ETL scheduled to run every month. Your projects requirements are to run this ETL every month. You set it to run every month and everything works for some time. But after a quarter someone makes some config changes and the ETL is not triggered anymore. It obviously doesn't throw any new error. I need a tool outside my environment to monitor that holds and enforces the requirements.

Is there some tool/methodology/technique like the one described above?

dinigo
  • 101
  • 3

1 Answers1

1

I would suggest to use Cloud Monitoring which provides you visibility into the performance, uptime, and overall health of cloud-powered applications. If you are planning to use Dataflow, it includes a web-based monitoring interface. The monitoring interface let you see and interact with your Dataflow jobs, allows you to create alerts and dashboards as well as determining the cause of failures in your pipeline

  • Thanks Isaac. I (we) already use Cloud Monitoring. It's just that it has certain "time constrains" on how it can monitor. For example I want to monitor if theres NO entries for a week for certain resource, but I can only monitor for a single day. That is the widest that the time window goes – dinigo Mar 09 '20 at 17:50
  • What about a [Metric-absence policy](https://cloud.google.com/monitoring/alerts/policies-in-json#json-metric-absence)? "A metric-absence policy is triggered when no data is written to a metric for the specified duration" – Isaac Miliani Mar 10 '20 at 13:03