1

I want to design an alerting system, which will send alert to a prescribed channel, whenever any GCP service will exceed prescribed percent of quota limit, across all the metrics. I have configured alerts for Run and Function manually for all the metrics, but I am stuck at GCS, BQ and Logging,

Edit:

Here is the specific case:

My cloud function is required to read logs using Cloud Logging API. This function is generating quota limit exceed error for ReadRequestsPerMinutePerProject for Cloud Logging. enter image description here

I also tried to find a metric in Resource type: Logging and to be safe I also tried to find it in Resource type: Cloud Function. But there is no defined metric matching the description in either of those Resource types.

Wojtek_B
  • 4,245
  • 1
  • 7
  • 21
  • Please provide more details in what you did: what worked and what didn't. If possible you can describe your setup which will help to troubleshoot this. – Wojtek_B May 21 '21 at 10:21
  • @Wojtek_B I am trying to make a monitoring system for GCP, to monitor resources utilised by services like 'Cloud Run' or 'Big Query'. Say for example, maximum compute time for a 'Cloud Run' instance is capped at 540s, here I want to configure an alert, which will send me an email, whenever a 'Run' instance will compute for more than '75%' i.e. 405s of the capped limit. I want to configure an alert system for all the metric in all services, so that I can prevent any service instance to exceed quota limit imposed by google. – KISHAN DHRANGADHARIYA May 22 '21 at 05:56

1 Answers1

1

Having quota alerts for every GCP feature seems complicated but we can try to split it in pieces and analyse. I will focus on the ones that you mentioned in your question.

However - there are limits imposed on so many parameters (even considering buckets alone) that having alert for everything is an overkill.

For BigQuery is the same; there are limits to consider such as Concurrent rate limit for interactive queries, Cross-region federated querying or Daily destination table update limit.

You have to define what you want to monitor and the design a proper solution;

  • trace a proper logs in Cloud Logging that hold the information about quota usage
  • create a log based metric utilising mentioned logs
  • create an alert utilising the metric

You can read more about quotas for:

Similar topic has been discussed here.

Complete list of API's (over 500 to be exact) that have quotas/limits you can find in the IAM & Admin > Quotas.

------- UPDATE -------

You can try to limit the number of checks performed by the function or ask for increased quota for Cloud Logging API.

Have in mind that checking performed by the function are counted towards the quota.

Wojtek_B
  • 4,245
  • 1
  • 7
  • 21
  • @Wojtek_B I followed your instruction and configured alert most of the metrics, I am facing some problem is cloud function instance, my function is required to reads log using logging API and that function is generating quota limit error I am attaching a link to image – KISHAN DHRANGADHARIYA May 27 '21 at 07:03
  • Please update your question with the details regarding this cloud function instance (it's better than having it in comments). – Wojtek_B May 27 '21 at 07:05
  • @Wojtek_B I am sorry I am new to this platform and still figuring out how to use it. I will edit the question Please ignore the comments above – KISHAN DHRANGADHARIYA May 27 '21 at 07:08