1

I have some scripts with functions running on a trigger. Some functions are scheduled to run once per day and few every 10 mins. I am using a Google workspace account which has a quota of 100,000 calls/day for Urlfetch. I am quite sure that I haven't exceeded this quota. So, not sure why I keep getting this exception.

Exception: Service invoked too many times for one day: premium urlfetch.

Also, this exception comes for about an hour every day after which it resolves.

Please advise on the following :-

  1. Root cause and resolution
  2. Difference between urlfetch and premium urlfetch (as the exception says premium)
ab.it.gcp
  • 151
  • 1
  • 14

2 Answers2

0

App Script quotas and limitations

I would like to clarify what is the difference between both "urlfetch" and "premium urlfetch". As suggested by the Quotas for Google Services from App Script, there have been separated under two categories, one for consumer or gmail account and another one for the Google Workspace accounts, the premium error on your end is due to the edition you have to manage the App Script projects (limit on Gmail accounts are onlye 20,000 "urlfetch")

Identifying the root cause might take you to different scenarios. Most of the time could be due to multiple reasons, mainly it could be due to the service being called too many times, in this case multiple Sheet API calls with different projects or scripts, which leads to the error affecting your user account, or the constant increase of new data that is being fetch over your Sheets or scripts. One key thing to take in consideration is the error is linked to the quotas per user, you could create a new user and run the scripts with the new user as the owner of the new project or copy.

There is a similar scenario presented over this thread, about functions and how the exponential grow of data leads to the error being presented and a potential solution, I would highly suggest to review it:

References:

  • I don't think this should be linked to no. of calls/second as suggested in another post which you've shared. Because the error clearly states "Service invoked too many times in a day". For the other scenario, the error should have been something like "Exception: Service invoked too many times in a short time.." – ab.it.gcp Feb 25 '23 at 14:48
  • I've created an issue for this in Google Issue tracker to get more clarity as the error msg seems to be a bit deceiving/unclear. https://issuetracker.google.com/issues/270047073 – ab.it.gcp Feb 25 '23 at 15:03
  • There is no response on the issue tracker. Any help on how to escalate it please? – ab.it.gcp Mar 09 '23 at 06:13
0

Quota exceeded errors usually tell that you are really exceeding the quota. Use logging to ensure that the logic in your code is correct and that there are no unintended repeated calls to UrlFetchApp().fetch().

To find how many times you are actually calling UrlFetchApp.fetch(), use console.log(). One easy way to do that is to replace all UrlFetchApp.fetch(...).getContentText() calls with a call to a helper utility function such as this one:

/**
* Caches and logs UrlFetchApp.fetch().getContentText().
*
* @param {String} url The URL to fetch.
* @param {Object} params The parameters to use in fetch.
* @param {String} optContext Optional. An identifier string to log. Use false to skip logging.
* @return {String} The text returned by HTTPResponse.getContentText().
*/
function cachedUrlFetchContentText_(url, params, optContext) {
  // version 1.1, written by --Hyde, 21 March 2023
  //  - see https://stackoverflow.com/a/75705228/13045193
  const cacheKey = JSON.stringify([url, params]);
  const cache = CacheService.getScriptCache();
  let cacheHit = false;
  let result;
  let resultJson = cache.get(cacheKey);
  if (resultJson) {
    result = JSON.parse(resultJson);
    cacheHit = true;
  } else {
    result = UrlFetchApp.fetch(url, params)
      .getContentText(); // replace with .getContent() to get raw result
    resultJson = JSON.stringify(result);
    cache.put(cacheKey, resultJson, 21600);
  }
  if (optContext !== false) {
    console.log(`cachedUrlFetchContentText_ context: ${optContext || '-'} url: ${url} cacheHit: ${cacheHit}`);
  }
  return result;
}

When you use a utility function to wrap all calls to UrlFetchApp.fetch(), it is also easy to incorporate caching, as shown above. In many use cases, caching can help you avoid hitting quota limits in the first place.

To view the logs, visit the My Executions dashboard or the Logs Explorer.

See console, Cloud Logging and Cache Service.

doubleunary
  • 13,842
  • 3
  • 18
  • 51
  • Thanks for your post however I'm pretty sure that the function is not called more than 100,000 times which is the quota limit. In fact the number would be much lesser than that. Also, I've already done cache handling in some of the calls. The others are used to return access token which expires in a short time. So, avoided using cache for those. – ab.it.gcp Mar 12 '23 at 06:48
  • Quota exceeded errors usually tell that you are _really_ exceeding the quota. Use logging to ensure that the logic in your code is correct and that you are not calling `UrlFetchApp()` in a never-ending loop or something like that. – doubleunary Mar 12 '23 at 07:21
  • It was indeed a loop. Not a never ending one but ran many more times than what I thought and was at the wrong place too. Thank you, logging helped identify it. Although I didn't use a common function/cache, marking your answer as correct for the logging suggestion. – ab.it.gcp Mar 14 '23 at 04:27