0

I have the firebase extension for streaming data to Big Query installed https://extensions.dev/extensions/firebase/firestore-bigquery-export.

Each month I run a job to import data into my Firestore collection in batches. This month I imported 2706 rows but only 2646 made it into Big Query (60 less).

I am got the following errors from the extension: [![enter image description here][1]][1]

I contacted Firebase support and they suggested I upgrade to the latest firebase admin and function packages but these have breaking changes. Updating the latest version of firebase-admin gave me errors. I have not got any more help from them and it is still happening for multiple collections.

The options I see are:

  1. Update to the latest firebase-admin and firebase-functions packages and change my code to work with the breaking changes. I think this is unlikely to help.
  2. Update the firebase extension to the latest version from 0.1.24 to 0.1.29 which now includes a flag called "Use new query syntax for snapshots" which can be turned on. I can't find much information about this.
  3. Increase the Big Query quota somehow.
  4. Slow down the data being entered into Firestore or add it daily/weekly rather than monthly.

Here is my code in Nodejs:

  • firebase-admin: 9.12.0
  • firebase-functions: 3.24.1
  • firebase/firestore-bigquery-export@0.1.24
  const platformFeesCollectionPath = `platformFees`;
  const limit = 500;
  let batch = db.batch();
  let totalFeeCount = 0;
  let counter = 0;

  for (const af of applicationFees) {
    const docRef = db.collection(platformFeesCollectionPath).doc();
    batch.set(docRef, { ...af, dateCreated: getTimestamp(), dateModified: getTimestamp() })

    counter++;
    if (counter === limit || counter === applicationFees.length) {
      await batch.commit();
      console.log(`Platform fees batch run for ${counter} platform fees`);
      batch = db.batch();
      totalFeeCount = totalFeeCount + counter;
      counter = 0;
    }
  }

  if (applicationFees.length > limit) {
    // Need this commit if there are multiple batches as the applicationFees.length does not work
    await batch.commit();
    totalFeeCount = totalFeeCount + counter;
  }
  
  if (counter > 0) {
    console.log(`Platform fees batch run for ${totalFeeCount} platform fees`);
  }

Update: If I look in the GCP logs using the query:

protoPayload.status.code ="7"
protoPayload.status.message: ("Quota exceeded" OR "limit")```

I can see many of these errors:
[![Errors][2]][2]


  [1]: https://i.stack.imgur.com/BAgTm.png
  [2]: https://i.stack.imgur.com/eswzI.png

Edit:
Added issue to the repo:
github.com/firebase/extensions/issues/1394

Update:
It is still not working with v0.1.29 of the bigquery extension. I am getting the same errors.
MadMac
  • 4,048
  • 6
  • 32
  • 69
  • I think I know what is causing your problem. A question about your code, what is it doing exactly? Do you copy multiple elements to a specific collection? – GabrielNexT Jan 08 '23 at 01:13
  • @GabrielNexT I just copy multiple transaction records from Stripe into a collection in Firestore. 2706 of them at 500 a batch as mentioned above. – MadMac Jan 08 '23 at 01:52
  • Forgive me, for a moment I thought you were moving data between collections. No problem, let's think of another solution that applies more to your case. I deleted my answer as it doesn't solve the problem. – GabrielNexT Jan 09 '23 at 00:01
  • Probably the problem is that the extension doesn't batch update. Have you tried using the bigquery package for Nodejs? That way you add it to firestore and bigquery. – GabrielNexT Jan 09 '23 at 00:02

2 Answers2

1

Would it be possible to provide the BigQuery Extension version number for your current installation. This can be found on your Firebase console => Extensions tab.

The error "Exceeded rate limits: too many api requests", was an error we hoped to resolve with a release in June 2022. So perhaps be resolved with an upgrade, at the very least using the above example will provide the maintenance team a way of reproducing the bug.

In addition, if you would like to create an issue on the repository, it would be easier for maintainers to track this issue.

Darren Ackers
  • 148
  • 1
  • 6
  • The version is in the description firebase/firestore-bigquery-export@0.1.24. I have updated it to 0.1.29 with "Use new query syntax for snapshots" turned to on so will find out next month if it is working. Will add to repo. – MadMac Jan 09 '23 at 18:37
  • https://github.com/firebase/extensions/issues/1394 – MadMac Jan 09 '23 at 18:47
  • It is still not working with v0.1.29 of the bigquery extension. I am getting the same errors. – MadMac Feb 02 '23 at 21:34
  • With this custom script impacting the data, as the extension is based on the `onWrite` trigger in Firestore. Is there any you could throttle the speed of your requests as temporary solution? I wonder if so many concurrent requests is causing the service to fail? – Darren Ackers Feb 02 '23 at 22:17
0

The error that you are facing is due to the Maximum number of API requests per second per user per method is exceeded.BigQuery returns this error when you hit the rate limit for the number of API requests to a BigQuery API per user per method. For more information, see the Maximum number of API requests per second per user per method rate limit in All BigQuery API.
To prevent this error you could try out the following:

  • Reduce the number of API requests or add a delay between multiple API requests so that the number of requests stays under this limit.

  • The streaming inserts API has costs associated with it and has its
    own set of limits and quotas.

    To learn about the cost of streaming inserts, see BigQuery pricing

  • You can request a quota increase by contacting support or sales. For additional quota, see Request a quota increase. Requesting a quota increase might take several days to process. To provide more information for your request, we recommend that your request includes the priority of the job, the user running the query, and the affected method.

  • You can retry the operation after a few seconds. Use exponential backoff between retry attempts. That is, increase the delay between each retry.

Vaidehi Jamankar
  • 1,232
  • 1
  • 2
  • 10
  • This is pretty much just a copy and paste of the page here: https://cloud.google.com/bigquery/docs/troubleshoot-quotas – MadMac Jan 09 '23 at 18:39