-1

I have a firestore collection structure as follows: /documents/{documentID}/events/{eventId}. I want to insert a row into a BigQuery table using a firebase trigger function, which listens to the events collection with

functions.firestore.document("documents/{documentID}/events/{eventId}").onCreate(async (snap, context) => {
// insert into BQ using DML 
}

When more than a number of events are added to the collection at once, the insert DML statement doesn't work for all of the events (40 of the 45 events on avg. gets written into the BQ table) and it outputs an error like this: message: 'Exceeded rate limits: too many table dml insert operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas'

I can't use streaming inserts because it denies me the ability to delete the inserted row before a certain amount of time passes. (90 minutes or sth.)

How can I solve this problem? Thanks in advance.

melihozcann
  • 403
  • 1
  • 4
  • 9

1 Answers1

0
  1. You could batch multiple rows and load them together.

  2. Take advantage of streaming and instead of trying to remove rows within 90min in your main table, have a separate table that serves as a "removed rows" record.

You could then query the main table as:

SELECT * FROM mainTable WHERE not(id in (SELECT id FROM removedRowsTable))

Periodically you could still update your main table by removing rows identified in the "removed rows" table.

Luka
  • 392
  • 1
  • 9