I have around a million data items being inserted into BigQuery using streaming API (BigQuery Python Client's insert_row
function), but there's some data loss, ~10,000 data items are lost while inserting. Is there a chance BigQuery might be dropping some of the data? Since there aren't any insertion errors (or any errors whatsoever for that matter).
Asked
Active
Viewed 100 times
0

Dinesh
- 11
- 1
-
If BigQuery had a problem with dropping data, everybody would be screaming and nobody would use the product. You have a problem with your code / processes. However, you did not include anything in your question to help you with. – John Hanley Dec 03 '18 at 17:46
-
1please report with more specifics on the BigQuery issue tracker https://issuetracker.google.com/savedsearches/559654 – Felipe Hoffa Dec 03 '18 at 22:03
1 Answers
1
I would recommend you to file a private Issue Tracker in order for the BigQuery Engineers to look into this. Make sure to provide affected project, the source of the data, the code that you're using to stream into BigQuery along with the client library version.

Steeve
- 385
- 2
- 13