Questions tagged [google-bigquery-storage-api]

23 questions
2
votes
1 answer

What is the best way to extract data from BigQuery and load into SQL Server?

I want to create some generic pipeline where I can pass table name or custom SQL as input and load required data from BigQuery to SQL Server. This pipeline should handle a load of daily incremental data and the initial historical load (around 100…
2
votes
1 answer

Using managedwriter without a protocol definition

I am trying to use Go BigQuery Storage API Client. I checked that data can be inserted for most field types. However, preparing the protocol buffers definition isn't enjoyable every time I insert data. Java client has JsonStreamWriter class; it…
1
vote
1 answer

Querying INFORMATION_SCHEMA.TABLE_STORAGE in BigQuery gives an access denied error

I'm trying to query the TABLE_STORAGE view in my BigQuery project to get storage information about my tables. I'm using the following query - SELECT * FROM region-us.INFORMATION_SCHEMA.TABLE_STORAGE but I get the following error - Access Denied:…
Pratzz
  • 107
  • 1
  • 7
1
vote
2 answers

Google Dataflow store to specific Partition using BigQuery Storage Write API

I want to store data to BigQuery by using specific partitions. The partitions are ingestion-time based. I want to use a range of partitions spanning over two years. I use the partition alias destination project-id:data-set.table-id$partition-date. I…
1
vote
1 answer

Avoid session shutdown on BigQuery Storage API with Dataflow

I am implementing an ETL job that migrates a non partitioned BigQuery Table to a partitioned one. To do so I use the Storage API from BigQuery. This creates a number of sessions to pull Data from. In order to route the BigQuery writes to the right…
1
vote
1 answer

How to use BigQuery Storage API to concurrently read streams in Python threads

I have a large table (external to BigQuery as the data is in Google Cloud Storage). I want to scan the table using BigQuery to a client machine. For throughput, I fetch multiple streams concurrently in multiple threads. From all I can tell,…
0
votes
1 answer

Dataflow WriteToBigQuery with STORAGE_WRITE_API doesnt accept fields starting with undescores

When deploying this step WriteToBigQuery( method='STORAGE_WRITE_API', table=[TABLE], schema=[PATH TO SCHEMA ON GCS BUCKET], …
0
votes
0 answers

BigQuery Data Transfer for CM360

I am currently trying to do a BigQuery Data Transfer for CM360. My Google Account Manager has placed the Data Transfer files into a Google cloud storage (GCS) bucket. Access to the GCS bucket is managed by a Google Group of which i am admin of, BUT…
0
votes
0 answers

Problem importing google-cloud-bigquerystorage maven dependency

I'm trying to build service to store data in BigQuery. One of imported dependencies is: com.google.cloud google-cloud-bigquerystorage **2.41.0**
0
votes
1 answer

BigQuery Storage API: what happened to AppendRowsStream

Seeing no Errors but no Data Submitted I am trying to use the new Python BigQuery Storage API (google.cloud.bigquery_storage instead of google.cloud.bigquery_storage_v1 or google.cloud.bigquery_storage_v1beta2), as can be seen here. But there are no…
0
votes
1 answer

How to restrict BigQuery API User for selective Users?

I want to set a quota of daily BigQuery Query usage for selective users. As per the document, we can set Query usage Quota for all users, not selective users. Is there any way to set up Quota over BigQuery API in the same project as…
0
votes
0 answers

BigQuery Storage Write API calls not logged to Audit logs

I'm trying to create a Cloud Function that is called when new rows are written to a BigQuery table using the Storage Write API (it's the Firebase Crashlytics to BigQuery stream). However, the Cloud Function is never triggered. I checked the Audit…
0
votes
1 answer

How to delete a column with a reserved word as the column name?

I'm new at Big Query and I created a column that has a reserved word in one of the tables: enter image description here Now that I'm trying to remove it by rewriting the table with a new query, I always get this error: enter image description…
0
votes
0 answers

Problems streaming arrow to DuckDB wasm

Apologies in advance for the wall of text I am working on a PoC to see whether we can expose data from our Data warehouse in an relatively effective effective way out to a browser client. Data warehouse: Bigquery API: Bigquery storage read api One…
0
votes
1 answer

Using 'adapt' from @google-cloud/bigquery-storage

I'm trying to use @google-cloud/bigquery-storage and it seems I can't get adapt. This is what I'm trying to do. const { v1, adapt } = require('@google-cloud/bigquery-storage'); const { BigQueryWriteClient } = v1; const { BigQuery } =…
DbxD
  • 540
  • 2
  • 15
  • 29
1
2