Questions tagged [google-cloud-data-transfer]

Use this tag for Google Cloud Platform Data Transfer products (https://cloud.google.com/products/data-transfer/) which help customers move unstructured data between Google Cloud and other clouds or storage systems in private data centers. This includes products like Storage Transfer Service, Transfer Service for on premises data, Transfer Appliance and the use of the gsutil cp command.

71 questions
0
votes
1 answer

Long Delay in transfer of 74GB CSV from AWS S3 to BigQuery using Big Query Data Transfer Service

I'm trying to move a 74 GB CSV from AWS S3 to BigQuery using the Big Query Data Transfer Service. It's been 9 hours, and it's still not done. The logs don't show any errors, but one message keeps showing up: Transfer from Amazon S3 to Google Cloud…
0
votes
1 answer

YouTube Data Transfer high views

I am using BigQuery Data Transfers to regularly get YouTube Content Owner data. The transfers are running successfully, but for some videos the views I get in the p_content_owner_basic_a3 are much higher than the ones seen in the YouTube Analytics…
0
votes
0 answers

Where do I find search terms for shopping campaigns in Google Ads data transfer service tables

We are migrating our app to use the Google Ads transfer service tables rather than the Google AdWords transfer service tables in BigQuery. We are having trouble finding the search terms that used to be in the table SearchQueryStats for "SHOPPING"…
0
votes
1 answer

How do I exclude a filename that contains '_All_Data' when using the Data Transfer feature on BigQuery?

I'm using Google Cloud Storage as a source and BigQuery as the destination for a Data Transfer feature that's available in BigQuery.So, on the data source details when creating a new data transfer I need to input the 'Cloud Storage URI', but I don't…
0
votes
0 answers

gcloud storage cp : skipping already copied file into destination bucket

I am using gcloud storage cp command to copy large number of files from one gcp bucket to another bucket using below command gcloud storage cp -r "gs://test-1/*" "gs://test-3" --encryption-key=XXXXXXXXXXXXXXXXXXXXXXX --storage-class=REGIONAL I have…
0
votes
1 answer

gcloud storage cp not preserving directory structure when copy cloud to cloud

I am trying to copy all the data from source bucket to destination bucket using below command recursively gcloud storage cp -r "gs://test-1" "gs://test-3" --encryption-key=XXXXXXXXXXXXXXXX --manifest-path=test-manifest.csv…
0
votes
1 answer

Copy large data from one cloud storage bucket to another with CSEK

I want to copy data from one GCP cloud storage bucket to another GCP cloud storage bucket. Source bucket has million of small files and are encrypted with AES256 keys(i.e. customer supplied encryption keys). customerEncryption: …
0
votes
0 answers

Google Biq query data transfer service not updating GBQ table after first run

We are fetching data from Appsflyer source to a Google cloud storage bucket Then through data transfer service - trying to insert in gbq table Data transfer service is running successfully but data is not shown in GBQ table, but whenever we are…
0
votes
0 answers

BigQuery Data Transfer service - Amazon S3 Transfer - Sometimes cannot match the source files

I'm migrating data from Amazon S3 to Bigquery using Bigquery Data Transfer Service, the source files are some CSV that are being generated to a specific bucket. Normally on this bucket (s3://landing_data/to_load/) I've two type of…
0
votes
1 answer

Update query string in scheduled query using Python Client for BigQuery Data Transfer Service

I'm struggling to find documentation and examples for Python Client for BigQuery Data Transfer Service. A new query string is generated by my application from time to time and I'd like to update the existing scheduled query accordingly. This is the…
evam
  • 45
  • 1
  • 6
0
votes
1 answer

How to automatically transfer newly added avro data from GCS to BigQuery

I want to schedule the data transfer job between Cloud Storage to BigQuery. I have one application that dumps data continuously to the GCS bucket path (let's say gs://test-bucket/data1/*.avro) that I want to move to BigQuery as soon as the object is…
0
votes
1 answer

How to transfers files with Google Workspace Admin SDK (python)

I am trying to write a program that transfers users drive and docs files from one user to another. It looks like I can do it using this documentation Documentation. I created the data transfer object, which looks like this: datatransfer = { …
0
votes
1 answer

Google CLoud Transfer Job is creating one extra folder

I have created a Transfer Job to import some of my website's static resources to Google storage. The job was supposed to import the data in a bucket named www.pretty-story.com. It is importing from a tsv file located here. For instance the first url…
Sam
  • 13,934
  • 26
  • 108
  • 194
0
votes
1 answer

Bigquery LoadJobConfig Delete Source Files After Transfer

When creating a Bigquery Data Transfer Service Job Manually through the UI, I can select an option to delete source files after transfer. When I try to use the CLI or the Python Client to create on-demand Data Transfer Service Jobs, I do not see an…
0
votes
1 answer

How to delete a GCP storage transfer job in python

I want to create a small python program (Cloud Functions) to create, run and delete a few GCP storage transfer services. I can create a GCP storage transfer job as follows, from google.cloud import storage_transfer transfer_job_request =…
takaomag
  • 1,545
  • 1
  • 16
  • 26