Questions tagged [gcs]
237 questions
0
votes
3 answers
Table expiration in GCS to BQ Airflow task
I am copying a CSV into a new BQ table using the GCSToBigQueryOperator task in Airflow. Is there a way to add a table expiration to this table within this task?
new_table_task = GCSToBigQueryOperator(
task_id='insert_gcs_to_bq_tmp_table',
…

Hayley Guillou
- 3,953
- 4
- 24
- 34
0
votes
2 answers
unable unzip password protected .gz file in GCS bucket using Python
Trying to unzip password protected file in GCS but getting error in below code. Below code work fine with normal .gz files but fails to unzip password protected files.
storage_client = storage.Client()
source_bucket = 'bucket'
source_bucket1 =…

abhi
- 11
- 4
0
votes
3 answers
Upload file to Google bucket directly from SFTP server using Python
I am trying to upload file from SFTP server to GCS bucket using cloud function. But this code not working. I am able to sftp. But when I try to upload file in GCS bucket, it doesn't work, and the requirement is to use cloud function with Python.
Any…

Chandra
- 7
- 1
- 3
0
votes
1 answer
A signed GCS URL got 'SignatureDoesNotMatch'. the URL signed by a service with workload identity
In a workload identity enabled GKE cluster, a service signed a GCS file but got: 'SignatureDoesNotMatch' after about 10 days.
Does the system-managed private key rotation cause it?
What should I do to resolve it?
0
votes
1 answer
deleting a folder inside gcp bucket
I am having a temporary folder which I want to delete in gcp bucket I want to delete it with all its content, what I thought of is I can pass the path of this temp folder as a prefix and list all blobs inside it and delete every blob, but I had…

Mee
- 1,413
- 5
- 24
- 40
0
votes
2 answers
Python script for GCS - calling in terminal with argument - json.loads() stoped working
I'm Uploading files to GCS (Google Cloud Storage). My script was working.
But now I try to run it in Terminal by python uploadtogcs.py 'C:/Users/AS/Documents/GCSUploadParameters.json'
I receive error:
Traceback (most recent call last):
File…

AnnaSh
- 11
- 1
- 4
0
votes
0 answers
Avro Files from GCS to BigQuery Using Dataflow
I want to export the data from Avro files which are present in GCS into BigQuery table using Dataflow and Python.
Can anyone let me know how to do it? Because in dataflow there is no readymade template for Transferring Avro files in Batch mode to…

Rahul Wagh
- 281
- 6
- 20
0
votes
1 answer
BigQuery Client bigquery.listTableData with Filter condition
I am using the below lines of code to fetch the data from Google Bigquery.
public class BQTEST {
public static void main(String... args) throws Exception {
String datasetName = "mydataset";
String tableName =…

1stenjoydmoment
- 229
- 3
- 14
0
votes
1 answer
Python script for automating GCP storage UPLOAD
I'm new to GCP and GCP Storage. I want to upload files from File System on PC to GCS bucket.
I found following code and altering it.
But I have files that sit in folder like this: \F1\Data\Export\PlayOnUsers\2021\12\
That is 2021 year and 12 month -…

Anna
- 1
- 1
- 4
0
votes
1 answer
GCS, signed URL and CORS
I'm trying to upload a file to GCS from the browser using a signed URL.
I've set the CORS policy on the bucket:
$ gsutil cors get gs://some-bucket/
[{"maxAgeSeconds": 3600, "method": ["HEAD", "GET", "OPTIONS", "PUT"], "origin": ["*"],…

Miki Tebeka
- 13,428
- 4
- 37
- 49
0
votes
0 answers
How to configure GCS as filebeat input
We are storing our audit logs in GCS bucket. we would like to ingest them to Elasticsearch when required - not regularly - using filebeat. I have checked S3 option where it let us use s3 like storages as input using providers.
I'm using following…

Arvin
- 315
- 1
- 3
- 15
0
votes
1 answer
No AbstractFileSystem configured for scheme: gs
I am getting below error while running a gobblin job.
My core-site.xml looks fine and it has the required value.
core-site.xml
fs.AbstractFileSystem.gs.impl
com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS
…

1stenjoydmoment
- 229
- 3
- 14
0
votes
1 answer
Spark read job from gcs object stuck
I'm trying to read an object with a spark job locally. I previously created with another Spark job locally.
When looking at the logs I see nothing weird, and in the spark UI the job is just stuck
Before I kick the read job I update the spark config…

bachr
- 5,780
- 12
- 57
- 92
0
votes
2 answers
RESOURCE_EXHAUSTED error when linking a GCS bucket to firebase (projects.buckets.addFirebase)
I have a script that creates a GCS bucket, links it to firebase and apply firebase rules on the bucket. Recently, I am running into this error that it cannot link the GCS bucket to firebase.
I am using the REST method projects.buckets.addFirebase to…

davidbilla
- 2,120
- 1
- 15
- 26
0
votes
2 answers
Download and upload files and folders between GCS and Cloud Shell
I have some files and folders on GCS(Google Cloud Storage):
And I have some files and folders on Cloud Shell as well:
Now, I want to download and upload these files and folders between GCS and Cloud Shell:
Are there any ways to do that?

Super Kai - Kazuya Ito
- 22,221
- 10
- 124
- 129