Questions tagged [gcs]

237 questions
2
votes
1 answer

Google BigQuery TableResult getValues() vs iterateAll()

Option 1: TableResult results = bigquery.query(queryConfig); while(results.hasNextPage()) { results = results.getNextPage(); for (FieldValueList row : results.getNextPage().getValues()) { //Print row } } Option…
1stenjoydmoment
  • 229
  • 3
  • 14
2
votes
0 answers

Is it possible to generate one signed URL for a folder and upload multiple files into it?

Is it possible to use GCS signed url like "{gcs_domain}/anybucket/folder1/?{gcs_headers}" to upload multiple times different fils (with one or more requests, it doesn't matter)? and finally it gives : -folder1 --file1.txt --file2.txt ... Or the…
Xin Wang
  • 21
  • 1
2
votes
1 answer

Is BigQuery Data Transfer Service for Cloud Storage Transfer possible to implement using Terraform?

This is more of a question about the possibility because there is no documentation on GCP regarding this. I am using BigQuery DTS for moving my CSVs from a GCS bucket to BQ Table. I have tried it out manually and it works but I need some automation…
Sam
  • 161
  • 1
  • 9
1
vote
0 answers

On Dataproc simple spark job is very slow

I'm facing slow performance issues with my Spark jobs running on Dataproc when reading data from Google Cloud Storage (GCS) as parquet files. I have run two experiments with different data sizes and observed significant latency. Here are the…
1
vote
3 answers

How to enable SFTP for a GCS Bucket

A business partner wants to drop files for us via SFTP, and we want those files on a GCS bucket. As far as I can tell, GCS doesn't support SFTP access, but there's a pre-GA connector, described here. If I understand the doc correctly, this connector…
Yanay Lehavi
  • 166
  • 11
1
vote
0 answers

pyarrow fails to create ParquetFile from blob in Google Cloud Storage

This issue is tricky and hard to explain clearly, because it is not quite deterministic, and the code can't be entirely listed. There are a couple variations, some works and others don't, and I can't see the difference in the code that has bearings…
zpz
  • 354
  • 1
  • 3
  • 16
1
vote
1 answer

How to Add text qualifier and Escape char while Doing Bigquery Export

I m trying to Export one table into GCP Csv File using below Query EXPORT DATA OPTIONS ( uri = 'gs://Filepath/Filename*.csv', format = 'CSV', OVERWRITE = TRUE, header = TRUE, field_delimiter = ',') AS ( SELECT {Columns} FROM…
1
vote
1 answer

How can this code make the dataflow streaming only listen to the newly inserted files in input pattern rather new and existing

I want this code to write only the newly added files in the input pattern and write it to BQ table as it now adds the existing and new files not only the new files I want this code to write only the newly added files in the input pattern and write…
1
vote
0 answers

Google cloud storage how to show deleted blob list

I am trying to show that the list of blob deleted in bucket if version enabled in Google cloud storage. I am using the python code for these def bucketVersionHistory(self,bucket_name,serviceAccount): os.environ['GOOGLE_APPLICATION_CREDENTIALS']…
user1056388
  • 15
  • 1
  • 6
1
vote
0 answers

how can we host static website using global https loadbalancer and cdn keeping bucket private in gcp?

I have hosted website(index.html) in gcs bucket keeping bucket private not accessible to internet. also configured global https loadbalancer with cloud cdn enabled.but when i try to hit loadbalancer's IP getting accessed denied error. although i…
1
vote
1 answer

How to resolve an ' Unable to get public no-arg constructor' error while trying to push data to GCS and load it into BigQuery?

I've set up a pyspark session and provided it specific configuration settings based off what I've read: self.spark_session = SparkSession.builder.appName( "Example Session" ).config("spark.jars",…
OpenDataAlex
  • 1,375
  • 5
  • 19
  • 39
1
vote
1 answer

GCS Uploads for chunking in Go SDK?

I'm trying to upload large files using the GCS writer: bucketHandle := m.Client.Bucket(bucket) objectHandle := bucketHandle.Object(path) writer := objectHandle.NewWriter(context.Background()) then for chunks of size N I call…
user38643
  • 341
  • 1
  • 7
1
vote
0 answers

How do I unzip a .gz file in google cloud storage bucket?

I modified the code from this thread to unzip a .gz file, but I'm getting the following error: How do I unzip a .zip file in google cloud storage? AttributeError: 'GzipFile' object has no attribute 'namelist' My code is as follows: from…
AldanaBRZ
  • 87
  • 5
1
vote
1 answer

How to load fonts from GCS

I want to load "fonts" from Google Storage, I've try two ways, but none of them work. Any pointers? Appreciated for any advices provided. First: I follow the instruction load_font_from_gcs(uri)given in the answer here, but I received an NameError:…
april
  • 53
  • 1
  • 7
1
vote
2 answers

How to use delta live table with google cloud storage

[Cross-posting from databrick's community : link] I have been working on a POC exploring delta live table with GCS location. I have some doubts : how to access the gcs bucket. We have to establish connection using databricks service account. In a…
1
2
3
15 16