Use this tag if you are going to refer something from Python libraries OR API Clients of Google Cloud Platform.
Questions tagged [google-cloud-python]
207 questions
1
vote
3 answers
using add_done_callback() and reponse.metadata() for automl
I'm trying to use a standard example for automl. I would expect the create_model to launch a long running operation that will update the operation response once it's done and to then access the metadata (to get the model_id of the newly trained…

superneutrino
- 23
- 5
1
vote
1 answer
how to query spanner and get metadata, especially columns' names?
I'm trying to query custom SQL on Spanner and convert the results into a Pandas Dataframe, so I need data and column names, but I can't find a way to get the column names.
According to the documentation, I can get columns using metadata or fields…

Adamo Figueroa
- 330
- 2
- 14
1
vote
1 answer
Unable to authenticate using google cloud service account key created by python API
The sample below demonstrates failure to authenticate to google service account using the key created just the few lines above using python api.
I was not able to find any document on how these, programmatic keys, can be used.
The keys created by…

R. Simac
- 737
- 9
- 9
1
vote
1 answer
HappyBase Table.cells not returning any result
Below is my code
# Create a Cloud Bigtable client.
client = bigtable.Client(project=config.GCP_PROJECT_ID)
# Connect to an existing Cloud Bigtable instance.
instance = client.instance(config.BIGTABLE_INSTANCE_ID)
# Open an existing…

Hana Alaydrus
- 2,225
- 16
- 19
1
vote
0 answers
Bigquery, save clusters of clustered table to cloud storage
I have a bigquery table that's clustered by several columns, let's call them client_id and attribute_id.
What I'd like is to submit one job or command that exports that table data to cloud storage, but saves each cluster (so each combination of…

blaineh
- 2,263
- 3
- 28
- 46
1
vote
0 answers
bigquery.Client().extract_table() does not (always) divide a big table into small CSV files
My Python application needs to export BigQuery tables into small CSV files in GCS (like smaller than 1GB).
I referred to the document, and wrote the following code:
from google.cloud import…

Taichi
- 2,297
- 6
- 25
- 47
1
vote
1 answer
How do I write a Cloud Function to receive, parse, and publish PubSub messages?
This can be considered a follow-up to this thread, but I need more help with moving things along. Hopefully someone can have a look over my attempts below and provide further guidance.
To summarize, I need a cloud function that
Is triggered by a…

Larry Cai
- 881
- 1
- 11
- 24
1
vote
1 answer
Creating a new project in Google Cloud using python without service account credentials
I am aiming to do a pythonic automated Google Cloud project manager. Just testing a bunch of models of Tensorflow and stuff. Even when I can fully access training, deploying and testing models inside a project, I can't mke any new projects since I…

Irribarra Cristián
- 107
- 13
1
vote
0 answers
Unable to run Google Cloud Python Pubsub examples
I'm running Python 2.7.14, Windows machine and I've been trying to see up a simple pubsub sample using the Google Cloud Python library to get it all to work, but for some reason, none of the examples I've tried actually work. The script I've got is:…

Ciaran Blewitt
- 11
- 3
1
vote
0 answers
How do I limit the number of threads on Publisher in Google Pub/Sub?
I am currently using Google pub/sub and using the Google pubsub python library. I can't find a way to limit the number of threads spawned on my machine which uses most of my resources.
Here's my code:
def publish(data):
def…

Dinesh
- 11
- 1
1
vote
1 answer
App Engine Python, Standard Environment, using time.sleep
I read that using time.sleep counts against some CPU quota. I never heard of such a quota before or could find anything in the docs. Can someone enlighten me please?
EDIT: Since the other questions are 8 years old and I couldnt find anything in the…

aydunno
- 91
- 10
1
vote
1 answer
Google Cloud Bigquery Library Error
I am receiving this error
Cannot set destination table in jobs with DDL statements
When I try to resubmit a job from the job.build_resource() function in the google.cloud.bigquery library.
It seems that the destination table is set to something…

dillon
- 125
- 2
- 2
- 6
1
vote
1 answer
Google Cloud storage bucket CORS configuration not getting set using python client
CORS configuration for cloud storage bucket is not getting set using python.
I am following the steps given in: http://google-cloud-python.readthedocs.io/en/latest/storage/buckets.html#google.cloud.storage.bucket.Bucket.cors
That is,
>>> policies =…

Ashley Thomas
- 77
- 2
- 8
1
vote
1 answer
Retrieving firebase database data from Google Cloud yalm python app
I'm building a google cloud function which is launching every 1mins. The purpose is to get data from my database (firebase) and to delete this data if the date of the post is superior to one hour ago.
I was thinking about making a firebase cloud…

Florian Birolleau
- 313
- 4
- 15
1
vote
2 answers
How is ndb (and cloud datastore) being used in the firebase tic-tac-toe example
In the google app engine firebase tic-tac-toe example here: https://cloud.google.com/solutions/using-firebase-real-time-events-app-engine
nbd is used to create the Game data model. This model is used in the code to store the state of the tic-tac-toe…

BrainPermafrost
- 644
- 2
- 7
- 20