Is there a way to predict in bulk a large amount of data using Google Prediction API (about 100 MB)? I see that is it possible to train a model using data stored on Google Cloud Storage. Is it possible to make a prediction based on a file stored on Google Cloud Storage?
Asked
Active
Viewed 399 times
1 Answers
0
Yes, that works. I have a Python code sample below for you.
# Get predefined credentials (see https://cloud.google.com/sdk/gcloud/#gcloud.auth)
http = AppAssertionCredentials('https://www.googleapis.com/auth/prediction').authorize(httplib2.Http())
# Create a service for the Prediction API
service = build('prediction', 'v1.6', http=http)
# Define body to create a new model with a URL to the CSV files stored in GCP Cloud Storage
request_body = {"id": "<ModelID>", "storageDataLocation": <bucket_with_filename>}
# Execute command
return service.trainedmodels().insert(project='<ProjectID>', body=request_body).execute()

Alexander Graebe
- 797
- 5
- 10
-
insert() is to batch update (or create?) the model, not to request predictions. There appears to be a batch mode for the underlying HTTP api, but it's not clear to me from the documentation how best to use it from Python. https://cloud.google.com/prediction/docs/reference/v1.6/batch – Sean McCullough Sep 29 '15 at 17:15