0

As per my understanding Online prediction works with json data. Currently i am running online prediction on local host, where each image get converted to json. ML engin API use this json from localhost for prediction. Internally ML engine API might have been uploading json to cloud for prediction.

Is there any way to run online prediction on json files already uploaded to cloud bucket?

  • Do you mind clarifying what you mean by running "online prediction on local host"? Are you using `gcloud ml-engine local predict`? And is your question whether or not you can use files on GCS to use that command? – rhaertel80 Jul 27 '17 at 14:56

1 Answers1

0

Internally we parse the input from the payload in the request directly for serving, not store the requests on disk. Currently reading inputs from Cloud is not supported for online prediction. You may consider to use batch prediction which reads data from files stored on cloud.

There is a small discrepancy of the inputs between online and batch for the model that accepts only one string input (probably like your case). In this case, you must base64 encode the image bytes and put it in a JSON file for online prediction, while for batch prediction you need to pack the image bytes into records in TFRecords format and save it as tfrecord file(s). Other than that, the inputs are compatible.

yxshi
  • 244
  • 1
  • 5