0

I want to submit batch prediction job for a custom model (in my case it is torch model, but I think this is irrelevant in this case). So I read the documentation: batch prediction from file-list

But as there are no examples I cannot be sure what the schema of the json object which vertex ai will send to my model will be. Does someone have made this work ?

My best guess is that the request will be with the following body:

{'instance' : <b64-encoded-content-of-the-file>}

But when I read the documentation (for other 'features' of vertex ai) I could imagine the following body as well:

{'instance': {'b64' : <b64-encoded-content-of-the-file>}}

Does somebody actually know ?

Another thing I did is to make a 'fake-model' which returns the request it gets ... when I submit the batch-prediction job it actually finishes successfully but when I check the output file it is empty ... so ... I actually need help/more time to think of other ways to decipher vertex ai docs.

Thanks in advance!

Vasil Yordanov
  • 417
  • 3
  • 14

1 Answers1

1

Vertex AI custom container should wrap a service with an endpoint (predict) for receiving a list of instances, each is a json serializable object

{'instances': [{'b64' : <b64-encoded-content-of-the-file1>}, {'b64' : <b64-encoded-content-of-the-file1>}, ...]}
Tsvi Sabo
  • 575
  • 2
  • 11