0

I've used mlflow.pyfunc.log_model and I was able to get model inference with this, but not withmlflow.pytorch.log_model. The error was Verify that the serialized input Dataframe is compatible with the model for inference.

    data = torch.randn(10, 3, 224, 224)  # shape: [bs, channel, size, size]
    model_input = {
                "inputs": { 
                    "x": data.tolist() }
                }
    request = json.dumps(model_input)
    headers = {"content-type": "application/json"}
    response = requests.post(URL, data=request, headers=headers) # to mlflow
    response = response.json() 
    print(response)

The very same input to the model, but I could get inference on one but not the other? Am I missing something here? I would like to use mlflow.pytorch.log_model so I don't have to do a model wrapper for generalisation with mlflow.pyfunc.log_model.

Can anyone help me with this please.

Eusto
  • 21
  • 2

0 Answers0