1

I have installed Kubeflow via MiniKF on my laptop. I am trying to run a CNN on some MNIST data that uses TensorFlow 1.4.0. Currently, I am following this tutorial: https://codelabs.arrikto.com/codelabs/minikf-kale-katib-kfserving/index.html#0

The code is in a Jupyter Notebook server, and it runs completely fine. When I build a pipeline, it was completed successfully. But at the step for "Serving," when I run the kfserver command on my model, I get strange behavior: it gets stuck at the "Waiting for InferenceService."

An example screenshot from a successful run is shown below, where the process must end with the service being created: enter image description here

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245

1 Answers1

0

Stuck at the "waiting for inferenceservice" means that the pipeline doesn't get the hostname of the servable. You can also validate this by running kubectl get inferenceservices, your new inference service should have it's READY state to True. If this is not the case, the deployment has failed.

There might be many different reasons why the inference service is not in ready state, there is a nice troubleshooting guide at kfserving's repo.

Theofilos Papapanagiotou
  • 5,133
  • 1
  • 18
  • 24