0

Hi I want to setup Kubeflow - NVIDIA TensorRT inference server with repository located in MinIO.

I don't how to change gs://inference-server-model-store/tf_model_store to connect Minio.

ks init my-inference-server
cd my-inference-server
ks registry add kubeflow https://github.com/kubeflow/kubeflow/tree/master/kubeflow
ks pkg install kubeflow/nvidia-inference-server
ks generate nvidia-inference-server iscomp --name=inference-server --image=nvcr.io/nvidia/tensorrtserver:19.04-py3 --modelRepositoryPath=gs://inference-server-model-store/tf_model_store

1 Answers1

0

Looks like NVIDIA TensorRT inference documentation on Kubeflow is not updated yet.

  • Ksonnet is being deprecated after Heptio acquisition. Hence Kubeflow is moving to Kustomize.

  • If you consider using Kustomize, you can configure S3 API with Kubeflow as described here. You should be able to use s3://bucket/object for models.

Nitish T
  • 132
  • 1
  • 1
  • 8