0

I am struggling to find an answer, and every exemple uses tensorflow. I am trying to load a saved_model, optimized with tensorrt, without tensorflow.

I performed training with tensorflow, and optimized and saved the model with tensorrt.

Now, on a different machine, which only have tensorrt installed, I would like to load the model and perform inference. The goal of not having tensorflow on that machine is to save building time, and space.

Is it at least possible? If not, if I convert the model to Onnx, can I load it and run it with only tensorrt ?

Thank you in advance.

1 Answers1

0

The requirements of TensorRT contains tensorflow, so no.

Minh-Long Luu
  • 2,393
  • 1
  • 17
  • 39
  • Ok but, when I run this container : docker run --gpus all -it nvcr.io/nvidia/tensorrt:22.09-py3 I cannot import tensorflow in my script, I can import tensorrt. But all I see from my searches are load model with tensorflow (which I cannot import). Is there a way to import my .pb model with trt. ? – Simon Nathan Mar 17 '23 at 21:32