Questions tagged [tensorrt-python]
18 questions
2
votes
1 answer
TRT inference using onnx - Error Code 1: Cuda Driver (invalid resource handle)
Currently I'm tryin to convert given onnx file to tensorrt file, and do inference on the generated tensorrt file.
To do so, I used tensorrt python binding API, but
"Error Code 1: Cuda Driver (invalid resource handle)" happens and there is no kind…

happychild
- 83
- 7
2
votes
1 answer
Unable to install python3-libnvinfer package, unmet dependencies
I'm trying to install python3-libnvinfer-dev for TensorRT (Tensorflow). I have Ubuntu 22.04 with python 3.10.4 but I use Anaconda's python 3.9.7. I have Nvidia 510 drivers, CUDA 11.6 and Cudnn 8. The issue is, when I run sudo apt-get install…

awhitesong
- 105
- 2
- 11
1
vote
1 answer
Convert a tensor RT engine file back to source Onnx file, or pytorch model wights
I wanted to explore possible options for model conversions.
Converting a pytorch model to onnx is pretty straightforward
After that, is it possible to convert an onnx model file into a Tensor RT engine file using tensorrt python API
I wanted to get…

Homagni Saha
- 11
- 1
1
vote
0 answers
Test the .trt file using tensorflow
Below output_saved_model_dir in this directory i am having trt file named final_model_gender_classification_gpu0_int8.trt
output_saved_model_dir='/home/cocoslabs/Downloads/age_gender_trt'
saved_model_loaded =…

Pavithran
- 41
- 3
1
vote
0 answers
Tensorrt build engine gives error with static input dimensions
I am trying to build a cuda engine using static dimensions and referring this documentation: https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html
However, I keep seeing the below error:
[TensorRT] ERROR: (Unnamed Layer* 249)…

pree
- 2,297
- 6
- 37
- 55
0
votes
1 answer
importing tensorrt gives module not found error
The import of tenosrrt gives an error of module not found. Here are some commands I ran on my terminal. I am working on jetson xavier nx developer tool kit. Tensorrt is installed by default with jetpack flash
Python version:3.8
tensorrt version:…

AI Pro
- 1
0
votes
0 answers
Can't drive the nvidia GPU on Ubuntu server, finally Skipping registering GPU devices
I tried to run a simple TensorFlow verify script.
But always failed to drive my Nvidia GPU, always happened "Skipping registering GPU devices...". Finally, the script runs via CPU.
I expect to show as my MacBook: "Plugin optimizer for device_type…

Kao-Yuan Lin
- 65
- 1
- 6
0
votes
0 answers
How to do explicit quantization with TensorRT by setting weights, biases and scales?
I have done the following steps as inputs to the problem:
trained a MNIST model using Tensorflow 2.11 (see link below)
made the model Quantization Aware (QA) using tfmot.quantization.keras.quantize_model
trained the QA model a bit extra to adapt to…

SpaceKees
- 31
- 3
0
votes
1 answer
Load tensorrt model without tensorflow
I am struggling to find an answer, and every exemple uses tensorflow.
I am trying to load a saved_model, optimized with tensorrt, without tensorflow.
I performed training with tensorflow, and optimized and saved the model with tensorrt.
Now, on a…
0
votes
1 answer
Modules lost upgrading to python 3.11
I just installed python 3.11 and noticed modules are missing.
I am working on ARM, NVIDIA Jetson Xavier AGX, Jetpack 5.0.2 it comes with ubuntu 20.04 and python 3.8.
In python 3.8 I can run import tensorrt, but in python 3.11 I get No module named…

Joachim Spange
- 85
- 1
- 1
- 10
0
votes
1 answer
cuMemcpyHtoDAsync failed: invalid argument by using TensorRT (Python)
I am trying to copy an np array to the GPU using TensorRT in Python but I keep getting the error 'cuMemcpyHtoDAsync failed: invalid argument'. The array has the correct format (float32) and size, but the error remains. Does anyone have an idea of…

Peter T.
- 13
- 2
0
votes
1 answer
Tensor RT installation
I am trying to install Tensor Rt in windows 11. I have installed the necessary CUDA and CUDNN versions. But it says that the system is not supporting. So is tensor rt compatible with windows 11
I have tried installing it in possible ways but failed.
0
votes
0 answers
Is it possible to load only weights to TF-TRT model?
I have two models with the exact same architecture, but different weights as the same network is used for two different problems. We're using TF-TRT to optimize the model in order to use it on edge devices.
We'd like to be able to switch from one…

nachomendi
- 51
- 2
- 6
0
votes
0 answers
Getting black image after inference from .engine file using python
I converted the pytorch Real-ESRGAN model to model.engine file from using c++ code . After conversion inference is working well on c++. But when I try to infer the image from this model.engine in python. It gives me the black image as given…
0
votes
0 answers
Error Code 1: Myelin (Compiled against cuBLASLt 10.2.2.0 but running against cuBLASLt 11.4.2.0.) : Tensorrt
Hi I am using TensorRT for an image detection in python but getting this issue. we have tested this on Linux and working well but got issues on windows. This is issue of only getting on python , C++ inference working smoothly. I am also attaching…

NaQi
- 28
- 1
- 1
- 6