0

I am using a JetsonNano with JetPack 4.4.1, Tensorflow 2.3.1 and Tensorrt 7.1.3 I have a Keras model that I converted to a TF-TRT model

When performing inference on the model, I get the following error:

TF-TRT Warning: Engine creation for PartitionedCall/TRTEngineOp_0_0 failed. The native segment will be used instead. Reason: Internal: Failed to build TensorRT engine

During Inference I get:

W tensorflow/compiler/tf2tensorrt/kernels/trt_engine_op.cc:629] TF-TRT Warning: Engine retrieval for input shapes: [[1,100,68,3]] failed. Running native segment for PartitionedCall/TRTEngineOp_0_0

What does it mean?

It seems like TRT is not building engines but the inference works the same. I have performed the same inference on another PC (TF-2.4.1 and TRT 7.2) and I do not get this error. However, I have compared the inference results between the Keras and TF-TRT model and they are the same (both with the error on JetsonNano and without the error on PC)

Why are my results the same? How do I solve this?

1 Answers1

0

It is hard to tell what is happening without a little more information on your code. Also, I am not quite sure where exactly your two errors occur, could you elaborate a little further?

As a general info: TF-TRT will fall back to a TensorFlow operation in case the specific operation is not supported by TRT. This might explain why your results are ok after all.