in case it is of interest to you to use inference with C++ you can compile TFlite 2.4.1 on your Jetson device like I did on the Xavier NX:
$ sudo apt-get install cmake curl
$ wget -O tensorflow.zip https://github.com/tensorflow/tensorflow/archive/v2.4.1.zip
$ unzip tensorflow.zip
$ mv tensorflow-2.4.1 tensorflow
$ cd tensorflow
$ ./tensorflow/lite/tools/make/download_dependencies.sh
$ ./tensorflow/lite/tools/make/build_aarch64_lib.sh
After that you will also have to install the TF lite flat buffers like this:
$ cd ./tensorflow/tensorflow/lite/tools/make/downloads/flatbuffers
$ mkdir build && cd build
$ cmake ..
$ make -j
$ sudo make install
$ sudo ldconfig
After that you find the library here tensorflow/tensorflow/lite/tools/make/gen/linux_aarch64/libtensorflow-lite.a
You can build your inference application against that like this
gcc -llibtensorflow-lite.a -ledgetpu main.cpp
You will also need to install libedgetpu.so like shown on Coral.ai
Best
Alexander