I've got some models for the ONNX Model Zoo. I'd like to use models from here in a TensorFlow Lite (Android) application and I'm running into problems figuring out how to get the models converted.
From what I've read, the process I need to follow is to convert the ONNX model to a TensorFlow model, then convert that TensorFlow model to a TensorFlow Lite model.
import onnx
from onnx_tf.backend import prepare
import tensorflow as tf
onnx_model = onnx.load('./some-model.onnx')
tf_rep = prepare(onnx_model)
tf_rep.export_graph("some-model.pb")
After the above executes, I have the file some-model.pb which I believe contains a TensorFlow Freeze Graph. From here I am not sure where to go. When I search I find a lot of answers that are for TensorFlow 1.x (which I only realize after the samples I find fail to execute). I'm trying to use TensorFlow 2.x.
If it matters, the specific model I'm starting off with is here.
Per the ReadMe.md, the shape of the input is (1x3x416x416) and the output shape is (1x125x13x13).