3

I would like to convert an integer quantized tflite model into a frozen graph (.pb) in Tensorflow. I read through and tried many solutions on StackOverflow and none of them worked. Specifically, toco didn't work (output_format cannot be TENSORFLOW_GRAPHDEF).

My ultimate goal is to get a quantized ONNX model through tf2onnx, yet tf2onnx does not support tflite as input (only saved_model, checkpoint and graph_def are supported). However, after quantizing the trained model using TFLiteConverter, it only returns a tflite file. This is where the problem arises.

The ideal flow is essentially this: tf model in float32 -> tflite model in int8 -> graph_def -> onnx model. I am stuck at the second arrow.

nikolai_ye
  • 51
  • 2

1 Answers1

2

The ability to convert tflite models to .pb was removed after Tensorflow version r1.9. Try downgrading your TF version to 1.9 and then something like this

bazel run --config=opt \
  //tensorflow/contrib/lite/toco:toco -- \
  --input_file=/tmp/foo.tflite \
  --output_file=/tmp/foo.pb \
  --input_format=TFLITE \
  --output_format=TENSORFLOW_GRAPHDEF \
  --input_shape=1,128,128,3 \
  --input_array=input \
  --output_array=MobilenetV1/Predictions/Reshape_1

Here is the source.

sakumoil
  • 602
  • 4
  • 11