1

I was following the offcial guide for XLA AOT compilation (https://www.tensorflow.org/xla/tfcompile), and compiling the examples works just fine (inside aot/tests).

But then I wanted to compile some slightly bigger models, and a problem arises: if XLA AOT requires a frozen graph as input (as I understand from the guide) and frozen graphs are not supported anymore in TensorFlow 2, what input does XLA expect now?

SC94
  • 21
  • 4

1 Answers1

1

It seems like there are still ways to freeze a graph in TensorFlow 2. I followed this post to create a frozen graph and it worked to compile it afterward: https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/

# Convert Keras model to ConcreteFunction
full_model = tf.function(lambda x: model(x))
full_model = full_model.get_concrete_function(
    tf.TensorSpec(model.inputs[0].shape, model.inputs[0].dtype))

# Get frozen ConcreteFunction
frozen_func = convert_variables_to_constants_v2(full_model)
frozen_func.graph.as_graph_def()

layers = [op.name for op in frozen_func.graph.get_operations()]
print("-" * 50)
print("Frozen model layers: ")
for layer in layers:
    print(layer)

print("-" * 50)
print("Frozen model inputs: ")
print(frozen_func.inputs)
print("Frozen model outputs: ")
print(frozen_func.outputs)

# Save frozen graph from frozen ConcreteFunction to hard drive
tf.io.write_graph(graph_or_graph_def=frozen_func.graph,
                  logdir="./frozen_models",
                  name="frozen_graph.pb",
                  as_text=False)
lucgig
  • 31
  • 3