If I understand your question correctly, you have saved_model.pb
generated, either by using tf.saved_model.simple_save
or tf.saved_model.builder.SavedModelBuilder
or by estimator.export_savedmodel
.
If my understanding is correct, then, you are exporting Training and Inference Graphs to saved_model.pb
.
The Point you mentioned from the Guide of TF Org Website states that, in addition to Exporting Training Graph, we need to Export Evaluation Graph as well. That is called EvalSavedModel.
The Evaluation Graph comprises the Metrics for that Model, so that you can Evaluate the Model's performance using Visualizations.
Before we Export EvalSaved
Model, we should prepare eval_input_receiver_fn
, similar to serving_input_receiver_fn
.
We can mention other functionalities as well, like, if you want the Metrics to be defined in a Distributed Manner or if we want to Evaluate our Model using Slices of Data, rather than the Entire Dataset. Such Options can be mentioned in eval_input_receiver_fn
.
Then we can Export the EvalSavedModel
using the Code below:
tfma.export.export_eval_savedmodel(estimator=estimator,export_dir_base=export_dir,
eval_input_receiver_fn=eval_input_receiver_fn)