I am having trouble running inference from a tensorflow 2.0 SavedModel loaded the C_API, because I cannot access the input and output operations by name.
I load the session via TF_LoadSessionFromSavedModel(...) successfully:
#include <tensorflow/c/c_api>
...
TF_Status* status = TF_NewStatus();
TF_Graph* graph = TF_NewGraph();
TF_Buffer* r_opts = TF_NewBufferFromString("",0);
TF_Buffer* meta_g = TF_NewBuffer();
TF_SessionOptions* opts = TF_NewSessionOptions();
const char* tags[] = {"serve"};
TF_Session* session = TF_LoadSessionFromSavedModel(opts, r_opts, "saved_model/tf2_model", tags, 1, graph, meta_g, status);
if ( TF_GetCode(status) != TF_OK ) exit(-1); //does not happen
However, I get an error when trying to setup the input and output tensors using:
TF_Operation* inputOp = TF_GraphOperationByName(graph, "input"); //works with "serving_default_input"
TF_Operation* outputOp = TF_GraphOperationByName(graph, "prediction"); //does not work
The names I am passing as arguments are assigned to the input and output keras layers of the saved model, but are not in the loaded graph
. Running saved_model_cli
(following the tf SavedModel tutorial here) shows that the tenors with these names exist under the SignatureDef
serving_default
, so I guess that I need to instantiate serving_default
into a graph (in other words create a graph according to the signature), however I could not find a way to do this using the C API.
Note that tensorflows's C_API test uses C++ tensorflow/core/ functionality to load a signature definition map from the metagraph and uses it to find input and output operation names, but I would like to avoid the dependency on C++.
Also note that accessing the operations by name works for frozen .pb graphs, however this format is being deprecated.
Thanks in advance for any ideas and hints!