1

I have a TensorFlow keras model that looks like the following:

model = tf.keras.Sequential([
    feature_layer,
    tf.keras.layers.Dense(64, activation='relu'),
    tf.keras.layers.Dense(
        6, kernel_regularizer=l2(0.01), activation=tf.nn.softmax
    )
])
model.compile(loss='squared_hinge',
              optimizer='adadelta',
              metrics=['accuracy'])

I save it to a format servable with TensorFlow Serving, with the following code and inputs:

tf.saved_model.simple_save(
    keras.backend.get_session(),
    export_path,
    inputs=model.inputs,
    outputs={t.name:t for t in model.outputs})

#model.inputs:
#{
# 'int1': <tf.Tensor 'int1_5:0' shape=(?,) dtype=int32>, 
# 'int2': <tf.Tensor 'int2_5:0' shape=(?,) dtype=int32>,
# 'int3': <tf.Tensor 'int3_5:0' shape=(?,) dtype=int32>,
# 'string1': <tf.Tensor 'string1_5:0' shape=(?,) dtype=string>
#}

#model.outputs:
# [<tf.Tensor 'sequential/Identity:0' shape=(?, 6) dtype=float32>]

I expect 6 outputs, each cooresponding to a class and their likelyhood. Using model.predict(), this seems to work fine: the model returns arrays like this [0.15914398 0.152271 0.18949589 0.14985411 0.17449048 0.1747446 ].

However, the SavedModel that I want to use with TensorFlow Serving that is produced by the code has an empty 'variables' folder upon generation, leaving only the saved_model.pb file. When I request the model with the saved_model_cli I get the following:

RuntimeError: The Session graph is empty.  Add operations to the graph before calling run().

Why is this the case and how can I get the behaviour I want? Thanks!

0 Answers0