3

I would like to add extra signature to SavadModel, which will return business description and serve it with TensorFlow Serving.

@tf.function
def info():
    return json.dumps({
       'name':  'My model',
       'description': 'This is model description.',
       'project': 'Product ABCD',
       'type': 'some_type',
       ...
})

As is written in TensorFlow Core manual https://www.tensorflow.org/guide/saved_model#identifying_a_signature_to_export, I can easily export signature which accepts arguments providing tf.TensorSpec.

Is it possible to export signature without arguments and call it on server?


Added after @EricMcLachlan comments:

When I try to call a function without defined signature (input_signature=[]) with a code like this:

data = json.dumps({"signature_name": "info", "inputs": None})

headers = {"content-type": "application/json"}
json_response = requests.post('http://localhost:8501/v1/models/my_model:predict', data=data, headers=headers)

I get next error in the response:

'_content': b'{ "error": "Failed to get input map for signature: info" }'

1 Answers1

3

Defining the Signature:

I was going to write my own example, but here's such a great example provided by @AntPhitlok in another StackOverflow post:

class MyModule(tf.Module):
  def __init__(self, model, other_variable):
    self.model = model
    self._other_variable = other_variable

  @tf.function(input_signature=[tf.TensorSpec(shape=(None, None, 1), dtype=tf.float32)])
  def score(self, waveform):
    result = self.model(waveform)
    return { "scores": results }

  @tf.function(input_signature=[])
  def metadata(self):
    return { "other_variable": self._other_variable }

In this case, they're serving is a Module, but it could have been a Keras model as well.


Using the Serving:

I am not 100% sure how to access the serving (I haven't done it myself yet) but I think you'll be able to access the serving similarly to this:

from tensorflow_serving.apis import predict_pb2
from tensorflow_serving.apis import prediction_service_pb2_grpc
stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)
request = predict_pb2.PredictRequest()
request.model_spec.name = model_name
request.model_spec.signature_name = 'serving_default'
request.model_spec.version_label = self.version

tensor_proto = tf.make_tensor_proto(my_input_data, dtype=tf.float32)
request.inputs['my_signatures_input'].CopyFrom(tensor_proto)

try:
    response = self.stub.Predict(request, MAX_TIMEOUT)
except Exception as ex:
    logging.error(str(ex))
    return [None] * len(batch_of_texts)

Here I'm using gRPC to access the TensorFlow Server.

You'd probably need to substitute 'serving_default' with your serving name. Similarly, 'my_signature_input' should match the input to your tf.function (in your case, I think it's empty).

This is a normal standard Keras type prediction and is piggybacking of predict_pb2.PredictRequest. It might be necessary to create a custom Protobuf but that's a bit beyond my abilities at this point.

I hope it's enough to get you going.

Eric McLachlan
  • 3,132
  • 2
  • 25
  • 37
  • Thank you very much, @EricMcLachlan. Do you happen to know how can I call this signature? For example, I guess I cannot do that with **saved_model_cli run --dir /tmp/module_with_multiple_signatures --tag_set serve --signature_def metadata** , because I have to provide inputs – Andras Gyacsok Mar 06 '20 at 11:39
  • @AndrasGyacsok: I've tried to include a response in my original answer. I hope it helps. If I'm off, please add an answer of your own when you figure it out. – Eric McLachlan Mar 06 '20 at 11:42
  • 1
    Thank you for your answer. It clarified a lot. I will keep searching for, and will post the result. – Andras Gyacsok Mar 06 '20 at 11:50