1

I implemented a simple TF model. The model received a serialized tensor of a gray image (simply a 2d ndarray), and restored it to a 2-d tensor. After then, some inference is applied on this 2-d tensor.

I deployed the mode with TensorFlow Model Serving, and tried to send a JSON string to the REST port as follows:

  {
    "instances": [
                    {"b64": bin_str},
                 ]
  }

I tried something like tf.io.serialize_tensor etc to convert input image into a serialized tensor and to pass it to the server, but all failed.

I would like to know how to send a serialized tensor to the serving server.

And my saved model has following signature:

 signatures = {
        "serving_default": _get_serve_tf_examples_fn(
            model,
            transform_output).get_concrete_function(
                # explicitly specify input signature of serve_tf_examples_fn
                tf.TensorSpec(
                    shape=[None],
                    dtype=tf.string,
                    name="examples")),
    }

and the definition of _get_serve_tf_examples_fn is,

def _get_serve_tf_examples_fn(model: tf.keras.models.Model,
                              transform_output: tft.TFTransformOutput):
    # get the Transform graph from the component
    model.tft_layer = transform_output.transform_features_layer()

    @tf.function
    def serve_tf_examples_fn(serialized: str) -> Dict:
        '''  Args: serialized: is serialized image tensor.
        '''
        feature_spec = transform_output.raw_feature_spec()
        # remove label spec.
        feature_spec.pop("label")

        # Deserialize the image tensor.
        parsed_features = tf.io.parse_example(
            serialized,
            feature_spec)

        # Preprocess the example using outputs of Transform pipeline.
        transformed_features = model.tft_layer(parsed_features)
        outputs = model(transformed_features)
        return {"outputs": outputs}

    return serve_tf_examples_fn

The above code segment received a serialized tensor of a gray image (simply a 2d ndarray), and restored it to a 2-d tensor. After then, the model is doing inference on this 2-d tensor.

I would like to know how to send a serialized tensor to the REST port of the serving server.

Any help would be appreciated.

Zepp L.
  • 11
  • 1
  • BTW, a workaround is to read image file as file stream, and then encoding it as string. We can send the string to the TF serving sever in which the received string is decoded into a ndarray with image decoder. But in this question I would like to get a solution to send a serialized tensor to the server. Thanks in advance. – Zepp L. May 24 '22 at 09:47

0 Answers0