I deployed a tensorflow model on GCP AI Platform. This model predicts whether a text is sarcasm (1) or not (0).
A text is represented (with a given function "tokenize_text") as two tensors. That could look like this:
text = tokenize_text('This is a text')
print(text)
>>> <tf.Tensor: shape=(1, 512), dtype=int32, numpy=
array([[ 101, 71284, 92947, 11962, 10168, 12830, 102, 0,...]],
array([[1, 1, 1, 1, 1, 1, 1, 0, 0, ...]]>
Furthermore
model.predict(text) #result: not sarcasm (4%)
>>> array([[0.04065517]], dtype=float32)
Now I want to do the same thing on the same model but on GCP AI Platform. Therefore, the input ("text") will be wrapped up in a JSON as the model only works with JSON files. But I get the following error:
TypeError: Object of type EagerTensor is not JSON serializable
I knew that Tensors are not directly convertable to JSON. However, I only used tensors for prediction before deploying on GCP.
Do you have any ideas/approaches?