-1

Recently, I have been using estimator to train and deploy a tensorflow model, but when I deploy the model (it was exported using estimator serving_fn including tf.py_func) using tensorflow seving, there is an error (see below).

I found this question on Github that said the serving can't support tf.py_func.

Can anyone help?

I want to implement a token function using other tokenlizer(NLTK,Jieba).

The error:

Invalid argument: No OpKernel was registered to support Op 'PyFunc' used by {{node map/while/PyFunc}}with these attrs: [Tout=[DT_STRING], token="pyfunc_4", _output_shapes=[<unknown>], Tin=[DT_STRING]]
Registered devices: [CPU]
Registered kernels:
  <no registered kernels>
Laurenz Albe
  • 209,280
  • 17
  • 206
  • 263
Albert
  • 19
  • 6

1 Answers1

1

Have you tried using the tensorflow native tokenizer,eg. see https://www.tensorflow.org/beta/tutorials/tensorflow_text/intro#tokenization