1

I wanted to save a model to do some predictions on specific pictures. Here is my serving function:

def _serving_input_receiver_fn():
    # Note: only handles one image at a time
    feat = tf.placeholder(tf.float32, shape=[None, 120, 50, 1])
    return tf.estimator.export.TensorServingInputReceiver(features=feat, receiver_tensors=feat)

and here is where I export the model:

  export_dir_base = os.path.join(FLAGS.model_dir, 'export')
  export_dir = estimator.export_savedmodel(
    export_dir_base, _serving_input_receiver_fn)

But I get the following error:

ValueError: Both labels and logits must be provided.

Now this Error I don't understand since the Serving stuff should just create a placeholder so I can later put some images through the placeholder to make predictions on the saved model?

Here is the whole traceback:

Traceback (most recent call last):
  File "/home/cezary/models/official/mnist/mnist_tpu.py", line 222, in <module>
    tf.app.run()
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/platform/app.py", line 125, in run
    _sys.exit(main(argv))
  File "/home/cezary/models/official/mnist/mnist_tpu.py", line 206, in main
    export_dir_base, _serving_input_receiver_fn)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/estimator/estimator.py", line 650, in export_savedmodel
    mode=model_fn_lib.ModeKeys.PREDICT)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/estimator/estimator.py", line 703, in _export_saved_model_for_mode
    strip_default_attrs=strip_default_attrs)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/estimator/estimator.py", line 811, in _export_all_saved_models
    mode=model_fn_lib.ModeKeys.PREDICT)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/tpu/python/tpu/tpu_estimator.py", line 1971, in _add_meta_graph_for_mode
    mode=mode)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/estimator/estimator.py", line 879, in _add_meta_graph_for_mode
    config=self.config)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/tpu/python/tpu/tpu_estimator.py", line 1992, in _call_model_fn
    features, labels, mode, config)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/estimator/estimator.py", line 1107, in _call_model_fn
    model_fn_results = self._model_fn(features=features, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/tpu/python/tpu/tpu_estimator.py", line 2203, in _model_fn
    features, labels, is_export_mode=is_export_mode)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/tpu/python/tpu/tpu_estimator.py", line 1131, in call_without_tpu
    return self._call_model_fn(features, labels, is_export_mode=is_export_mode)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/tpu/python/tpu/tpu_estimator.py", line 1337, in _call_model_fn
    estimator_spec = self._model_fn(features=features, **kwargs)
  File "/home/cezary/models/official/mnist/mnist_tpu.py", line 95, in model_fn
    cross_entropy = tf.nn.sigmoid_cross_entropy_with_logits(labels=labels, logits=logits)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/nn_impl.py", line 156, in sigmoid_cross_entropy_with_logits
    labels, logits)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/nn_ops.py", line 1777, in _ensure_xent_args
    raise ValueError("Both labels and logits must be provided.")
ValueError: Both labels and logits must be provided.

Nevermind the mnist naming, I just used the structure of the code, but didn't rename it.

Thanks for any help!

craft
  • 495
  • 5
  • 16

1 Answers1

0

(I can't comment with a brand new account.) I was able to replicate your error by setting features and receiver_tensors to have the same value, but I don't think that your __serving_input_receiver_fn is implemented correctly. Can you follow the example here?

Alex Ilchenko
  • 322
  • 1
  • 3
  • Hi Welcome to SO. As good practice, don't write answers limited to the OP. Answer should be helpful for the entire community whoever is searching for similar issue. Try to provide code example or if you are sharing a link explain why do you think the link would solve the problem. Spend some time on https://stackoverflow.com/help/how-to-ask – Amit Phaltankar Sep 25 '18 at 01:10
  • Thanks or the reply! I did follow it and now I get the message from this condition: if mode == tf.estimator.ModeKeys.PREDICT: raise RuntimeError("mode {} is not supported yet".format(mode)) Seems like predicting isnt supported yet on the TPU? – craft Sep 25 '18 at 11:01
  • @craft Hmm I believe that you might be trying to use the generic Estimator and did not provide something for the predict functionality. However, it is hard for me to say without looking at your code. Could you post this as another question (tag it as `google-cloud-tpu`) with more details and the code? – Alex Ilchenko Sep 25 '18 at 18:18
  • @craft If the above answer resolved your question, feel free to accept it! – Alex Ilchenko Sep 25 '18 at 18:19
  • I definitly use the TPUEstimator. Basically I used this code here https://github.com/tensorflow/models/blob/master/official/mnist/mnist_tpu.py#L85 . I didnt change anything particular in there. @AlexIlchenko – craft Sep 26 '18 at 06:10
  • @craft So there is actually a [pull request](https://github.com/tensorflow/models/pull/5379/files) happening to enable that mode. If you can't wait for the pull request to be merged, just take a look at how the predict mode is implemented and use that. – Alex Ilchenko Sep 26 '18 at 17:35