1

When calling the following method:

losses = [tf.nn.sparse_softmax_cross_entropy_with_logits(logits, labels)
          for logits, labels in zip(logits_series,labels_series)]

I receive the following ValueError:

ValueError: Only call `sparse_softmax_cross_entropy_with_logits` with named arguments (labels=..., logits=..., ...)

Against this:

[tf.nn.sparse_softmax_cross_entropy_with_logits(logits, labels)

According to the documentation for nn_ops.py I need to ensure that the logins and labels are initialised to something e.g.:

def _ensure_xent_args(name, sentinel, labels, logits): # Make sure that all arguments were passed as named arguments. if sentinel is not None: raise ValueError("Only call %s with " "named arguments (labels=..., logits=..., ...)" % name) if labels is None or logits is None: raise ValueError("Both labels and logits must be provided.")

Logits=X, labels =Y

What is the cause here? And am I initialising them to some value such as the loss? Or?

Maxim
  • 52,561
  • 27
  • 155
  • 209
Glennismade
  • 117
  • 2
  • 14
  • I found a similar issue here: https://stackoverflow.com/questions/45038024/tensorflow-valueerror-only-call-sparse-softmax-cross-entropy-with-logits-with – Glennismade Dec 20 '17 at 15:36
  • what do I have to explicitly tell tensorflow what the logits and labels are? – Glennismade Dec 20 '17 at 15:49

1 Answers1

1

The cause is that the first argument of tf.nn.sparse_softmax_cross_entropy_with_logits is _sentinel:

_sentinel: Used to prevent positional parameters. Internal, do not use.

This API encourages you to name your arguments, like this:

tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits, labels=labels)

... so that you don't accidentally pass logits to labels or vice versa.

Maxim
  • 52,561
  • 27
  • 155
  • 209
  • Ahhh, awesome, fixed. Also, spotted I was passing a float to labels. My bad. Cheers for the help on this. Tensorflow is very new to me, so trying to learn amongst all this module shuffling that has happened between 0. and 1. versions. – Glennismade Dec 20 '17 at 17:08
  • 1
    No problem, Tensorflow isn't trivial. Don't hesitate to ask further questions when you get confused by something – Maxim Dec 20 '17 at 17:10
  • @Maxim excuse me , i'm facing this problem and still error exists .. i tried losses = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=labels) with same error .. Can you help, please ? – user1 Oct 11 '18 at 12:29