0

Here is my code:

import tensorflow as tf
    with tf.Session() as sess:
        y = tf.constant([0,0,1])
        x = tf.constant([0,1,0])
        r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
        sess.run()
        print(r.eval())

It generates the following error:

ValueError                                Traceback (most recent call last)
<ipython-input-10-28a8854a9457> in <module>()
      4     y = tf.constant([0,0,1])
      5     x = tf.constant([0,1,0])
----> 6     r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
      7     sess.run()
      8     print(r.eval())

~\AppData\Local\conda\conda\envs\tensorflow\lib\site-packages\tensorflow\python\ops\nn_ops.py in sparse_softmax_cross_entropy_with_logits(_sentinel, labels, logits, name)
   1687       raise ValueError("Rank mismatch: Rank of labels (received %s) should "
   1688                        "equal rank of logits minus 1 (received %s)." %
-> 1689                        (labels_static_shape.ndims, logits.get_shape().ndims))
   1690     # Check if no reshapes are required.
   1691     if logits.get_shape().ndims == 2:

ValueError: Rank mismatch: Rank of labels (received 1) should equal rank of logits minus 1 (received 1).

Could somebody help me to understand this error? It is fairly straight forward how to compute softmax and compute cross-entropy manually.

Also, how would I use this function, I need to feed batch into it (2 dim array)?

UPDATE

I also tried:

import tensorflow as tf

with tf.Session() as sess:
    y = tf.constant([1])
    x = tf.constant([0,1,0])
    r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
    sess.run()
    print(r.eval())

and it generated the same error

user1700890
  • 7,144
  • 18
  • 87
  • 183
  • 1
    Not sure, but have you tried with rank 2 tensors? Softmax is usually employed in multi-class problems. – E_net4 Sep 27 '17 at 11:47

1 Answers1

1

Fixed it for you. x needs to be a 2d vector

with tf.Session() as sess:
    y = tf.constant([1])
    x = tf.expand_dims(tf.constant([0.0, 1.0, 0.0]), 0)
    r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
    print(r.eval())
Aaron
  • 2,354
  • 1
  • 17
  • 25