0

Tensorflow has a wonderful function.

tf.nn.softmax_cross_entropy_with_logits

Later I see another function,

tf.nn.softmax_cross_entropy_with_logits_v2

what is the reason for this new function?

While using the earlier function Tensorflow says,

Future major versions of TensorFlow will allow gradients to flow into the labels input on backprop by default.

See tf.nn.softmax_cross_entropy_with_logits_v2.

I don't understand what it actually means. However the definition of the functions are same.

tf.nn.softmax_cross_entropy_with_logits_v2(
    _sentinel=None,
    labels=None,
    logits=None,
    dim=-1,
    name=None
)

The documentations are kind of over my head (As they are also kind of same). Any better explanation?

Maruf
  • 792
  • 12
  • 36
  • 1
    Thank @mikkola Yea it's a duplicate. Should I delete the question or just write an answer from those post.? – Maruf Mar 26 '18 at 08:53
  • No need to make any changes, once it's marked as a duplicate that's sufficient because there's now a link to the original question. – David Parks Mar 26 '18 at 16:09

0 Answers0