Tensorflow has a wonderful function.
tf.nn.softmax_cross_entropy_with_logits
Later I see another function,
tf.nn.softmax_cross_entropy_with_logits_v2
what is the reason for this new function?
While using the earlier function Tensorflow says,
Future major versions of TensorFlow will allow gradients to flow into the labels input on backprop by default.
See tf.nn.softmax_cross_entropy_with_logits_v2.
I don't understand what it actually means. However the definition of the functions are same.
tf.nn.softmax_cross_entropy_with_logits_v2(
_sentinel=None,
labels=None,
logits=None,
dim=-1,
name=None
)
The documentations are kind of over my head (As they are also kind of same). Any better explanation?