My current code using sparse_softmax_cross_entropy
works fine.
loss_normal = (
tf.reduce_mean(tf.losses
.sparse_softmax_cross_entropy(labels=labels,
logits=logits,
weights=class_weights))
)
However, when I try to use the hinge_loss
:
loss_normal = (
tf.reduce_mean(tf.losses
.hinge_loss(labels=labels,
logits=logits,
weights=class_weights))
)
It reported an error saying:
ValueError: Shapes (1024, 2) and (1024,) are incompatible
The error seems to be originated from this function in the losses_impl.py
file:
with ops.name_scope(scope, "hinge_loss", (logits, labels)) as scope:
...
logits.get_shape().assert_is_compatible_with(labels.get_shape())
...
I modified my code as below to just extract 1 column of the logits tensor:
loss_normal = (
tf.reduce_mean(tf.losses
.hinge_loss(labels=labels,
logits=logits[:,1:],
weights=class_weights
))
)
But it still reports a similar error:
ValueError: Shapes (1024, 1) and (1024,) are incompatible.
Can someone please help point out why my code works fine with sparse_softmax_cross_entropy
loss but not hinge_loss
?