0
predicted_scores = tf.constant([
    [0.32,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5],
    [0.31,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5],
    [0.31,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5],
    [0.3111,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5],
    [0.33423,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5],
    [0.33243,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5],
    [0.334,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5],
    [0.32,0.2,0.15,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5]
])# predicted_scores(N, 8 , n_classes)

true_classes = tf.constant([
    [ 5,  5,  0, 10,  0,  0, 10,  5]
    
])

If I have predicted_scores and true_classes like this
with torch I used

conf_loss_all = tf.nn.sigmoid_cross_entropy_with_logits(predicted_scores.view(-1, n_classes), true_classes.view(-1))  # (N * 8732)

to find the cross_enthropy
How should I find the cross entropy with TensorFlow?

Berriel
  • 12,659
  • 4
  • 43
  • 67
user6507246
  • 91
  • 1
  • 2

1 Answers1

1

You can use the SparseCategoricalCrossentropy loss.

scce = tf.keras.losses.SparseCategoricalCrossentropy()
scce(true_classes[0], predicted_scores)
<tf.Tensor: shape=(), dtype=float32, numpy=2.8711867>
Allen Qin
  • 19,507
  • 8
  • 51
  • 67