-2

If y is the label and hat y is my prediction, would the following formula for cross-entropy with the number of C possible classes be right:

Binary Cross Entropy Formula

In the case of a Binary Cross Entropy, can I just remove the sum over C or say C=1?

For calculating the loss over the whole dataset or a mini-batch with size M, I just add 1/M sum over m before sum over c, right?

Thanks!

user3352632
  • 617
  • 6
  • 18

1 Answers1

1

This is a formula of binary cross entropy and C is not for classes. C is for number of examples in a mini-batch. For taking average loss instead of sum just add 1/C in the beginning of the formula. Basically, both sum and average can be used while training. That's why sometimes you can see 1/C in the formulas and sometimes not. Multi-class cross entropy looks differently: enter image description here

There is a slight difference in the logic between binary and multi-class cross entropy. Binary cross entropy requires a single value in 0...1 range for each example. That's why there is (1 - y) in the right part for class 0. On the other hand, multi-class cross entropy requires a vector of values in which target class is expected to have higher value then the rest.

  • Thanks! My formula is incorrect! I think this is the most precise formulation of Cross Entropy that contains the sum over classes https://cdn.analyticsvidhya.com/wp-content/uploads/2021/03/Screenshot-from-2021-03-03-11-43-42.png – user3352632 Jul 14 '22 at 13:32
  • y_i times log(hat(y_i) is the dot product, right? y_i is a vector with the size / length of classes? – user3352632 Jul 14 '22 at 13:44
  • No, it is not a dot product. It is multiplication of 2 scalar values. The formula by the link is good, but take into account that ground truth target is usually one-hot encoded vector so that cross-entropy for classes sum will equal to **log(hat(y_i)** of target class, other classes will be multiplied by zero in this sum. This is probably the reason why classes sum is sometimes skipped in formulas of cross-entropy. – Serhii Maksymenko Jul 14 '22 at 15:10
  • are you talking about your formula or the formula of my link? – user3352632 Jul 15 '22 at 00:44
  • 1
    Sorry for confusion, I meant formula of your link – Serhii Maksymenko Jul 15 '22 at 07:24