-1

Here it is the formula I built: My equation for weighted binary cross entropy loss

Where:

alpha(i) = #Negative samples / Total samples if y(i)=1

alpha(i) = #Positive samples / Total samples if y(i)=0

Is it correct? I cannot find any similar formula on the internet. Thank you!

inginging
  • 1
  • 1
  • 3

1 Answers1

0

The alpha(i) value inside the summation block seems misplaced. The binary loss value is calculated for each sample which is then summated to get the total binary log loss/binary cross entropy. But the alpha(i) does not belong to the sample, it is an aggregate property. Hence the formula is not logical. Refer -> https://www.analyticsvidhya.com/blog/2021/03/binary-cross-entropy-log-loss-for-binary-classification/

  • Sorry I can't understand your answer. I agree that alpha is an aggregated property, however you use it to compute the total loss. Since everytime you have a positive label you should multiply it by the number of negative samples divided by the total samples, I came up with this formula. Could you explain again? – inginging Feb 06 '22 at 13:45
  • I get your point, missed out the relevance of the term "weighted" in your ques definition. The formula looks fine in this case. But still, since alpha is not sample dependent, why don't we keep the weights outside the summation. Avoids confusion. – Vighnesh Sablok Feb 06 '22 at 14:02
  • I agree it would avoid confusion but I am not completely sure how to write the formula in that case, since alpha(i) depends on the value of y(i). – inginging Feb 06 '22 at 14:21