What is the general logic behind choosing the weight for calculating weighted sigmoid cross-entropy loss, or for any weighted loss in case of an imbalanced dataset? The problem domain is based on vision/image classification.
Asked
Active
Viewed 2,629 times
1 Answers
2
A good reference would be this CVPR '19 paper: "Class-Balanced Loss Based on Effective Number of Samples". In the paper, they've used an re-weighting scheme that uses the effective number of samples for each-class to re-balance the loss for handling inter-class imbalance issues. You could also refer a Medium article explaining the same research work.

Balraj Ashwath
- 1,407
- 2
- 13
- 19
-
A very good starting point, thanks. But why is the norm of weighting is set to the inverse of class support or inverse of the square root of class support? – Solaiman Salvi Jan 04 '20 at 08:50