1

So I have a neural network using cross-entropy (CE) as loss-function, its a binary classification. I am using the AUC as a validation metric, and when I plot training and validation error (i.e the CE), the is the training falling, the validation is rising BUT the validation AUC is rising aswell.

I struggle to see how that is possible - shouldnt the AUC rise then the CE falls? If not, how is that possible?

EDIT:

Here is the CE

def cross_entropy(ys,ts):
    cross_entropy = -torch.sum(ts * torch.log(ys+0.00001) + (1-ts)*torch.log(1-ys+0.00001))
    return cross_entropy
CutePoison
  • 4,679
  • 5
  • 28
  • 63

0 Answers0