I have a feed-forward neural network and a binary classification problem.
Defining the loss function as
def cross_entropy(ys,ts):
cross_entropy = -torch.sum(ts * torch.log(ys+0.00001) + (1-ts)*torch.log(1-ys+0.00001))
return cross_entropy
and the AUC as
def auc(ys, ts):
ts = ts.detach().numpy()
ys = ys.detach().numpy()
return roc_auc_score(ts,ys)
where ts
and ys
is target/net-output (for class 1) respectively.
For some reason, when I train, the cross-entropy rises and the AUC rises. I would think either should fall when the other one grows.