The negative log-likelihood for logistic regression is given by […] This is also called the cross-entropy error function.
— Page 246, Machine Learning: A Probabilistic Perspective, 2012
So I tried that and I found a bit of difference:
from sklearn.metrics import log_loss
y_true = [0, 0 , 0, 0]
y_pred = [0.5, 0.5, 0.5, 0.5]
log_loss(y_true, y_pred, labels=[0, 1]) # 0.6931471805599453
from math import log2
def cross_entropy(p, q):
return -sum([p[i]*log2(q[i]) for i in range(len(p))])
cross_entropy(y_true, y_pred) #-0.0
Why?