3

What is the logloss of a binary classifier if we choose a constant prediction f=0.5?

Is this development of the equation correct?

logloss = (1/N) * sum(-y*log(f)-(1-y)*log(1-f))
logloss(y=0) = -(1-0)*log(1-f) = -log(f)
logloss(y=1) = -1*log(f) = -log(f)
// That is either ground true y is 0 or 1 the cost is -log(f)
logloss = (1/N) * N * (-log(f)) = -log(f) = -log(5) = 0.69315
desertnaut
  • 57,590
  • 26
  • 140
  • 166
Seguy
  • 368
  • 2
  • 12
  • This is a duplicate of https://stats.stackexchange.com/questions/214177/scaling-up-random-guess-benchmark-of-log-loss, on cross validated stack exchange. – Him Apr 23 '18 at 13:47
  • But, yes, this is correct. – Him Apr 23 '18 at 13:47
  • 1
    Thanks Scott, this is a non graded question on a quiz from coursera (How to win a data science competition). The corrector of the coursera quiz says that 0.69315 is not a correct answer and I do not get why? – Seguy Apr 24 '18 at 07:33

0 Answers0