1

Attribute GradientBoostingClassifier._loss has been deprecated.

What do we use instead, e.g.:

GBC = GradientBoostingClassifier(...)
GBC.fit(X, y)
for y_pred in GBC.staged_decision_function(X_test):
    print(GBC.loss_(y_test, y_pred))

NB. The loss argument to GradientBoostingClassifier defaults to log_loss, but sklearn.metrics.log_loss and GBC.loss_ return dramatically different values (the former mostly useless). In fact, loss_ is _gb_losses.BinomialDeviance - an internal class.

sds
  • 58,617
  • 29
  • 161
  • 278

1 Answers1

0

When loss is the default (log_loss), one can recover deviance like this:

def sigmoid(z):
    "Convert log odds to probabilities."
    # https://stackoverflow.com/q/60746851/850781
    return 1/(1 + np.exp(-z))

GBC = GradientBoostingClassifier(...)
GBC.fit(X, y)
for y_pred in GBC.staged_decision_function(X_test):
    assert math.isclose(GBC.loss_(y_test, y_pred) / 2,
                        sklearn.metrics.log_loss(y_test, sigmoid(y_pred)))

However, the arbitrary loss case is still missing.

sds
  • 58,617
  • 29
  • 161
  • 278