0

XGBoost did not allow to use absolute error as objective function in the past since it is a non-differentiable function and its Hessian is equal to 0. However, it does allow to use it now (https://xgboost.readthedocs.io/en/stable/parameter.html).

How does it do it and how can I define non-differentiable custom objective functions?

I've tried to implement it as simply:

def absolute_error(predt, dtrain):
    y_true = dtrain.get_label()
    errors = y_true - predt
    grad = -1.0 * np.sign(errors)  # Gradient (negative of the sign of the error)
    hess = np.zeros_like(y_true)  # Hessian (constant, 0, in this case)
    return grad, hess

But it obviously does not work

0 Answers0