According to documentation, http://xgboost.readthedocs.io/en/latest/python/python_api.html if we want to define custom objective function, it should have signature
objective(y_true, y_pred) -> grad, hess
where
hess: array_like of shape [n_samples]
The value of the second derivative for each sample point
But, if we have loss function, depending on N variables, we should have NxN matrix of second derivatives, but our hess's shape is only Nx1. Should we exlude "cross-variable" derivatives? Or what else?