Making sure I am getting this right:
If we use sklearn.metrics.log_loss standalone, i.e. log_loss(y_true,y_pred), it generates a positive score -- the smaller the score, the better the performance.
However, if we use 'neg_log_loss' as a scoring scheme as in 'cross_val_score", the score is negative -- the bigger the score, the better the performance.
And this is due to the scoring scheme being built to be consistent with other scoring schemes - since, generally, the higher the better, we negate usual log_loss to be consistent with the trend. And it is done so solely for that purpose. Is this understanding correct?
[Background: got positive scores for metric.log_loss, and negative scores for 'neg_los_loss', and both refer to the same documentation page.]