I have a logistic regression xgboost model trained with the following hyperparameters (obtained with a grid search) in Python:
Hyperparams selected {'gamma': 0, 'learning_rate': 0.1, 'max_depth': 3, 'min_child_weight': 1, 'n_estimators': 125}
This is the distribution of my observations over the support.
When I plot my predicted values versus my observations that's what I obtain:
What is the reason the model does not predict any value in the range [0, 0.05] and in the range [0.8, 1] not even with the training set? What can be done to improve the model? I guess it is related to the hyper-parameters tuning, but I can`t find a solution.