I used XGBClassifier to fit the dataset and got feature_score using codes below:
feature_score = clf.get_booster().get_score()
The fscores i got all greater than 1 but how can that possible?
I used XGBClassifier to fit the dataset and got feature_score using codes below:
feature_score = clf.get_booster().get_score()
The fscores i got all greater than 1 but how can that possible?
It is not F1 score that XGBoost is using to assess feature importances. F1 score conveys the balance between precision and recall, and varies between [0, 1]. XGBoost assess the feature importances with a feature score, which tells the number of occurence of features in the ensemble, and can vary between [0, x], where x is the number of trees in your ensemble.