I have a multiclass classification problem and I extracted features importances based on impurity decrease. I compared a decision tree and AdaBoost classifiers and I ovserved that there is a feature that was ranked on top with the decision tree while it has a very lower importance according to AdaBoost. Is that a normal behavior? Thanks
Asked
Active
Viewed 40 times
1 Answers
0
Yes it is normal behavior. The features importance calculates a score for all the input features of a model. However, each model has a (slightly) different technique. For example: a linear regression will look at linear relationships. If a feature has a perfect linear relationship with your target, then it will have a high feature importance. Features with a non-linear relationship may not improve the accuracy resulting in a lower feature importance score.
There is some research related to the difference in feature importance measures. An example is: https://link.springer.com/article/10.1007/s42452-021-04148-9

Kylian
- 319
- 2
- 14