I'm looking at the Mallet source codes, and it seems that most of the classifier implementations (e.g naive bayes) didn't really take into account the feature selections even though the InstanceList
class has a setFeatureSelection
method.
Now I want to conduct some quick experiments with my datasets with feature selection involved. I am thinking, from a technical shortcut standpoint, I might get the lowest ranking features and set those values to 0 in the instance vectors. Is this equivalent in machine learning to feature selection in classifier training whereby they are not considered at all (if smoothing e.g laplace estimation is not involved)?
thank you