I have a very random population I'm trying to split using binary decision tree.
Population probability
TRUE 51%
FALSE 49%
So the entropy is 1 (rounded to 3). So for any feature the entropy will also be 1 (the same), and thus no information gain.
Am I doing this right? In my process to learn it I haven't come across anything saying that entropy is useless for 2 classes