I can't find a rational explanation on that:
null > 0 // as expected
false
null == 0 // as expected
false
But
null >= 0
true
From my understanding null is "nothing", it is non-measurable. So using a comparison operator with zero should return false or NaN or something similar. Can somebody explain me why the js has implement this rule with that way?