I just discovered this oddity in JavaScript:
null > 0
// false
null == 0
//false
null >= 0
//true
Why does the final test evaluate to true? It seems more than a little counterintuitive.
I just discovered this oddity in JavaScript:
null > 0
// false
null == 0
//false
null >= 0
//true
Why does the final test evaluate to true? It seems more than a little counterintuitive.