I have noticed that PHP and JavaScript treat octal and hexadecimal numbers with some difficulty while type juggling and casting:
PHP:
echo 16 == '0x10' ? 'true' : 'false'; //true, as expected
echo 8 == '010' ? 'true' : 'false'; //false, o_O
echo (int)'0x10'; //0, o_O
echo intval('0x10'); //0, o_O
echo (int)'010'; //10, o_O
echo intval('010'); //10, o_O
JavaScript:
console.log(16 == '0x10' ? 'true' : 'false'); //true, as expected
console.log(8 == '010' ? 'true' : 'false'); //false, o_O
console.log(parseInt('0x10')); //16, as expected
console.log(parseInt('010')); //8, as expected
console.log(Number('0x10')); //16, as expected
console.log(Number('010')); //10, o_O
I know that PHP has the octdec()
and hexdec()
functions to remedy the octal/hexadecimal misbehaviour, but I'd expect the intval()
to deal with octal and hexadecimal numbers just as JavaScript's parseInt()
does.
Anyway, what is the rationale behind this odd behaviour?