For example, char hello='a'
and char hello2='b'
Now I make
if (hello<hello2) ...
In the if, it is true, but what does Java compares, the Unicode or ASCII code?
For example, char hello='a'
and char hello2='b'
Now I make
if (hello<hello2) ...
In the if, it is true, but what does Java compares, the Unicode or ASCII code?
The integer values are compared. Chars are 2 bytes long, so the 2 byte value will be checked.
It's basically UTF-16. The reason for this is when Java came out, we had just decided ASCII wasn't good enough, but didn't realize that we needed the number of characters we do. We thought going from 8 to 16 bits would be enough. It wasn't, but by the time UTF-8 won out, it was way too late to change.