0

For example, char hello='a' and char hello2='b'

Now I make if (hello<hello2) ...

In the if, it is true, but what does Java compares, the Unicode or ASCII code?

Mark Rotteveel
  • 100,966
  • 191
  • 140
  • 197
qoidewjo
  • 3
  • 2
  • Please add more code and `char hello2 = 'b'` not double quotes – Zahid Khan Dec 09 '21 at 19:16
  • unicode is the same as standard ascii for codes `00` through `7F`. Beyond that, standard ascii doesn't apply. – Green Cloak Guy Dec 09 '21 at 19:24
  • See [Java Tutorial Primitive Data Types](https://docs.oracle.com/javase/tutorial/java/nutsandbolts/datatypes.html): _"**`char`**: The `char` data type is a single 16-bit Unicode character. It has a minimum value of `'\u0000'` (or 0) and a maximum value of `'\uffff'` (or 65,535 inclusive)."_ – Mark Rotteveel Dec 09 '21 at 19:40

1 Answers1

4

The integer values are compared. Chars are 2 bytes long, so the 2 byte value will be checked.

It's basically UTF-16. The reason for this is when Java came out, we had just decided ASCII wasn't good enough, but didn't realize that we needed the number of characters we do. We thought going from 8 to 16 bits would be enough. It wasn't, but by the time UTF-8 won out, it was way too late to change.

Mark Rotteveel
  • 100,966
  • 191
  • 140
  • 197
Gabe Sechan
  • 90,003
  • 9
  • 87
  • 127