I'm learning about Unicode basics and I came across this passage:
"The Unicode standard describes how characters are represented by code points. A code point is an integer value, usually denoted in base 16. In the standard, a code point is written using the notation U+12ca to mean the character with value 0x12ca (4810 decimal)."
I have three questions from here.
- what does the ca stand for? in some places i've seen it written as just U+12. what's the difference?
- where did the 0 in 0x12ca come from? what does it mean?
- how does the value 0x12ca become 4810 decimal?
its my first post here and would appreciate any help! have a nice day y'all!!