I am based on this article https://kishuagarwal.github.io/unicode.html
I took for example: UTF-16 code point 0x1F9F0
In hexa:
0x1F9F0
In binary:
0001 1111 1001 1111 0000
Fallowing the explanation from article, should i have some thing like that:
1101 10XX XXXX XXXX 1101 11XX XXXX XXXX
Which populate from the bits from do code point, give me
binary:
1101 1000 0111 1110 1101 1101 1111 0000
hexa:
\uD87E \uDDF0
But in this page correct value is:
hexa:
\uD83E\uDDF0
binary:
1101 1000 0011 1110 1101 1101 1111 0000
So...
my hexa: \uD87E \uDDF0
correct hexa: \uD83E \uDDF0
I have single bit misplaced, and I cant figure out why...