I have a code that translates ASCII char array to a Hex char array:
void ASCIIFormatCharArray2HexFormatCharArray(char chrASCII[72], char chrHex[144])
{
int i,j;
memset(chrHex, 0, 144);
for(i=0, j=0; i<strlen(chrASCII); i++, j+=2)
{
sprintf((char*)chrHex + j, "%02X", chrASCII[i]);
}
chrHex[j] = '\0';
}
When I insert the function the char 'א' - Alef, the equivalent to 'A' in English, the function does this:
chrHex = "FFFF"
I don't understand how 1 char translates to 2 bytes of Hex ("FFFF") instead of 1 byte(like "u" in ASCII is "75" in Hex) when it's not even an English letter. would love for an explanation of how the compiler treats 'א' like so.