I'm doing this conversion is because from a network where the data being received is an array of unsigned char[8]. Every two bytes represent a UINT16 number. I want to send these bytes through a socket however, sockets require the chars to be signed (I'm working in little-endian).
Since it is impossible to convert a unsigned char to signed char I thought I could form a UINT16 number from the two unsigned chars and then decompose the number into two signed chars.
The code I have works in situations when the UINT16 number is between [0,127] and [256,65535], but when the value is between [128,255] the first char blows up to ~4294967295 and the UINT16 is now represented as ~65535. Any explanations/solutions?
uint16_t in = 129;
cout << "input is\t" << in << endl;
char lo = in & 0xFF;
char hi = in >> 8;
printf("%u", lo);
cout << "\n" << endl;
printf("%u", hi);
cout << "\n" << endl;
uint16_t out = lo | uint16_t(hi) << 8;
cout << "out is\t" << out << endl;
The error I get when I send unsigned char[]
through the socket; if I use the following initialization I can use it as an appropriate argument for the send()
function through the socket (winsock). Works just fine.
char buffer[8]
for (int i= 0; i <8; i++){
buffer[i] = 0x016; //random data
}
send(clientSocket, buffer, 8, 0);
But if initialize the buffer as following I get an error:
unsigned char buffer[8]
for (int i=0; i < 8;i++){
buffer[i] = 0x016; //again random data
}
send(clientSocket, buffer,8,0);
With the error stating:
int send (SOCKET, const char *, int, int)': cannot convert argument 2 from unsigned char[8] to const char *