Why this uint16_t variable declaration doesn't work?
Well, it does...
You are probably on a little endian machine so the result is correct.
On a little endian machine the memory is like:
uint16_t v1 = 1; --> Memory: 0x01 0x00
uint16_t v256 = 256; --> Memory: 0x00 0x01
In network order it's like:
uint16_t v1 = 1; --> Memory: 0x00 0x01
uint16_t v256 = 256; --> Memory: 0x01 0x00
So k
have the value 1
which in memory will be 0x01 0x00
Then op
will be assigned k
converted to network order so in memory it will be 0x00 0x01
.
When you print that as a uint16_t
on the host, you will see the result 256.
This little program illustrates what is going on:
#include <stdio.h>
#include <inttypes.h>
#include <arpa/inet.h>
int main()
{
uint16_t k = 1;
unsigned char* p = (unsigned char*)&k;
printf("k=%hd\n", k);
for (size_t i = 0; i<sizeof k; ++i)
{
printf("%02X\n", *p++);
}
uint16_t op = htons(k);
unsigned char* pop = (unsigned char*)&op;
printf("op=%hd\n", op);
for (size_t i = 0; i<sizeof op; ++i)
{
printf("%02X\n", *pop++);
}
return 0;
}
Output:
k=1
01
00
op=256
00
01