I want to convert a four-byte string to either int32
or uint32
in c++.
This answer helped me to write the following code:
#include <string>
#include <iostream>
#include <stdint.h>
int main()
{
std::string a("\xaa\x00\x00\xaa", 4);
int u = *(int *) a.c_str();
int v = *(unsigned int *) a.c_str();
int x = *(int32_t *) a.c_str();
int y = *(uint32_t *) a.c_str();
std::cout << a << std::endl;
std::cout << "int: " << u << std::endl;
std::cout << "uint: " << v << std::endl;
std::cout << "int32_t: " << x << std::endl;
std::cout << "uint32_t: " << y << std::endl;
return 0;
}
However the outputs are all the same, where the int
(or int32_t
) values are correct, but the unsigned ones are wrong. Why does not the unsigned conversions work?
// output
int: -1442840406
uint: -1442840406
int32_t: -1442840406
uint32_t: -1442840406
Pythons struct.unpack
, gives the right conversion
In [1]: import struct
In [2]: struct.unpack("<i", b"\xaa\x00\x00\xaa")
Out[2]: (-1442840406,)
In [3]: struct.unpack("<I", b"\xaa\x00\x00\xaa")
Out[3]: (2852126890,)
I would also like a similar solution to work for int16
and uint16
, but first things first, since I guess an extension would be trivial if I manage to solve this problem.