I expect strtol("ffffffffffffffff", NULL, 16)
to return -1
and strtol("7fffffffffffffff", NULL, 16)
to return LONG_MAX
, since the linux man-page first sentence seems to imply strtol returns signed long. The second call does return the expected result. But the first call returns LONG_MAX
! Like, the input hex strings are not even the same. As if that's not confusing enough, strtol("8000000000000000", NULL, 16)
, which I expect to return LONG_MIN
, returns the same value as the previous two calls, LONG_MAX
. For the previous two calls, I thought strtol was ignoring the most significant bit in the input string, but the third call refutes this hypothesis.
Is this some weird case of casting going on or I am mixing mathematical reality with C reality?
Here is my source code:
#include <stdio.h>
#include <stdlib.h>
int main(int argc, char const *argv[]) {
long x = strtol(argv[1], NULL, 16);
printf("%ld\n", x);
return 0;
}
I am compiling with gcc scratch.c
on Ubuntu, and I am getting my results by passing a hex string as a command line argument. For example:
./a.out ffffffffffffffff // gives LONG_MAX
./a.out 8000000000000000 // gives LONG_MAX
./a.out 7fffffffffffffff // gives LONG_MAX