I have the following C code:
#include <stdint.h>
#include <stdio.h>
int i;
uint64_t a[] = { (uint64_t)&i, (uint64_t)&i + 0x8000000000000000 };
int main() {
printf("%p %llx %llx\n", &i, a[0], a[1]);
}
If I compile this (as C or as C++) with Microsoft Visual Studio Community 2015 and then run it, the output is similar to the following:
013E9154 13e9154 13e9154
It seems that the code + 0x8000000000000000
, which I expected to set the high bit of a[1]
, has been silently ignored.
However, if I move the initialization of a
inside main
, the output is what I would expect:
00179154 179154 8000000000179154
With a
global, why is the addition being silently ignored? Should the attempted addition actually set the high bit of a[1]
or should it cause a compiler error?
Interestingly, if + 0x8000000000000000
in the above code is replaced by | 0x8000000000000000
, I get "error C2099: initializer is not a constant".
Edit: A similar issue can occur even in the absence of casts. Compiled for x64, the following code prints the same value (e.g. 000000013FB8D180
) three times:
#include <stdio.h>
int i;
int * a[] = { &i, &i + 0x100000000 };
int main() {
printf("%p %p %p\n", &i, a[0], a[1]);
}