I've encountered some interesting behavior with gcc's interpretation of the signedness of constants. I have a piece of code which (greatly simplified) looks like the below:
#define SPECIFIC_VALUE 0xFFFFFFFF
//...
int32_t value = SOMETHING;
if (value == SPECIFIC_VALUE) {
// Do something
}
When I compile the above, I get the warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
All well and good -- it seems that gcc interprets the hex constant as unsigned, and doesn't like the comparison to a signed integer. However, if I change the define to something like #define SPECIFIC_VALUE 0x7FFFFFFF
, the warning goes away. Again, I'm not particularly surprised -- the sign bit being a zero would make gcc happier about interpreting the constant as a signed value. What really surprises me is that if I change the definition to be #define SPECIFIC_VALUE INT32_C(0xFFFFFFFF)
, I STILL get the warning. I would expect explicitly telling the compiler to interpret my constant as a signed value would silence the warning.