Because sizeof
gives you an unsigned value, which you probably would have noticed had you turned up the warning level, such as using -Wall -Wextra
with gcc
(a):
xyzzy.c: In function 'main':
xyzzy.c:8: warning: comparison between signed and unsigned
If you force it to signed, it works fine:
#define TOTAL_ELEMENTS (int)((sizeof(array) / sizeof(array[0])))
What happens in detail can be gleaned from the ISO standard. In comparisons between different types, promotions are performed to make the types compatible. The compatible type chosen depends on several factors such as sign compatibility, precision and rank but, in this case, it was deemed that the unsigned type size_t
was the compatible type so d
was upgraded to that type.
Unfortunately, casting -1
to an unsigned type (at least for two's complement which is almost certainly what you're using) results in a rather large positive number.
One that's certainly larger the the 5
you get from (TOTAL_ELEMENTS-2)
. In other words, your for statement effectively becomes:
for (d = some big honking number way greater than five;
d <= 5;
d++
) {
// fat chance of getting in here !!
}
(a) This requirement to use extra
remains a point of contention between the gcc
developers and myself. They're obviously using some new definition of the word "all" of which I was previously unaware (with apologies to Douglas Adams).