I found a strange (to me) behavior of casting to int
in C. Appologies if that's a basic question but I'm unable to find the answer to why the following code produces an unexpected result.
#include <stdio.h>
int main(void)
{
printf("1000 * 0.1 = %d\n", (1000 * 0.1));
printf("1000 * (10/100) = %d\n", (1000 * (10/100)));
printf("(int)1000 * 0.1 = %d\n", (int)(1000 * 0.1));
printf("(int)1000 * (10/100) = %d\n", (int)(1000 * (10/100)));
return 0;
}
Result with both -O0
and -O3
is the same:
1000 * 0.1 = -957043896
1000 * (10/100) = 0
(int)1000 * 0.1 = 100
(int)1000 * (10/100) = 0
I expect a non-sensical result for the first two (I don't know why but I expect passing a double to int argument shouldn't work). However the difference between 3 and 4 is puzzling to me. I expected (10/100)
to be calculated on compile time and render the same result as 3.
Can someone explain to me why such result happens, and what is the proper/safe way to do integer-based divisions here?