#include<stdio.h>
#include<math.h>
int main()
{
printf("%f",3/2);
printf(" %d",3/2);
printf(" %d",3<<2);
return 0;
}
Here is my code, i was expecting to get 1.50000 1 12
but i received 2.168831 1 12
as my output.
#include<stdio.h>
#include<math.h>
int main()
{
printf("%f",3/2);
printf(" %d",3/2);
printf(" %d",3<<2);
return 0;
}
Here is my code, i was expecting to get 1.50000 1 12
but i received 2.168831 1 12
as my output.
You get a mixture of mis-converted bytes (the integer you passed) & transient data off the stack, formatted as a floating-point number. @H2CO3 gave a good reference.
This is because you are passing an int
, but you have told printf()
to expect a floating-point value (specifically, a double). If you use %f as a format, you need to pass a double. Failure to do so causes undefined & erroneous access to undefined/ garbage values on the stack.
Both operands integers 3 / 2
will perform an integer division. Make one or both the operands doubles, ie 3.0 / 2
, and you'll have floating-point division & printf() will function as you expect.