I have a program in c which is as follows:
#include <stdio.h>
int main() {
int sum = 17, count = 5;
double mean;
printf("Value of mean (without casting): %f\n", sum/count);
mean = (double) sum / count;
printf("Value of mean (with casting): %f\n", mean );
return (0);
}
For the above program, I'm getting the following output:
Value of mean (without casting): 0.000000
Value of mean (with casting): 3.400000
I'm not getting why I'm getting 0.0000000 before performing the typecasting even though my sum/count returns a decimal (float) value, so I believe both the values should be coming out to be the same. Any help would be highly appreciated. Thanks!