3

I am learning the C programming language and, in Xcode 13.x, when I write this:

float a = 3 / 2;
float b = 1 / 3;
printf("3 divided by 2 as a float is %f, 1 divided by 3 as a float is %f\n", a, b);

The console shouts out this:
3 divided by 2 as a float is 1.000000, 1 divided by 3 as a float is 0.000000

I would expect it to show 1.500000 and 0.333333, but why doesn't it? I am sure the solution is obvious and simple but basic googling/searching did not help.
I tried to #include float.h thinking it would have helped but no luck there.
What am I blindly missing?
Thank you

NotationMaster
  • 390
  • 3
  • 17

1 Answers1

5

You are obviously trying to store the result of the division into a float, but floats also include integers. You're actually performing an integer division, because, in both cases, both of your terms are integers (don't have a fractional part). The division is performed rounded (result being without a fractional part), and is assigned to your variable.

How to correct this? Try adding a fractional part to at least one of the two numbers. For example:

float a = 3.0/2;

This should do it.

Mario MateaČ™
  • 638
  • 7
  • 25