4

Conside the following for C# and Java,

double d = 3 / 2 * 3.2;

Java

System.out.println(d); // 3.2

C#

Console.WriteLine(d); //3.2

It skip the 3/2,

We know that the correct answer should be 4.8

if i change to

double d = 3.00 / 2 * 3.2;

I can get 4.8,

So i want to ask, if (3 / 2 * 3.2) is illegal , why eclipse and vs2008 have no error? And how to prevent this problem in both C# and Java?

Austin Salonen
  • 49,173
  • 15
  • 109
  • 139
Cheung
  • 15,293
  • 19
  • 63
  • 93

8 Answers8

25

3/2 is Integer division, whose result is 1.

1 * 3.2 equals 3.2, which is the result you receive.

This is a proper well-defined formula with well-defined expected results, hence no error by the compiler. Use 3.00, it's the best straightforward way to force the compiler to use floating point math.

From Thorbjørn's answer: another option is to standardize all your floating point calculations by starting with 1.0*, in your example it would be 1.0*3/2*3.2.

Community
  • 1
  • 1
Roee Adler
  • 33,434
  • 32
  • 105
  • 133
11

3 / 2 is considered as an integer division, so the result comes out to be 1.

Then, performing a multiplication between 1 and 3.2 causes the integer 1 to be promoted to floating point 1, resulting in 3.2.

The idea is:

// Both "3" and "2" are integers, so integer division is performed.
3 / 2 == 1

// "3.2" is a floating point value, while "1" is an integer, so "1" is
// promoted to an floating point value.
1 * 3.2  -->  1.0 * 3.2 == 3.2

When typing 2.0, the decimal point tells Java to consider the literal as a floating point value (in this case, a double), so the result of the calculation is 4.8 as expected. Without the decimal point, the value is a integer literal.

It is not an error, but an issue with how the literals are handled.

The following links have more information:

Bernhard Barker
  • 54,589
  • 14
  • 104
  • 138
coobird
  • 159,216
  • 35
  • 211
  • 226
7

It's perfectly legal, but 3/2 is an integer division (which is like doing the division but then choosing the number nearest to 0 (in some languages -infinity)). You need to give the compiler some kind of "hint" want division you want to do.

You can simply avoid it by writing

3.0 / 2 * 3.2          or      3 / 2.0 * 3.2
3.0 / 2.0 * 3.2
3d / 2d * 3.2 (C#)
(double)3 / 2 * 3.2
Marcel Jackwerth
  • 53,948
  • 9
  • 74
  • 88
6

If you want to be CERTAIN that a calculation happens with decimals instead of integers, then either start with

 1.0 * 3 / 2 * 3.2

or tell the compiler that the first number is a real/double:

3d / 2 * 3.2
Thorbjørn Ravn Andersen
  • 73,784
  • 33
  • 194
  • 347
3

When dividing one integer literal with another, the entire expression is treated at integer division: 3/2 as such becomes 1; not 1.5 (because it's all integers). Not until the next part, 1 * 3.2 is the integer promoted to a double; and as such, not until that point does it actually become 1.0.

EDIT: There's no error flagged because it isn't illegal.

Williham Totland
  • 28,471
  • 6
  • 52
  • 68
1

3 / 2 is the first operation performed.

Both 3 and 2 are integers. When two ints are divided, the answer is an int (in this case, the answer is 1).

So from 3 / 2 * 3.2 you end up with 1 * 3.2. When you multiply an int and a float, the result will be promoted to a float. So the answer is 3.2.

Alex Reynolds
  • 95,983
  • 54
  • 240
  • 345
1

This is due to casting and number type issues.

In C# (and Java as it seems too) 3 and 2 are both integer values so the division '3 / 2' is an integer division which leads to an integer 1.

Since 3.2 is a floating point value the integer result of the division is converted to a floating point value and multiplicated after that and since 1 * 3.2 is 3.2 that'sd the result.

There is no code error in this so that's why Java and C# accept this.

The safest way would always be to tell the compiler that you need floating point values here by using

3.0 / 2.0 * 3.2

This works in all cases.

mmmmmmmm
  • 15,269
  • 2
  • 30
  • 55
0

In C#

3d / 2d * 3.2

As stated before

George
  • 1,466
  • 3
  • 12
  • 30