9

I try to divide int by unsigned int and I get unexpected result:

int b;
unsigned int c;
int res;
float res_f;

b = -25;
c = 5;

res = b / c;   // res = 858993454
res_f = b / c; // res_f = -5.000000

The same works just fine for '+', '-' and '*', but fails for '/'. What is it that I miss here?

P.S.

It was tested on different compilers and the result was the same.

Paul R
  • 208,748
  • 37
  • 389
  • 560
shushu
  • 91
  • 1
  • 2

1 Answers1

16

Assuming this is C or similar (e.g. Objective C), change:

res = b / c;

to:

res = b / (int)c;

Explanation: b is being converted from int to unsigned int, according to C's type conversion rules for mixed expressions. In the process it overflows from -25 to 0xFFFFFFE7 == 4294967271. Then you get an unsigned int result of 4294967271 / 5U = 858993454U, which is then being implicitly converted back to an int (no overflow in this step, as the result is in the range of both signed and unsigned 32-bit ints).

By the way, the float result should be the same, within the precision limits of a float (I get 858993472.0). I'm surprised that you get -5.0 in this case.

Paul R
  • 208,748
  • 37
  • 389
  • 560
  • 1
    This is why languages with implicit type conversions are evil. If you don't want crap like this happening to you, use a language with a real strong type system, like Ada. – T.E.D. Mar 16 '11 at 16:33
  • 1
    Making `c` an `unsigned short` also works if that makes more sense in context. – aaz Mar 16 '11 at 16:42