0

I've seen many implementations using below to find mid point of two indices:

int mid = lo + (hi - lo) / 2;

instead of

int mid = (lo + hi) / 2;

Mathematically, I see no difference and yet, I've never seen anyone using the below one. Is there a difference between the two computationally?

Gurwinder Singh
  • 38,557
  • 6
  • 51
  • 76

1 Answers1

1

There exist a maximum positive value for a 32-bit signed binary integer in computing.

We assume this value is 100.

int lo = 60;
int hi = 80;

then lo + hi = 60 + 80 = 140 > 100, it is dangerous to do so because it will cause a integer overflow error.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Vau
  • 77
  • 4
  • "it will cause a integer overflow error" Integer overflow isn't really an error, is it? It's just the way computers handle the erroneous situation of a number requiring too many bits to represent. – The SE I loved is dead Nov 17 '16 at 02:50