I am debugging a production code written in C and its simplest form can be shown as -
void
test_fun(int sr)
{
int hr = 0;
#define ME 65535
#define SE 256
sr = sr/SE; <-- This should yield 0
if(sr == 1)
hr = ME;
else
hr = (ME+1)/sr; <-- We should crash here.
}
We are passing sr
as 128, which ideally should yield in divide by zero error in processor. I see that this division happens successfully with quotient as 0x7ffffffff (hr
is this value).
This does not happens (it crashes when attempts the division by zero) when I compile and run the same on Intel platform with gcc.
Want to to know principle behind this big quotient. Not sure if it is just some other bug I still need to uncover. Can someone help me with another program that does the same?