so I have recently been working with some probability distributions of order statistics. in that area it is quite common to see lots of formulas with high powers of numbers from the real interval [0, 1].
Consider numbers a ~ b ~ 0, both positive. I want to calculate something like a^n / b^m where n, m are huge numbers.
My question is the following: If I use C code like
double f(double a, double b, long m, long n)
{
return( pow(a, n) / pow(b, m) );
}
will this be stable? The rule should be to first evaluate a^n, then b^m then divide, however if a^n or b^m is small enough it will just be zero or NaN. Instead I could do something like
double g(double a, double b, long m, long n)
{
double res;
long i;
res = 1;
for (i = 0; i < min(m, n); ++i)
{
res = res * a / b;
}
if ( n > m )
{
res = res * pow(a, n - m);
} else {
res = res / pow(b, m - n);
}
return( res );
}
Do you know whether 1) is subject to optimization in such cases? If not, how would one handle such cases for a fast and stable evaluation?