I have a Lagrange interpolation algorithm that begins to diverge after many time steps and I can't seem to figure out why. As a quick review, if I had two arrays
int x[11] = {0, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100}
int y[11] = {0, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20}
and I input an x-value of 15 into the algorithm, the output (i.e. interpolated y-value) should be 3. The following algorithm gets the interpolated value correct, but as I cycle through incremented inputs eventually the outputs begin to diverge. I am not sure what is causing the divergence. The code creates two arrays of integers going from -100 to +100 and interpolates the values based on an incremented x-input. The values begin matching as they should, but around 55 or so the interpolated y-value begins to diverge. The code is below. Any insight would be greatly appreciated.
#include <stdio.h>
#define SIZE 201
int main()
{
double x[SIZE], y[SIZE], value, sum, factor[SIZE];
for (int i = 0; i < SIZE; i++)
{
x[i] = -100 + i;
}
for (int i = 0; i < SIZE; i++)
{
y[i] = -100 + i;
}
value = 0.0;
while (1)
{
sum = 0.0;
printf("Input is: %lf\n", value);
for(int i = 0; i < SIZE; i++)
{
factor[i] = 1.0;
for(int j = 0; j < SIZE; j++)
{
if(i != j)
{
factor[i] = factor[i] * (value - x[j])/(x[i] - x[j]);
}
}
sum = sum + factor[i] * y[i];
}
printf("Output is: %lf\n", sum);
// if ((value - sum) > 0.01) break;
if (value < 100) value += 0.001;
else break;
}
return 0;
}