For simplicity let's assume we have a function sin(x) and calculated 1000 samples between -1 and 1 with it. We can plot those samples. Now in the next step we want to plot the integral of sin(x) which would be - cos(x) + C. Now i can calculate the integral with my existing samples like this:
y[n] = x[n] + y[n-1]
Because it's a cumulative sum we will need to normalize it to get samples between -1 and 1 on the y axis.
y = 2 * ( x - min(x) / max(x) - min(x) ) - 1
To normalize we need a maximum and a minimum.
Now we want to calculate the next 1000 samples for sin(x) and calculate the integral again. Because it's a cumulative sum we will have a new maximum which means we will need to normalize all of our 2000 samples.
Now my question basically is:
How can i normalize samples in this context without knowing the maximum and minimum? How can i prevent, to normalize all previous samples again, if i have a new set of samples with a new maximum/minimum?