I am working on an assignment that requires me to calculate the first 9 partial sums of the taylor series for ln(2), the equation for which is the sum:
I have to use Aitken's delta-squared process to accelerate it, so that I can compare it to the normal sequential iterative process. Here is a brief description of this process, and the Wikipedia had some pseudo code on how to perform it as well, which I tried after not being able to figure it out:
So my question is, why isn't my code working? It seems like it converges, but it converges in the wrong place lol. What am I doing wrong?
import numpy as np
from numba import jit
@jit(nopython=True, cache=True)
def f():
y_0 = 1
y_3 = 0
for i in range(4, 10):
y_1 = (-1) ** (i + 1) * (1 / i)
y_2 = (-1) ** ((i + 1) + 1) * (1 / (i + 1))
y_3 += y_2 - ((y_2 - y_1) ** 2) / ((y_2 - y_1) / (y_1 - y_0))
y_0 = y_3
return y_0
@jit(nopython=True, cache=True)
def g():
x = 0
for i in range(1, 10):
x += (-1) ** (i + 1) * (1 / i)
return x
x = g()
y = f()
print(x, y)
Here, the function f() is the iterated Aitken's, and the summation approach is g(). Ignore the @jit decorations, thats just to keep runtime under control if I ever have to scale this up.
Any and all help is appreciated, thanks!