I'm doing a project required to compute the precise(up to a certain precession N
) numerical derivatives of a function.
The usual approach was to use the finite difference types of algorithms, i.e.
(f(x+h)-f(x))/h
However, there's not many document to mention how small the h
should be, or how to check the convergence or the stability of the results, as h=0
had a different meaning in the numerical computation than symbolic computation.
How to tell how small h
is in the finite difference for the precession N
? Also, is there any numerical differentiation method that's stable under recurrent application, since the aim was to apply the differentiation up to 1000 times?