I have a point-wise defined function. It originates from a deposit situation where you each month deposit 1k USD with 5% interest which you plot with Numpy. There are multiple ways to compute the marginal interest, such as Finite Differences, Automatic Derivatives, Symbolic Differentiation and by hand (some aspects covered here but in closed form):
0. 0-10 months: 10USD
1. 10-20 months: 50USD
2. 20-30 months: 100USD
3. 30-40kk months: 130USD
4. 40-50kk months: 200USD
5. 50-60kk months: 260USD
and in Python the code without marginal differentiation:
import numpy as np
import matplotlib.pyplot as plt
def situationNmonth(n):
#Returns situation after Nth month.
return np.sum([1000*np.exp(0.05*n/12) for n in range(n)])
def myHistory(test):
return [situationNmonth(n) for n in range(60)]
def interests(n):
#Returns interest given a month n.
return situationNmonth(n)-n*1000
def plotInterests(test):
plt.plot([x for x in range(60)], [interests(n) for n in range(60)])
plt.title("5% interests over 60 months with 1k USD per month.")
plt.show()
What is the easiest way here to differentiate the plotInterests
function, a pointwise-defined function, to see each monthly interest? What is the status of Finite Differences, Automatic Derivatives and Symbolic Differentiation in Python and can they be robustly computed here with Python 3?