1

I have a point-wise defined function. It originates from a deposit situation where you each month deposit 1k USD with 5% interest which you plot with Numpy. There are multiple ways to compute the marginal interest, such as Finite Differences, Automatic Derivatives, Symbolic Differentiation and by hand (some aspects covered here but in closed form):

0. 0-10    months: 10USD
1. 10-20   months: 50USD
2. 20-30   months: 100USD
3. 30-40kk months: 130USD
4. 40-50kk months: 200USD
5. 50-60kk months: 260USD

and in Python the code without marginal differentiation:

import numpy as np
import matplotlib.pyplot as plt

def situationNmonth(n): 
    #Returns situation after Nth month. 
    return np.sum([1000*np.exp(0.05*n/12) for n in range(n)])

def myHistory(test):
    return [situationNmonth(n) for n in range(60)]

def interests(n):
    #Returns interest given a month n.
    return situationNmonth(n)-n*1000

def plotInterests(test): 
    plt.plot([x for x in range(60)], [interests(n) for n in range(60)])
    plt.title("5% interests over 60 months with 1k USD per month.")
    plt.show()

enter image description here

What is the easiest way here to differentiate the plotInterests function, a pointwise-defined function, to see each monthly interest? What is the status of Finite Differences, Automatic Derivatives and Symbolic Differentiation in Python and can they be robustly computed here with Python 3?

hhh
  • 50,788
  • 62
  • 179
  • 282

2 Answers2

1

Parametric solutions

Parametric solutions contain Numpy's Gradient function:

Return the gradient of an N-dimensional array.

The gradient is computed using second order accurate central differences in the interior points and either first or second order accurate one-sides (forward or backwards) differences at the boundaries. The returned gradient hence has the same shape as the input array. (docs)

where we focus on the differences explicitly with the interior points. For example, you could compute it directly by using the gradient function of numpy:

>>> data=[x for x in range(60)], [interests(n) for n in range(60)]

>>> np.gradient(data[1])
array([   0.        ,    2.08767965,    6.27175575,   10.47330187,
         14.69239096,   18.92909627,   23.18349133,   27.45565003,
         31.74564653,   36.0535553 ,   40.37945114,   44.72340914,
         49.08550474,   53.46581365,   57.86441192,   62.28137592,
         66.71678233,   71.17070816,   75.64323073,   80.13442769,
         84.64437701,   89.17315698,   93.72084624,   98.28752374,
        102.87326876,  107.47816091,  112.10228014,  116.74570672,
        121.40852129,  126.09080477,  130.79263848,  135.51410403,
        140.25528339,  145.01625888,  149.79711316,  154.59792922,
        159.41879041,  164.25978043,  169.12098332,  174.00248348,
        178.90436566,  183.82671495,  188.76961683,  193.73315709,
        198.71742192,  203.72249785,  208.74847176,  213.79543092,
        218.86346295,  223.95265584,  229.06309793,  234.19487796,
        239.34808501,  244.52280855,  249.71913842,  254.93716484,
        260.17697839,  265.43867004,  270.72233115,  273.36966551])

where you can see the monthly additional interest. You could np.gradient(data[1], 2) would give you the second derivative here.

enter image description here

More-non-parametric solutions and end-to-end approaches

More non-parametric solutions contain Bayesian approaches: the original data points are prior points with uncertainty and the result is the posterior values.

  1. https://github.com/HIPS/autograd

  2. ftp://ftp.tuebingen.mpg.de/pub/kyb/antonio/pub/ebio/chrisd/GPtutorial.pdf

  3. http://scikit-learn.org/stable/auto_examples/gaussian_process/plot_gpr_noisy_targets.html

  4. Gaussian Processes For Machine Learning

I leave this section open, perhaps there is some expert to explain them in Python. And which kind of solutions exist to compute the differentials with the points robustly.

hhh
  • 50,788
  • 62
  • 179
  • 282
0

You can interpolate your discrete data points, and differentiate the resulting interpolant.

ev-br
  • 24,968
  • 9
  • 65
  • 78
  • I am more looking for end-to-end solution without me explicitly specifying an interpolant, I found [this](https://github.com/HIPS/autograd) but not yet sure whether they have done some demo. – hhh Jul 15 '18 at 20:59