numpy.diff
The first order difference is given by out[n] = a[n+1] - a[n]
https://docs.scipy.org/doc/numpy-1.10.1/reference/generated/numpy.diff.html
import numpy as np
data = [1, 2, 4, 5, 8, 7, 6, 4, 1]
data = np.array(data, dtype=float)
velocity = np.diff(data)
acceleration = np.diff(velocity)
jerk = np.diff(acceleration)
jounce = np.diff(jerk)
print data
print velocity
print acceleration
print jerk
print jounce
>>>
[ 1. 2. 4. 5. 8. 7. 6. 4. 1.]
# positive numbers = rising
[ 1. 2. 1. 3. -1. -1. -2. -3.]
# positive numbers = concave up
[ 1. -1. 2. -4. 0. -1. -1.]
# positive numbers = curling up
[-2. 3. -6. 4. -1. 0.]
# positive numbers = snapping up
[ 5. -9. 10. -5. 1.]
https://en.wikipedia.org/wiki/Velocity
https://en.wikipedia.org/wiki/Acceleration
https://en.wikipedia.org/wiki/Jerk_(physics)
https://en.wikipedia.org/wiki/Jounce
my tendency is to then divide 1st derivative; velocity by a moving average and multiply by 100 to convert to %ROC; sometimes acceleration is also important; the concavity... the further you get jerk/jounce the more stochastic/noisy the data becomes
you can also calculate the mean of each:
print np.mean(data)
print np.mean(velocity)
print np.mean(acceleration)
to make generalizations about the shape, for this sample set:
>>>
4.22222222222 # average value
0.0 # generally sideways; no trend
-0.571428571429 # concave mostly down
and then the mean relative standard deviation
import numpy as np
data = [1, 2, 4, 5, 8, 7, 6, 4, 1]
coef_variance = np.std(data) / np.mean(data)
print coef_variance
>>>0.566859453383
which I'd call "fairly volatile"; but not extreme by orders of magnitude; typically >1 is considered "highly variant"
https://en.wikipedia.org/wiki/Coefficient_of_variation
and if we plot:
import matplotlib.pyplot as plt
import numpy as np
data = [1, 2, 4, 5, 8, 7, 6, 4, 1]
x = range(9)
plt.plot(x,data,c='red',ms=2)
plt.show()
we can see that is a generally good description of what we find:

no overall up/down trend, fairly volatile, concave down; mean just over 4
you can also polyfit:
import matplotlib.pyplot as plt
import numpy as np
data = [1, 2, 4, 5, 8, 7, 6, 4, 1]
x = range(9)
plt.plot(x,data,c='red',ms=2)
poly = np.polyfit(x,data,2)
z = []
for x in range(9):
z.append(poly[0]*x*x + poly[1]*x + poly[2])
x = range(9)
plt.plot(x,z,c='blue',ms=2)
print poly
plt.show()
which returns:
[-0.37445887 3.195671 -0.07272727]
in other words:
-0.374x^2 + 3.195x - 0.072
which plots:

from there you can calculate sum of squares to see how accurate your model is
Sum of Square Differences (SSD) in numpy/scipy
and you could iterate the polyfit process increasing the degree each time
np.polyfit(x,data,degree)
until you attain an adequately low SSD for your needs; which would tell you if your data is more x^2ish, x^3ish, x^4ish, etc.
while ssd > your_desire:
poly_array = polyfit()
ssd = sum_squares(poly_array, data)
degree +=1