Let's say I have a plotted line, I use the x-coordinates of each point as index of the array and the y-coordinates as the values in the array. With this array, I need to figure out the line's minimum acceleration (the lowest point in the slope of the slope).
My initial idea was to begin finding the slope at every point by doing
slope[i] = (line[i] - line[i-1]) / (i - (i-1))
with
slope[0] = 0
and then doing
acceleration[i] = (slope[i] - slope[i-1]) / (i - (i-1))
with
acceleration[0] = 0
and then just sorting acceleration
to have the lowest value first. But I'm not too sure if this will work.