I'm trying to fit a function of form:
where A and B are fixed constants. In scipy, my usual (and I think reasonably canonical) approach to such problems goes like so:
def func(t, coefs):
phase = np.poly1d(coefs)(t)
return A * np.cos(phase) + B
def fit(time, data, guess_coefs):
residuals = lambda p: func(time, p) - data
fit_coefs = scipy.optimize.leastsq(residuals, guess_coefs)
return fit_coefs
This works OK, but I'd like to supply an analytic Jacobian to improve convergence. Thus:
def jacobian(t, coefs):
phase = np.poly1d(coefs, t)
the_jacobian = []
for i in np.arange(len(coefs)):
the_jac.append(-A*np.sin(phase)*(t**i))
return the_jac
def fit(time, data, guess_coefs):
residuals = lambda p: func(time, p) - data
jac = lambda p: jacobian(time, p)
fit_coefs = scipy.optimize.leastsq(residuals, guess_coefs,
Dfun=jac, col_deriv=True)
This doesn't work as well, even at orders of 2 or less. Neither does a quick check with optimize.check_gradient() give positive results.
I'm almost certain the Jacobian and the code are correct (although please correct me) and that the problem is more fundamental: the t**i term in the Jacobian causes overflow errors. This isn't a problem in the function itself, because here the monomial terms are multiplied by their coefficients, which are very small.
My questions are:
- Is there something wrong code-wise with what I've done above?
- Or is there some other problem?
- If my hypothesis is correct, is there a way to precondition the fit-function so the Jacobian is better-behaved? Maybe I could fit to logarithms of the data and time, or something.
Thanks!
Edit: forgot the square in the original functional form