0

I am new to iPython, and need to solve a specific curve fitting problem, I have the concept but my programming knowledge is yet too limited. I have experimental data (x, y) to fit to an equation (curve fitting) with four coefficients (a,b,c,d), I would like to fix one of these coefficients (e.g. a ) to a specific value and refit my experimental data (non-linear least squares). coefficients b, c and d are not independent of another, meaning they are related by a system of equations.

Is it more adequate to use curve_fit or lmfit?

I started this with curve_fit:

def fitfunc(x,a,b,c,d):
    return a+b*x+c/x+log10(x)*d
popt, fitcov = curve_fit(fitfunc, x, y)

or a code like this with lmfit:

import scipy as sp
from lmfit import minimize, Parameters, Parameter, report_fit

def fctmin(params, x, y):
    a = params['a'].value
    b = params['b'].value
    c = params['c'].value
    d = params['d'].value

    model = a+b*x+c/x+d*np.log10(x)

    return model - y

#create parameters
params = Parameters()
params.add('a', value = -89)
params.add('b', value =b)
params.add('c', value = c)
params.add('d', value = d)

#fit leastsq model
result = minimize(fctmin, params, args=(x, y))

#calculate results
final = y +  result.residual
report_fit(params)
Thomas K
  • 39,200
  • 7
  • 84
  • 86
zircon_34
  • 3
  • 2
  • The key here seems to be that your coefficients are not independent. If they were, `curve_fit` should do the job just fine. But you don't seem to be handling the dependence in either of your solutions above, so there is no guarantee that the output from `curve_fit` or `lmfit` will preserve that dependence. Can you solve the dependence equations for one or more independent variables? – Jsl Jun 14 '14 at 01:22

1 Answers1

0

I'll admit to being biased. curve_fit() is designed to simplify scipy.optimize.leastsq() by assuming that you are fitting y(x) data to a model for y(x, parameters), so the function you pass to curve_fit() is one that will calculate the model for the values to be fit. lmfit is a bit more general and flexible in that your objective function has to return the array to be minimized in the least-squares sense, but your objective function has to return "model-data" instead of "model"

But, lmfit has features that appear to do exactly what you want: fix one of the parameters in the model without having to rewrite the objective function.
That is, you could say

params.add('a', value = -89, vary=False)

and the parameter 'a' will stay fixed. To do that with curve_fit() you have to rewrite your model function.

In addition, you say that b, c and d are related by equations, but don't give details. With lmfit, you might be able to include these equations as constraints. You have

params.add('b', value =b)
params.add('c', value = c)
params.add('d', value = d)

though I don't see a value for b. Even assuming there is a value, this creates three independent variables with the same starting value. You might mean "vary b, and force c and d to have the same value". lmfit can do this with:

params.add('b', value = 10)
params.add('c', expr = 'b')
params.add('d', expr = 'c')

That will have one independent variable, and the value for c will be forced to the value of b (and d to c). You can use (approximately) any valid python statement as a constraints constraint expression, for example:

params.add('b', value = 10)
params.add('c', expr = 'sqrt(b/10)')
params.add('d', expr = '1-c')

I think that might be the sort of thing you're looking for.

  • thank you!!! its exactly what I was looking for. I also found out, that if I want to use the so calculated coefficients in a later equation, I need to use: a = params['a'].value b = params['b'].value c = params['c'].value d = params['d'].value – zircon_34 Jun 14 '14 at 12:56