0

To summarize:

My post is quite long but my two questions are:

Can we use eq or ineq constraints in optimize.curvefit?

How to use minimize with initial guesses far from the optimal ones?

Some details:

I have been trying to use scipy.optimize.curvefit and scipy.optimize.minimize to find optimal parameters in order to fit experimental curve.

I have x, y experimental data that I want to fit. As an example my function to optimize F is an addition of multiple f(x,arg1,arg2,arg3,arg4 ) + a linear function (ax+b). With curve fit it works almost ok but I would like to add constraint to my linear part (ax+b) to make sure that I don't have y<0.

And I didn't find it possible to add eq or ineq constraints in curvefit.. Is it possible?

So I tried to go with minimize just by creating a diff function:

def diff(x, F, y,  args) :
    intmodel =  F(x, args)
    summdiff= 0
    for i,item in enumerate (intmodel):
        diff = (y[i] - item)**2
        summdiff= summdiff+ diff
    return summdiff

and I added

cons1 = {'type' : 'ineq','fun' : lambda arg : arg[-2] * min(x) + arg[-1]} #positive for all x

cons2 = {'type': 'ineq', 'fun': lambda arg: arg[-2] * max(x) + arg[-1]}   #positive for all x

Constr = [cons1, cons2]

And finally:

resultminimize = scipy.optimize.minimize(fun = diff,x0 = initialguess, bounds = bdns, constraints = Constr)

So it runs and:

[{'type': 'ineq', 'fun': <function fit.<locals>.<lambda> at 0x0000016B96DC8DC8>}, {'type': 'ineq', 'fun': <function fit.<locals>.<lambda> at 0x0000016B96DC8EE8>}]
14 14 2 [{'type': 'ineq', 'fun': <function fit.<locals>.<lambda> at 0x0000016B96DC8DC8>}, {'type': 'ineq', 'fun': <function fit.<locals>.<lambda> at 0x0000016B96DC8EE8>}]
     fun: 23403018.409918513
     jac: array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])
 message: 'Optimization terminated successfully.'
    nfev: 16
     nit: 1
    njev: 1
  status: 0
 success: True
       x: array([ 14.45      , 100.        ,   1.        ,   1.        ,
        15.        ,  10.        ,   1.        ,   1.        ,
        15.15      , 100.        ,   1.        ,   1.        ,
        -7.54765688, 215.89131193])

It didn't change anything on the parameters (the results are identical to what I put in initialguess) but it is happy so ...yeaaaah??

The exact same initial guesses are used in curvefit and give really nice results (except that I can't add constraints). Normally the bounds are not trouble making. I tried to increase the tol (because I read it could help) but it is not changing anything..

I often have this kind of problem with minimize so I prefer to use curvefit but I think I am doing something wrong, could you give me some clues?

Thanks

Lorene
  • 1
  • 1

1 Answers1

0

Have you tried with LinearConstraint? In your case, the matricial system should be something like this:

0<A [Arg1; Arg2; Arg3; Arg4, a; b]<+inf

where a and b denote the parameters you want to optimize and A a matrix. Since you want to ensure 0 < ax+b, for each x in your experimental set, A must be:

A=[[0, 0, 0, 0, x_1, 1],
   [0, 0, 0, 0, x_2, 1],
   [0, 0, 0, 0, x_3, 1];
   ...
   [0, 0, 0, 0, x_n, 1]]

That is:

n=len(x)
A=np.hstack([np.zeros(shape=(n,4)), x.reshape((n,1)), np.ones(shape=(n,1))])

Finally, your constraint is given by:

lincons=scipy.optimize.LinearConstraint(A,0,np.inf)

Hope this helps.

Dorian
  • 151
  • 1
  • 4
  • Thank you for your answer, I already tried that (and I tried your code to be sure) but the problem is with minimize, it doesn't change the values at all, they remain the same as initial guess, I tried to remove bounds and constraints because I thought it was maybe a too constrained problem but it doesn't change anything. – Lorene Jun 05 '20 at 06:29
  • According to [`scipy.optimize.minimize`](https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html)'s docstring, the default method used for constrained problems is `SLSPQ`. Maybe you should try another method, e.g. with `method='trust-constr'`. – Dorian Jun 05 '20 at 09:31