0

I am tackling an optimization problem, and to do so I chain 3 optimizers. Symfit is imported as sft, and sympy as sy: [EDIT] The code below is a Minimal example of my situation, producing the same error message.

k, a = sft.parameters('k, a') # parameters to be optimized
k.min = 0.01
k.max = 1
a.min, a.max = 0.01, 1
L = sft.Parameter('L', value = 5, fixed = True) #this parameter is known,
 #therefore I don't wan't is to move


#variables
x = sft.Variable('x')
A = sft.Variable('A')
P = sft.Variable('P')

#model
model_dict = {
    sy.Derivative(A, x): k * A - P**a/ L, 
    sy.Derivative(P, x): - k * (P**2)/L 
    }

odemodel = sft.ODEModel(model_dict, initial= {x : 0.,
                                             A : 0,
                                             P : 0
                                             }) 

#some mock data ( inspired of tBuLi symfit doc)
x = np.linspace(0, 20, 40)
mock_data = odemodel(x=x, k=0.1, a = 0.08, L = 5)._asdict()
sigma_data = 0.5
np.random.seed(42)
for var in mock_data:
    mock_data[var] += np.random.normal(0, sigma_data, size=len(x))
fit = sft.Fit(odemodel, x = x,
              A = mock_data[A], P = mock_data[P],
              minimizer = [DifferentialEvolution, LBFGSB, BasinHopping]) #DifferentialEvolution #BasinHopping
fit_result = fit.execute()

The following error message pops :

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-8-ea4a7a6e9a8e> in <module>
     34               A = mock_data[A], P = mock_data[P],
     35               minimizer = [DifferentialEvolution, LBFGSB, BasinHopping]) #DifferentialEvolution #BasinHopping
---> 36 fit_result = fit.execute()

C:\ProgramData\Anaconda3\lib\site-packages\symfit\core\fit.py in execute(self, **minimize_options)
    577         :return: FitResults instance
    578         """
--> 579         minimizer_ans = self.minimizer.execute(**minimize_options)
    580         minimizer_ans.covariance_matrix = self.covariance_matrix(
    581             dict(zip(self.model.params, minimizer_ans._popt))

C:\ProgramData\Anaconda3\lib\site-packages\symfit\core\minimizers.py in execute(self, **minimizer_kwargs)
    270         for minimizer, kwargs in zip(self.minimizers, bound_arguments.arguments.values()):
    271             minimizer.initial_guesses = next_guess
--> 272             ans = minimizer.execute(**kwargs)
    273             next_guess = list(ans.params.values())
    274             answers.append(ans)

C:\ProgramData\Anaconda3\lib\site-packages\symfit\core\support.py in wrapped_func(*args, **kwargs)
    421                     else:
    422                         bound_args.arguments[param.name] = param.default
--> 423             return func(*bound_args.args, **bound_args.kwargs)
    424         return wrapped_func
    425 

C:\ProgramData\Anaconda3\lib\site-packages\symfit\core\minimizers.py in execute(self, **minimize_options)
    408         if jacobian is None:
    409             jacobian = self.wrapped_jacobian
--> 410         return super(ScipyGradientMinimize, self).execute(jacobian=jacobian, **minimize_options)
    411 
    412     def scipy_constraints(self, constraints):

C:\ProgramData\Anaconda3\lib\site-packages\symfit\core\minimizers.py in execute(self, **minimize_options)
    428     def execute(self, **minimize_options):
    429         return super(ScipyBoundedMinimizer, self).execute(bounds=self.bounds,
--> 430                                                           **minimize_options)
    431 
    432 

C:\ProgramData\Anaconda3\lib\site-packages\symfit\core\support.py in wrapped_func(*args, **kwargs)
    421                     else:
    422                         bound_args.arguments[param.name] = param.default
--> 423             return func(*bound_args.args, **bound_args.kwargs)
    424         return wrapped_func
    425 

C:\ProgramData\Anaconda3\lib\site-packages\symfit\core\minimizers.py in execute(self, bounds, jacobian, hessian, constraints, **minimize_options)
    353             jac=jacobian,
    354             hess=hessian,
--> 355             **minimize_options
    356         )
    357         return self._pack_output(ans)

C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_minimize.py in minimize(fun, x0, args, method, jac, hess, hessp, bounds, constraints, tol, callback, options)
    608     elif meth == 'l-bfgs-b':
    609         return _minimize_lbfgsb(fun, x0, args, jac, bounds,
--> 610                                 callback=callback, **options)
    611     elif meth == 'tnc':
    612         return _minimize_tnc(fun, x0, args, jac, bounds, callback=callback,

C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\lbfgsb.py in _minimize_lbfgsb(fun, x0, args, jac, bounds, disp, maxcor, ftol, gtol, eps, maxfun, maxiter, iprint, callback, maxls, **unknown_options)
    275         bounds = [(None, None)] * n
    276     if len(bounds) != n:
--> 277         raise ValueError('length of x0 != length of bounds')
    278     # unbounded variables must use None, not +-inf, for optimizer to work properly
    279     bounds = [(None if l == -np.inf else l, None if u == np.inf else u) for l, u in bounds]

ValueError: length of x0 != length of bounds

As mentioned in this (long) message it is the second minimizer lbfgsb which is causing troubles but I don't know at all how to overcome this. In my full code, when I try to put only one minimizer, the program runs forever and I cannot find out why. This is why one minimizer only seems to be not enough and I want to chain them. I think this is due to the complexity of the problem : two coupled ODE with 7 parameters to optimize, and a initial guess performed with InteractiveGuess ( very great tool by the way).

Thanks by advance for your help !

  • Does it run if you replace LBFGSB with e.g. BFGS? Does it work if you use only LBFGSB without chaining it with others? And perhaps you can try to find a Minimal Working Example which reproduces the error? Formulating that might already help. A couple of things I can think of going wrong is that you have `k_in ` etc. as parameters first, but then give fixed values for them when you initialize ODEModel. I'm not sure if symfit will like this. And both fixing the parameters and giving bounds shouldn't be necessary, it should't be a problem if I did my work correctly but you never know ;). – tBuLi Jul 22 '20 at 14:22
  • I tryied the replacement with BFGS, and get the same error message. But when I use LBFGSB only, it works ! Sadly it is not sufficient to solve my optimization problem, the solution return is not satisfying. Thus, I would still like to chain the minimizers. Thanks for the remark on ODEModel, I looked on symfit examples and your are totally rights. Despite the changes, the solution returned is still not fitting my data, but at least the code is better. I am wokring on finding a MWE but I have never done this before, so I still did not come up with one. Thanks for your help so far ! – Mr.Smalltree Jul 23 '20 at 03:16
  • If the chained fit does not work, I can still copy paste my code several times and only change the minimizer in symfit.Fit(), but I don't get how you give to symfit an initial guess for the parameters. Is it with the "value" keyword ? Interestingly, when I chain LBFGS and BFGS, I get a little bit different error message : [...]`<__array_function__ internals> in dot(*args, **kwargs) ValueError: shapes (8,8) and (5,) not aligned: 8 (dim 1) != 5 (dim 0) ` – Mr.Smalltree Jul 23 '20 at 03:37
  • Perhaps you can try to figure out if the problem is coming from BasinHopping or DifferentialEvolution by chaining only one of them with LBFGSB. I would also think that you don't need both of those since they are both global minimizers, personally I would use DifferentialEvolution chained with LBFGSB or BasinHopping alone. – tBuLi Jul 23 '20 at 09:29
  • And indeed, initial guesses are given to parameters using the `value` keyword. With regards to your latest issue, it is probably exactly what the traceback says, the shapes you feed to the model are incompatible. – tBuLi Jul 23 '20 at 09:31
  • Thank you for these precisions. I edited my post according to your previous message : When I chain DifferentialEvolution with LBFGSB alone, the programs takes a lot of time, I cannot see the end of it. Regarding you last comment, thanks for the"value" tips, I was not seeing other options for parameter initialization but I am sure now. I am sorry, it seems very obvious for you, but what do you mean by "the shapes you feed to the model are incompatible"? Should I provide either bounds ( k.min, k.max) and not "value" argument, or the opposite, but not both ? I have fixed parameters. – Mr.Smalltree Jul 24 '20 at 07:42
  • I set max and min value to the same value for my fixed parameters, instead of mixing bounds with(value = myvalue, fiexed = True). I don(t really know why fixing a value of a parameter create problems, but there we are this problem is solved, thank you ! Now I just need to find out how to get the results faster, sometims the programs runs for an hour without return, I'm forced to shut it down. – Mr.Smalltree Jul 24 '20 at 08:14
  • Easier, I deleted my fixed parameter and just set their value into my model. Yet I still don't see why the fixed = True create such problems. @tBuLi, if you can put your comment as an awnser so I can say you solved my issue ? Thanks again :) – Mr.Smalltree Jul 24 '20 at 12:39
  • DifferentialEvolution can definitely be slow, thats quite a problem. Perhaps you can start with better initial guesses? Or try BasinHopping instead, that might work better for you. – tBuLi Jul 24 '20 at 14:35
  • The shape issue comes from numpy, probably if you would call the odemodel directly with your data you would get the same error. That might help you debug it. Essentially, what do you expect a dot product between an 8x8 matrix and a 5-vector to look like? – tBuLi Jul 24 '20 at 14:37
  • Regarding the dot product, I totally get why it doesnt work, but I don't see where does this 8,8 matrix and this 5, come from. For what I saw, the 8,8 is the matrix associated to the parameters to be optimized, because when I remove one, it becomes a 7,7. For the other one, I don't know. – Mr.Smalltree Jul 27 '20 at 19:24
  • Well, thank you tBuLi for your advices, with InteractiveGuess I was able to find nice initial condiations. Sadly the program won't chain with Fit, so I have to write down the initial guess, restart the kernel and launch the fit with the guessed parameters, and it works pretty well with Differential evolution and lbfgssb. If I could solve the issue of guess.execute() never ending after I shut the windows, it would be perfect ! – Mr.Smalltree Jul 28 '20 at 14:17
  • The code is as simple as this: _definition of variables and parameters + model dictionary+ odemodel_
    `guess = InteractiveGuess(odemodel, t=dataarray[0], mean_Ab=dataarray[1], mean_Pb=dataarray[2], n_points=100)
    guess.execute()
    fit = sft.Fit(odemodel, t = dataarray[0], mean_Ab = dataarray[1], mean_Pb = dataarray[2], minimizer = [DifferentialEvolution, LBFGSB])
    fit_result = fit.execute()`
    – Mr.Smalltree Jul 28 '20 at 14:25
  • @tBuLi please let me tag you for this so you get notified, you were of great help with your previous comments. – Mr.Smalltree Jul 29 '20 at 08:55

0 Answers0