0

what I want to do is essentially to understand how to implement a custom step taking routine to include in the take_step parameter of the basinhopping algorithm of the scipy library (https://docs.scipy.org/doc/scipy-0.18.1/reference/generated/scipy.optimize.basinhopping.html) in python that matches the procedure used in the SQP algorithm of MATLAB fmincon.

the reference material is here, more specifically in the Line Search and Merit Function section (https://it.mathworks.com/help/optim/ug/constrained-nonlinear-optimization-algorithms.html#f26965)

as far as I understood by reading the docs, the minimize algorithm (on which basinhopping relies for local minimization) is essentially the same up to the new iteration (on which the new starting point for a local minimization based on sequential quadratic programming that embeds bounds and aggregate contraints). the difference is exactly on how the new starting point is selected. the main difference is that in the basin hopping algorithm, the perturbation is random whereas it seems to be directed in the MATLAB implementation. I would like to do just that.

problem is I can't seem to find a way to "expose" the quantities needed. Meaning in the Matlab reference doc: Zk and Gk.

Can anyone point me in the right direction?

Asher11
  • 1,295
  • 2
  • 15
  • 31
  • Can you be more precise? Line-searches are working on assumptions. Wolfe-conditions, Armijo and co. do some operations with some idea about that function. This is often used to achieve some local-convergence and is the inner-step of basinhopping. You want to use linesearch on the outer level? This does not make any sense to me! Ignoring assumptions and co: the core idea of global-optimization is to accept worse solutions to be able to achieve better solutions in the future (easily seen in simulated-annealing; somewhat like a bridge)! By *only going down* (line-search), you can't achieve that. – sascha Jan 13 '18 at 15:16
  • your comment clearly pinpoints my knowledge gap in mathematical optimizations. unfortunately I cannot answer those questions. suffice to say, I dare say basin hopping random displacement is a bit of a "raw" method and I was looking for a more "efficient" and powerful solutions in adressing new search areas. one of the kind that bumped into mind was the one proposed by MATLAB in the aforementioned algorithm. having said that, do you have some methodological alternatives I could look into? – Asher11 Jan 13 '18 at 16:13
  • It's very broad and treating the general problem, all comments are depressing (NP-hardness; no-free-lunch-theorem and co). Without a-priori knowledge about your optimization-surface there is not much you can do. Some people use global-optimization methods based on bayesian-methods (try to infer the surface by observing and using some smoothness assumptions) but we won't know if that's something for you. E.g. [rbfopt](https://github.com/coin-or/rbfopt). Your best bet is probably looking for keywords Global-/ derivative-free / black-box optimization. But it's hard to tackle! – sascha Jan 13 '18 at 16:18
  • my problem is highly non-linear so I'd say that the optimization surface will remain a mistery. I'll definetely have a look at rbfopt. would you say specialized packages in quadratric programming such as http://cvxopt.org/ would offer something noteworthy from what you were able to gather from my situation? also, when you say the block box will be hard to tackle, you mean implementation wise? – Asher11 Jan 13 '18 at 16:32
  • Be careful. It seems some basics are missing! If you are trying to solve a convex continuous opt problem, you can achieve a global optimum in polynomial time. cvxopt, if you can express it in some of the supported forms, as well as any nonlinear-optimizer achieving local-opt convergence (as local-opt = global-opt in convex optimization) like SLSQP will do that (quite differently!). In this case using Basinhopping and co. is completely unnecessary (and wrong). If it's non-convex; cvxopt can't tackle it. General nonlinear-optimizers will do and achieve local-opts. – sascha Jan 13 '18 at 16:37
  • And in addition to the last sentence above: often you will use multiple initial-values and calculate local-opts then and take the best. Basinhopping is a somewhat more clever way to do it (some assumptions made). When i said hard to tackle, i meant theory-wise. Some problems, and we don't know yours, are just NP-hard and it's infeasible to get exact results. – sascha Jan 13 '18 at 16:39
  • after reading some more on the topics at hand, I can say my problem is not convex so cvxopt cannot be used. I'll try my hand at black box optimization other classic not linear optimizers – Asher11 Jan 14 '18 at 09:57

0 Answers0