5

I want to minimize a scalar function (energy) of a 1D array of variables (atomic coordinates). The function and gradient are computed by an external program. My problem is that the scipy function is taking big steps in X, causing f(X) to fail. Is there a way to restrain the size of the steps? Even better, is there a way to get minimize() to avoid broken values of X?

result = scipy.optimize.minimize(qmgrad_wrap, X0, args=args, method='BFGS', jac=True, options=options)
print(result)
Karl I.
  • 121
  • 5
  • No, you can't as far as i know. Linesearch is setting the stepsize in BFGS and some others. Use bounds or constraints (not supported by BFGS). – sascha Nov 29 '17 at 18:57
  • @Karl, Hi did you ever find a solution for this. I need to do the same. – Kvothe Jan 31 '21 at 14:32
  • @Kvothe, sorry, I did not. (That project is mothballed.) I guess I would look for a minimizer with a trust radius, or try to use the external program, or code something from Numerical Recipes. – Karl I. Feb 01 '21 at 15:46
  • Having the same problem, its annoying that you cannot specify a maximum step size without modifying the code. – benno Apr 28 '21 at 06:22
  • I found that you can get good behaviour by softmaxing the Jacobian, _e.g._ `new_jac = np.tanh(jac*100)/100` – Rotem Shalev Jul 22 '22 at 15:29

0 Answers0