0

I have a constrained minimization problem with around 1000 variables. Currently I'm using scipy's SLSQP routine:

x_min = scipy.optimize.minimize(energy, x0, method='SLSQP', jac=energy_grad,
                                args=(L, N, U, t, mu), constraints=cons)

(with constraints of the form x0[1,1]**2 + x0[1,2]**2 + ... + x0[1,N]**2 = 1)

I had hoped providing the analytical form of the Jacobian would speed the process slightly (over calculating the derivatives numerically). However, when I compare the run times there seems to be no difference.

ali_m
  • 71,714
  • 23
  • 223
  • 298
user12800
  • 47
  • 1
  • 6
  • It will usually give some performance benefit, but how much depends on lots of factors. How often is the Jacobian being evaluated relative to the cost function/constraints? How costly is each Jacobian evaluation relative to the cost function/constraints? Have you tried passing analytic Jacobians for your constraints? For example, you might be spending almost all of your time evaluating the constraints, so you would see virtually no benefit from speeding up the Jacobian calculation. – ali_m Mar 07 '16 at 02:03
  • You should definitely make an [MCVE](http://stackoverflow.com/help/mcve). – ali_m Mar 07 '16 at 02:03
  • Yes, analytic gradients can help both in reliability and in performance. In general it is highly recommended to provide gradients. Some modeling systems do automatic differentiation so the user does not have to provide them. – Erwin Kalvelagen Mar 07 '16 at 08:52

0 Answers0