1

I mean the following method: scipy.optimize.minimize(method=’SLSQP’)

I heard in this issue that the "The memory required by COBYLA and SLSQP is quadratic in the number of variables. These algorithms are not suitable for solving such large problems.":

Anyone know the exact time&space complexity of this algorithm?

paulduf
  • 163
  • 2
  • 14
Feng Chen
  • 11
  • 2

1 Answers1

0

Yes it looks like the memory complexity is O(n^2), where n is the problem dimension (number of variables). This is due to the B-matrix (approximation of the Hessian matrix), as SLSQP is a quasi-Newton method.

About time complexity, I also found this related issue suggesting the time complexity of O(n^3). But I am not convinced. From what I found the SciPy implementation is a wrapper of old FORTRAN code and the given reference for this is:

Kraft, D. A software package for sequential quadratic programming. (Wiss. Berichtswesen d. DFVLR, 1988).

One computational drawback would come from LDL decomposition, but only rank-one updates of a LDL decomposition are needed, which can be achieved in O(n^2).

The O(n^3) may still come from the algorithm used to solve the quadratic subproblem. I am still reading the references so I may edit my response here.

paulduf
  • 163
  • 2
  • 14