0

The QP problem is convex. For Wiki, the problem can be solved in polynomial time. But what exactly is the order?

Anguslilei
  • 31
  • 1
  • 5

1 Answers1

0

That is an interesting question with (in my opinion) no clear answer. I am going to assume your problem is convex and you are interested in run-time complexity (as opposed to Iteration complexity).

  • As you may know, QuadProg is not one algorithm but rather, a generic name for something that solves Quadratic problems. It uses a set of algorithms underneath viz. Interior Point (Default), Trust-Region and Active-Set. Source.
  • Depending upon what you choose, each of these algorithms will have its own complexity analysis. For Trust-Region and Active-Set methods, the complexity analysis is extremely hard. In fact, Active-Set methods are not polynomial to begin with. Counterexamples exist where Active-Set methods take exponential "time" to converge (This is true also for the Simplex Method for Linear Programs). Source.
  • Now, assuming that you choose Interior Point methods, the answer is still not straightforward because there are various flavours of these methods. When Karmarkar first proposed this method, it was the first known polynomial algorithm for solving Linear Programs and it had a complexity of O(n^3.5). Source. These bounds were improved quite a lot later. However, this is for Linear Programs.
  • Finally, to answer your question, Ye and Tse proved in 1989 that we can have an Interior Point method with complexity O(n^3). However, whether MATLAB uses this exact flavor of Interior Point method is a little tricky to know but O(n^3) would be my best guess.

Of course, my answer is rather theoretical; if you want to empirically test it out, you can do so by gradually increasing the number of variables and plotting the CPU time required to get an estimate.

Nitish
  • 6,358
  • 1
  • 15
  • 15
  • I empirically tested this for unconstrained quadratic programs, and interestingly, between 100 and 1,000 variables the run time seems to scale as O(n^2.5). Below 100 variables the curve is flatter than this, but I suspect that constant factors and overhead are dominating the run time. I haven't tested more than 1,000 variables. – Chris Taylor Aug 31 '17 at 10:13