1

I am using Scipy's implementation of LBFGS for minimizing a non-convex objective function. The result is not too bad. But the status of convergence is "ABNORMAL_TERMINATION_IN_LNSRCH".

Is it possible that this is because my objective function is non-convex? Or could this mean my gradients (analytically calculated manually and passed as argument to Scipy's LBFGS) are wrong?

ali_m
  • 71,714
  • 23
  • 223
  • 298
Satarupa Guha
  • 1,267
  • 13
  • 20
  • 1
    Most likely your gradient is wrong. Non-convex only means that in general you do not find the global minimum. – cel Oct 23 '15 at 18:51
  • @cel OK.. Actually I have checked my gradients several times. I don't know any better, to the best of my knowledge.. Could this be a result of the function being non-smooth? – Satarupa Guha Oct 24 '15 at 05:21
  • 2
    All standard optimization routines require smooth objective functions. – cel Oct 24 '15 at 05:24
  • @cel - So that means, if my objective function is non-smooth, that could lead to such abnormal terminations? – Satarupa Guha Oct 25 '15 at 20:21

1 Answers1

1

It all fine. Normally L-BGFS, Gradient Descents are convex optimization methods. That means your optimization function should have a global minimum and it should be smooth. When the function is non-convex, it has different terrains which we know as local minima. So in this case when we use convex optimization methods for non convex function what happens is you optimization procedure can find a local minima which is not the perfect answer.

Shamane Siriwardhana
  • 3,951
  • 6
  • 33
  • 73