I'm trying to minimize a function using Nelder-Mead as implemented in scipy.minimize(method='Nelder-Mead'). The function has about 30 inputs, but I have been optimizing sequentially (i.e. optimize over the first 5, keep remaining 25 fixed, then gradually increase number of variables to be minimized over). (I'm not using gradient-based since there is simulation noise in my objective function, making it non-smooth for too small step-sizes and gradients unreliable).
However, the iterations are very slow. If I time the step of a single function evaluation, it takes about 60sec, but each iteration in the optimization is very slow, slower than 10 min at least. I'm using the callback function option to gauge the time taken.
What does it do in each iteration? Does it actually take N^2 steps in the initial simplex? And what is done in each step of the algorithm? I know for a fact that it is different from the Matlab implementation, which only takes a single step per iteration (and sometimes a few more if it is shrinking or expanding the simplex). Or is it just a matter of when the callback function is called?
Btw, I'm running this in a Jupyter notebook. But I have had it running for over 3 days with only 221 iterations completed and 20 variables to be optimized over; over 20min per iteration on avg.