4

I am running nested optimization code.

sp.optimize.minimize(fun=A, x0=D, method="SLSQP", bounds=(E), constraints=({'type':'eq','fun':constrains}), options={'disp': True, 'maxiter':100, 'ftol':1e-05})

sp.optimize.minimize(fun=B, x0=C, method="Nelder-Mead", options={'disp': True})

The first minimization is the part of the function B, so it is kind of running inside the second minimization.

And the whole optimization is based on the data, there's no random number involved.

I run the exactly same code on two different computers, and get the totally different results.

I have installed different versions of anaconda, but

scipy, numpy, and all the packages used have the same versions.

I don't really think OS would matter, but one is windows 10 (64bit), and the other one is windows 8.1 (64 bit)

I am trying to figure out what might be causing this.

Even though I did not state the whole options, if two computers are running the same code, shouldn't the results be the same?

or are there any options for sp.optimize that default values are set to be different from computer to computer?

PS. I was looking at the option "eps". Is it possible that default values of "eps" are different on these computers?

WKW
  • 77
  • 1
  • 5
  • Is there a difference in the Python version? – Roland Smith Sep 04 '17 at 20:42
  • Oh. you are right. I missed this.. One is Python 3.6.0 and the other one is Python 3.6.1 Will it be causing the difference ? I mean.. It is a bit frustrating, because on one computer it converges in 3000 iterations, but on the other computer, it does not converge until 7000 iterations.. – WKW Sep 04 '17 at 20:50
  • Are both Python instances 64 bit? That might also be a factor. Check that numpy arrays use the same `dtype` on both instances. – Roland Smith Sep 04 '17 at 22:50

2 Answers2

0

You should never expect numerical methods to perform identically on different devices; or even different runs of the same code on the same device. Due to the finite precision of the machine you can never calculate the "real" result, but only numerical approximations. During a long optimization task these differences can sum up.

Furthermore, some optimazion methods use some kind of randomness on the inside to solve the problem of being stuck in local minima: they add a small, alomost vanishing noise to the previous calculated solution to allow the algorithm to converge faster in the global minimum and not being stuck in a local minimum or a saddle-point.

Can you try to plot the landscape of the function you want to minimize? This can help you to analyze the problem: If both of the results (on each machine) are local minima, then this behaviour can be explained by my previous description.

If this is not the case, you should check the version of scipy you have installed on both machines. Maybe you are implicitly using float values on one device and double values on the other one, too?

You see: there are a lot of possible explanations for this (at the first look) strange numerical behaviour; you have to give us more details to solve this.

zimmerrol
  • 4,872
  • 3
  • 22
  • 41
  • First off, thank you for the answer! But I have been getting the same results on the same computer over many times. And I just tried on the other computer, and got different results and a bit surprised.. I really wish I could just upload my code, but I think I shouldn't.. :( Could you be very kind and tell me more about "implicitly using float and double" ? Does this have something to do with machine precision ? How can I check this? (scipy versions are the same..) – WKW Sep 04 '17 at 21:18
  • @WonkiWoo Try to plot the landspace of the function you want to minimize, as I told you. – zimmerrol Sep 04 '17 at 21:19
  • I have the same problem: running optimisation problems with many iterations while using the same packages results on different results on two hpc clusters. The minima these optimisers terminate in are local, I'm certain of that. Can I pass certain numerical settings, such as when to use floats and doubles, to python when running it? Or the number of bytes values must at least be? – H. Vabri Jun 25 '18 at 05:59
  • 1
    In case you are using numpy you have to make sure you use the same openblas version, and that you have configured it identically. But even then, it is not guaranteed that the results will be the same. – zimmerrol Jun 25 '18 at 07:27
0

I found that different versions of SciPy do or do not allow minimum and maximum bounds to be the same. For example, in SciPy version 1.5.4, a parameter with equal min and max bounds sends that term's Jacobian to nan, which brings the minimization to a premature stop.

Eric Saund
  • 113
  • 8