0

I need to find global minimum of a complex function. I use basinhopping from scipy.optimize. when I changed method, example method="nelder-mead" vs "L-BFGS-B" or initial guess x0, they gave me different results, especially in values of x which I need to get to use in next steps. x[5] = 0.6 with "nelder-mead" but x[5]=0.0008 with "L-BFGS-B" although the function value is similar 2055.7795 vs 2055.7756 (all of these has "success: TRUE"). I thought basinhopping finds global minimum. So it should give the same result, no matter what method or initial guess I use. Anyone can explain why please? and suggest what I should do to find global minimum and check if it is global (not local).

Thank you

Nick ODell
  • 15,465
  • 3
  • 32
  • 66
Helena
  • 1
  • Can you show your code? – Frank Yellin Jan 13 '22 at 21:05
  • Hi Frank Yellin, here is my code:
    a=([])
    – Helena Jan 14 '22 at 09:08
  • Hi Frank Yellin, here is my code: a=np.array([0.45275208, 0.32413238, -0.30483703, 0.99848052, 0.55297566, 4.26764045, 1.03994821, 0.25612404, -4.58725626, 0.67264732, 0.17100111, -3.20751967]) bnds = ((-1, 1), (-1, 1),(-1,1),(0,None),(0,None),(None,None),(0,None),(0,None),(None,None),(0,None),(0,None),(None,None)) minimizer_kwargs = dict(method="nelder-mead", bounds=bnds) res=basinhopping(neg_loglike,a,minimizer_kwargs=minimizer_kwargs). Sorry I am new to stack. I tried to read how to format comments but could not make it work. – Helena Jan 14 '22 at 09:14
  • Please provide enough code so others can better understand or reproduce the problem. – Community Jan 22 '22 at 21:50

1 Answers1

0

The basin-hopping method is not guaranteed to give the global minimum for any function. Also it is not deterministic, at there is a random component in the way it will explore the vicinty, as described in the help about take_step argument a

If you want to reproduce the same result in two different calls in addition to using the same method you must use the same seed parameter.

Also using the same seed should increase the likelihood of giving the same result using different local optimizer methods.

Bob
  • 13,867
  • 1
  • 5
  • 27