1

the tutorial from scipy only shows one possible solution for differential evolution (https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.differential_evolution.html).

How can i get multiple solutions? And if not, is it because of scipy implementation or are differential evolutions algorithm just designed that way?

import numpy as np
from scipy.optimize import rosen, differential_evolution
bounds = [(0,2), (0, 2), (0, 2), (0, 2), (0, 2)]
result = differential_evolution(rosen, bounds)
result.x, result.fun
sayuri
  • 33
  • 2
  • Set a different seed should give you a different solution. `differential_evolution(rosen, bounds, seed=42)` and `differential_evolution(rosen, bounds, seed=24)` – alvas Mar 23 '23 at 09:33

2 Answers2

0

There are several hyperparameters that you can set for the evolution algorithm to diverge.

https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.differential_evolution.html

E.g.

import numpy as np
from scipy.optimize import rosen, differential_evolution
bounds = [(0,2), (1, 3), (1, 2), (4, 7), (500, 900)]

result = differential_evolution(rosen, bounds, seed=234)
print(result.x, result.fun)

result = differential_evolution(rosen, bounds, seed=42)
print(result.x, result.fun)

result = differential_evolution(rosen, bounds, seed=42)
print(result.x, result.fun)

result = differential_evolution(rosen, bounds, seed=23, mutation=(0.9, 1), recombination=0.8)
print(result.x, result.fun)

result = differential_evolution(rosen, bounds, seed=23, maxiter=2, mutation=(0.9, 1), recombination=0.8)
print(result.x, result.fun)

result = differential_evolution(rosen, bounds, seed=23, maxiter=2, mutation=(0.9, 1), recombination=0.8)
print(result.x, result.fun)

[out]:


(array([  1.1880044 ,   1.41300298,   2.        ,   7.        ,
        500.        ]),
 20341037.207360283)

(array([  1.18838044,   1.41362179,   2.        ,   7.        ,
        500.        ]),
 20341037.207038924)
​
(array([  1.18891057,   1.41438122,   2.        ,   7.        ,
        500.        ]),
 20341037.207497682)

(array([  1.1885353 ,   1.41414795,   2.        ,   7.        ,
        500.        ]),
 20341037.207302164)

Having a large variance in the bounds

But since rosen function is formulaic, the variance in the bounds needs to be large enough to see significant changes in the results.

import numpy as np
from scipy.optimize import rosen, differential_evolution
bounds = [(0,221529234), (123121, 31231232), (1231, 291231235), (30434, 1232317), (500, 900)]

result = differential_evolution(rosen, bounds, seed=234)
print(result.x, result.fun)

result = differential_evolution(rosen, bounds, seed=42)
print(result.x, result.fun)

[out]:

(array([  8141.41766062, 123121.        ,   1231.        ,  30434.        ,
           813.59702423]),
 2.3065086995396433e+22)

(array([     0.        , 123121.        ,   3838.30391681,  30434.        ,
           881.09558529]),
 2.30646627657552e+22)
alvas
  • 115,346
  • 109
  • 446
  • 738
  • I should have clarified that my the function i am using, is more complex and it takes far longer to get the optimum, wichi is why running multiple searches is not a valid option for me. – sayuri Mar 24 '23 at 12:14
  • There's no easy way to get around running multiple seeds or multiple hyperparameters unless you do something like bandits, take a look at https://optuna.org/#code_examples . But if you want the output of different generations, simply iterate through the different maxiter. – alvas Mar 24 '23 at 13:20
0

There is only one global minimum in the rosen function. It's not clear why you expect there to be multiple solutions. differential_evolution is designed to have a greater probability of finding that global minimum. If you have a solution with multiple minima and you want to record them all then shgo might be your best option. Alternatively you could track the progress of the differential_evolution population to identify different minima in the energy surface.

Andrew Nelson
  • 460
  • 3
  • 11
  • I was asking because someone said that when using genetic algorithms, you could output not only the best optimum found, but the whole last generation. – sayuri Mar 24 '23 at 12:14
  • By the last generation the population should've converged on a solution, so there won't be much variation. It's possible to record every generation, but it's a little involved. As I said above, if you want to record the multiple local minimas you should use `shgo`. – Andrew Nelson Mar 24 '23 at 21:35