1

I want to know how to trigger early stopping for scipy optimization algorithms like differential evolution, dual annealing, and basin hopping.

I know you can call a callback function that returns True or False, but, I wanted to know how to trigger the whole algorithm to stop when the fitness function converge to the same value for several itertaions. For instance, if the maxiter is set to 250, but the fitness function converged at 100 iterations, how do i trigger it to stop when it is no longer making any improvements?

Mad Physicist
  • 107,652
  • 25
  • 181
  • 264
  • `maxiter` is when to stop if it *doesn't* converge (i.e., fails). If you get the same number more than once, it will terminate. That's the whole point. – Mad Physicist Dec 23 '21 at 22:44

1 Answers1

0

In addition to the maxiter parameter, there is a tolerance that determines convergence. All the algorithms stop once they reach the convergence criterion.

Take differential_evolution as an example. It has two tolerance parameters (most have only the first one):

tol : float, optional

Relative tolerance for convergence, the solving stops when np.std(pop) <= atol + tol * np.abs(np.mean(population_energies)), where and atol and tol are the absolute and relative tolerance respectively.

...

atol : float, optional

Absolute tolerance for convergence, the solving stops when np.std(pop) <= atol + tol * np.abs(np.mean(population_energies)), where and atol and tol are the absolute and relative tolerance respectively.

Mad Physicist
  • 107,652
  • 25
  • 181
  • 264