1

From browsing the documentation and examples (here, here, here) for DEAP, I found a few instances of using DEAP for multi-objective optimization, but nothing on multi-modal optimization.

Is it possible to use the DEAP framework for evolutionary multimodal optimization, similar to what is described in this article? Are there examples of this being done?

usernumber
  • 1,958
  • 1
  • 21
  • 58
  • Yes and no. Genetic algorithms have facilities to escape local solutions (mutation). But as they are just heuristics, they typically will not find the global optimum solution. Of course in many practical cases, a good solution may be quite acceptable. – Erwin Kalvelagen Oct 30 '19 at 07:58
  • 1
    @ErwinKalvelagen Thank you for taking the time to comment. However, I don't see how this relates to my question. Genetic algorithms can be used for multimodal optimization, by using niching/crowding strategies. I would like to know if there are examples of DEAP being used to do this. – usernumber Oct 30 '19 at 08:40

1 Answers1

1

DEAP doesn't have built-in support for multimodal optimization. However, it can be used to solve such problems, simply by specifying the right fitness function.

import numpy as np
import random
import math
import matplotlib.pyplot as plt
from scipy.spatial import distance_matrix
from deap import base, tools, creator, algorithms

npop = 1000
sigma = 0.3
alpha = 2

# Here is a function with many local maxima
def objective(x, y):
    return np.exp(-9*abs(x*y)) * np.sin(3*math.pi*x)**2 * np.sin(3*math.pi*y)**2

def fitness(individual):
    return objective(individual[0], individual[1]),

xrange = np.arange(0., 1., 0.01)
X, Y = np.meshgrid(xrange, xrange)
zs = np.array(objective(np.ravel(X), np.ravel(Y)))
Z = zs.reshape(X.shape)

# Setup the DEAP toolbox
toolbox = base.Toolbox()

creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list, fitness=creator.FitnessMax)
toolbox.register("individual", tools.initRepeat, creator.Individual, random.random, n=2)

toolbox.register("population", tools.initRepeat, list, toolbox.individual)

toolbox.register('mutate', tools.mutPolynomialBounded, eta=.6, low=[0,0], up=[1,1], indpb=0.1)
toolbox.register('mate', tools.cxUniform, indpb=0.5)
toolbox.register('select', tools.selBest)

# Register a shared fitness function
def sharing(distance, sigma, alpha):
    res = 0
    if distance<sigma:
        res += 1 - (distance/sigma)**alpha
    return res

def shared_fitness(individual, population, sigma, alpha):
    num = fitness(individual)[0]

    dists = distance_matrix([individual], population)[0]
    tmp = [sharing(d, sigma, alpha) for d in dists]
    den = sum(tmp)

    return num/den,

pop = toolbox.population(n=npop)

toolbox.register('evaluate', shared_fitness, population=pop, sigma=sigma, alpha=alpha)

# Run the optimization algorithm
mu = int(len(pop)*0.5)
lambda_ = int(len(pop)*0.5)
cxpb = 0.4
mutpb = 0.5
ngen = 10

pop, logbook = algorithms.eaMuPlusLambda(pop, toolbox, mu, lambda_, cxpb, mutpb, ngen)

fig = plt.figure()
ax = fig.add_subplot(111)
sc = ax.scatter(X, Y, c=Z)
plt.colorbar(sc) 
ax.scatter(*zip(*pop), color='red')

With this, the population gets distributed within niches, and each local maxima can be identified.

Final result

usernumber
  • 1,958
  • 1
  • 21
  • 58
  • Works as advertised, thanks. Minor recommendation: in the `shared_fitness()` function you may get division-by-zero NumPy warnings. Consider trapping them. – András Aszódi Nov 12 '19 at 19:48
  • @LaryxDecidua Trapping? – usernumber Nov 13 '19 at 15:26
  • 1
    "trapping" in the sense of `if abs(den) < 1.0e-7: ...`, i.e. detecting when the denominator is close to 0.0 and then handle it. NumPy issues a warning and continues. Standard Python would raise an exception. – András Aszódi Nov 13 '19 at 15:34
  • @LaryxDecidua The distance of a point to itself is 0 and `sharing(0,...) ` is equal to `1`. So the denominator is the sum of 1 plus some positive values. That means the denominator should always be bigger than 1, and therefore never raise a division by zero error. – usernumber Nov 19 '19 at 09:03
  • Hmm, that's interesting, I did get such error messages. Need to check the code once again. Thanks – András Aszódi Nov 19 '19 at 12:57
  • @LaryxDecidua I figured out where such an error message can come from. When evaluating offspring, the offspring individuals are not yet in the population list. So the shared fitness of those offspring can be very close to 0. – usernumber Nov 19 '19 at 14:07
  • A solution would be to check if the individual is in the population, and if it isn't, then add 1 to the denominator. – usernumber Nov 19 '19 at 14:08
  • 1
    @LaryxDecidua thanks for pointing that out. It raised interesting questions on the inner workings of the algorithm. – usernumber Nov 19 '19 at 14:09