0

What does the energy variable represent in a simulated annealing algorithm? I'm guessing it's similar to the fitness variable in a GA?

Undefined
  • 1,899
  • 6
  • 29
  • 38

3 Answers3

1

Yeah, it's very similar to a fitness function in genetic programming or GA. The energy (E) of the system starts at some arbitrary high energy state. Each step the energy is evaluated and the system attempts to move to lower energy states.

In the beginning, when the system has a high "temperature", larger moves against optimal state are allowed so that the system can escape local maxima. Over many steps, the temperature is decreased (as is the energy level, hopefully).

There are a lot of good write-ups about simulated annealing out there. Here is a good PPT overview: Link

Ryan O.
  • 310
  • 2
  • 6
1

I do not see a clear relationship between energy in SA and fitness in GA.

The energy in SA defines the search space for the next iteration: as the energy variable shrinks, the volume of the search space shrinks. For instance, if you were doing some kind of musical search and you had a note that was "C", a high SA energy might allow that value to become anything from an "A" to an "G", while a low SA energy might allow that value only to become a C flat or a sharp.

In GA, the search space is defined by the entropy of the values at a given genotypic location. Thus, if at position 1 in your music search every individual has a "C" note, the children will have a "C" at that position (barring mutation), and there's no real search along that dimension in the solution space. If, though, the values at position 2 in the genotype are equally "A"-"G", then the search space is very large.

The fitness in a GA is simply the quality of a complete solution. It's a description of an individual and not a parameter to the next iteration (except indirectly, as it influences selection). So I just do not see any good conceptual mapping to SA energy.

Larry OBrien
  • 8,484
  • 1
  • 41
  • 75
1

In Simulated Annealing, the energy (E) of a point determines its probability of being accepted as a solution. When the temperature parameter is high, the algorithm accepts new solutions either with low or high energy in a random manner. When the temperature is low, the algorithm accepts new solutions whose energy is low.

In typical implementations, the algorithm decreases its temperature parameter as their iterations pass. This provokes a smooth transition from random to deterministic behavior, which is a key characteristic of Simulated Annealing.

There are some texts that explain both Simulated Annealing and Genetic algorithms in depth. I recommend:

[1] Vöcking, B., Alt, H., Dietzfelbinger, M., Reischuk, R., Scheideler, C., Vollmer, H., Wagner, D. "Algorithms Unplugged", Ed. Berlin, Germany: Springer-Verlag Berlin Heidelberg, 2011, ch. 41, pp. 393-400.

[2] Duc Pham, D. Karaboga, "Intelligent Optimisation Techniques". London, United Kingdom: Springer-Verlag London, 2000.

Ricardo Alejos
  • 402
  • 4
  • 9