Differential Evolution: Getting Started, Python implementation: de.py

by Amit

Differential Evolution (DE) is a population based global optimization method inspired by the same principles that Genetic Algorithms, Evolutionary Strategies, Evolutionary Programming are inspired by. The basic idea remains the same: the search space of the given problem is searched in an inherently parallel manner, at each step the algorithm processes a population of possible solutions. Like others, DE also have crossover, mutation and selection, which guide the algorithm to the solution. However, DE is based on very simple principles, which makes it easier, not only to understand, but also to implement (~200 loc in Python, as you will see, and possibly could be even less, since I have possibly written C in Python) and hence experiment with it.

DE was introduced by Storn and Price, in their work Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. At this point, I would probably suggest you to go and directly read that paper- its super simple to follow. Then, you may come back, if you haven’t already started writing your own DE in your favorite language.

Working principles in brief

  1. An objective function to minimize (or maximize)
  2. Population size, NP is set to 20*dimension of the problem
  3. Generate the initial population. In DE terminology, each individual is called a vector and the corresponding to each variable in the objective function, there is a vector component. The value of a vector component is chosen randomly such that its value falls between the specified upper and lower bound
  4. Differential mutation: For each i of NP individuals,called the target vector generate a mutant vector v as follows: v_{i,g} = x_{r0,g} + F.(x_{r1,g} - x_{r2,g}), where g \in [1,dimension]. r0, r1,r2 are three random numbers in [1,NP], such that r0!=r1!=r2!=i. F is the scaling factor, which determines the rate of the population’s evolution (yet to grok this fully) and is recommended to use 0.9 as its value.
  5. Crossover: The mutant vector then undergoes crossover- a mix of its vector components with the target vector. The crossing method is defined such that a new vector is defined by replacing some components of the mutant vector by the corresponding components in the target vector. The number of components replaced is indirectly determined by the crossover probability, cr
  6. Selection: The vector obtained from the previous step is compared with the target vector. If it gives a better (lesser in this case) objective function value, then it replaces the target vector in the population and takes part in the next generation
  7. Continue steps 4,5,6 till the maximum number of generations, maxgen is reached or a convergence criteria is met.

Python implementation

The above working principles have been used to write de.py which is a DE implementation of the type de/rand/1. I have great hopes for myself to improve this piece of code to incorporate a lot of other features so that it can be used for DE research.

Learn more

Here are a couple of resources you should definitely refer to:

  1. Differential evolution-a simple and efficient adaptive scheme for global optimization (PDF)
  2. Differential evolution: a practical approach to global optimization (Google Books)