The Evolution Simulation

The simulation is mentioned in "The Blind Watchmaker", by Richard Dawkins. It is an example showing how natural selection achieves better results for adaptability than random chance.

The idea is to start with a goal string g of upper case letters and spaces, for example

   METHINKS IT IS LIKE A WEASEL
An initial string of the same length is chosen randomly as the first parent, p. Unless you are really lucky this string will not resemble the goal string.

Measuring the closeness of two strings

We need a way to measure the closeness of such a string to the goal string. If g[0],...,g[n] are the ascii codes of the characters of g and p[0],...,p[n] are the codes for p then the closeness of p to g is defined as
   closeness(p, g) = abs(p[0] - g[0]) + ... + abs(p[n] - g[n])
This has the useful property that closeness(p,g) = 0 if and only if p = g

Introuducing mutations into a string

We want to see if the initial parent can be made to evolve toward the goal string by introducing random mutations in the initial parent string. Thus, we define a mutated string or child string c derived from p as follows
   c = p with one character replaced by a random letter or space

A random chance algorithm

As our first evolution attempt we can produce 1 child at each step, and use it as the parent for the next step. Here is the algorithm
   Choose a goal string g
   Choose an inital parent p randomly
   while (closeness(p,g) > 0)
      Choose new p = mutation of p
   end while
You can try this with the applet below by choosing 1 as the number of children and pressing the start button. The applet uses METHINKS IT IS A WEASEL as a default goal string. You will quickly discover that there is no convergence to the goal string. This is clear since with only one child per generation we have no natural selection, only random chance. Click the suspend button to stop the applet.

You may think that the goal string is too long so type a smaller one such as HELLO and press the start button. Again the chances of producing the goal string is pretty slim.

A natural selection algorithm

The trick is to choose several children at each generation. Each is obtained by a single character mutation from the previous parent. From these children choose the one closest to the goal string as the next parent and continue. Here is the algorithm
   Choose the number of children, n, per generation
   Choose a goal string g
   Choose an initial parent p randomly
   while (closeness(p,g) > 0)
      Choose n children c[0], ... c[n-1] as mutations of p
      Choose new p as the closest child to g
   end while
Now with a sufficient number of children we obtain rapid evolution to the goal string. For example, try 100 children in the applet and the default goal string. Convergence is obtained in 50 to 60 generations. Experiment with different strings and numbers of children.

The Evolution Simulation Applet (Java 1.0)