allele: the basic genetic unit, expressed as some trait
chromosome: a collection of alleles that make up an individual's genotype
genotype: the encoded description of an individual
phenotype: the individual as expressed by the genotype
i've put some source code showing how to use the simple GA here:
http://music.columbia.edu/~douglas/evo.tar.gz
"Genetic programming is much more powerful than genetic algorithms. The output of the genetic algorithm is a quantity, while the output of the genetic programming is a another computer program. In essence, this is the beginning of computer programs that program themselves.
Genetic programming works best for several types of problems. The first type is where there is no ideal solution, (for example, a program that drives a car). There is no one solution to driving a car. Some solutions drive safely at the expense of time, while others drive fast at a high safety risk. Therefore, driving a car consists of making compromises of speed versus safety, as well as many other variables. In this case genetic programming will find a solution that attempts to compromise and be the most efficient solution from a large list of variables.
Furthermore, genetic programming is useful in finding solutions where the variables are constantly changing. In the previous car example, the program will find one solution for a smooth concrete highway, while it will find a totally different solution for a rough unpaved road."
from: http://www.geneticprogramming.com/Tutorial/index.html
a two-dimensional space
an ant nest
a bunch of ants
ant pheromones
a food source/target
an ant moves randomly about the space, except when it encounters an pheromone-based ant trail. when a pheromone scent is present, its strength determines the probability that the ant will follow the trail.
the ant continues wandering around (and following trails) until it encounters the food source, at which point it grabs some food and returns directly to the nest. on its way back to the nest it leaves a pheromone trail. when it reaches the nest it dumps the food and goes back to wandering around the space.
over time as ants find the food source, various routes will be discovered. the shorter routes will build up stronger pheromone trails since in a given amount of time the ants can make more trips on a shorter trail than on a longer one (thus leaving the shorter trail with more pheromones on it).
this process establishes a strong positive feedback loop: the shorter the trail, the stronger the pheromones on that trail. the stronger the pheromones on a trail, the more likely it is that an ant will follow that trail. thus ants will tend, over time, to favor shorter trails.
while the ants will favor shorter trails, those trails may not be the shortest trails. there are a few ways to inject some entropy into the process to keep the ants from falling into local minima too quickly. one is to add time-based pheromone strength decay. this will help avoid having the ants settle into a trail that is shorter than many other trails, but is still much longer than the best trails. since it will take the ants a long time to traverse the shortest-but-still-too-long trail, most of the pheromone they lay down will decay by the time they get back to the nest. that leaves other ants free to search for other, shorter trails.
another technique is to put barriers across some trails. this forces the ants to make their way around the barriers, thereby possibly finding shorter trails (or more interesting ones anyway).
you can also play with the pheromone-following probability settings. for instance you could make the probability increase as the inverse log of the pheromone strength so that a trail would have to get pretty strong before most ants would start following it. that would encourage ants to explore more of the space before settling into certain paths.
so what? here are some ideas for using the ACA:
"11.1.4 Simulated Annealing
Simulated annealing [Fox:88mm], [Hajek:88a], [Kirkpatrick:83a], [Otten:89a] is a very general optimization method which stochastically simulates the slow cooling of a physical system. The idea is that there is a cost function H (in physical terms, a Hamiltonian) which associates a cost with a state of the system, a ``temperature'' T, and various ways to change the state of the system. The algorithm works by iteratively proposing changes and either accepting or rejecting each change. Having proposed a change we may evaluate the change in H. The proposed change may be accepted or rejected by the Metropolis criterion; if the cost function decreases the change is accepted unconditionally; otherwise it is accepted but only with probability. A crucial requirement for the proposed changes is reachability or ergodicity-that there be a sufficient variety of possible changes that one can always find a sequence of changes so that any system state may be reached from any other.
When the temperature is zero, changes are accepted only if H decreases, an algorithm also known as hill-climbing, or more generally, the greedy algorithm [Aho:83a]. The system soon reaches a state in which none of the proposed changes can decrease the cost function, but this is usually a poor optimum. In real life, we might be trying to achieve the highest point of a mountain range by simply walking upwards; we soon arrive at the peak of a small foothill and can go no further.
On the contrary, if the temperature is very large, all changes are accepted, and we simply move at random ignoring the cost function. Because of the reachability property of the set of changes, we explore all states of the system, including the global optimum.
Simulated annealing consists of running the accept/reject algorithm between the temperature extremes. We propose many changes, starting at a high temperature and exploring the state space, and gradually decreasing the temperature to zero while hopefully settling on the global optimum. It can be shown that if the temperature decreases sufficiently slowly (the inverse of the logarithm of the time), then the probability of being in a global optimum tends to certainty [Hajek:88a]."
from: http://www.npac.syr.edu/copywrite/pcw/node252.html
http://www.aridolan.com/ga/gaa/gaa.html
(java applets)
http://www.aic.nrl.navy.mil/galist/src/
(source code)
http://www.taygete.demon.co.uk/java/ga/
(simple ga animated applet)
http://www.jps.net/shamblen/pro/thesis.html
(music notation GA)
http://www.cs.cmu.edu/Groups/AI/html/faqs/ai/genetic/
(genetic *** FAQ)