Gentic Algorithm Notes

Updated 5th September 2007: First content added.

Creative Commons License Genetic Algorithm Notes by Matt Oates is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 License.


The following are some notes about Genetic Algorithms (GAs) adpated from my undergraduate dissertation. The material has been updated to include more practical examples on how to create a GA for yourself.


Programmed evolutionary systems have been investigated since the availability of early digital computers during the 1950's and 60's. In the 1970's the Genetic Algorithm came out as a culmination of this early work, with John Holland's Schema Theorem (Holland, 1975, 1992) formally describing a general case for solving problems iteratively by the process of evolved sexual combination of bit strings. Specifically the Schema Theorem outlined the use of schemas as an abstraction of directly evolving long and complicated bit sequences, instead patterns of commonality between fit bit strings were evolved. These early works give the common heritage of genetic crossover, mutation, fitness measuring, and evolutionary iteration to any evolutionary system, that now comes under the umbrella term Genetic Algorithm. Genetic Algorithms were popularised in the 1980's and 90's by David E. Goldberg and his book "Genetic algorithms for search, optimization, and machine learning", this is amongst the most cited works in Computer Science (CiteSeer, 2006). Goldberg furthered the effort of showing that GAs were not just an abstract novelty, but could be applied to real world problems reliably and efficiently; typified by the gas pipeline optimisation problems detailed in his early PhD work.

General Algorithm

  1. Create a random initial population of chromosomes of bit-strings, character-strings, object-arrays. Any data representation can be used to represent your problem, so long as it is string or list based.
  2. Calculate and record the fitness or suitability of each solution in the population to solve the problem.
  3. Select a number of solutions available from the population or pool. This should introduce bias towards fitter solutions in each subsequent algorithm epoch. Many methods of selecting solutions have been explored but the most comprehensive leave the possibility for less fit solutions to be passed from one generation to the next.
  4. Operators now act on the selected chromosomes with a given probability of affecting any given chromosome; this is conceptually equivalent to reproduction in Biology. This process should be blind to the fitness of any solution but tailored so that it is not overly destructive to the internal structure of a chromosome to cripple its fitness in the next generation. These operators commonly include sexual crossover and random mutation.
  5. Collate a new generation from the set of solutions outputted from the genetic operators.
  6. If an upper limit of iterations (epochs) has been reached, or a level of desired fitness within the population has been reached end the algorithm offering the most fit solution as output. If these conditions are not met, continue to stage 1 for another epoch.









The information provided on this and other pages by me, Matt Oates (, is under my own personal responsibility and not that of Aberystwyth University or the University of Bristol. Similarly, any opinions expressed are my own and are in no way to be taken as those of either institute.