# Genetic Algorithms by gyvwpsjkko

VIEWS: 9 PAGES: 28

• pg 1
```									Genetic Algorithms
Techniques
    combinatorial problems and optimization techniques are
characterized by looking for a solution to a problem from among
many potential solutions
    in many cases an exhaustive search is unfeasible
  some form of directed search is undertaken:
    branch-and-bound search
    dynamic programming
    hill climbing
    simulated annealing
    The name and inspiration come from annealing in metallurgy, a technique
involving heating and controlled cooling of a material to increase the size of
its crystals and reduce their defects. The heat causes the atoms to become
unstuck from their initial positions (a local minimum of the internal energy)
and wander randomly through states of higher energy; the slow cooling
gives them more chances of finding configurations with lower internal
energy than the initial one.
    genetic algorithms
Genetic Algorithms
    attempt to mimic the process of natural evolution in
a population of individuals
    use the principles of selection and evolution to produce
several solutions to a given problem.
    biologically-derived techniques such as inheritance,
mutation, natural selection, and recombination
    a computer simulation in which a population of
abstract representations (called chromosomes) of
candidate solutions (called individuals) to an
optimization problem evolves toward better
solutions.
    over time, those genetic changes which enhance the
viability of an organism tend to predominate
Crossover (recombination)
    evolution works at the chromosome level through
the reproductive process
    portions of the genetic information of each parent are
combined to generate the chromosomes of the offspring
    termed crossover
Mutation
    In addition to crossover, a random change
may occur in the chromosome pattern of an
individual
    termed a mutation
    can lead to individuals whose chromosome
patterns differ significantly from parents
    similar to effects of radiation, disease
    high mutation rate leads to nonconvergence and
instability
Genetic Algorithms
    The evolution starts from a population of
completely random individuals and happens
in generations. In each generation:
     multiple individuals are stochastically selected
from the current population based on relative
“fitness” of reproducing members
    modified (mutated or recombined) to form a new
population, which becomes current in the next
iteration of the algorithm.
Sequential Genetic Algorithm
generation_no = 0;
initialize Population(generation_no);
evaluate Population(generation_no);
set termination_condition to False;
while (not termination condition) {
generation_no++;
select Parents(generation_no) from
Population(generation_no-1);
apply crossover to Parents(generation_no) to get
Offspring(generation_no);
apply mutation to Offspring(generation_no) to get
Population(generation_no);
evaluate Population(generation_no);
update termination_condition;
}
Initial Population
    Solution “chromosomes” are usually represented by
0’s and 1’s – binary strings
    Initially several chromosomes are generated - first
generation pool.
    may be totally random, or the seeded with "hints" to form
an initial pool of possible solutions.
    Enough individuals so as not to restrict the solution
    20 to 1000 usually
    e.g. Find maximum of

f ( x, y, z ) = − x 2 + 1000000 x − y 2 − 40000 y − z 2
Finding maximum of a function
  ith potential solution consists of 3-tuple
(xi,yi,zi) in which each component has one of
the possible integers in the interval
-1,000,000 <= component <=+1,000,000
  21 bits to represent each of components
    Concatentated, so single solution has 63 bits
  RNG to generate initial population
  Equation is fitness function
f ( x, y, z ) = − x 2 + 1000000 x − y 2 − 40000 y − z 2
Population
    How many individuals?
    Small number decreases liklihood of a good
solution
    Large number increases computation required
    Usually 20 to 1000
Selection Process
    In each successive generation, each individual is
evaluated with fitness function.
    The next generation pool of organisms is generated
using any or all of the genetic operators: selection,
crossover (or recombination), and mutation.
     A pair of organisms are selected for breeding.
    Selection is biased towards individuals with greater fitness
– selective pressure
    usually not so biased that poorer elements have no chance
to participate, in order to prevent the solution set from
converging too early to a sub-optimal or local solution.
    several well-defined organism selection methods:
  roulette wheel selection and tournament selection
Selection Process
    tournament selection:
    each individual is equally likely to be selected to participate in a
tournament
    a set of k individuals is selected for each tournament and
evaluated
    winner of each tournament is the fittest and will become a parent
    n tournaments are conducted to produce n parents
    when k is 1 no selective pressure, when k is large the selective
pressure is large
    k=2 typical value (medieval joust)
Crossover Methods
    Single-Point Crossover
    randomly-located cut is made at the pth bit of each parent and
crossover occurs
    produces 2 different offspring
    Multi-Point Crossover
    more cuts, greater inter-mingling
    Uniform Crossover
    each bit randomly selected from either parent
    Random selection form a gene-pool of parents
Mutation Methods
    Typical genetic algorithms have a fixed, very small probability
of mutation of perhaps 0.01 or less.
    A random number between 0 and 1 is generated; if it falls
within the range, the new child organism's chromosome is
randomly mutated in some way, typically by simply randomly
altering bits in the chromosome data structure.
    beneficial mutations will continue, detrimental will die out
Genetic Algorithms -
Variations
  allow some of the better organisms from the
first generation to carry over to the second,
unaltered. This form of genetic algorithm is
known as an elite selection strategy.
  Randomly create a few new individuals in
each generation, rather than just at the
beginning
  Allow population size to vary from one
generation to next
Termination Conditions
  number of iterations
  measure of degree of improvement between
successive generations
    may suffer from oscillating solutions
    measure degree of similarity between
individuals in a population
    similarity will increase as population converges
    combine a few termination conditions
Genetic Algorithms:
  do an excellent job of searching through a
large and complex search space.
  most effective in a search space for which
little is known.
    very large set of candidate solutions
    the search space is uneven and has many hills
and valleys
Parallel Genetic Algorithms
  Genetic algorithms are extremely easy to
  Three different models for parallel genetic
algorithms:
    the global model
    the diffusion model
    the migration model.
Parallel Genetic Algorithms:
Global model
    Self-scheduling Farmer/
worker architecture
    one node the farmer,
responsible for
responsible for
recombination and
mutation
    the other nodes workers
responsible for fitness
evaluation
    both synchronous and
asynchronous algorithms
    asynchronous changes
dynamics
Parallel Genetic algorithms:
Diffusion model
    The diffusion model
handles every individual
separately and selects
the mating partner in a
local neighbourhood
similar to local selection.
    Thus, a diffusion of
information through the
population takes place.
    During the search
virtual islands will
evolve.
Migration model
    The migration model divides the population in
multiple isolated subpopulations. These
subpopulations evolve independently from each
other for a certain number of generations (isolation
time).
    each processor
    independently generates its own initial subpopulation of
individuals
    carries out n generations
    after every k generations, a number of individuals is
distributed between the subpopulations (migration).
Migration model
    The migration of individuals from one population to another
is controlled by several parameters like:
  topology that defines connections between subpopulations
  the number of exchanged individuals (migration rate)
  migration scheme – which individuals migrate and which
are replaced
  migration interval

    Most popular method
Isolated Subpopulations
    Migration Models:
    Island Model
    individuals are allowed to
be sent to any other
subpopulation – no
restrictions on migration
    better model of nature in
some ways
and delay
Isolated Subpopulations
    Migration Models:
    Stepping stone model/ ring
model
    reduces communication by
restricting migration to
neighbours

    Neighbourhood migration
Migration
Parallel Pseudocode for
worker process
generation_no = 0;
initialize Population(generation_no);
evaluate Population(generation_no);
set termination_condition to False;
while (not termination condition) {
generation_no++;
select Parents(generation_no) from
Population(generation_no-1);
apply crossover to Parents(generation_no) to get
Offspring(generation_no);
apply mutation to Offspring(generation_no) to get
Population(generation_no);
apply migration to Population(generation_no);
evaluate Population(generation_no);
update termination_condition;
}
Parallel Algorithms
    Isolated Populations
    embarrassingly parallel except for migrations
    quality of solution differs from serial version – new
class of algorithm that searches solution space
differently
    the relatively isolated subpopulations may result in
missing the global solution
    permits speciation
Parallel Algorithms
    Parallelizing a common population
    each processor could work on subset of
populations
    However, relative time of computation to
communication not likely to be favourable
    Maybe OK on shared memory systems
    quality of solution unaffected

```
To top