Your Federal Quarterly Tax Payments are due April 15th Get Help Now >>

Introduction to Evolutionary Computation and Evolutionary Computation by ssy92676

VIEWS: 0 PAGES: 22

									Questions from Exercise 02     GA Examples - Web page              Summary   References




             Introduction to Evolutionary Computation and Evolutionary
                                Computation Module
                                      Tutorial 2

                                  V. Landassuri-Moreno
                               v.landassuri-moreno@cs.bham.ac.uk


                                    School of Computer Science
                                     University of Birmingham



                                     October 23, 2009




                                                                                      1/22
Questions from Exercise 02    GA Examples - Web page   Summary   References




 Outline



 Questions from Exercise 02



 GA Examples - Web page



 Summary




                                                                          2/22
 Questions from Exercise 02        GA Examples - Web page         Summary              References


 Q1. Random number in a given range




  Here is presented the code you can use to generate a random number between two
  values
      ◮   Given two number A and B
      ◮   where A < B
      ◮   you can use:
  randRange = A + (random ∗ (B − A))
      ◮   Is the interval A B inclusive or exclusive for the randRange value generated?, i.e.
          (A, B) or [A, B].




Questions from Exercise 02                                                                      3/22
 Questions from Exercise 02                GA Examples - Web page               Summary                    References


 Q2. General questions / EAs comparisons




      ◮   What to measure to compare two algorithms / implementations
              ◮   Normally you are focus in the fitness / Error.
              ◮   Note that the fitness could be different from the error.
              ◮   There are different metrics you can use:
                     1.      MSE
                     2.      RMSE
                     3.      NRMSE
                     4.      ...
              ◮   The parameter you can use will depend of the problem at hand, e.g. for Artificial
                  Neural Networks (ANNs) you can measure:
                     1. Error of the best individual: normally RMSE or NRMSE
                     2. Average error over the entire population
                     3. Average number of inputs, hidden nodes and connections (in case you are evolving
                        architectures)
                     4. . . .




Questions from Exercise 02                                                                                          4/22
 Questions from Exercise 02                                                                               GA Examples - Web page                                                                             Summary                          References


 Q2. Example
                                                                                                                                                                        13
                                                                                                                                                                                                                                A
                                                               1.3                                                                                                                                                              B
                                                                                                                       A
                                                                                                                       B
                                                              1.25                                                                                       12.5


                                                               1.2

                                                                                                                                                                        12
                                                              1.15




                                                                                                                                     Average Inputs
                                               Average NRMS
                                                               1.1

                                                                                                                                                         11.5
                                                              1.05


                                                                  1
                                                                                                                                                                        11

                                                              0.95


                                                               0.9
                                                                                                                                                         10.5

                                                              0.85


                                                               0.8                                                                                                      10
                                                                      0        50     100        150       200   250       300                                               0        50    100        150        200    250        300
                                                                                            Generations                                                                                            Generations




                                                                                            (a)                                                                                                    (b)

                                               15.5                                                                                                                         230
                                                                                                                           A                                                                                                        A
                                                                                                                           B                                                                                                        B

                                                                                                                                                                            225
                                                    15



                                                                                                                                                                            220
                                               14.5
                        Average Hidden Nodes




                                                                                                                                                      Average Connections
                                                                                                                                                                            215

                                                    14

                                                                                                                                                                            210


                                               13.5
                                                                                                                                                                            205



                                                    13
                                                                                                                                                                            200




                                               12.5                                                                                                                         195
                                                              0           50         100        150       200    250           300                                                0    50    100        150        200    250           300
                                                                                            Generations                                                                                             Generations




                                                                                            (c)                                                                                                    (d)

Questions from Exercise 02 Figure:                                                  ANN testing two values (A and B) for the same parameter                                                                                                            5/22
 Questions from Exercise 02               GA Examples - Web page             Summary                  References


 Q2.



      ◮   Only one run
              ◮   Given the nature of the EAs, you can not run your algorithm only one time and
                  compare it against other
              ◮   Because every time you run it, it is probably you will obtain different values (fortunately
                  closed values)
              ◮   For that reason you need to run your algorithm more than one time and perform some
                  statistics to compare them using the average values over many independent runs.
      ◮   Many runs
              ◮   You can find some works in the literature that use 90 or more runs to test their
                  algorithms
              ◮   But that value will depend of the complexity of your problem and the time you need to
                  solve it
              ◮   A common value could be 30 independent runs
              ◮   After your algorithm finish and you obtain the error per run over all generations, you
                  can use the STD, Min and Max values to have another point of view to analyse your
                  results




Questions from Exercise 02                                                                                      6/22
 Questions from Exercise 02        GA Examples - Web page           Summary                References


 Q2. Example




  Table: Average, Std, Min and Max values of the best individual from 30 independent runs for a
  given ANN solving the X problem. Values obtained after 300 generations of evolution

       Prob X                          Mean                 Std        Min          Max
       Number of Connections 200.3666667 46.30072863                  121           285
       Number of Inputs             12.7    1.841101619                9            16
       Number of Hidden Nodes 13.96666667 2.235810943                 10            18
       Error Training Set       14.56171314 55.46580903            6.60E-05      231.286
       Error Test Set Inside EA 0.570126023 0.344171878            8.90E-02      1.25983
       Error Final Test Set     0.615407433 0.176054045            0.305909      0.926824




Questions from Exercise 02                                                                          7/22
 Questions from Exercise 02         GA Examples - Web page         Summary             References


 Q3. Elitism




      ◮   Is useful to maintain a subpopulation of the best individuals
      ◮   It take the best individuals and pass them directly to the next generation without
          any modification.
      ◮   If it is not used you can lost best individual found in a given generation
      ◮   It may cause stuck in a local minimum or maximum




Questions from Exercise 02                                                                      8/22
 Questions from Exercise 02                   GA Examples - Web page                     Summary   References


                                              2
 Q4. Code for fitness function f (x1 , x2 ) = x1 + x2
      ◮   From tutorial 1 we know that the minimum is located when we have a value of
          zero, i.e. f (x1 , x2 ) = 0
                                      2
      ◮   Given any values for x1 and x2 in every generations, the algorithm could give
          some similar results like the next for the best individual.
      ◮   Gen 0 -> 59
      ◮   Gen 1 -> 25
      ◮   Gen 2 -> 20
      ◮   ...
  But if you want to calculate the probability to select one individual, you need to have a
  value that give more weight to best individuals. Thus, one way is to use the inverse of
  the fitness:


    fitness[i] = (fitness_Pop[i][j] *
       fitness_Pop[i][j]) + fitness_Pop[i][j+1];
    fitness_inv[i] = 1/fitness[i];


  where the index i correspond to the variable x1 and j to x2 .
  Note: This code only take care of minimisation, check/modify as required for negative numbers
Questions from Exercise 02                                                                                 10/22
 Questions from Exercise 02                   GA Examples - Web page              Summary                  References


 Example - minimisation

                              Table: Roulette-wheel selection and probability = 1-probability

                      Individual        Fitness      Probability Roulette − wheel           1 − roulette
                      1                 50           0.5                                    0.5
                      2                 30           0.3                                    0.7
                      3                 10           0.1                                    0.9
                      4                 5            0.05                                   0.95
                      5                 5            0.05                                   0.95
                      sum               100          1                                      4

                               Table: Roulette-wheel selection probability = 1 / probability

                        Individual       Fitness       1/fitness        Probability Roulette − wheel
                        1                50            0.02            0.0361
                        2                30            0.0333          0.062
                        3                10            0.1             0.1807
                        4                5             0.2             0.3615
                        5                5             0.2             0.3615
                        sum              100           0.5533          ≈1

Questions from Exercise 02                                                                                         11/22
 Questions from Exercise 02                   GA Examples - Web page              Summary                  References


 Example - minimisation

                              Table: Roulette-wheel selection and probability = 1-probability

                      Individual        Fitness      Probability Roulette − wheel           1 − roulette
                      1                 50           0.5                                    0.5
                      2                 30           0.3                                    0.7
                      3                 10           0.1                                    0.9
                      4                 5            0.05                                   0.95
                      5                 5            0.05                                   0.95
                      sum               100          1                                      4

                               Table: Roulette-wheel selection probability = 1 / probability

                        Individual       Fitness       1/fitness        Probability Roulette − wheel
                        1                50            0.02            0.0361
                        2                30            0.0333          0.062
                        3                10            0.1             0.1807
                        4                5             0.2             0.3615
                        5                5             0.2             0.3615
                        sum              100           0.5533          ≈1

Questions from Exercise 02                                                                                         11/22
 Questions from Exercise 02         GA Examples - Web page           Summary        References


 Q5. Cauchy mutation in binary space?




      ◮   To create a binary individual at random it is needed an ’if’ condition
      ◮   If a random number is under 0.5, we put a ’0’, else ’1’.
      ◮   A Cauchy random number was developed for real representations, it generates
          random numbers with no limits (that is in theory).
      ◮   Then, could be useful to use a Cauchy random number and a condition ’if’ to
          generate a random binary string?
      ◮   On the contrary, first generate a Cauchy random number and then convert it to a
          binary string. What could happen? Have you implemented this idea?




Questions from Exercise 02                                                                  12/22
 Questions from Exercise 02             GA Examples - Web page           Summary              References


 Q6. Cauchy mutation




      ◮   The main point was to implement a Normal and Cauchy random numbers. The
          algorithms presented here were taken from [Sau]
      ◮   Cauchy
              ◮   Algorithm:
                  (1) Generate a uniform random number U{−1/2, 1/2}
                  (2) Return X = a + b tan(πU)
                  where a = 0 and b = 1/2, (value b given during lectures)
      ◮   Normal (Gaussian)
              ◮   Algorithm:
                  (1) Independently generate U1 = U(−1, 1) and U2 = U(−1, 1)
                                  2
                  (2) Set U = U1 + U2   2
                                                     p
                  (3) if U < 1, return X = µ + σ ∗ U1 −2lnU/U; otherwise, go back to step 1




Questions from Exercise 02                                                                            13/22
 Questions from Exercise 02         GA Examples - Web page          Summary              References


 Q7. Percentage of Crossover and mutation



  Is there an standard value?
      ◮   The normal percentage value that you can find in publications are: 70% for
          Crossover and 30% for mutation
      ◮   Nevertheless, at the end you new to chose the best value that suits your
          requirements (algorithm)
  What happen if crossover is set to 100% and mutation to 0%
      ◮   If the algorithm reaches a local optima it will get stuck there indefinitely.
  Percentage of mutation to 100% and crossover to 0%
      ◮   In this case, the algorithm do not put attention if good individuals are selected.
          There is no bias from parents to offspring that drive the evolution, consequently
          you are performing a random search.




Questions from Exercise 02                                                                       14/22
 Questions from Exercise 02         GA Examples - Web page        Summary                References


 GA Examples - Web page




  In this section it is going to presented some graphical examples (from a web page)
  that let us see what happen when we run a GA.
     ◮    The evolution of individuals over an objective function could be an abstract
          concept
     ◮    However, it is possible to understand the behavior of our algorithm and evolution
          of individuals if we plot the function to optimize (only in easy functions) and
          individuals in a graphical interface.
     ◮    In this case [Obi] present some applets that let us plot the objective function and
          the evolution of a population with a generic GA




GA Examples - Web page                                                                           15/22
 Questions from Exercise 02         GA Examples - Web page        Summary              References


 Basic Description of GA




     ◮    Basic Description of a GA can be found in the Web page:
          http://www.obitko.com/tutorials/genetic-algorithms/ga-basic-description.php
     ◮    Where a simplest function is used to see the evolution of several individual while
          they try to reach the minimum of the function.
     ◮    Note that you can run it step by step.




GA Examples - Web page                                                                         16/22
 Questions from Exercise 02            GA Examples - Web page   Summary          References


 GA Example 1D function




     ◮    In this example
          http://www.obitko.com/tutorials/genetic-algorithms/example-function-minimum.php
          it is possible to see:
             ◮   The parents chosen
             ◮   One point crossover
             ◮   How is mutated the offspring
             ◮   The fitness of the new individual
             ◮   How are stored the offspring
             ◮   Elitism




GA Examples - Web page                                                                   17/22
 Questions from Exercise 02          GA Examples - Web page          Summary               References


 Crossover and mutation probabilities




     ◮    In this section it is possible to run an applet with different probabilities values for
          Crossover and Mutation, as well, activate an option if the algorithm use elitism or
          not.
          Web page:
          http://www.obitko.com/tutorials/genetic-algorithms/parameters.php
     ◮    The plot of the bottom presents the best solution with a red line and the average
          fitness with a blue line.
     ◮    From question 7, it was shown what happen if you create all the new population
          with crossover or mutation, here you can play with the values and corroborate
          that.




GA Examples - Web page                                                                              18/22
 Questions from Exercise 02        GA Examples - Web page         Summary               References


 GA Example 2D function




     ◮    In this applet you can modify the objective function and see in a graphical
          interface the results.
     ◮    You can play with the Crossover and Mutation rates.
     ◮    Change the elitism (on/off) or Maximize/Minimize.
          Web page:
          http://www.obitko.com/tutorials/genetic-algorithms/example-3d-function.php




GA Examples - Web page                                                                          19/22
 Questions from Exercise 02        GA Examples - Web page       Summary                References


 Traveling salesman problem (TSP)




     ◮    TSP problem is presented in this section, allowing you to add nodes to the
          existing solution
     ◮    Different Crossover and mutation operators
     ◮    Web page:
          http://www.obitko.com/tutorials/genetic-algorithms/tsp-example.php




GA Examples - Web page                                                                         20/22
 Questions from Exercise 02        GA Examples - Web page         Summary              References




  Summary
     ◮    We reviewed the questions from exercise 2.
     ◮    Is has been presented the evolution of individual in a graphical environment using
          the applets form
          http://www.obitko.com/tutorials/genetic-algorithms/index.php.
     ◮    You can play with the values and functions to see different scenarios and
          behaviors.
     ◮    It was stated some important remarks that you need to take into account when
          you run your EAs, i.e. plot evolution of fitness and parameter per generation, like
          average Crossover and Mutation (they are optional).




Summary                                                                                        21/22
 Questions from Exercise 02         GA Examples - Web page          Summary               References




  References
          Marek Obitko.
          Introduction to genetic algorithms,
          http://www.obitko.com/tutorials/genetic-algorithms/index.php.
          Accessed on 10/10/2009.
          Richard Saucier.
          Computer generation of statistical distributions. http://ftp.arl.mil/random/.
          Accessed on 10/10/2009.




References                                                                                        22/22

								
To top