Neural Networks Critical View
Shah Muhammad Butt MS CS Network & Communication FUUAST, Islamabad
Using Genetic Algorithms with Neural Networks
This essay is a very basic introduction about how to use genetic algorithms to help train neural networks. I also wrote a simple program to accompany the essay that uses genetic algorithms to evolve an XOR network. Evolving Weights This is the most common use of a genetic algorithm in conjunction with a neural network. Since genetic algorithms are excellent at searching a state-space, searching for neural network weights is an ideal application. Simply set up the genetic algorithm to evolve a string of floating point numbers (within the range that you specify) that can be used as weights for the network. The biggest trouble with using a GA is specifying the range of the weights. Since we generally don't know the range we have to estimate, and use a trail and error method to correct/optimize them. Generally network weights should not be too big - for example, the XOR net weights don't get larger in the absolute than 3. Another problem with genetic algorithms and neural networks is figuring out an appropriate method of reproducing and crossing over the weights. It all depends on how you have set your weights up. Going back to our example with the XOR net, our network is small and the weights are easily represented by a 3x3 array:
m_fWeights 0 0 0 0 0 0 0 0 0
For this kind of set up, I swap over groups of weights. In the XORGA program, I select two population examples, one from the lower error (LE) half, and another from the higher error (HE) half. I then give the HE the LE weights for the final layer (the blue weights in the above diagram), and put it back in the population. Yet if your neural network is a lot more complicated, then your representation of the weights will not be as simple. However you decide to do it, it is recommended that you keep weights grouped together, otherwise the resultant weights are as a good as random. Mutation is also a genetic operator you should consider. In the example program, there is a rather high (10%) chance of mutation, and then the weights are altered by anything between -1 and 1.
When to Use
Genetic algorithms are an option, but they are not by any stretch always the best option. The genetic algorithm is a lot slower than back-propagation when applied to the XOR problem. The GA gives better results than the BP example used:
BackPropagation: 0,0 = 0.0494681 0,1 = 0.955633 1,0 = 0.942529 1,1 = 0.0433488 Genetic 0 xor 0 005 0 xor 1 1 xor 0 1 xor 1 Algorithm: = 2.47602e= 0.997028 = 0.999292 = 0.010474
The genetic algorithm finds more accurate results, but the back-propagation is close to instantaneous, whereas the genetic algorithm will take anything between 5-20 seconds (233 Mhz test computer).
Since the overall architecture of the network is imperative the operation, a lot of research has focused on using evolutionary techniques to evolve the best architecture (much like the evolution of our own brains). One simple method is to use a boolean NxN matrix, where N is the number of neurons in the network. Any given place on the matrix refers to a connection between neuron X and neuron Y. For example, for the XORNet:
1 2 3 4 5 1 0 0 0 0 0 2 0 0 0 0 0 3 1 1 0 0 0 4 1 1 0 0 0 5 0 0 1 1 0
This is a very simple method, and gets inefficient for the large networks that architectural optimization is often applied to. For a more information, see Melanie Mitchell's book An Introduction to Genetic Algorithms.
These are only two applications of GAs to neural networking - other areas include function minimizing, local minima avoidance and other "tweaking" techniques. Remember that any parameters can be evolved by a genetic algorithm, but how much it will affect the overall performance of the network varies from parameter to parameter - the weights are obviously the most important. Remember, only use a genetic algorithm when other training methods are inefficient, not practical or you feel that genetic algorithms will provide an advantage over other training methods.