Discovery of Neural Network Weight Update Equations Through Genetic Programming
Keywords:
Software Engineering, Machine learning algorithms, Genetic programmingAbstract
The current state of machine learning algorithms is that they mostly rely on manually crafted designs. How to update the weights in a Neural Network (NN) is still a currently researched and open topic of discussion. Genetic programming (GP) is a machine learning technique that can automatically generate functions. It does this by simulating the process of natural evolution, where programs are encoded as chromosomes and undergo mutation, crossover, and selection. This process allows genetic programming to find new and innovative algorithms that may outperform those that are manually designed. Apply GP to the task of discovering functions describes how the weights can be changed in a NN. The proposed method has the potential to discover new equations that can enhance the performance of machine learning algorithms, such as an evolved weight change equation or the back-propagation operation of neural networks. The performance of these new algorithms are compared with the commonly used handcrafted designs. This research could potentially lead to overall improvements in the performance and efficiency of machine learning algorithms, contributing to more advanced and accurate models. The current findings from the GP developed are equations that are similar to the manually crafted weight change equation (back propagation) used in Neural Network’s hidden-layer weights. The GP algorithm also re-discovered the standard weight change equation for a perceptron. The end goal is to re-discover the whole weight change equation for all layers of a Neural Network and show that GP is able to discover complex equations.