Tim Whitfort, Chris Matthews, Belinda Choi and John McCullagh
Genetic algorithms (GAs) and neural networks have been combined in various ways in an effort to develop powerful tools for problem solving. This work presents a two-stage process for the specification of high-performing backpropagation neural networks for four commonly used real-world databases. In the first stage, GAs are used to evolve a set of potential neural network architectures and operating parameters. The best networks are then used for further training and testing, to determine an optimum starting seed and number of training passes. We also compare the evolved network architectures and operating parameters from stage I with some of the backpropagation neural network (BPNN) design heuristics previously reported in the literature. In order to test and compare networks specified in this way, an exhaustive set of experiments using the machine learning induction algorithm, C4.5 were also carried out on the four databases. These results, together with some previously published benchmark data and BPNN results, were then compared with those obtained from the networks developed using our method. Our results compare more than favorably with the C4.5, benchmark, and previously published BPNN results across the four databases.
Keywords: Backpropagation neural networks, Genetic algorithms, Network architecture, Network parameters, Classification