Evolutionary and Heuristic Approaches for the Selection of Neural Network Architectures and Parameters1
Tim Whitfort, Chris Matthews, Belinda Choi and John McCullagh
Division of Information Technology, La Trobe University P.O. Box 199, Bendigo, Victoria, 3552, Australia
Received:April 12, 1998Accepted:August 29, 1998Published:December 20, 1998
Keywords:Backpropagation neural networks, Genetic algorithms, Network architecture, Network parameters, Classification
Genetic algorithms (GAs) and neural networks have been combined in various ways in an effort to develop powerful tools for problem solving. This work presents a two-stage process for the specification of high-performing backpropagation neural networks for four commonly used real-world databases. In the first stage, GAs are used to evolve a set of potential neural network architectures and operating parameters. The best networks are then used for further training and testing, to determine an optimum starting seed and number of training passes. We also compare the evolved network architectures and operating parameters from stage I with some of the backpropagation neural network (BPNN) design heuristics previously reported in the literature. In order to test and compare networks specified in this way, an exhaustive set of experiments using the machine learning induction algorithm, C4.5 were also carried out on the four databases. These results, together with some previously published benchmark data and BPNN results, were then compared with those obtained from the networks developed using our method. Our results compare more than favorably with the C4.5, benchmark, and previously published BPNN results across the four databases.
Cite this article as:T. Whitfort, C. Matthews, B. Choi, and J. McCullagh, “Evolutionary and Heuristic Approaches for the Selection of Neural Network Architectures and Parameters1,” J. Adv. Comput. Intell. Intell. Inform., Vol.2 No.6, pp. 214-220, 1998.Data files: