single-jc.php

JACIII Vol.5 No.5 pp. 300-305
doi: 10.20965/jaciii.2001.p0300
(2001)

Paper:

Results of Bias-variance Tests on Multi-layer Perceptron Neural Networks

Wimpie D. Nortje*, Johann E. W. Holm**, Gerhard P. Hancke***, Imre. J. Rudas****, and Laszlo. Horvath*****

*,**,***Department of Electrical, Electronic and Computer Engineering University of Pretoria Pretoria South Africa

****,

*****Budapest Polytechnic, Budapest, Npsznhz u. 8, Budapest, H-1081 Hungary

Received:
April 1, 2001
Accepted:
June 1, 2001
Published:
September 20, 2001
Keywords:
multi-layer perceptron, neural network, biasvariance test, multiple network, Bayesian network
Abstract
Training neural networks involves selection of a set of network parameters, or weights, on account of fitting a non-linear model to data. Due to the bias in the training data and small computational errors, the neural networks’ opinions are biased. Some improvement is possible when multiple networks are used to do the classification. This approach is similar to taking the average of a number of biased opinions in order to remove some of the bias that resulted from training. Bayesian networks are effective in removing some of the bias associated with training, but Bayesian techniques are tedious in terms of computational time. It is for this reason that alternatives to Bayesian networks are investigated.
Cite this article as:
W. Nortje, J. Holm, G. Hancke, I. Rudas, and L. Horvath, “Results of Bias-variance Tests on Multi-layer Perceptron Neural Networks,” J. Adv. Comput. Intell. Intell. Inform., Vol.5 No.5, pp. 300-305, 2001.
Data files:

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 05, 2024