JACIII Vol.9 No.6 pp. 580-589
doi: 10.20965/jaciii.2005.p0580


Improved MLP Learning via Orthogonal Bipolar Target Vectors

Shigueo Nomura*, Keiji Yamanaka**, Osamu Katai*, Hiroshi Kawakami*, and Takayuki Shiose*

*Graduate School of Informatics, Kyoto University, 606-8501 Kyoto, Japan

**Faculty of Electrical Engineering, Federal University of Uberlândia, 38400-902 Uberlândia, Brazil

February 25, 2005
June 20, 2005
November 20, 2005
Multilayer Perceptron learning, pattern recognition, degraded image,orthogonal bipolar vector, target vector
In this paper, we present an approach to improve "Multilayer Perceptron" (MLP) learning by adopting orthogonal bipolar vectors as expectation values. These vectors differ from conventional vectors in two main ways. First, since they are larger than conventional ones, they use more synaptic connections among hidden and output neurons. Second, since they are orthogonal vectors with bipolar representation, the Euclidean distance for these vectors (in a Euclidean space Rn) increases when their number (n) of components increases. We present an algorithm for constructing orthogonal bipolar vectors used as target vectors. These target vectors are used in experiments for training by backpropagation algorithm with an MLP model that classifies characters extracted from degraded license plate images. Experimental results are obtained by using both orthogonal bipolar and conventional target vectors. Comparisons of results lead to conclusions that recognition of degraded characters are considerably improved with the proposed expectation values for MLP learning.
Cite this article as:
S. Nomura, K. Yamanaka, O. Katai, H. Kawakami, and T. Shiose, “Improved MLP Learning via Orthogonal Bipolar Target Vectors,” J. Adv. Comput. Intell. Intell. Inform., Vol.9 No.6, pp. 580-589, 2005.
Data files:

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 12, 2024