single-jc.php

JACIII Vol.8 No.6 pp. 621-626
doi: 10.20965/jaciii.2004.p0621
(2004)

Paper:

Studies on Effects of Initialization on Structure Formationand Generalization of Structural Learning with Forgetting

Hiroshi Shiratsuchi*, Hiromu Gotanda**, Katsuhiro Inoue***, and Kousuke Kumamaru***

*Faculty of Engineering, University of the Ryukyus, Nishihara, Okinawa 903-0213, Japan

**Kinki University School of Humanity-Oriented Science and Engineering

***Faculty of Computer Science and Systems Engineering, Kyushu Institute of Technology

Received:
October 6, 2003
Accepted:
July 8, 2004
Published:
November 20, 2004
Keywords:
multilayer neural network, initialization, generalization, structural
Abstract

In this paper, our proposed initialization for multilayer neural networks (NN) applies to the structural learning with forgetting. Initialization consists of two steps: weights of hidden units are initialized so that their hyperplanes pass through the center of gravity of an input pattern set, and weights of output units are initialized to zero. Several simulations were performed to study how the initialization effects the structure formation of the NN. From the simulation result, it was confirmed that the initialization gives better network structure and higher generalization ability.

Cite this article as:
Hiroshi Shiratsuchi, Hiromu Gotanda, Katsuhiro Inoue, and Kousuke Kumamaru, “Studies on Effects of Initialization on Structure Formationand Generalization of Structural Learning with Forgetting,” J. Adv. Comput. Intell. Intell. Inform., Vol.8, No.6, pp. 621-626, 2004.
Data files:

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Mar. 05, 2021