Extending Learning Feasibility Through Feedforward Sequential Learning
Michael K. Weir* and Li Hui Chen**
*Department of Mathematics and Computer Science, University of St Andrews St Andrews, U.K.
**Division of Information Engineering, School of EEE, Nanyang Technological University Singapore.
In this paper, a sequence-based neural network approach called feedforward sequential learning (FSL) is proposed for extending the range of feasibility for feedforward networks in the three areas of architecture, training, and generalization. The extension is enabled through a spatio-temporal indexing scheme that decomposes the task into a sequence of simpler subproblems. Each subproblem is then solved by a separate weight state. The separate trained weight states are then combined into a continuous final weight state sequence to enable smooth generalization. FSL can be used to train mappings of analog or discrete I/O with underlying continuity for pattern association or classification. Implementation of FSL is illustrated and tested by learning the 2-spirals problem and an extended 4-spiral version. Training is found to be faster and more robust than its single-state counterpart. The generalization obtained indicates that the underlying patterns are classified more smoothly with FSL. Overall, the results suggest FSL to be a feasible approach to consider for complex and decomposable tasks.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.