Paper:
An Incremental Neural Network for Online Supervised Learning and Topology Learning
Youki Kamiya*, Shen Furao**, and Osamu Hasegawa**,***
*Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, R2-52, 4259 Nagatsuta, Midori-ku, Yokohama 226-8503, Japan
**Imaging Science and Engineering Lab., Tokyo Institute of Technology
***PRESTO, Japan Science and Technology Agency (JST)
- [1] B. Santosa, T. B. Trafalis, and T. Conway, “Knowledge Base-Clustering and Application of Multi-Class SVM for Genes Expression Analysis,” ASME Press, 2002.
- [2] L. P. Li, C. R. Weinberg, T. A. Darden, and L. G. Pedersen, “Gene selection for sample classification based on gene expression data: study of sensitivity to choice of parameters of the GA/KNN method,” Bioinformatics, Vol.17, No.12, pp. 1131-1142, 2001.
- [3] Y. Freund and R. E. Schapire, “A decision-theoretic generalization of online learning and an application to boosting,” Journal of Computer and System Sciences, Vol.55, No.1, pp. 119-139, 1997.
- [4] L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, “Classification and Regression Trees,” Belmont, CA: Wadsworth, 1984.
- [5] M. A. Mendez, C. Hodar, C. Vulpe, M. Gonzalez, and V. Cambiazo, “Discriminant analysis to evaluate clustering of gene expression data,” Federation of European Biochemical Societies Letters, Vol.522, Nos.1-3, pp. 24-28, 2002.
- [6] T. Kohonen, “Self-organization and associative memory,” Springer, Berlin, 3rd edition, 1989.
- [7] T. Kohonen, “Self-organized formation of topologically correct feature maps,” Biological Cybernetics, Vol.43, pp. 59-69, 1982.
- [8] R. Feraud and R. Clerot, “A methodology to explain neural network classification,” Neural Networks, Vol.15, No.2, pp. 237-246, 2002.
- [9] T. M. Cover and P. E. Hart, “Nearest neighbor pattern classification,” IEEE Trans. on Information Theory, Vol.IT-13, No.1, pp. 21-27, 1967.
- [10] T. Shibata, T. Kato, and T. Wada, “K-D decision tree: An accelerated and memory efficient nearest neighbor classifier,” MIRU2004.
- [11] B. V. Dasarathy, “Minimal consistent set (MCS) identification for optimal nearest neighbor decision systems design,” IEEE Trans. Syst. Man Cybern., Vol.24, No.3, pp. 511-517, 1994.
- [12] G. W. Gates, “The reduced nearest neighbor rule,” IEEE Trans. Inf. Theory, Vol.IT-18, No.3, pp. 431-433, 1972.
- [13] S. Furao and O. Hasegawa, “An incremental neural network for non-stationary unsupervised learning,” International Conference on Neural Information Processing (ICONIP2004), Calcutta, India, 2004.
- [14] B. Fritzke, “A Growing Neural Gas Network Learns Topologies,” In Advances in Neural Information Processing Systems, Vol.7, pp. 625-632, 1995.
- [15] S. Furao and O. Hasegawa, “An On-line Learning Mechanism for Unsupervised Classification and Topology Representation,” IEEE International Conference on Computer Vision and Pattern Recognition (CVPR05), San Diego, 2005 (accepted).
- [16] T. M. Martinetz, “Competitive Hebbian learning rule forms perfectly topology preserving maps,” ICANN, pp. 427-434, 1993.
- [17] C. Merz and M. Murphy, “UCI repository of machine learning databases, Irvine, CA,” University of California Department of Information, 1996.
http://www.ics.uci.edu/˜mlearn/MLRepository.html - [18] A. Passerini, M. Pontil, and P. Frasconi, “From Margins to Probabilities in Multiclass Learning Problems,” in F. van Harmelen (ed.), Proc. 15th European Conf. on Artificial Intelligence, 2002.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.