Paper:
A Minimal Neural Network Ensemble Construction Method: A Constructive Approach
M. A. H. Akhand* and Kazuyuki Murase*, **
*Department of Human and Artificial Intelligence Systems, Graduate School of Engineering, University of Fukui
**Research and Education Program for Life Science, University of Fukui, 3-9-1 Bunkyo, Fukui 910-8507, Japan
- [1] S. Haykin, “Neural Networks – A Comprehensive Foundation,” Prentice Hall, 2nd edition, 1999.
- [2] D. J. Newman, S. Hettich, C. L. Blake, and C. J. Merz, “UCI Repository of Machine Learning Databases,” Dept. of Information and Computer Sciences, University of California, Irvine, 1998.
(Available: http://www.ics.uci.edu/˜mlearn/MLRepository.html) - [3] T. G. Dietterich, “Ensemble Learning,” The Handbook of Brain Theory and Neural Networks, Second Edition, pp. 405-408, 2002.
- [4] A. J. C. Sharkey, “On combining artificial neural nets,” Connection Science, 8-3/4, pp. 299-314, 1996.
- [5] A. J. C. Sharkey and N. E. Sharkey, “Combining Diverse Neural Nets,” Knowledge Engineering Review, 12-3, pp. 299-314, 1997.
- [6] G. Brown, J. Wyatt, R. Harris, and X. Yao, “Diversity Creation Methods: A Survey and Categorization,” Information Fusion, 6, pp. 99-111, 2005.
- [7] D. W. Opitz and R. Maclin, “Popular ensemble methods: An empirical study,” Journal of Artificial Intelligence Research, 11, pp. 169-198, 1999.
- [8] L. Breiman, “Bagging predictors,” Machine Learning, 24, pp. 123-140, 1996.
- [9] Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” Proc. of the 13th International Conference on Machine Learning, Morgan kaufmann, pp. 148-156, 1996.
- [10] P. Melville and R. J. Mooney, “Creating diversity in ensembles using artificial data,” Information Fusion, 6, pp. 99-111, 2005.
- [11] T.-Y. Kwok and D.-Y. Yeung, “Constructive algorithms for structure learning in feed forward neural networks for regression problems,” IEEE Transactions on Neural Networks, 8-3, pp. 630-645, 1999.
- [12] D. Partridge and W. B. Yates, “Engineering Multiversion Neural-net Systems,” Neural Computation, 8-4, pp. 869-893, 1996.
- [13] L. Prechelt, “Proben1 – A Set of Benchmarks and Benching Rules for Neural Network Training Algorithms,” Tech. Rep. 21/94, Fakultat fur Informatik, University of Karlsruhe, Germany, 1994.
- [14] Y. Liu and X. Yao, “Ensemble learning via negative correlation,” Neural Networks, 12, pp. 1399-1404, 1999.
- [15] X. Yao and Y. Liu, “A New Evolutionary System for Evolving Artificial Neural Networks,” IEEE Transactions on Neural Networks, 8, pp. 694-713, 1997.
- [16] A. Krogh and J. Vedelsby, “Neural Network Ensembles, Cross Validation, and Active Learning,” in Proc. of Advances in Neural Information Processing, pp. 231-238, 1995.
- [17] M. M. Nelson and W. T. Illingworth, “A practical guide to neural networks,” Addison-Wesley, USA, 1992.
- [18] A. Tsymbal, M. Pechenizkiy, and P. Cunningham, “Diversity in search strategies for ensemble feature selection,” Information Fusion, 6, pp. 83-98, 2005.
- [19] D. Wolpert and W. Macready, “No Free Lunch Theorems for optimization,” IEEE Transactions on Evolutionary Computation, 1-1, pp. 67-82, 1997.
- [20] T. Y. Kwok and D. Y. Yeung, “Objective Functions for Training New Hidden Units in Constructive Neural Networks,” IEEE Trans. Neural Networks, 8, pp. 1131-1148, 1997.
- [21] R. A. Flores-Mendez, “Standardization of Multi-Agent System Frameworks,” ACM Crossroads, 5-4, pp. 18-24, 1999.
- [22] Multi-Agents research group, “Intelligent Agents and Multi-Agents,” School of Computer Science and Information Technology, University of Nottingham,
http://www.asap.cs.nott.ac.uk/themes/ma.shtml.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.