JACIII Vol.11 No.6 pp. 582-592
doi: 10.20965/jaciii.2007.p0582


A Minimal Neural Network Ensemble Construction Method: A Constructive Approach

M. A. H. Akhand* and Kazuyuki Murase*, **

*Department of Human and Artificial Intelligence Systems, Graduate School of Engineering, University of Fukui

**Research and Education Program for Life Science, University of Fukui, 3-9-1 Bunkyo, Fukui 910-8507, Japan

January 9, 2007
March 19, 2007
July 20, 2007
constructive approach, diversity, generalization ability, neural network ensemble

This paper presents a neural network ensemble (NNE) construction method for classification problems. The proposed method automatically determines a minimal NNE architecture and thus called the Minimal Neural Network Ensemble Construction (MNNEC) method. To determine minimal architecture, it starts with a single neural network (NN) with a minimal number of hidden units. During training process, it adds additional NN(s) with cumulative number(s) of hidden units. In conventional methods, in contrast, the number of NNs for NNE and the number of hidden nodes for each NN should be predetermined. At the time of NN addition in MNNEC, the added NN specializes in the previously unsolved portion of the input space. Finally all the NNs are trained simultaneously to improve the generalization ability. Therefore, for easy problems when multiple NNs are not required and a single NN is sufficient, the MNNEC can generate a single NN with a minimal number of hidden units. The MNNEC has been tested extensively on several benchmark problems of machine learning and NNs. The results exhibit that the MNNEC is able to construct NNEs of much smaller size than conventional methods.

Cite this article as:
M. A. H. Akhand and Kazuyuki Murase, “A Minimal Neural Network Ensemble Construction Method: A Constructive Approach,” J. Adv. Comput. Intell. Intell. Inform., Vol.11, No.6, pp. 582-592, 2007.
Data files:
  1. [1] S. Haykin, “Neural Networks – A Comprehensive Foundation,” Prentice Hall, 2nd edition, 1999.
  2. [2] D. J. Newman, S. Hettich, C. L. Blake, and C. J. Merz, “UCI Repository of Machine Learning Databases,” Dept. of Information and Computer Sciences, University of California, Irvine, 1998.
  3. [3] T. G. Dietterich, “Ensemble Learning,” The Handbook of Brain Theory and Neural Networks, Second Edition, pp. 405-408, 2002.
  4. [4] A. J. C. Sharkey, “On combining artificial neural nets,” Connection Science, 8-3/4, pp. 299-314, 1996.
  5. [5] A. J. C. Sharkey and N. E. Sharkey, “Combining Diverse Neural Nets,” Knowledge Engineering Review, 12-3, pp. 299-314, 1997.
  6. [6] G. Brown, J. Wyatt, R. Harris, and X. Yao, “Diversity Creation Methods: A Survey and Categorization,” Information Fusion, 6, pp. 99-111, 2005.
  7. [7] D. W. Opitz and R. Maclin, “Popular ensemble methods: An empirical study,” Journal of Artificial Intelligence Research, 11, pp. 169-198, 1999.
  8. [8] L. Breiman, “Bagging predictors,” Machine Learning, 24, pp. 123-140, 1996.
  9. [9] Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” Proc. of the 13th International Conference on Machine Learning, Morgan kaufmann, pp. 148-156, 1996.
  10. [10] P. Melville and R. J. Mooney, “Creating diversity in ensembles using artificial data,” Information Fusion, 6, pp. 99-111, 2005.
  11. [11] T.-Y. Kwok and D.-Y. Yeung, “Constructive algorithms for structure learning in feed forward neural networks for regression problems,” IEEE Transactions on Neural Networks, 8-3, pp. 630-645, 1999.
  12. [12] D. Partridge and W. B. Yates, “Engineering Multiversion Neural-net Systems,” Neural Computation, 8-4, pp. 869-893, 1996.
  13. [13] L. Prechelt, “Proben1 – A Set of Benchmarks and Benching Rules for Neural Network Training Algorithms,” Tech. Rep. 21/94, Fakultat fur Informatik, University of Karlsruhe, Germany, 1994.
  14. [14] Y. Liu and X. Yao, “Ensemble learning via negative correlation,” Neural Networks, 12, pp. 1399-1404, 1999.
  15. [15] X. Yao and Y. Liu, “A New Evolutionary System for Evolving Artificial Neural Networks,” IEEE Transactions on Neural Networks, 8, pp. 694-713, 1997.
  16. [16] A. Krogh and J. Vedelsby, “Neural Network Ensembles, Cross Validation, and Active Learning,” in Proc. of Advances in Neural Information Processing, pp. 231-238, 1995.
  17. [17] M. M. Nelson and W. T. Illingworth, “A practical guide to neural networks,” Addison-Wesley, USA, 1992.
  18. [18] A. Tsymbal, M. Pechenizkiy, and P. Cunningham, “Diversity in search strategies for ensemble feature selection,” Information Fusion, 6, pp. 83-98, 2005.
  19. [19] D. Wolpert and W. Macready, “No Free Lunch Theorems for optimization,” IEEE Transactions on Evolutionary Computation, 1-1, pp. 67-82, 1997.
  20. [20] T. Y. Kwok and D. Y. Yeung, “Objective Functions for Training New Hidden Units in Constructive Neural Networks,” IEEE Trans. Neural Networks, 8, pp. 1131-1148, 1997.
  21. [21] R. A. Flores-Mendez, “Standardization of Multi-Agent System Frameworks,” ACM Crossroads, 5-4, pp. 18-24, 1999.
  22. [22] Multi-Agents research group, “Intelligent Agents and Multi-Agents,” School of Computer Science and Information Technology, University of Nottingham,

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Mar. 05, 2021