single-jc.php

JACIII Vol.19 No.3 pp. 381-388
doi: 10.20965/jaciii.2015.p0381
(2015)

Paper:

A MultiBoosting Based Transfer Learning Algorithm

Xiaobo Liu*, Guangjun Wang*, Zhihua Cai**, and Harry Zhang***

*School of Automation, China University of Geosciences
388 Lumo Road, Wuhan, Hubei 430074, China

**School of Computer Science, China University of Geosciences
388 Lumo Road, Wuhan, Hubei 430074, China

***Faculty of Computer Science, University of New Brunswick
P.O. Box 4400, Fredericton, NB E3B 5A3, Canada

Received:
October 28, 2014
Accepted:
February 20, 2015
Published:
May 20, 2015
Keywords:
ensemble learning, transfer learning, wagging, AdaBoost
Abstract
Ensemble learning is sophisticated machine learning use to solve many problems in practical applications. MultiBoosting, a cutting-edge learning approach in ensemble learning, is combined with AdaBoost and wagging. It retains AdaBoost’s bias reduction while adding wagging’s variance reduction to that already obtained by AdaBoost, thus reducing the total number of errors in classification. Data characteristics do not always follow traditional machine learning rules, however, so transfer learning acts to solve this problem. We propose a TrMultiBoosting algorithm, composed of MultiBoosting and state-of-the-art transfer learning algorithm TrAdaBoost for transfer learning. We use naive bayes as the basic learning algorithm. TrMultiBoosting has proven to present a decision committee with higher prediction accuracy on UCI data sets than either TrAdaBoost or MultiBoosting.
Cite this article as:
X. Liu, G. Wang, Z. Cai, and H. Zhang, “A MultiBoosting Based Transfer Learning Algorithm,” J. Adv. Comput. Intell. Intell. Inform., Vol.19 No.3, pp. 381-388, 2015.
Data files:
References
  1. [1] I. H. Witten, E. Frank, and M. A. Hall, “Data Mining: Practical Machine Learning Tools and Techniques (3rd Edition),” Morgan Kaufmann, Burlington, 2011.
  2. [2] T. G. Dietterich, “Ensemble Methods in Machine Learning,” Proc. of the 1st Int. Workshop on Multiple Classifier Systems, pp. 1-15. Springer-Verlag, London, 2000.
  3. [3] S. Pan and Q. Yang, “A Survey on Transfer Learning,” IEEE Trans. on Knowledge and Data Engineering, Vol.22, pp. 1345-1359, 2011.
  4. [4] W. Dai, Q. Yang, G. Xue, et al., “Boosting for Transfer Learning,” Proc. of the 24th Annual Int. Conf. on Machine Learning (ICML’07), pp. 193-200, New York, IEEE Press, 2007.
  5. [5] E. Eaton and M. DesJardins, “Set-Based Boosting for Instance-Level Transfer,” IEEE Int. Conf. on Data Mining Workshops, pp. 422-428, IEEE Press, Washington, 2009.
  6. [6] T. Kamishima, M. Hamasaki, and S. Akaho, “TrBagg: A Simple Transfer Learning Method and Its Application to Personalization in Collaborative Tagging,” 9th IEEE Int. Conf. on Data Mining (ICDM 2009), pp. 219-228, IEEE Press, Washington, 2009.
  7. [7] L. Breiman, “Bagging predictors,” Machine Learning, Vol.24, pp. 123-140, 1996.
  8. [8] L. La, Q. Guo, Q. Cao, et al., “Transfer learning with reasonable boosting strategy,” Neural Computing & Applications, Vol.24, No.3-4, pp. 807-816, 2014.
  9. [9] G. I. Webb, “MultiBoosting: A Technique for Combining Boosting and Wagging,” Machine Learning, Vol.40, pp. 156-196, 2000.
  10. [10] Y. Freund and R. E. Schapire, “Experiments with a New Boosting Algorithm,” Proc. of the 13th Int. Conf. on Machine Learning (ICML 2006), pp. 148-156, 2006.
  11. [11] E. Bauer and R. Kohavi, “An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants,” Machine Learning, Vol.36, pp. 105-139, 1999.
  12. [12] T. M. Mitchell, “Machine Learning,” McGraw Hill, New York, 1997.
  13. [13] Y. Shi, Z. Lan, and W. Liu, “Extending Semi-supervised Learning Methods for Inductive Transfer Learning,” 9th IEEE Int. Conf. on Data Mining (ICDM 2009), pp. 483-492, IEEE Press, Miami, 2009.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024