single-jc.php

JACIII Vol.21 No.6 pp. 989-997
doi: 10.20965/jaciii.2017.p0989
(2017)

Paper:

Robust and Sparse LP-Norm Support Vector Regression

Ya-Fen Ye*, Chao Ying**, Yuan-Hai Shao*, Chun-Na Li*, and Yu-Juan Chen***

*Zhijiang College, Zhejiang University of Technology
182 Zhijiang Road, Hangzhou 310024, China

**Rainbow City Primary School
501 Weiye Road, Hangzhou 310013, China

***School of Data Sciences, Zhejiang University of Finance and Economics
18 Xueyuan Road, Hangzhou 310018, China

Received:
December 25, 2016
Accepted:
April 20, 2017
Published:
October 20, 2017
Keywords:
support vector regression, Lp-norm, sparse solution, feature selection
Abstract

A robust and sparse Lp-norm support vector regression (Lp-RSVR) is proposed in this paper. The implementation of feature selection in our Lp-RSVR not only preserves the performance of regression but also improves its robustness. The main characteristics of Lp-RSVR are as follows: (i) By using the absolute constraint, Lp-RSVR performs robustly against outliers. (ii) Lp-RSVR ensures that useful features are selected based on theoretical analysis. (iii) Based on the feature-selection results, nonlinear Lp-RSVR can be used when data is structurally nonlinear. Experimental results demonstrate the superiorities of the proposed Lp-RSVR in both feature selection and regression performance as well as its robustness.

Cite this article as:
Y. Ye, C. Ying, Y. Shao, C. Li, and Y. Chen, “Robust and Sparse LP-Norm Support Vector Regression,” J. Adv. Comput. Intell. Intell. Inform., Vol.21 No.6, pp. 989-997, 2017.
Data files:
References
  1. [1] H. Drucker, C. J. C. Burges, L. Kaufman, A. Smola, and V. Vapnik, “Support Vector Regression Machines,” Bell Labs and Monmouth University, 1997.
  2. [2] C. Cortes and V. Vapnik, “Support vector networks,” Machine Learning, Vol.20, No.3, pp. 273-297, 1995.
  3. [3] V. Vapnik, “Statistical learning theory,” Wiley, New York, 1998.
  4. [4] J. Gu, M. Zhu, and L. Jiang, “Housing price forecasting based on genetic algorithm and support vector machine,” Expert Systems with Applications, Vol.38, No.4, pp. 3383-3386, 2011.
  5. [5] Z.-Y. Chen and Z.-P. Fan, “Dynamic customer lifetime value prediction using longitudinal data: An improved multiple kernel svr approach,” Expert Systems with Applications, Vol.43, pp. 123-134, 2013.
  6. [6] A. Kazem, E. Sharifi, F. K. Hussain, and M. Saberi, “Support vector regression with chaos-based firefly algorithm for stock market price forecasting,” Applied Soft Computing, Vol.13, pp. 947-958, 2013.
  7. [7] L. Wang and J. Zhu, “Financial market forecasting using a two-step kernel learning method for the support vector regression,” Annals of Operations Research, Vol.174, pp. 103-120, 2010.
  8. [8] X. Tang, L. Zhuang, and C. Jiang, “Prediction of silicon content in hot metal using support vector regression based on chaos particle swarm optimization,” Expert Systems with Applications, Vol.36, No.9, pp. 11853-11857, 2009.
  9. [9] J. Huang, Y. Bo, and H. Wang, “Electromechanical equipment state forecasting based on genetic algorithmcsupport vector regression,” Expert Systems with Applications, Vol.38, No.7, pp. 8399-8402, 2011.
  10. [10] X. Peng and D. Xu, “A local information-based feature-selection algorithm for data regression,” Pattern Recognition, Vol.46, pp. 2519-2530, 2013.
  11. [11] C. Zhang, D. Li, and J. Tan, “The support vector regression with adaptive norms,” Procedia Computer Science, Vol.18, pp. 1730-1736, 2013.
  12. [12] Y. Tian, J. Yu, and W. Chen, “Lp-norm support vector machine with CCCP,” 7th Int. Conf. on Fuzzy Systems and Knowledge Discovery (FSKD), 2010.
  13. [13] W. Chen and Y. Tian, “Lp-norm proximal support vector machine and its applications,” Procedia Computer Science, Vol.1, pp. 2417-2423, 2012.
  14. [14] Y. Ye, Y. Shao, and C. Li, “Wavelet Lp-Norm Support Vector Regression with Feature Selection,” J. Adv. Comput. Intell. Intell. Inform., Vol.19, No.3, pp. 407-416, 2015.
  15. [15] P. Bradley, O. Mangasarian, and W. Street, “Feature selection via mathematical programming,” INFORMS J. on Computing, doi:10.1287/ijoc.10.2.209, 1998.
  16. [16] J. Y. Tan, Z. Z. Zhang, L. Zhen, C. H. Zhang, and N. Y. Deng, “Adaptive feature selection via a new version of support vector machine,” Neural Computer Application, Vol.23, pp. 937-945, 2013.
  17. [17] Y. H. Shao, C. H. Zhang, Z. M. Yang, L. Jing, and N. Y. Deng, “An ε-twin support vector machine for regression,” Neural Computing and Applications, Vol.23,pp. 175-185, 2013.
  18. [18] Y. Ye, Y. Shao, and W. Chen, “Comparing inflation forecasts using an ε-wavelet twin support vector regression,” J. of Information and Computational Science, Vol.10, pp. 2041-2049, 2013.
  19. [19] W. Cui and X. Yan, “Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR,” Chemometrics and Intelligent Laboratory Systems, Vol.98, pp. 130-135, 2009.
  20. [20] Y.-F. Ye, Y.-X. Jiang, Y.-H. Shao, and C.-N. Li, “Financial Conditions Index Construction Through Weighted Lp-Norm Support Vector Regression,” J. Adv. Comput. Intell. Intell. Inform., Vol.19, No.3, pp. 397-406, 2015.
  21. [21] http://mlg.info.ucl.ac.be/index.php?page=DataBases [accessed July 22, 2016]
  22. [22] Q. Yu, Y. Miche, E. Sverin, and A. Lendasse, “Bankruptcy prediction using extreme learning machine and financial expertise,” Neurocomputing, Vol.128, pp. 296-302,2014.
  23. [23] Y.-F. Ye, H. Cao, L. Bai, Z. Wang, and Y.-H. Shao, “Exploring determinants of inflation in China based on L1-ε-twin support vector regression,” Procedia Computer Science, Vol.17, pp. 514-522, 2013.
  24. [24] http://www.ics.uci.edu/~mlearn/MLrepository.html [accessed June 1, 2015]
  25. [25] J. H. Stock and M. W. Watson, “Combination forecasts of output growth in a seven-country data set,” J. of Forecasting, Vol.23, pp. 405-430, 2004.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Nov. 04, 2024