single-jc.php

JACIII Vol.21 No.1 pp. 31-48
doi: 10.20965/jaciii.2017.p0031
(2017)

Invited Paper:

A Review of Data Mining Techniques and Applications

Ratchakoon Pruengkarn, Kok Wai Wong, and Chun Che Fung

School of Engineering and Information Technology, Murdoch University
Perth, Australia

Received:
October 24, 2016
Accepted:
December 19, 2016
Published:
January 20, 2017
Keywords:
data mining, data mining techniques, data mining application, big data
Abstract
Data mining is the analytics and knowledge discovery process of analyzing large volumes of data from various sources and transforming the data into useful information. Various disciplines have contributed to its development and is becoming increasingly important in the scientific and industrial world. This article presents a review of data mining techniques and applications from 1996 to 2016. Techniques are divided into two main categories: predictive methods and descriptive methods. Due to the huge number of publications available on this topic, only a selected number are used in this review to highlight the developments of the past 20 years. Applications are included to provide some insights into how each data mining technique has evolved over the last two decades. Recent research trends focus more on large data sets and big data. Recently there have also been more applications in area of health informatics with the advent of newer algorithms.
Cite this article as:
R. Pruengkarn, K. Wong, and C. Fung, “A Review of Data Mining Techniques and Applications,” J. Adv. Comput. Intell. Intell. Inform., Vol.21 No.1, pp. 31-48, 2017.
Data files:
References
  1. [1] U. Fayyad, G. Piatetsky-Shapiro, and P. Smyth, “From Data Mining to Knowledge Discovery in Database,” AI Magazine, Vol.17, pp. 37-54, 1996.
  2. [2] M. Kantardzic, “Data Mining: Concepts, Models, Methods and Algorithms,” John Wiley & Sons, Inc., 2002.
  3. [3] M. H. Dunham, “Data mining introductory and advanced topics,” Pearson Education, Inc., 2003.
  4. [4] G. C. Nsofor, “A Comparative Analysis of Predictive Data-Mining Techniques,” Master of Science, The University of Tennessee, Knoxville, 2006.
  5. [5] F. Gorunescu, “Data Mining Concepts, Models and Techniques,” Vol.12, Springer-Verlag Berlin Heidelberg, 2011.
  6. [6] M. M. Mottalib, M. Rokonuzzaman, and M. T. Habib, “Fabric defect classification with geometric features using Bayesian classifier,” Int. Conf. on Advances in Electrical Engineering 2015 (ICAEE), 2015.
  7. [7] H. Yamagishi, H. Kawakami, T. Horiuchi, and a. O. Katai, “Reinforcement Leaning of Fuzzy Control Rules with Context-Specitic Segmentation of Actions,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.6, pp. 19-24, 2002.
  8. [8] N. Takagi, “An Application of Binary Decision Trees to Pattern Recognition,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.10, No.5, pp. 682-687, 2006.
  9. [9] S. T. Chen, S. L. Dou, and W. J. Chen, “A Data Mining Approach to Rainfall Intensity Classification Using TRMM/TMI Data,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.12, pp. 516-522, 2008.
  10. [10] A. Stassopoulou, M. Petrou, and J. Kittler, “Bayesian and neural networks for geographic information processing,” Pattern Recognition Letters, Vol.17, pp. 1325-1330, 1996.
  11. [11] S. Lavington, N. Dewhurst, and A. F. E. Wilkins, “Interfacing knowledge discovery algorithms to large database management systems,” Information and Software Technology, Vol.41, pp. 605-617, 1999.
  12. [12] A. Rauber and J. Paralic, “Cluster Analysis as a First Step in the Knowledge Discovery Process,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.4, pp. 258-262, 2000.
  13. [13] T. Tamaki, “Human Limb Extraction Based on Motion Estimation Using Optical Flow and Image Registration,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.8, pp. 150-155, 2004.
  14. [14] P. Valenti, E. Cazamajou, M. Scarpettini, A. Aizemberg, W. Silva, and S. Kochen, “Automatic detection of interictal spikes using data mining models,” Neuroscience Methods, Vol.150, pp. 105-110, 2006.
  15. [15] J. Kazmierska and J. Malicki, “Application of the Naïve Bayesian Classifier to optimize treatment decisions,” Radiotherapy and Oncology, Vol.86, pp. 211-216, 2008.
  16. [16] J. Alonso-Montesinos, M. Martínez-Durbán, J. d. Sagrado, I. M. d. Águila, and F. J. Batlles, “The application of Bayesian network classifiers to cloud classification in satellite images,” Renewable Energy, Vol.97, pp. 155-161, 2016.
  17. [17] Q. Li, F. Li, and K. Doi, “Computerized Detection of Lung Nodules in Thin-Section CT Images by Use of Selective Enhancement Filters and an Automated Rule-Based Classifier,” Academic Radiology, Vol.15, pp. 165-175, 2008.
  18. [18] S. Mabu, M. Obayashi, and T. Kuremoto, “Ensemble learning of rule-based evolutionary algorithm using multi-layer perceptron for supporting decisions in stock trading problems,” Applied Soft Computing, Vol.36, pp. 357-367, 2015.
  19. [19] S. Mabu, W. Li, and K. Hirasawa, “A Class Association Rule Based Classifier Using Probability Density Functions for Intrusion Detection Systems,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.19, pp. 555-566, 2015.
  20. [20] D. M. Farid, M. A. Al-Mamun, B. Manderick, and A. Nowe, “An adaptive rule-based classifier for mining big biological data,” Expert Systems with Applications, Vol.64, pp. 305-316, 2016.
  21. [21] V. S. R. P. V. Kamadi, A. R. Allam, S. M. Thummala, and V. N. R. P., “A computational intelligence technique for the effective diagnosis of diabetic patients using principal component analysis (PCA) and modified fuzzy SLIQ decision tree approach,” Applied Soft Computing, Vol.49, pp. 137-145, 2016.
  22. [22] L. O. Hall and P. Lande, “Generation of Fuzzy Rules from Decision Trees,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.2, pp. 128-133, 1998.
  23. [23] O. Matsumoto, K. J. Mackin, and E. Tazaki, “Emergence of Learning Rule in Neural Networks Using Genetic Programming Combined with Decision Trees,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.3, pp. 223-233, 1999.
  24. [24] M. T. a. N. Muranaka, “Product-Impression Analysis Using Fuzzy C4.5 Decision Tree,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.13, pp. 731-737, 2009.
  25. [25] H. E. Osman, “Averaging Forest for Online Vision,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.13, pp. 400-406, 2009.
  26. [26] K. M. Lee, K. S. Hwang, K. M. Lee, S. K. Han, W. H. Jung, and S. Lee, “Supervised Learning-Based Feature Selection for Mondrian Paintings Style Authentication,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.16, pp. 894-899, 2012.
  27. [27] J. Kozak and U. Boryczka, “Collective data mining in the ant colony decision tree approach,” Information Sciences, Vol.372, pp. 126-147, 2016.
  28. [28] K. Kim, “A hybrid classification algorithm by subspace partitioning through semi-supervised decision tree,” Pattern Recognition, Vol.60, pp. 157-163, 2016.
  29. [29] M. Puri, A. Solanki, T. Padawer, S. M. Tipparaju, W. A. Moreno, and Y. Pathak, “Chapter 1 - Introduction to Artificial Neural Network (ANN) as a Predictive Tool for Drug Design, Discovery, Delivery, and Disposition: Basic Concepts and Modeling,” Artificial Neural Network for Drug Design, Delivery and Disposition, pp. 3-13, 2016.
  30. [30] P. Baranyi, L. T. Koczy, and T. D. Gedeon, “Improved Fuzzy and Neural Network Algorithms for Word Frequency Prediction in Document Filtering,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.2, pp. 88-95, 1998.
  31. [31] R. Dhaouadi and K. Nouri, “Neural Network-Based Speed Control of A Two-Mass-Model System,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.3, pp. 427-430, 1999.
  32. [32] J. Lu and H. Ohta, “A Powerful Neural Network Method with Digital-contract Hints for Pricing Complex Options,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.7, pp. 139-146, 2003.
  33. [33] N. Watanabe and S. Ishizaki, “Neural Network Model for Word Sense Disambiguation Using Up/Down State and Morphoelectrotonic Transform,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.11, pp. 780-786, 2007.
  34. [34] T. Kondo, J. Ueno, and S. Takao, “Medical Image Diagnosis of Liver Cancer Using a Neural Network and Artificial Intelligence,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.15, pp. 714-722, 2011.
  35. [35] K. G. Abistado, C. N. Arellano, and E. A. Maravillas, “Weather Forecasting Using Artificial Neural Network and Bayesian Network,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.18, pp. 812-817, 2014.
  36. [36] S. R. Gunn, “Support Vector Machines for Classification and Regression,” University of Southampton, 1998.
  37. [37] M. Kantardzic, B. Djulbegovic, and H. Hamdan, “A data-mining approach to improving Polycythemia Vera diagnosis,” Computers & Industrial Engineering, Vol.43, pp. 765-773, 2002.
  38. [38] S. F. Crone, S. Lessmann, and R. Stahlbock, “The impact of preprocessing on data mining: An evaluation of classifier sensitivity in direct marketing,” European J. of Operational Research, Vol.173, pp. 781-800, 2006.
  39. [39] C. C. Lee, S. H. Chen, and Y. C. Chiang, “Classification of Liver Disease from CT Images Using a Support Vector Machine,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.11, pp. 396-402, 2007.
  40. [40] B. Li, J. Hu, and K. Hirasawa, “Support Vector Machine Classifier with WHM Offset for Unbalanced Data,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.12, pp. 94-101, 2008.
  41. [41] B. L. Qiangwei Wang, and Jinglu Hu, “Human Resource Selection Based on Performance ClassificationUsingWeighted Support Vector Machine,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.13, pp. 407-415, 2009.
  42. [42] K. Shimada, R. Muto, and T. Endo, “A Combined Method Based on SVM and Online Learning with HOG for Hand Shape Recognition,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.16, pp. 687-695, 2012.
  43. [43] “Stock Market Trend Prediction Based on Text Mining of Corporate Web and Time Series Data,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.18, pp. 22-31, 2014.
  44. [44] J. Han, M. Kamber, and J. Pei, “Data Mining: Concepts and Techniques,” ELSEVIER SCIENCE & TECHNOLOGY, 2011.
  45. [45] H. He, S. Hawkins, W. Graco, and X. Yao, “Application of Genetic Algorithm and K-Nearest Neighbour Method in Real World Medical Fraud Detection Problem,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.4, pp. 130-137, 2000.
  46. [46] M. Y. Su, “Real-time anomaly detection systems for Denial-of-Service attacks by weighted k-nearest-neighbor classifiers,” Expert Systems with Applications, Vol.38, pp. 3492-3498, 2011.
  47. [47] J. Maillo, S. Ramírez, I. Triguero, and F. Herrera, “kNN-IS: An Iterative Spark-based design of the k-Nearest Neighbors classifier for big data,” Knowledge-Based Systems, pp. 1-13, 2016.
  48. [48] D. Wang, “K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited,” Mechanical Systems and Signal Processing, Vol.70-71, pp. 201-208, 2016.
  49. [49] K. Hirota, H. Yoshino, M. Q. Xu, Y. Zhu, X. Y. Li, and D. Horie, “An Application of Fuzzy Theory to the Case-Based Reasoning of the CISG,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.1, pp. 86-93, 1997.
  50. [50] N. H. Phuong, N. B. Tu, and A. Ohsato, “Developing Case-based Reasoning System for Medical Consultation Using the Importance of Features,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.6, pp. 41-50, 2002.
  51. [51] T. Ahola and K. Leiviskä, “Case-Based Reasoning in Web Break Sensitivity Evaluation in a Paper Machine,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.9, pp. 556-561, 2005.
  52. [52] C. H. L. Lee, A. Liu, and H. H. Huang, “Using Planning and Case-Based Reasoning for Service Composition,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.14, pp. 540-548, 2010.
  53. [53] Y. Shen, J. Colloc, A. Jacquet-Andrieu, and K. Lei, “Emerging medical informatics with case-based reasoning for aiding clinical decision in multi-agent system,” J. of Biomedical Informatics, Vol.56, pp. 307-317, 2015.
  54. [54] F. Sartori, A. Mazzucchelli, and A. D. Gregorio, “Bankruptcy forecasting using case-based reasoning: The CRePERIE approach,” Expert Systems with Applications,” Vol.64, pp. 400-411, 2016.
  55. [55] T. Miyajima and T. Koitabashi, “Placement Time Optimization of Chip Mounter by Genetic Algorithms – Search for Optimal Tape Feeder Arrangement –,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.2, pp. 160-163, 1998.
  56. [56] M. Watanabe, H. Nobuhara, K. Kawamoto, F. Dong, and K. Hirota, “Chain Restaurant Work Scheduling Based on Genetic Algorithm with Fuzzy Logic,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.10, pp. 50-59, 2006.
  57. [57] C. W. Han and H. Nobuhara, “Advanced Genetic Algorithms Based on Adaptive Partitioning Method,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.11, pp. 677-680, 2007.
  58. [58] T. Inaba, L. He, K. Suzuki, K. Murakami, and Y. Chao, “A Genetic-Algorithm-Based Temporal Subtraction for Chest Radiographs,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.13, pp. 289-296, 2009.
  59. [59] E. P. Ijjina and K. M. Chalavadi, “Human action recognition using genetic algorithms and convolutional neural networks,” Pattern Recognition, Vol.59, pp. 199-212, 2016.
  60. [60] P. Mahajan, R. Kandwal, and R. Vijay, “Rough Set Approach in Machine Learning: A Review.,” J. of Computer Applications, Vol.56, pp. 1-13, 2012.
  61. [61] P. Pattaraintakorn and N. Cercone, “Integrating rough set theory and medical applications,” Applied Mathematics Letters, Vol.21, pp. 400-403, 2008.
  62. [62] I. Jagielska, “Using Rough Sets for Practical Feature Selection in a Rough Sets/Neural Network Framework for Knowledge Discovery,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.4, pp. 31-38, 2000.
  63. [63] W. Jin-Mao, W. Shu-Qin, and W. Ming-Yang, “Novel Approach to Decision-Tree Construction,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.8, p. JACIIIVol.8 No.3pp. , 2004.
  64. [64] H. Sakai, K. Koba, and M. Nakata, “Rough Sets Based Rule Generation from Data with Categorical and Numerical Values,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.12, pp. 426-434, 2008.
  65. [65] A. Kothari and A. Keskar, “Rough Set Approach for Overall Performance Improvement of an Unsupervised ANN-Based Pattern Classifier,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.13, pp. 434-440, 2009.
  66. [66] Y. Matsumoto and J. Watada, “Rough Sets Based Prediction Model of Tick-Wise Price Fluctuations,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.15, pp. 449-453, 2011.
  67. [67] D. V. Nguyen, K. Yamada, and M. Unehara, “Rough Set Approach with Imperfect Data Based on Dempster-Shafer Theory,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.18, pp. 280-288, 2014.
  68. [68] S. Park and H. Lee-Kwang, “Type-2 Fuzzy Hypergraphs Using Type-2 Fuzzy Sets,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.4, pp. 362-367, 2000.
  69. [69] G. H. Tzeng, C. M. Feng, and C. C. Kang, “Application of Fuzzy Set Theory and DEA Model to Evaluating Production Efficiency for Taipei City Bus Company,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.5, pp. 128-138, 2001.
  70. [70] H. Sekiya, T. Kondo, M. Hashimoto, and T. Takagi, “Dynamic Sense Representation Using Conceptual Fuzzy Sets,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.10, pp. 859-867, 2006.
  71. [71] K. Uehara and K. Hirota, “Multi-Level Control of Fuzzy-Constraint Propagation via Evaluations with Linguistic Truth Values in Generalized-Mean-Based Inference,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.20, pp. 355-377, 2016.
  72. [72] R. A. Yaffee, “Robust Regression Analysis:Some Popular Statistical Package Options,” Statistics, Social Science, and Mapping Group Academic Computing Services Information Technology Services, 2002.
  73. [73] T. C. Fu, “A review on time series data mining,” Engineering Applications of Artificial Intelligence, Vol.24, pp. 164-181, 2011.
  74. [74] A. Serletis and D. Krause, “Empirical evidence on the long-run neutrality hypothesis using low-frequency international data,” Economics Letters, Vol.50, pp. 323-327, 1996.
  75. [75] R. E. Abdel-Aal and A. M. Mangoud, “Modeling and forecasting monthly patient volume at a primary health care clinic using univariate time-series analysis,” Computer Methods and Programs in Biomedicine, Vol.56, pp. 235-247, 1998.
  76. [76] J. C. Cuaresma, J. Hlouskova, S. Kossmeier, and M. Obersteiner, “Forecasting electricity spot-prices using linear univariate time-series models,” Applied Energy, Vol.77, pp. 87-106, 2004.
  77. [77] J. W. Taylor and R. Buizza, “Density forecasting for weather derivative pricing,” Int. J. of Forecasting, Vol.22, pp. 29-42, 2006.
  78. [78] S. Chattopadhyay and G. Chattopadhyay, “Univariate modelling of summer-monsoon rainfall time series: Comparison between ARIMA and ARNN,” Comptes Rendus Geoscience, Vol.342, pp. 100-107, 2010.
  79. [79] M. Maric, E. d. Haan, S. M. Hogendoorn, L. H. Wolters, and H. M. Huizenga, “Evaluating Statistical and Clinical Significance of Intervention Effects in Single-Case Experimental Designs: An SPSS Method to Analyze Univariate Data,” Behavior Therapy, Vol.46, pp. 230-241, 2015.
  80. [80] Y. Yin and P. Shang, “Forecasting traffic time series with multivariate predicting method,” Applied Mathematics and Computation, Vol.291, pp. 266-278, 2016.
  81. [81] Z. Li and M. Kafatos, “Interannual Variability of Vegetation in the United States and Its Relation to El Niño/Southern Oscillation,” Remote Sensing of Environment, Vol.71, pp. 239-247, 2000.
  82. [82] X. Chen and X. Chen, “Sensitivity analysis and determination of streambed leakance and aquifer hydraulic properties,” J. of Hydrology, Vol.284, pp. 270-284, 2003.
  83. [83] A. Müller, H. Osterhage, R. Sowa, R. G. Andrzejak, F. Mormann, and K. Lehnertz, “A distributed computing system for multivariate time series analyses of multichannel neurophysiological data,” J. of Neuroscience Methods, Vol.152, pp. 190-201, 2006.
  84. [84] F. Duchêne, C. Garbay, and Vincent Rialle, “Learning recurrent behaviors from heterogeneous multivariate time-series,” Artificial Intelligence in Medicine, Vol.39, pp. 25-47, 2007.
  85. [85] K. C. Perera, A. W. Western, B. George, and B. Nawarathna, “Multivariate time series modeling of short-term system scale irrigation demand,” J. of Hydrology, Vol.531, pp. 1003-1019, 2015.
  86. [86] A. Karahoca, D. Karahoca, and M. Sanver, “Survey of Data Mining and Applications (Review from 1996 to Now),” Data Mining Applications in Engineering and Medicine, InTech, 2012.
  87. [87] P. L. Gozalo and O. B. Linton, “Testing additivity in generalized nonparametric regression models with estimated parameters,” J. of Econometrics, Vol.104, pp. 1-48, 2001.
  88. [88] S. Kim, S. Imoto, and S. Miyano, “Dynamic Bayesian network and nonparametric regression for nonlinear modeling of gene networks from time series gene expression data,” Biosystems, Vol.75, pp. 57-65, 2004.
  89. [89] S. Efromovich, “Nonparametric regression with responses missing at random,” J. of Statistical Planning and Inference, Vol.141, pp. 3744-3752, 2011.
  90. [90] N. Yampikulsakul, E. Byon, S. Huang, S. Sheng, and M. You, “Condition Monitoring of Wind Power System With Nonparametric Regression Analysis,” IEEE Trans. on Energy Conversion, Vol.29, pp. 288-299, 2014.
  91. [91] N. N. Kachouie, T. Gerke, P. Huybers, and A. Schwartzman, “Nonparametric Regression for Estimation of Spatiotemporal Mountain Glacier Retreat From Satellite Images,” IEEE Trans. on Geoscience and Remote Sensing, Vol.53, pp. 1135-1149, 2015.
  92. [92] Z. Pan, Y. Wang, and C. Wu, “A nonparametric approach to test for predictability,” Economics Letters, Vol.148, pp. 10-16, 2016.
  93. [93] T. H. Soukissian and F. E. Karathanasi, “On the use of robust regression methods in wind speed assessment,” Renewable Energy, Vol.99, pp. 1287-1298, 2016.
  94. [94] K. A. Weigel and S. W. Lin, “Development of Int. Conversion Equations Using Robust Regression Methodology,” J. of Dairy Science, Vol.82, pp. 2023-2029, 1999.
  95. [95] A. C. Atkinson and T. C. Cheng, “On robust linear regression with incomplete data,” Computational Statistics & Data Analysis, Vol.33, pp. 361-380, 2000.
  96. [96] D. E. Booth and K. Lee, “Robust regression-based analysis of drug-nucleic acid binding,” Analytical Biochemistry, Vol.319, pp. 258-262, 2003.
  97. [97] M. C. Ortiz, L. A. Sarabia, and A. Herrero, “Robust regression techniques: A useful alternative for the detection of outlier data in chemical analysis,” Talanta, Vol.70, pp. 499-512, 2006.
  98. [98] V. Fritsch, B. D. Mota, G. V. Eva Loth, T. Banaschewski, G. J. Barker, A. L. W. Bokde, et al., “Robust regression for large-scale neuroimaging studies,” NeuroImage, Vol.111, pp. 431-441, 2015.
  99. [99] Andy Leung, H. Zhang, and R. Zamar, “Robust regression estimation and inference in the presence of cellwise and casewise contamination,” Computational Statistics & Data Analysis, Vol.99, pp. 1-11, 2016.
  100. [100] R. J. Holzworth, “Policy Capturing with Ridge Regression,” Organizational Behavior and Human Decision Processes, Vol.68, pp. 171-179, 1996.
  101. [101] J. Huang, D. Brennan, L. Sattler, J. Alderman, B. Lane, and C. O’Mathuna, “A comparison of calibration methods based on calibration data size and robustness,” Chemometrics and Intelligent Laboratory Systems, Vol.62, pp. 25-35, 2002.
  102. [102] L. M. Grosso, E. W. Triche, K. Belanger, N. L. Benowitz, T. R. Holford, and M. B. Bracken, “Association of caffeine metabolites in umbilical cord blood with IUGR and preterm delivery: A prospective cohort study of 1609 pregnancies,” Annals of Epidemiology, Vol.115, pp. 659-660, 2005.
  103. [103] X. Yu and S. Y. Liong, “Forecasting of hydrologic time series with ridge regression in feature space,” J. of Hydrology, Vol.332, pp. 290-302, 2007.
  104. [104] P. Mohapatra, S. Chakravarty, and P. K. Dash, “Microarray medical data classification using kernel ridge regression and modified cat swarm optimization based gene selection system,” Swarm and Evolutionary Computation, Vol.20, pp. 144-160, 2016.
  105. [105] K. Jin, S. Lee, S. Lee, and G. Heo, “Development of nuclear forensic models using kernel regression,” Progress in Nuclear Energy, Vol.94, pp. 55-63, 2017.
  106. [106] G. Y. Heo, “Condition monitoring using empirical models: technical review and prospects for nuclear applications,” Nuclear Engineering and Technology, Vol.40, pp. 49-68, 2008.
  107. [107] D. Lanari and C. T. E D’Agaro, “Use of nonlinear regression to evaluate the effects of phytase enzyme treatment of plant protein diets for rainbow trout (Oncorhynchus mykiss),” Aquaculture, Vol.161, pp. 345-356, 1998.
  108. [108] P. F. Schikora and M. R. Godfrey, “Efficacy of end-user neural network and data mining software for predicting complex system performance,” Int. J. of Production Economics, Vol.84, pp. 231-253.
  109. [109] M. Mohanty, D. K. Painuli, A. K. Misra, K. K. Bandyopadhyaya, and P. K. Ghosh, “Estimating impact of puddling, tillage and residue management on wheat (Triticum aestivum, L.) seedling emergence and growth in a rice–wheat system using nonlinear regression models,” Soil and Tillage Research, Vol.87, pp. 119-130, 2006.
  110. [110] D. Kimura, M. Nii, T. Yamaguchi, Y. Takahashi, and T. Yumoto, “Fuzzy Nonlinear Regression Analysis Using Fuzzified Neural Networks for Fault Diagnosis of Chemical Plants,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.15, pp. 336-344, 2011.
  111. [111] N. Deshpande, S. Londhe, and S. Kulkarni, “Modeling compressive strength of recycled aggregate concrete by Artificial Neural Network, Model Tree and Non-linear Regression,” Int. J. of Sustainable Built Environment, Vol.3, pp. 187-198, 2014.
  112. [112] S. Tomlinson, “Novel approaches to the calculation and comparison of thermoregulatory parameters: Non-linear regression of metabolic rate and evaporative water loss in Australian rodents,” J. of Thermal Biology, Vol.57, pp. 54-65, 2016.
  113. [113] G. E. Hinton, “Learning multiple layers of representation,” TRENDS in Cognitive Sciences, Vol.11, pp. 428-434, 2007.
  114. [114] S. Raschka, “What is the Difference Between Deep Learning and “Regular” Machine Learning?” Available: http://www.kdnuggets.com/2016/06/difference-between-deep-learning-regular-machine-learning.html
  115. [115] Y. LeCun, Y. Bengio, and G. Hinton, “Deep Learning,” Nature, Vol.521, pp. 436-444, 2015.
  116. [116] E. d. l. Rosa and W. Yu, “Randomized algorithms for nonlinear system identification with deep learning modification,” Information Sciences, Vol.364-365, pp. 197-212, 2016.
  117. [117] Soniya, S. Paul, and L. Singh, “A review on advances in deep learning,” IEEE Workshop on Computational Intelligence: Theories, Applications and Future Directions (WCI), 2015.
  118. [118] S. Curtis, “What To Expect from Deep Learning in 2016 and Beyond,” http://www.kdnuggets.com/2016/01/deep-learning-2016-beyond.html
  119. [119] D. Cirecsan, U. Meier, J. Masci, and J. Schmidhuber, “Multi-column deep neural network for traffic sign classification,” Neural Networks, Vol.32, pp. 333-338, 2012.
  120. [120] K. Noda, H. Arie, Y. Suga, and T. Ogata, “Multimodal integration learning of robot behavior using deep neural networks,” Robotics and Autonomous Systems, Vol.62, pp. 721-736, 2014.
  121. [121] Y. Zhou, Q. Hu, J. Liu, and Y. Jia, “Combining heterogeneous deep neural networks with conditional random fields for Chinese dialogue act recognition,” Neurocomputing, Vol.168, pp. 408-417, 2016.
  122. [122] E. P. Ijjina and C. K. Mohan, “Hybrid deep neural network model for human action recognition,” Applied Soft Computing, Vol.46, pp. 936-952, 2016.
  123. [123] R. Kumari, Sheetanshu, M. K. Singh, R. Jha, and N. K. Singh, “Anomaly detection in network traffic using K-mean clustering,” The 3rd Int. Conf. on Recent Advances in Information Technology 2016 (RAIT), 2016.
  124. [124] C. W. Chen, J. Luo, and K. J. Parker, “Image segmentation via adaptive K-mean clustering and knowledge-based morphological operations with biomedical applications,” IEEE Trans. on Image Processing, Vol.7, pp. 1673-1683, 1998.
  125. [125] Y. M. Marzouk and Ahmed F. Ghoniem, “K-means clustering for optimal partitioning and dynamic load balancing of parallel hierarchical N-body simulations,” J. of Computational Physics, Vol.207, pp. 493-528, 2005.
  126. [126] S. C. Babu and P. Sanyal, “Chapter 13 – Classifying households on food security and poverty dimensions – application of K-mean cluster analysis,” Food Security, Poverty and Nutrition Policy Analysi, pp. 265-277, Academic Press, 2009.
  127. [127] L. H. Juang and M. N. Wu, “MRI brain lesion image detection based on color-converted K-means clustering segmentation,” Measurement, Vol.43, pp. 941-949, 2010.
  128. [128] Y. Ning, X. Zhu, S. Zhu, and Y. Zhang, “Surface EMG Decomposition Based on K-means Clustering and Convolution Kernel Compensation,” IEEE J. of Biomedical and Health Informatics, Vol.19, pp. 471-477, 2015.
  129. [129] O. Kesemen, Ö. Tezel, and E. Özkul, “Fuzzy c-means clustering algorithm for directional data (FCM4DD),” Expert Systems with Applications, Vol.58, pp. 76-82, 2016.
  130. [130] J. Nayak, B. Naik, and H. S. Behera, “Fuzzy C-Means (FCM) ClusteringAlgorithm: A Decade Reviewfrom 2000 to 2014,” Computational Intelligence in Data Mining, Vol.2, pp. 133-149, 2015.
  131. [131] M. Gil, E. G. Sarabia, J. R. Llata, and J. P. Oria, “Fuzzy c-means clustering for noise reduction, enhancement and reconstruction of 3D ultrasonic images,” 7th IEEE Int. Conf. on Emerging Technologies and Factory Automation, 1999.
  132. [132] M. Hanesch, R. S. a, and M. J. D. b, “The application of fuzzy C-means cluster analysis and non-linear mapping to a soil data set for the detection of polluted sites,” Physics and Chemistry of the Earth, Part A: Solid Earth and Geodesy, Vol.26, pp. 885-891, 2001.
  133. [133] M. Maitra and A. Chatterjee, “Hybrid multiresolution Slantlet transform and fuzzy c-means clustering approach for normal-pathological brain MR image segregation,” Medical Engineering & Physics, Vol.30, pp. 615-623, 2008.
  134. [134] Z. Xue, Y. Shang, and A. Feng, “Semi-supervised outlier detection based on fuzzy rough C-means clustering,” Mathematics and Computers in Simulation, Vol.80, pp. 1911-1921, 2010.
  135. [135] M. L. a, M. F. b, and G. Haiyan, “The E-commerce Risk Early-Warning Model Based on the Unascertained C-means Clustering,” Procedia Engineering, Vol.15, pp. 4740-4744, 2011.
  136. [136] A. M. Abdulshahed, A. P. Longstaff, S. Fletcher, and A. Myers, “Thermal error modelling of machine tools based on ANFIS with fuzzy c-means clustering using a thermal imaging camera,” Applied Mathematical Modelling, Vol.39, pp. 1837-1852, 2015.
  137. [137] O. Kesemen, Ö. Tezel, and E. Özkul, “Fuzzy c-means clustering algorithm for directional data (FCM4DD),” Expert Systems with Applications, Vol.58, pp. 76-82, 2016.
  138. [138] T. P. Hong, C. S. Kuo, and S. L. Wang, “A fuzzy AprioriTid mining algorithm with reduced computational time,” Applied Soft Computing, Vol.5, pp. 1-10, 2004.
  139. [139] K. H. Kaoru Shimada, and Jinglu Hu, “Alternate Genetic Network Programming with Association Rules Acquisition Mechanisms Between Attribute Families,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.10, pp. 954-963, 2006.
  140. [140] B. C. Chien, M. H. Zhong, and J. J. Wang, “Mining Fuzzy Association Rules on Has-A and Is-A Hierarchical Structures,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.11, pp. 423-432, 2007.
  141. [141] C. Aflori and M. Craus, “Grid implementation of the Apriori algorithm,” Advances in Engineering Software, Vol.38, pp. 295-300, 2006.
  142. [142] L. Hanguang and N. Yu, “Intrusion Detection Technology Research Based on Apriori Algorithm,” Physics Procedia, Vol.24, pp. 1615-1620, 2012.
  143. [143] W. Renli, D. Yueming, and D. Liming, “The application of Apriori-BSO algorithms in medical records data mining,” IEEE Information Technology, Networking, Electronic and Automation Control Conf. 2016, 2016.
  144. [144] L. Feng, T. Dillon, and J. Liu, “Inter-transactional association rules for multi-dimensional contexts for prediction and their application to studying meteorological data,” Data & Knowledge Engineering, Vol.37, pp. 85-115, 2001.
  145. [145] E. Fernández-Medina, J. Trujillo, R. Villarroel, and M. Piattini, “Access control and audit model for the multidimensional modeling of data warehouses,” Decision Support Systems, Vol.42, pp. 1270-1289, 2006.
  146. [146] J. Yan, L. Zhang, F. Zhao, W. Liu, and K. Li, “Multidimensional Association Analysis of Web Users’ Access Path for Website Design and Promotion,” Sixth Int. Conf. on Fuzzy Systems and Knowledge Discovery, 2009.
  147. [147] G. Liu, H. Jiang, R. Geng, and H. Li, “Application of multidimensional association rules in personal financial services,” Int. Conf. on Computer Design and Applications (ICCDA), 2010.
  148. [148] Y. L. Y. Zou, X. Qin, and S. Ma, “Research and application of association rule mining algorithm based on multidimensional sets,” 5th IEEE Int. Conf. on Software Engineering and Service Science (ICSESS), Beijing, 2014.
  149. [149] X. Zhu, “Quantitative Association Rules,” Encyclopedia of Database Systems, pp. 2240-2244, 2009.
  150. [150] D. Adhikary and S. Roy, “Trends in Quantitative Association Rule Mining,” IEEE 2nd Int. Conf. on Recent Trends in Information Systems (ReTIS), 2015.
  151. [151] W. Lian, D. W. Cheung, and S. M. Yiu, “An efficient algorithm for finding dense regions for mining quantitative association rules,” Computers & Mathematics with Applications, Vol.50, pp. 471-490, 2005.
  152. [152] M. Martínez-Ballesteros, I. Nepomuceno-Chamorro, and J. C. Riquelme, “Inferring gene-gene associations from Quantitative Association Rules,” 11th Int. Conf. on Intelligent Systems Design and Applications (ISDA), 2011.
  153. [153] X. Dong and D. Pi, “An Effective Method for Mining Quantitative Association Rules with Clustering Partition in Satellite Telemetry Data,” Second Int. Conf. on Advanced Cloud and Big Data (CBD), 2014.
  154. [154] A. P. Wright, A. T. Wright, A. B. McCoy, and D. F. Sittig, “The use of sequential pattern mining to predict next prescribed medications,” J. of Biomedical Informatics, Vol.53, pp. 73-80, 2015.
  155. [155] M. B. Schmid, “Novel approaches to the discovery of antimicrobial agents,” Novel approaches to the discovery of antimicrobial agents, Vol.2, pp. 529-534, 1998.
  156. [156] G. Labesse, D. Douguet, L. Assairi, and A. M. Gilles, “Diacylglyceride kinases, sphingosine kinases and NAD kinases: distant relatives of 6-phosphofructokinases,” Trends in Biochemical Sciences, Vol.27, pp. 273-275, 2002.
  157. [157] T. P. Exarchos, C. Papaloukas, C. Lampros, and D. I. Fotiadis, “Mining sequential patterns for protein fold recognition,” J. of Biomedical Informatics, Vol.41, pp. 165-179, 2008.
  158. [158] Shilpa and M. Kaur, “BIG Data and Methodology-A review,” Int. J. of Advanced Research in Computer Science and Software Engineering, Vol.3, pp. 991-995, 2013.
  159. [159] B. Thakur and M. Mann, “Data Mining for Big Data: A Review,” Int. J. of Advanced Research in Computer Science and Software Engineering, Vol.4, pp. 469-473, 2014.
  160. [160] Ishwarappa and J. Anuradha, “A Brief Introduction on Big Data 5Vs Characteristics and Hadoop Technology,” Int. Conf. on Computer, Communication and Convergence (ICCC-2015), 2015.
  161. [161] I. I. Center, “Vision Paper Distributed Data Mining and Big Data,” Intel Corporation, pp. 1-14, 2012.
  162. [162] V. Shobaba, S. Maheshwair, and M. Savithri, “Study on Big data with Data Mining,” Int. J. of Advanced Research in Computer and Communication Engineering, Vol.4, pp. 381-383, 2015.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 18, 2024