JACIII Vol.21 No.5 pp. 885-894
doi: 10.20965/jaciii.2017.p0885


Approach to Clustering with Variance-Based XCS

Caili Zhang*, Takato Tatsumi*, Masaya Nakata**, and Keiki Takadama*

*The University of Electro-Communications
1-5-1 Chofugaoka, Chofu-shi, Tokyo 182-8585, Japan

**Yokohama National University
79-1 Tokiwadai, Hodogaya-ku, Yokohama, Japan

April 14, 2017
July 21, 2017
September 20, 2017
machine learning, learning classifier system

This paper presents an approach to clustering that extends the variance-based Learning Classifier System (XCS-VR). In real world problems, the ability to combine similar rules is crucial in the knowledge discovery and data mining field. Conventionally, XCS-VR is able to acquire generalized rules, but it cannot further acquire more generalized rules from these rules. The proposed approach (called XCS-VRc) accomplishes this by integrating similar generalized rules. To validate the proposed approach, we designed a bench-mark problem to examine whether XCS-VRc can cluster both the generalized and more generalized features in the input data. The proposed XCS-VRc proved to be more efficient than XCS and the conventional XCS-VR.

Cite this article as:
C. Zhang, T. Tatsumi, M. Nakata, and K. Takadama, “Approach to Clustering with Variance-Based XCS,” J. Adv. Comput. Intell. Intell. Inform., Vol.21 No.5, pp. 885-894, 2017.
Data files:
  1. [1] J. R. Quinlan, “Induction of Decision Trees,” MACH. LEARN, Vol.1, pp. 81-106, 1986.
  2. [2] B. E. Boser, I. M. Guyon, and V. N. Vapnik, “A Training Algorithm for Optimal Margin Classifiers,” Proc. of the 5th Annual Workshop on Computational Learning Theory, COLT’92, pp. 144-152, NY, USA, ACM, 1992.
  3. [3] D. C. Montgomery, E. A. Peck, and G. G. Vining, “Introduction to linear regression analysis,” John Wiley & Sons, 2015.
  4. [4] T. Hastie, “Neural network,” Wiley Online Library, 1998.
  5. [5] T. Tatsumi, T. Komine, M. Nakata, H. Sato, and K. Takadama, “A Learning Classifier System that Adapts Accuracy Criterion,” Trans. of the Japanese Society for Evolutionary Computation, Vol.6, No.2, pp. 90-103, 2015.
  6. [6] J. H. Holland, “Escaping brittleness: the possibilities of general purpose learning algorithms applied to parallel rule-based system,” Machine learning, pp. 593-623, 1986.
  7. [7] S. W. Wilson, “Classier Fitness Based on Accuracy,” Evol. Comput, pp. 149-175, 1995.
  8. [8] R. S. Sutton, “Learnig to Predict by the Methods of Temporal Differences,” Machine learning, pp. 9-44, 1988.
  9. [9] D. E. Goldberg, “Genetic Algorithm in Search,Optimization and Machine Learning,” Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA, 1st ed., pp. 149-175, 1986.
  10. [10] K. Tamee, L. Bull, and O. Pinngern, “YCSc: A Modified Clustering Technique Based On LCS,” J. of Digital Information Management, Vol.5, No.3, 2007.
  11. [11] P. L. Lanzi, “Learning classifier systems from a reinforcement learning perspective,” Soft Computing, Vol.6, No.3-4, pp. 162-170, 2002.
  12. [12] M. V. Butz and J. Hoffmann, “Anticipations control behavior: Animal behavior in an anticipatory learning classifier system,” Adaptive Behavior, Vol.10, No.2, pp. 75-96, 2002.
  13. [13] R. S. Sutton, “Between MDPs and Semi-MDPs: Learning, planning, and representing knowledge at multiple temporal scales,” 1998.
  14. [14] M. V. Butz, K. Sastry, and D. E. Goldberg, “Tournament selection: stable fitness pressure in XCS,” pp. 1857-1869, 2003.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jun. 03, 2024