single-jc.php

JACIII Vol.18 No.6 pp. 1007-1012
doi: 10.20965/jaciii.2014.p1007
(2014)

Paper:

On Bayesian Clustering with a Structured Gaussian Mixture

Keisuke Yamazaki

Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, G5-19, 4259 Nagatsuta, Midori-ku, Yokohama 226-8503, Japan

Received:
September 25, 2013
Accepted:
May 15, 2014
Published:
November 20, 2014
Keywords:
cluster analysis, Bayes statistics, unsupervised learning, asymptotic analysis
Abstract
Cluster analysis is commonly used in the fields of computational intelligence and pattern recognition. The task is to detect the unobservable labels that show to which clusters the observable data belong. A Gaussian mixture is a representative hierarchical model that is often used when taking a probabilistic approach to this task. Although it is widely used, the statistical properties of cluster analysis have not yet been clarified. The present paper analyzes the theory of Bayesian clustering for the case when the number of clusters is unknown and the variance-covariance matrix of the Gaussian distribution has a constraint. We refer to this constraint as the structure of the components. The result of this analysis shows that, even if the estimation method does not take account of the structure, the Bayes method provides an effective, tractable, and efficient algorithm. Based on an experiment with simulated data, we confirmed the advantages of the Bayes method over the expectationmaximization (EM) method.
Cite this article as:
K. Yamazaki, “On Bayesian Clustering with a Structured Gaussian Mixture,” J. Adv. Comput. Intell. Intell. Inform., Vol.18 No.6, pp. 1007-1012, 2014.
Data files:
References
  1. [1] K. Yamazaki, “A Theoretical Analysis of KL-type Generalization Error on Hidden Variable Distribution,” Technical Report NC2010-165, IEICE, 2011.
  2. [2] C. Andrieu, N. d. Freitas, A. Doucet, and M. I. Jordan, “An Introduction to MCMC for Machine Learning,” Machine Learning, Vol.50, No.1-2, pp. 5-43, 2003.
  3. [3] M. E. Tipping and C. M. Bishop, “Probabilistic Principal Component Analysis,” J. of the Royal Statistical Society, Series B, Vol.61, pp. 611-622, 1999.
  4. [4] M. E. Tipping and C. M. Bishop, “Mixtures of probabilistic principal component analyzers,” Neural Computation, Vol.11, No.2, pp. 443-482, February 1999.
  5. [5] C. M. Bishop, “Bayesian PCA,” In NIPS, pp. 382-388, 1998.
  6. [6] C. M. Bishop, “Variational principal components,” Proc. 9th Int. Conf. on Artificial Neural Networks (ICANN’99), pp. 509-514, 1999.
  7. [7] S. Watanabe, “Algebraic Geometry and Statistical Learning Theory,” Cambridge University Press, New York, NY, USA, 2009.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024