Fuzzy c-Means Algorithms Using Kullback-Leibler Divergence and Helliger Distance Based on Multinomial Manifold
Ryo Inokuchi* and Sadaaki Miyamoto**
*Doctoral Program in Risk Engineering, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8573, Japan
**Department of Risk Engineering, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8573, Japan
S. Amari and H. Nagaoka, “Methods of Information Geometry,” 10th European Conference on Machine Learning, Cambridge University Press, Vol.70, pp. 1-25, 2000.
J. C. Bezdek, “Pattern Recognition with Fuzzy Objective Function Algorithms,” Kluwer Academic Publishers Norwell, MA, USA, 1981.
M. Collins, S. Dasgupta, and R. Schapire, “A generalization of principal component analysis to the exponential family,” Advances in Neural Information Processing Systems, MIT Press, Vol.14, 2002.
T. Jebara, R. Kondor, and A. Howard, “Probability product kernels,” The Journal of Machine Learning Research, MIT Press Cambridge, MA, USA, Vol.5, pp. 819-844, 2004.
R. E. Kass and P. W. Vos, “Geometrical Foundations of Asymptotic Inference,” Wiley Series in Probability and Statistics, 1997.
J. Lafferty and G. Lebanon, “Diffusion kernels on statistical manifolds,” The Journal of Machine Learning Research, MIT Press Cambridge, MA, USA, Vol.6, pp. 129-163, 2005.
J. MacQueen, “Some methods for classification and analysis of multivariate observations,” Proc. of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, Vol.1, pp. 281-297, 1967.
C. D. Manning and H. Schütze, “Foundations of statistical natural language processing,” MIT Press, 1999.
S. Miyamoto and M. Mukaidono, “Fuzzy c-means as a regularization and maximum entropy approach,” Proc. of the 7th International Fuzzy Systems Association World Congress (IFSA'97), Vol.2, pp. 86-92, 1997.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.