single-jc.php

JACIII Vol.9 No.6 pp. 599-606
doi: 10.20965/jaciii.2005.p0599
(2005)

Paper:

Adaptive Vector Quantization with Creation and Reduction Grounded in the Equinumber Principle

Michiharu Maeda*,Noritaka Shigei**, and Hiromi Miyajima**

*Kurume National College of Technology, Kurume 830-8555, Japan

**Kagoshima University, Kagoshima 890-0065, Japan

Received:
February 28, 2005
Accepted:
June 13, 2005
Published:
November 20, 2005
Keywords:
equinumber of inputs, creation and reduction, adaptive vector quantization, partition space, image coding
Abstract
This paper concerns the constitution of unit structures in neural networks for adaptive vector quantization. Partition errors are mutually equivalent when the number of inputs in a partition space is mutually equal, and average distortion is asymptotically minimized. This is termed the equinumber principle, in which two types of adaptive vector quantization are presented to avoid the initial dependence of reference vectors. Conventional techniques, such as structural learning with forgetting, have the same number of output units from start to finish. Our approach explicitly changes the number of output units to reach a predetermined number without neighboring relations equalling the numbers of inputs in a partition space. First, output units are sequentially created based on the equinumber principle in the learning process. Second, output units are sequentially deleted to reach a prespecified number. Experimental results demonstrate the effectiveness of these techniques in average distortion. These approaches are applied to image data and their feasibility was confirmed in image coding.
Cite this article as:
M. Maeda, N. Shigei, and H. Miyajima, “Adaptive Vector Quantization with Creation and Reduction Grounded in the Equinumber Principle,” J. Adv. Comput. Intell. Intell. Inform., Vol.9 No.6, pp. 599-606, 2005.
Data files:

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024