single-rb.php

JRM Vol.16 No.4 pp. 420-425
doi: 10.20965/jrm.2004.p0420
(2004)

Paper:

Input of Japanese Characters by Recognizing the Number of Fingers

Takashi Yamagishi*, and Kazunori Umeda**

*Graduate School of Sciences and Engineering, Chuo University (currently Hitachi, Ltd.)

**Faculty of Science and Engineering, Chuo University, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan

Received:
December 26, 2003
Accepted:
February 9, 2004
Published:
August 20, 2004
Keywords:
man-machine interface, gesture recognition, range image, recognition of number of fingers, input method
Abstract
Many recent studies centered on recognizing human gestures using images for man-machine interfaces do not deal with information on Japanese or other characters. We propose inputting Japanese characters by recognizing the number of fingers of one hand. Recognized numbers are converted to characters using the “pocket-bell” rule. The number is recognized by image processing. Range images are used as image information to implement the method. We also discuss how to control recognition timing and canceling input data. Experimental results using a range image sensor show the effectiveness of the proposed methods.
Cite this article as:
T. Yamagishi and K. Umeda, “Input of Japanese Characters by Recognizing the Number of Fingers,” J. Robot. Mechatron., Vol.16 No.4, pp. 420-425, 2004.
Data files:
References
  1. [1] C. Uras, and A. Verri, “Hand gesture recognizing from edge maps,” Proc. International Workshop on Automatic Face and Gesture Recognition, Zurich, pp. 116-121, 1995.
  2. [2] J. Yamato et al., “Recognizing Human Action in Time Sequential Images using Hidden Markov Model,” Proc. CVPR’92, pp. 379-385, 1992.
  3. [3] J. Ohya, and F. Kishino, “Detecting Facial Expressions from Face Images Using a Genetic Algorithm,” 13th International Conference on Pattern Recognition, Vol.III, Track C, pp. 649-653, Aug. 1996.
  4. [4] Y. Iwai, K. Watanabe, Y. Yagi, and M. Yachida, “Gesture Recognition Using Colored Gloves,” ICPR ’96, pp. 662-666, 1996.
  5. [5] J. Davis, and M. Shah, “Recognizing hand gestures,” Proc. European Conference on computer Vision, pp. 331-340, 1994.
  6. [6] K. Yoshino, K. Yoshikawa, T. Kawashima, and Y. Aoki, “Gesture estimation using color combination,” Proc. Asian Conference on computer Vision, Vol.2, pp. 405-409, 1995.
  7. [7] H. Wu, Q. Chen, and M. Yachida, “Face Detection From Color Images Using a Fuzzy Pattern Matching Method,” IEEE Trans. PAMI, 21, 6, pp. 557-563, 1999.
  8. [8] Y. Kokumai, K. Shibukawa, and T. Kishinami, “Extraction of Figure of Moving Part and Its Hand Shape from Sequence Images for Hand Gesture Recognition,” Journal of the Japan Society of Precision Engineering, Vol.64, No.10, pp. 1522-1526, 1998 (in Japanese).
  9. [9] T. Nishimura, T. Mukai, and R. Oka, “Spotting Recognition of Human Gestures performed by People from a Single Time-Varying Image,” Proc. IROS’97, Vol.2, pp. 967-972, 1997.
  10. [10] K. Umeda, and N. Suzuki, “Gesture Recognition of Head Motion Using Range Images,” Proc. 1996 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Vol.3, pp. 1594-1599, 1996.
  11. [11] K. Umeda, I. Furusawa, and S. Tanaka, “Recognition of Hand Gestures Using Range Images,” Proc. 1998 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1727-1732, 1998.
  12. [12] T. Nishimura, and R. Oka, “Towards the Integration of Spontaneous Speech and Gesture based on Spotting Method,” International Conf. on Multisensor Fusion and Integration for Intelligent Systems, pp. 433-437, 1996.
  13. [13] K. Shirai, T. Kobayashi, and I. Kudo, “Speech Processing Technology towards Practical Use,” Journal of IPSJ, Vol.38, No.11, pp. 971-975, 1997 (in Japanese).
  14. [14] H. Kimura, and Y. Ichida, “Hajimete no Shuwa,” Nihonbungeisha, pp. 180-189, 1995 (in Japanese).
  15. [15] K. Umeda, and T. Arai, “3D vision system for mechanical assembly/disassembly,” Advanced robotics, 11, 2, pp. 147-167, 1997.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Nov. 01, 2024