single-rb.php

JRM Vol.21 No.6 pp. 726-738
doi: 10.20965/jrm.2009.p0726
(2009)

Paper:

Fast Hand Feature Extraction Based on Connected Component Labeling, Distance Transform and Hough Transform

Le Dung and Makoto Mizukawa

Department of Electrical Engineering, Shibaura Institute of Technology 3-7-5, Toyosu, Koto-ku, Tokyo 135-8548, Japan

Received:
May 11, 2009
Accepted:
September 30, 2009
Published:
December 20, 2009
Keywords:
hand feature extraction, fingertip positioning, distance transform, Hough transform, connected component labeling
Abstract

In hand gesture recognition or hand tracking systems relied on hand modeling methods, it is usually required to extract from a hand image some hand features. This paper presents a new robust method based on connected component labeling (CCL), distance transform (DT) and Hough transform (HT) to fast and precisely extract the center of the hand, the directions and the fingertip positions of all outstretched fingers on a skin color detection image. First, the method uses a simple but reliable technique that is performed on both the connected component labeling image and the distance transform image to extract the center of the hand and a set of features pixels, which are called distance-based feature pixels. Then, the Hough transform is calculated on these feature pixels to detect all outstretched fingers as lines. From the line detection result, the finger directions and the fingertip positions are determined easily and precisely. This method can be carried out fast and accurately, even when the skin color detection image includes hand, faces and some noise. Moreover, the number of distance-based feature pixels is usually not so high; therefore, the line detection process based on the Hough transform can be performed very fast. That can satisfy the demands of a real-time human-robot interaction system based on hand gestures or hand tracking.

Cite this article as:
Le Dung and Makoto Mizukawa, “Fast Hand Feature Extraction Based on Connected Component Labeling, Distance Transform and Hough Transform,” J. Robot. Mechatron., Vol.21, No.6, pp. 726-738, 2009.
Data files:
References
  1. [1] M. Hasanuzzaman, V. Ampornaramveth, Z. Tao, M.A. Bhuiyan, Y. Shirai, and H. Ueno, “Real-time Vision-based Gesture Recognition for Human Robot Interaction,” IEEE Int. Conf. on Robotics and Biomimetics (ROBIO2004), pp. 413-418, 2004.
  2. [2] E. Sato, T. Yamaguchi, and F. Harashima, “Natural Interface Using Pointing Behavior for Human-Robot Gestural Interaction,” IEEE Trans. on Industrial Electronics, Vol.54, No.2, pp. 1105-1112, 2007.
  3. [3] K. Oka, Y. Sato, and H. Koike, “Real-time fingertip tracking and gesture recognition,” Computer Graphics and Applications, IEEE Vol.22, No.6, pp. 64-71, 2002.
  4. [4] X. Iturbe, A. Altuna, A. Ruiz de Olano, and I. Martinez, “VHDL described finger tracking system for real-time human-machine interaction,” Int. Conf. Signals and Electronic Systems ICSES’08, pp. 171-176, 2008.
  5. [5] Y. Sriboonruang, P. Kumhom, and K. Chamnongthai, “Hand Gesture Interface for Computer Board Game Control,” Consumer Electronics, ISCE IEEE Tenth Int. Symposium, pp. 1-5, 2006.
  6. [6] S. M. Dominguez, T. Keaton, and A.H. Sayed, “A Robust Finger Tracking Method for Multimodal Wearable Computer Interfacing,” IEEE Trans. on Multimedia, Vol.8, No.5, pp. 956-972, 2006.
  7. [7] G. Boreki and A. Zimmer, “Hand geometry: a new approach for feature extraction,” Fourth IEEE Workshop on Automatic Identification Advanced Technologies, pp. 149-154, 2005.
  8. [8] L. Gupta and M. Suwei, “Gesture-based interaction and communication: automated classification of hand gesture contours,” Systems, IEEE Trans. on Man, and Cybernetics, Part C: Applications and Reviews, Vol.31, No.1, pp. 114-120, 2001.
  9. [9] L. Jin, D. Yang, L.-X. Zhen, and J.-C. Huang, “A Novel Vision based Finger-writing Character Recognition System,” Proc of ICPR2006, August 20-24, HongKong, 2006.
  10. [10] J.-M. Kim and W.-K. Lee, “Hand Shape Recognition Using Fingertips,” Fifth Int. Conf. on Fuzzy Systems and Knowledge Discovery FSKD apos’08, Vol.4, pp. 44-48, 2008.
  11. [11] K.-J. Hsiao, T.-W. Chen, and S.-Y. Chien, “Fast fingertip positioning by combining particle filtering with particle random diffusion,” in Proc. of the IEEE Int. Conf. on Multimedia and Expo ICME2008, pp. 977-980, 2008.
  12. [12] A. Rosenfeld and J. Pfaltz, “Distance Functions in Digital Pictures,” Pattern Recognition, Vol.1, pp. 33-61, 1968.
  13. [13] L. Shapiro and G. Stockman, “Computer Vision,” Prentice Hall, pp. 69-73, 2002.
  14. [14] J.Y. Lee and S.I. Yoo, “An elliptical boundary model for skin color detection,” in Proc. of Int. Conf. on Imaging Science, Systems, and Technology, 2002.
  15. [15] D.H. Ballard, “Generalizing the Hough Transform to Detect Arbitrary Shapes,” Pattern Recognition, Vol.13, pp. 111-122, 1981.
  16. [16] R. O. Duda and P. E. Hart, “Use of the Hough Transformation to Detect Lines and Curves in Pictures,” Comm. ACM, Vol.15, pp. 11-15, 1972.
  17. [17] W. Ying and T.S Huang, “Hand modeling, analysis and recognition,” IEEE Signal Processing Magazine, Vol.18, pp. 55-60, 2001.
  18. [18] L. Bretzner, I. Laptev, and T. Lindeberg, “Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering,” in Proc. of the Fifth IEEE Int. Conf. on Automatic Face and Gesture Recognition, AFGR IEEE Computer Society, pp. 423-428, 2002.
  19. [19] T. Lindeberg, “Feature Detection with Automatic Scale Selection,” Int. J. of Computer Vision, Vol.30, pp. 79-116, 2004.
  20. [20] H. Bay, T. Tuytelaars, and L. V. Gool, “Surf: Speeded up robust features,” in Proc. of European Conf. on Computer Vision, pp. 404-417, 2006.
  21. [21] F. Yikai, C. Jian, W. Kongqiao, and L. Hanqing, “Hand Gesture Recognition Using Fast Multi-scale Analysis,” in Proc. of the Fourth Int. Conf. on Image and Graphics, ICIG ’07 IEEE Computer Society, pp. 694-698, 2007.
  22. [22] P. Viola and M. Jones, “Rapid Object Detection Using a Boosted Cascade of Simple Features,” in Proc. of Computer Vision and Pattern Recognition, pp. 511-518, 2001.
  23. [23] J. Triesch and C. von der Malsburg, “Robust classification of hand postures against complex background,” in Proc. of Int. Conf. on Face and Gesture Recognition, pp. 170-175, 1999.
  24. [24] P. Kakumanu, S. Makrogiannis, and N. Bourbakis, “A survey of skin-color modeling and detection methods,” Pattern Recognition, Vol.40, No.3, pp. 1106-1122, 2007.
  25. [25] S. Chang, “Extracting Skeletons from Distance Maps,” IJCSNS Int. J. of Computer Science and Network Security, Vol.7, No.7, pp. 213-219, 2007.
  26. [26] K. Hoshino, E. Tamaki, and T. Tanimoto, “Copycat hand - Robot hand imitating human motions at high speed and with high accuracy,” Advanced Robotics, Vol.21, No.15, pp. 1743-1761, 2007.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Nov. 25, 2021