JRM Vol.18 No.6 pp. 751-759
doi: 10.20965/jrm.2006.p0751


Gestures Recognition Based on the Fusion of Hand Positioning and Arm Gestures

Didier Coquin*, Eric Benoit*, Hideyuki Sawada**,
and Bogdan Ionescu*,***

*LISTIC - University of Savoie, Domaine Universitaire, B.P. 806, 74016 Annecy-Cedex, France

**Kagawa University, 2217-20 Hayashi-cho, Takamatsu, Kagawa 761-0396, Japan

***LAPI, University “Politehnica” Bucharest, 061071 Romania

April 4, 2006
August 11, 2006
December 20, 2006
hand positioning, arm gestures, fusion process, gesture recognition

To improve the link between operators and equipment, communication systems have begun using natural (user-oriented) languages such as speech and gestures. Our goal is to present gesture recognition based on the fusion of measurements from different sources. Sensors must be able to capture at least the location and orientation of the hand, as is done by Dataglove and a video camera. Dataglove gives the hand position and the video camera gives the general arm gesture representing the gesture’s physical and spatial properties based on the two-dimensional (2D) skeleton representation of the arm. Measurement is partly complementary and partly redundant. The application is distributed over intelligent cooperating sensors. We detail the measurement of hand positioning and arm gestures, fusion processes, and implementation.

Cite this article as:
Didier Coquin, Eric Benoit, Hideyuki Sawada, and
and Bogdan Ionescu, “Gestures Recognition Based on the Fusion of Hand Positioning and Arm Gestures,” J. Robot. Mechatron., Vol.18, No.6, pp. 751-759, 2006.
Data files:
  1. [1] E. Benoit, T. Allevard, T. Ukegawa, and H. Sawada, “Fuzzy Sensor for Gesture Recognition Based on Shape Recognition of Hand,” Int. Symp. on Virtual Environments, Human-Computer Interfaces, and Measurement Systems (VECIMS’03), Lugnano, Switzerland, pp. 63-67, July, 2003.
  2. [2] H. Sawada, T. Ukegawa, and E. Benoit, “Robust gesture recognition by possibilistic approach based on data resampling,” Fuzzy Systems & Innovational Computing (FIC2004), Kitakyushu, Japan, pp. 168-173, June, 2004.
  3. [3] V. I. Pavlovic, R. Sharma, and T. S. Huang, “Visual interpretation of hand gestures for human-computer interaction: a review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.19, No.7, pp. 677-695, 1997.
  4. [4] Y. Wu and T. S. Huang, “Vision-Based Gesture Recognition: A Review,” Lecture Notes in Computer Science, Vol.1739, pp. 1-12, 1999.
  5. [5] H. S. Yoon, J. Soh, Y. J. Bae, and H. S. Yang, “Hand gesture recognition using combined features of location, angle and velocity,” Pattern Recognition, Vol.34, No.7, pp. 1491-1501, 2001.
  6. [6] S. Nayaga, S. Seki, and R. Oka, “A theoretical consideration of pattern space trajectory for gesture spotting recognition,” IEEE International Conference on Automatic Face and Gesture Recognition (FGR’96), pp. 72-77, Killington, Vt, USA, October, 1996.
  7. [7] B. Raytchev, O. Hasegawa, and N. Otsu, “User-independent online gesture recognition by relative motion extraction,” Pattern Recognition Letters (21), pp. 69-82, January, 2000.
  8. [8] C. W. Ng and S. Ranganath, “Real-time gesture recognition system and application,” Image and Vision Computing 20, pp. 993-1007, 2002.
  9. [9] F.-S. Chen, C.-M. Fu, and C.-L. Huang, “Hand gesture recognition using a real time tracking method and hidden Markov models,” Image and Vision Computing 21, pp. 745-758, 2003.
  10. [10] R. Kjeldsen and J. Kender, “Towards the use of gesture in traditional user interface,” IEEE International Conference on Automatic Face and Gesture Recognition (FGR’96), pp. 66-71, Killington, Vt, USA, October, 1996.
  11. [11] F. Quek, “Unencumbered Gestural Interaction,” IEEE Multimedia, Vol.4, No.3, pp. 36-47, 1996.
  12. [12] H. Chris, I. Sexton, and M. Mullan, “A Linguistic Approach to the Recognition of Hand Gestures,” in Proceedings of the Designing Future Interaction Conference, University of Warwick, UK, 1994.
  13. [13] J. W. Davis, J. William, and M. Shah, “Gesture Recognition,” Technical Report, CS-TR-93-11, Department of Computer Science, University of Central Florida, Orlando, USA, 1993.
  14. [14] T. Allevard, E. Benoit, and L. Foulloy, “Fuzzy Glove for Gesture Recognition,” 17th IMEKO World Congress, Dubrovnik, Croatia, pp. 2026-2031, June, 2003.
  15. [15] A. Kendon, “Current issues in the study of gesture,” The Biological Foundations of Gestures: Motor and semiotic Aspects, J.-L. Nespoulous, P. Perron, and A. R. Lecours (Eds.), pp. 23-47, Lawrence Erlbaum Associates, Hillsdale, NJ, USA, 1986.
  16. [16] F. Quek, “Toward a vision-based hand gesture interface,” Proceeding of Virtual Reality Software and Technology Conference, pp. 17-29, Singapore, August, 1994.
  17. [17] B. Ionescu, D. Coquin, P. Lambert, and V. Buzuloiu, “Dynamic Hand Gesture Recognition Using the Skeleton of the Hand,” EURASIP Journal on Applied Signal Processing, Vol.2005, No.13, pp. 2101-2109, August, 2005.
  18. [18] W. T. Freeman and M. Roth, “Orientation Histograms for Hand Gesture Recognition,” Mitsubishi Electric Research Laboratories, Cambridge Research centre, TR-94-03a, December, 1995.
  19. [19] D. Coquin and P. Bolon, “Applications of Baddeley’s distance to dissimilarity measurement between gray scale images,” Pattern Recognition Letters, Vol.22, pp. 1483-1502, 2001.
  20. [20] A. J. Baddeley, “An error metric for binary images,” Robust Computer Vision, pp. 59-78, Wichmann, Karlsruhe, 1992.
  21. [21] J. Desachy, L. Roux, and E.-H. Zahzah, “Numeric and symbolic data fusion: A soft computing approach to remote sensing images analysis, ” Pattern Recognition Letters, Vol.17, Issue 13, pp. 1361-1378, November, 1996.
  22. [22] D. Dubois and H. Prade, “Combination of Fuzzy Information in the Framework of Possibility Theory,” M. A. Abid and R. C. Gonzalez (Eds.), Data Fusion in Robotics and Machine Intelligence, Academic Press, New York, 1992.
  23. [23] M. Oussalah, “Study of some algebraical properties of adaptive combination rules,” Fuzzy Sets and Systems, Vol.114, Issue 3, pp. 391-409, September, 2000.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Feb. 25, 2021