single-au.php

IJAT Vol.11 No.3 pp. 378-384
doi: 10.20965/ijat.2017.p0378
(2017)

Paper:

Three-Dimensional Input System Employing Pinching Gestures for Robot Design

Kiyoshi Hoshino and Keita Hamamatsu

University of Tsukuba
1-1-1 Tennodai, Tsukuba 305-8573, Japan

Corresponding author

Received:
October 5, 2016
Accepted:
February 21, 2017
Online released:
April 28, 2017
Published:
May 5, 2017
Keywords:
three-dimensional input system, pinching gestures, depth sensor, robot design
Abstract
Several studies of input interfaces capable of recognizing the gestures have been conducted but most of them use the user’s fingers to enter the position data. These finger-based input interfaces are difficult to provide a so-called click & drag function (as in a mouse) and some of them request for the user to take uncomfortable gestures. When people pinch any objects, however, basically their thumb and index finger come into contact with each other or separate them from each other. These pinching gestures provide superior benefits as the gestures, which may contribute to the input interfaces. This study proposes the method for detecting 3D finger positions and estimating 3D hand postures in pinching gestures based on information on depth images captured by a depth sensor, especially from the viewpoint of robot design. That produces benefits including button-clicking-like input operation by means of contact between the fingers; user’s comfortable gestures as in daily life; clicking action independent of input of positions and postures; and clear identification between ON and OFF. As the evaluation of the 3D input interface proposed here, the authors design real products with the system and a 3D printer, suggesting that the users can design precise and fine 3D objects with his/her comfortable daily gestures with highest usability.
Cite this article as:
K. Hoshino and K. Hamamatsu, “Three-Dimensional Input System Employing Pinching Gestures for Robot Design,” Int. J. Automation Technol., Vol.11 No.3, pp. 378-384, 2017.
Data files:
References
  1. [1] “3D-printed Car by Local Motors – The Strati.”
    https://www.youtube.com/watch?v=daioWlkH7ZI [Accessed September 7, 2014]
  2. [2] Y. Boz, O. Demir, and I. Lazoglu, “Model based feedrate scheduling for free-form surface machining,” Int. J. on Automation Technology (IJAT), Vol.4, No.3, pp. 273-283, 2010.
  3. [3] F. Tanaka, “Current situation and problems for representation of tolerance and surface texture in 3D CAD model,” Int. J. on Automation Technology (IJAT), Vol.5, No.2, pp. 201-205, 2011.
  4. [4] E. Kunii, T. Matsuura, S. Fukushige, and Y. Umeda, “Proposal of consistency management method between product and its life cycle for supporting life cycle design,” Int. J. on Automation Technology (IJAT), Vol.6, No.3, pp. 272-278, 2012.
  5. [5] K. Takasugi, H. Tanaka, M. Jono, and N. Asakawa, “Development of a forging type rapid prototyping system (relationship between hammering direction and product shape),” Int. J. on Automation Technology (IJAT), Vol.6, No.1, pp. 38-45, 2012.
  6. [6] S. Kanai, T. Shibata, and T. Kawashima, “Feature-based 3D process planning for MEMS fabrication,” Int. J. on Automation Technology (IJAT), Vol.8, No.3, pp. 406-419, 2014.
  7. [7] M. M. Isnaini, Y. Shinoki, R. Sato, and K. Shirase, “Development of a CAD-CAM interaction system to generate a flexible machining process plan,” Int. J. on Automation Technology (IJAT), Vol.9, No.2, pp. 104-114, 2015.
  8. [8] Y. Sato, Y. Kobayashi, and H. Koike, “Fast tracking of hands and fingertips in infrared images for augmented desk interface,” Proc. Fourth IEEE Int. Conf, on Automatic Face and Gesture Recognition, pp. 462-467, 2000.
  9. [9] C. Von Hardenberg and F. Bérard, “Bare-hand human-computer interaction,” ACM Proc. the 2001 workshop on Perceptive user interfaces, pp. 1-8, 2001.
  10. [10] K. Oka, Y. Sato, and H. Koike, “Real-time tracking of multiple fingertips and gesture recognition for augmented desk interface systems,” Proc. 5th IEEE on Automatic Face and Gesture Recognition, pp. 429-434, 2002.
  11. [11] F. Bérard, J. Ip, M. Benovoy, D. El-Shimy, J. R. Blum, J. R. Cooperstock, “Did ‘Minority Report’ get it wrong? Superiority of the mouse over 3D input devices in a 3D placement task,” IFIP Conf. on Human-Computer Interaction, pp. 400-414, 2009.
  12. [12] Z. Zhang, “Vision-based interaction with fingers and papers,” Proc. Int. Symposium on the CREST Digital Archiving Project, pp. 83-106, 2003.
  13. [13] S. Malik and J. Laszlo, “Visual touchpad: a two-handed gestural input device,” ACM Proc. the 6th Int. Conf, on Multimodal Interfaces, pp. 289-296, 2004.
  14. [14] S. Malik, A. Ranjan, and R. Balakrishnan, “Interacting with large displays from a distance with vision-tracked multi-finger gestural input,” Proc. 18th annual ACM symposium on User interface software and technology, pp. 43-52, 2005.
  15. [15] Z. Zhang, “Microsoft kinect sensor and its effect,” IEEE multimedia, Vol.19, No.2, pp. 4-10, 2012.
  16. [16] A. D. Wilson, “Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input,” Proc. UIST’06, ACM Press, pp. 255-258, 2006.
  17. [17] A. Wilson, S. Izadi, O. Hilliges, A. Garcia-Mendoza, and D. Kirk, “Bringing physics to the surface,” Proc. UIST ’08 ACM Proc. 21st annual ACM symposium on User interface software and technology, pp. 67-76, 2008.
  18. [18] K. Oka, Y. Sato, and H. Koike, “Real-time tracking of multiple fingertips and gesture recognition for augmented desk interface systems,” Proc. 5th IEEE on Automatic Face and Gesture Recognition, pp. 429-434, 2002.
  19. [19] K. Fukuchi, T. Sato, H. Mamiya, and H. Koike, “Pac-pac: pinching gesture recognition for tabletop entertainment system,” ACM Proc. the Int. Conf. on Advanced Visual Interfaces, pp. 267-273, 2010.
  20. [20] K. Hamamatsu and K. Hoshino, “Detection of pinching gestures using a depth sensor and its application to 3D modeling,” 2013 IEEE/SICE Int. Sympo. on System Integration (SII2013), TA2-K.1, pp. 814-819, 2013.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 05, 2024