single-jc.php

JACIII Vol.11 No.3 pp. 276-281
doi: 10.20965/jaciii.2007.p0276
(2007)

Paper:

Toward Natural Communication: Human-Robot Gestural Interaction Using Pointing

Eri Sato*, Aika Nakajima**, Jun Nakazato**, and Toru Yamaguchi**

*Tokyo Metropolitan Institute of Technology, Yamaguchi Lab., 6-6 Asahigaoka, Hino, Tokyo 191-0065, Japan

**Tokyo Metropolitan University, Yamaguchi Lab., 6-6 Asahigaoka, Hino, Tokyo 191-0065, Japan

Received:
April 19, 2006
Accepted:
July 29, 2006
Published:
March 20, 2007
Keywords:
gestural interface, human-machine interaction, pointing, behavior
Abstract
We are studying human and robot interaction based on interpersonal communication, focusing on pointing. Pointing, while useful in communicating with others, is highly context-dependent, making it difficult for robots to interpret accurately. We conducted three experiments on robot behavior, creating basic motion using a virtual robot because using a real robot requires much time. We then had the virtual robot interact with two real robots having different degrees of freedom and ranges of movement.
Cite this article as:
E. Sato, A. Nakajima, J. Nakazato, and T. Yamaguchi, “Toward Natural Communication: Human-Robot Gestural Interaction Using Pointing,” J. Adv. Comput. Intell. Intell. Inform., Vol.11 No.3, pp. 276-281, 2007.
Data files:
References
  1. [1] P. Andry, P. Gaussier, S. Moga, J. P. Banquet, and J. Nadel, “Learning and Communication via Imitation: An Autonomous Robot Perspective,” IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, 31(5): pp. 431-442, 2001.
  2. [2] R. A. Bolt, “ “Put-That-There”: Voice and Gesture at the Graphics Interface,” Computer Graphics (ACM), 14(3): pp. 262-270, 1980.
  3. [3] K. Kanagawa, E. Sato, T. Yamaguchi, and M. Miyaji, “Acquisition of Kansei expression by study of mimicry and application to human support network robot system,” In 2005 International Symposium on Advanced Intelligent Systems, pp. 171-174, September, 2005.
  4. [4] J. Nakazato, K. Kiyama, and T. Yamaguchi, “Intelligent networked Mobility based on natural interaction for Human support system,” In 2006 RISP International Workshop on Nonlinear Circuits and Signal Processing, pp. 134-137, March, 2006.
  5. [5] M. Ogino, H. Toichi, Y. Yoshikawa, and M. Asada, “Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping,” Robotics and Autonomous Systems, 54(5): pp. 414-418, 2006.
  6. [6] E. Sato, A. Nakajima, and T. Yamaguchi, “Nonverbal Interface for User-friendly Manipulation based on Natural Motion,” In 2005 IEEE International Symposium on Computational Intelligence in Robotics and Automation, June, 2005.
  7. [7] E. Sato, A. Nakajima, T. Yamaguchi, and F. Harashima, “Humatronics(1) – Natural Interaction between human and networked robot using human motion recognition,” In 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2794-2799, August, 2005.
  8. [8] T. Yamaguchi, A. Sannoh, and M. Iori, “Intelligent Technologies – theory and Applications,” chapter Evolutionary Human Support Agents and Applications, IOS Press, 2002.
  9. [9] T. Yamaguchi, E. Sato, and Y. Takama, “Intelligent Space and Human Centered Robotics,” IEEE Transaction on Industrial Electronics, 50(5): pp. 881-889, 2003.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024