single-jc.php

JACIII Vol.21 No.4 pp. 716-721
doi: 10.20965/jaciii.2017.p0716
(2017)

Paper:

Study on Motion of Sight Line of Communication Robot in Standby State

Akinari Kurosu* and Tomomi Hashimoto**

*Meitec Fielders Co., Ltd.
8-5-26 Sumitomofudousanaoyama Bldg. Nisikan, Akasaka, Minato-ku, Tokyo 107-0052, Japan

**Information Systems, Department of Engineering, Saitama Institute of Technology
1690 Fusaiji, Fukaya-shi, Saitama 369-0293, Japan

Received:
December 28, 2016
Accepted:
March 10, 2017
Published:
July 20, 2017
Keywords:
communication robot, eye robot, standby time
Abstract

The crucial role of nonverbal communication by gaze in social interactions has been highlighted. In this study, an eye robot having two degrees-of-freedom was developed as a communication robot, and the motion of the sight line of this robot in a standby state with an absence of communication with people was investigated. We compared and evaluated the impression provided by a robot and a person, who imitated the action of the robot, to observers and demonstrated that both the robot and person provided improved impression if the line of sight is stationary.

Cite this article as:
A. Kurosu and T. Hashimoto, “Study on Motion of Sight Line of Communication Robot in Standby State,” J. Adv. Comput. Intell. Intell. Inform., Vol.21 No.4, pp. 716-721, 2017.
Data files:
References
  1. [1] The activity of the humanoid robot spreads, Yomiuri Shimbun, 2016.01.11.
  2. [2] T. Shibata, “Therapeutic Robot “Paro” for Robot Therapy,” J. of the Robotics Society of Japan, Vol.24, No.3, pp. 319-322, 2006.
  3. [3] T. Kanda, “People who Perceive Human-likeness in a Robot,” J. of the Robotics Society of Japan, Vol.31, No.9, pp. 860-863, 2013.
  4. [4] A. Kurosu, H. Shimizu, and T. Hashimoto, “Trial Production of an Eye Robot in a Communication Robot,” The 16th SICE System Integration Division Annual Conf., pp. 401-402, 2015.
  5. [5] Y. Yamazaki, Y. Hatakeyama, H. Nobuhara, and K. Hirota, “Eye Robot for Mascot Robot System,” IEICE technical report, Vol.105, No.677, pp. 61-64, 2006.
  6. [6] H. Ueda, Y. Suzuki, and K. Kobuki, “Study of Eye Contact Between the Robot and the User,” IEICE technical report, Vol.112, No.238, pp. 1-6, 2012.
  7. [7] R. Vertegaal, R. Slagter, G. van der Veer, and A. Nijholt, “Eye Gaze Patterns in Conversations: There is More to Conversational Agents Than Meets the Eyes,” CHI ’01 Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 301-308, 2001.
  8. [8] R. A. Colburn, M. F. Cohen, and S. M. Drucker, “The Role of Eye Gaze in Avatar Mediated Conversational Interfaces,” Technical Report MSR-TR-2000-81, Microsoft Research, 2000.
  9. [9] D. G. Novick, B. Hansen, and K. Ward, “Coordinating turn-taking with gaze,” ICSLP 96. Proc., Vol.3, pp. 1888-1891, 1996.
  10. [10] A. Kurosu and T. Hashimoto, “Suggestion of Eyeball Control Method During Idling Time in Communication Robot,” The 16th SICE System Integration Division Annual Conf., pp. 1648-1649, 2016.
  11. [11] MMDAgent, http://www.mmdagent.jp/,2016 [accessed Dec. 28, 2016]
  12. [12] P. Ekman and W. V. Friesen, “Unmasking the Face,” Seishin Shobo, Ltd., 2015.
  13. [13] A. Fukayama, T. Ohno, N. Mukawa, M. Sawaki, and N. Hagita, “Gaze Control Method for Impression Management of Interface Agents,” J. of Information Processing Society of Japan, Vol.43, No.12, pp. 3596-3606, 2002.
  14. [14] K. Kobuki, N. Moriya, and H. Ueda, “Impression Evaluation of Conversational Robot’s Gaze,” IEICE technical report, MVE2010-5, pp. 21-26, 2010.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Oct. 01, 2024