single-jc.php

JACIII Vol.14 No.3 pp. 316-322
doi: 10.20965/jaciii.2010.p0316
(2010)

Paper:

Investigation on Robot User Interface for Information Access

Yasufumi Takama and Hiroki Namba

Graduate School of System Design, Tokyo Metropolitan University, 6-6 Asahigaoka, Hino, Tokyo 191-0065, Japan

Received:
November 19, 2009
Accepted:
January 6, 2010
Published:
April 20, 2010
Keywords:
robot UI, partner robot, human-robot interaction, information recommendation
Abstract

This paper investigates the characteristics of robots for non-industrial use such as home robots, when those are used as an interface for accessing information. Although information support is one of important capabilities home robots should have, the merits of accessing information via a robot compared with the access via PC or a mobile phone have yet to be fully explored. This paper focuses on the physical presence of robots, which is supposed to be important for robots to provide users with information. In order to investigate the merits, two experiments with participants are performed in this paper. The main contributions of the paper are the following points. First, it is shown that a robot can effectively attract the participants through movements, even though they pay less attention to it. Second, the possibility of using robot actions for providing additional information about information to be accessed by the participant is also investigated. Finally, the effect of a robot User Interface (UI) prototype on communication among users when providing information to them is also investigated. The obtained results support the significance of information support by home robots, which will be used for designing home robots with information support facility.

Cite this article as:
Yasufumi Takama and Hiroki Namba, “Investigation on Robot User Interface for Information Access,” J. Adv. Comput. Intell. Intell. Inform., Vol.14, No.3, pp. 316-322, 2010.
Data files:
References
  1. [1] N. Matsuhira and H. Ogawa, “Trends in Development of Home Robots Leading Advanced Technologies,” Toshiba Review, Vol.59, No.9, 2004. (in Japanese)
  2. [2] S. Satake, H. Kawashima, and M. Imai, “Browsing Robot: Browsing Web Contents through a Communication Robot,” IPSJ SIG Notes ICS, Vol.2004, No.85, pp. 49-55, 2004. (in Japanese)
  3. [3] K. Ueno, T. Hasegawa, and A. Ohsuga, “Agent Enabling Robots to Join Ubiquitous World,” Toshiba Review, Vol.59, No.9, pp. 45-48, 2004. (in Japanese)
  4. [4] H. Ueda et al., “Real Living Experiments with Conversational Robots at Ubiquitous-Home,” Review of NICT, Vol.53, No.3, pp. 145-152, 2007. (in Japanese)
  5. [5] Y. Takama, H. Namba, Y. Iwase, Y. Muto, and S. Hattori, “Concept of Humatronics and its Application to Human-Robot Communication Support Under TV Watching Environment,” J. of Advanced Computational Intelligence and Intelligent Informatics, Vol.12, No.6, pp. 494-502, 2008.
  6. [6] Ministry of Economy, “Trade and Industry, Report of Working group on robot policy,” 2006. (in Japanese)
    http://www.meti.go.jp/press/20060516002/robot-houkokushoset.pdf
  7. [7] T. Shibata, “An Overview of Human Interactive Robot for Psychological Enrichment,” Proc. of the IEEE, Vol.92, No.11, pp. 1749-1758, 2004.
  8. [8] iRobot Corporation, http://store.irobot.com/corp/index.jsp
  9. [9] J. Lee, N. Ando, and H. Hashimoto, “Mobile Robot Architecture in Intelligent Space,” J. of Robotics and Mechatronics, Vol.11, No.2, pp. 165-170, 1999.
  10. [10] M. Imai, Y. Hirota, S. Satake, and H. Kawashima, “Semantic Sensor Network for Physically Grounded Applications,” ICARCV2006, p. 1493, 2006.
  11. [11] M. Narita and M. Shimamura, “A Report on RSi (Robot Service Initiative) Activities,” 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 265-268, 2005.
  12. [12] K. Ikeda, “Communication,” The University of Tokyo Press, 2000. (in Japanese)
  13. [13] A. Yamaguchi, Y. Yano, S. Doki, and S. Okuma, “A study of emotional motion description by motion modification rules using adjectival expressions,” IEEE Int. Conf. on SMC, pp. 2837-2842, 2006.
  14. [14] Y. Yamazaki, F. Dong, Y. Uehara, Y. Hatakeyama, H. Nobuhara, Y. Takama, and K. Hirota, “Mentality Expression in An Affinity Pleasure-Arousal Space using the Ocular and Eyelid Motion of An Eye Robot,” SCIS&ISIS2006, pp. 422-425, 2006.
  15. [15] H. Kozima, C. Nakagawa, and Y. Yasuda, “Robot-mediated Communication for Autism Therapy,” J. of Information Processing Society of Japan, Vol.49, No.1, pp. 36-42, 2008. (in Japanese)
  16. [16] T. Kanda, H. Ishiguro, T. Ono, M. Imai, and R. Nakatsu, “Effects of Observation of Robot-Robot communication onHuman-Robot Communication,” IEICE Trans. on Information and Systems, Vol.J85-D-1, No.7, pp. 691-700, 2002. (in Japanese).
  17. [17] A. Taylor and R. Harper, “Switching On to Switch Off: a Analysis of Routine TV Watching Habits and Their Implications for Electronic Programme Guide Design,” usableiTV, 1, pp. 7-13, 2002.
  18. [18] ZMP. Inc., Nuvo, http://nuvo.jp/nuvo_home_e.html
  19. [19] N. Ando, T. Suehiro, K. Kitagaki, T. Kotoku, and W. Yoon, “RTComponent Object Model in RT-Middleware – Distributed Component Middleware for RT (Robot Technology) –,” 2005 IEEE Int. Symposium on Computational Intelligence in Robotics and Automation (CIRA2005), We-B2-5, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Mar. 05, 2021