single-jc.php

JACIII Vol.21 No.4 pp. 675-685
doi: 10.20965/jaciii.2017.p0675
(2017)

Paper:

Retaining Human-Robots Conversation: Comparing Single Robot to Multiple Robots in a Real Event

Takamasa Iio, Yuichiro Yoshikawa, and Hiroshi Ishiguro

Osaka University / JST ERATO
1-3 Machikaneyama, Toyonaka, Osaka 560-8531, Japan

Received:
November 20, 2016
Accepted:
May 2, 2017
Published:
July 20, 2017
Keywords:
human-robot interaction, multiple robots, social robot, field trial
Abstract

In human-robot conversation in a real environment, low speech recognition and unnatural response generation are critical issues. Most autonomous conversational robotic systems avoid these issues by restricting user input and robot responses. However, such restrictions often render the interaction boring because the conversation becomes predictable. In this study, we propose the use of multiple robots as a solution for this problem. To explore the effect of multiple robots on a conversation, we developed an autonomous conversational robotic system and conducted a field trial in a real event. Our system adopted a button interface, which restricted user input within positive or negative intention, and maintained a conversation by choosing the most suitable of the prepared static scenarios. Through the field trial, we found that visitors who conversed with multiple robots continued their conversation for a more prolonged period, and their experience improved their impression on the conversation, in contrast to the visitors who conversed with a single robot.

References
  1. [1] W. Burgard, et al., “The Interactive Museum Tour-Guide Robot,” National Conf. on Artificial Intelligence (AAAI1998), pp. 11-18, 1998.
  2. [2] S. Thrun, et al., “Minerva: A Second-Generation Museum Tour-Guide Robot,” IEEE Int. Conf. on Robotics and Automation (ICRA1999), pp. 1999-2005, 1999.
  3. [3] R. Siegwart, et al., “Robox at Expo.02: A Large Scale Installation of Personal Robots,” Robotics and Autonomous Systems, Vol.42, pp. 203-222, 2003.
  4. [4] R. Gockley, et al., “Designing Robots for Long-Term Social Interaction,” IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS2005), pp. 1338-1343, 2005.
  5. [5] H. M. Gross, et al., “Shopbot: Progress in Developing an Interactive Mobile Shopping Assistant for Everyday Use,” IEEE Int. Conf. on Systems, Man, and Cybernetics (SMC2008), pp. 3471-3478, 2008.
  6. [6] M. Shiomi, et al., “A Semi-Autonomous Communication Robot – a Field Trial at a Train Station –,” ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI2008), pp. 303-310, 2008.
  7. [7] T. Kanda, M. Shiomi, Z. Miyashita, H. Ishiguro, and N. Hagita, “A Communication Robot in a Shopping Mall,” IEEE Trans. on Robotics, Vol.26, pp. 897-913, 2010.
  8. [8] T. Arimoto, Y. Yoshikawa, and H. Ishiguro, “Cooperative Use of Multiple Robots for Enhancing Sense of Conversation without Voice Recognition,” SIG-SLUD, B5(2), pp. 76-77, 2015 (in Japanese).
  9. [9] T. Takahashi, M. Kanbara, and N. Hagita, “A Social Media Mediation Robot to Increase an Opportunity of Conversation for Elderly: Mediation Experiments Using Single or Multiple Robots,” Technical Committee on Cloud Network robotics (CNR), Vol.113, No.84, pp. 31-36, 2013 (in Japanese).
  10. [10] K. Hayashi, D. Sakamoto, T. Kanda, M. Shiomi, S. Koizumi H. Ishiguro, T. Ogasawara, and N. Hagita, “Humanoid robots as a passive-social medium – a field experiment at a train station,” ACM/IEEE 2nd Annual Conf. on Human-Robot Interaction, pp. 137-144, 2007.
  11. [11] H. I. Kuo, et al., “Age and gender factors in user acceptance of healthcare robots,” The 18th IEEE Int. Symp. on Robot and Human Interactive Communication, RO-MAN 2009, pp. 214-219, 2009.
  12. [12] P. Schermerhorn, M. Scheutz, and R. C. Crowell, “Robot social presence and gender: Do females view robots differently than males?” Proc. of the 3rd ACM/IEEE Int. Conf. on Human robot interaction, pp. 263-270, 2008.
  13. [13] D. J. Fisher and D. Byrne, “Too close for comfort: Sex differences in response to invasions of personal space,” J. of Personality and Social Psychology, Vol.32, No.1, pp. 15-21, 1975.
  14. [14] S. D. Syrdal, K. Dautenhahn, S. Woods, L. M. Walters, and L. K. Koay, “‘Doing the right thing wrong’ – Personality and tolerance to uncomfortable robot approaches,” The 15th IEEE Int. Symp. on Robot and Human Interactive Communication, ROMAN 2006, pp. 183-188, 2006.
  15. [15] B. Mutlu, J. Forlizzi, and J. Hodgins, “A storytelling robot: Modeling and evaluation of human-like gaze behavior,” 2006 6th IEEE-RAS Int. Conf. on Humanoid robots, pp. 518-523, 2006.
  16. [16] M. Imai, T. Ono, and H. Ishiguro, “Physical relation and expression: joint attention for human-robot interaction,” IEEE Trans. on Industrial Electronics, Vol.50, pp. 636-643, 2003.
  17. [17] M. Shimada, Y. Yoshikawa, M. Asada, N. Saiwaki, and H. Ishiguro, “Effects of observing eye contact between a robot and another person,” Int. J. of Social Robotics, Vol.3, No.2, pp. 143-154, 2011.
  18. [18] M. Shiomi, T. Kanda, I. Howley, K. Hayashi, and N. Hagita, “Can a Social Robot Stimulate Science Curiosity in Classrooms?” Int. J. of Social Robotics, pp. 1-12, 2015.
  19. [19] T. Kanda, R. Sato, N. Saiwaki, and H. Ishiguro, “A two-month Field Trial in an Elementary School for Long-term Human-robot Interaction,” IEEE Trans. on Robotics, Vol.23, No.5, pp. 962-971, 2007.
  20. [20] T. Iio, Y. Yoshikawa, and H. Ishiguro, “Pre-scheduled Turn-Taking between Robots to Make Conversation Coherent,” Proc. of the 4th Int. Conf. on Human Agent Interaction, pp. 19-25, 2016.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Aug. 19, 2017