single-rb.php

JRM Vol.32 No.1 pp. 224-235
doi: 10.20965/jrm.2020.p0224
(2020)

Paper:

Investigation of Robot Expression Style in Human-Robot Interaction

Wei-Fen Hsieh, Eri Sato-Shimokawara, and Toru Yamaguchi

Tokyo Metropolitan University
6-6 Asahigaoka, Hino, Tokyo 191-0065, Japan

Received:
May 10, 2019
Accepted:
November 19, 2019
Published:
February 20, 2020
Keywords:
human-robot interaction, robot impression analysis, robot expression style, nonverbal communication
Abstract

In our daily conversation, we obtain considerable information from our interlocutor’s non-verbal behaviors, such as gaze and gestures. Several studies have shown that nonverbal messages are prominent factors in smoothing the process of human-robot interaction. Our previous studies have shown that not only a robot’s appearance but also its gestures, tone, and other nonverbal factors influence a person’s impression of it. The paper presented an analysis of the impressions made when human motions are implemented on a humanoid robot, and experiments were conducted to evaluate impressions made by robot expressions to analyze the sensations. The results showed the relation between robot expression patterns and human preferences. To further investigate biofeedback elicited by different robot styles of expression, a scenario-based experiment was done. The results revealed that people’s emotions can definitely be affected by robot behavior, and the robot’s way of expressing itself is what most influences whether or not it is perceived as friendly. The results show that it is potentially useful to combine our concept into a robot system to meet individual needs.

Pepper interacting with a participant

Pepper interacting with a participant

Cite this article as:
W. Hsieh, E. Sato-Shimokawara, and T. Yamaguchi, “Investigation of Robot Expression Style in Human-Robot Interaction,” J. Robot. Mechatron., Vol.32 No.1, pp. 224-235, 2020.
Data files:
References
  1. [1] R. Borja, J. R. De La Pinta, A. lvarez, and J. M. Maestre, “Integration of service robots in the smart home by means of UPnP: A surveillance robot case study,” Robotics and Autonomous Systems, Vol.61, No.2, pp. 153-160, 2013.
  2. [2] C. Huijnen, A. Badii, H. van den Heuvel, P. Caleb-Solly, and D. Thiemert, ““Maybe it becomes a buddy, but do not call it a robot” – seamless cooperation between companion robotics and smart homes,” D. V. Keyson et al. (Eds.), “Ambient Intelligence,” pp. 324-329, Springer, 2011.
  3. [3] M. Fridin, “Storytelling by a kindergarten social assistive robot: A tool for constructive learning in preschool education,” Computers & Education, Vol.70, pp. 53-64, 2014.
  4. [4] C. D. Kidd, W. Taggart, and S. Turkle, “A sociable robot to encourage social interaction among the elderly,” Proc. of 2006 IEEE Int. Conf. on Robotics and Automation (ICRA 2006), pp. 3972-3976, May 2006.
  5. [5] E. Guizzo, “How Aldebaran Robotics Built its Friendly Humanoid Robot, Pepper,” IEEE Spectrum, December 26, 2014.
  6. [6] N. Mavridis, “A review of verbal and non-verbal human-robot interactive communication,” Robotics and Autonomous Systems, Vol.63, Part 1, pp. 22-35, 2015.
  7. [7] F. Jimenez, T. Yoshikawa, T. Furuhashi, and M. Kanoh, “Effects of a novel sympathy-expression method on collaborative learning among junior high school students and robots,” J. Robot. Mechatron., Vol.30, No.2, pp. 282-291, 2018.
  8. [8] B. Albert, “ Social Learning Theory,” General Learning Press, 1971.
  9. [9] D. Lee, C. Ott, and Y. Nakamura, “Mimetic communication with impedance control for physical human-robot interaction,” Proc. of IEEE Int. Conf. on Robotics and Automation (ICRA’09), pp. 1535-1542, 2009.
  10. [10] E. Okamura and F. Tanaka, “Design of a Robot that is Capable of High Fiving with Humans,” Proc. of the 26th IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN 2017), Lisbon, Portugal, pp. 704-711, August 2017.
  11. [11] H. B. Amor, D. Vogt, M. Ewerton, E. Berger, B. Jung, and J. Peters, “Learning responsive robot behavior by imitation,” Proc. of 2013 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), pp. 3257-3264, 2013.
  12. [12] Y. Ou, J. Hu, Z. Wang, Y. Fu, X. Wu, and X. Li, “A real-time human imitation system using Kinect,” Int. J. of Social Robotics, Vol.7, No.5, pp. 587-600, 2015.
  13. [13] E. Mota, A. P. Moreira, and T. P. do Nascimento, “Motion and Teaching of a NAO Robot,” Provas de Dissertacao do MIEEC, Portugal, 2011.
  14. [14] F. Zuher and R. Romero, “Recognition of human motions for imitation and control of a humanoid robot,” 2012 Brazilian Robotics Symp. and Latin American Robotics Symp. (SBR-LARS), pp. 190-195, 2012.
  15. [15] B. Spanlang, X. Navarro, J. M. Normand, S. Kishore, R. Pizarro, and M. Slater, “Real time whole body motion mapping for avatars and robots,” Proc. of the 19th ACM Symp. on Virtual Reality Software and Technology, pp. 175-178, 2013.
  16. [16] J. Lei, M. Song, Z. N. Li, and C. Chen, “Whole-body humanoid robot imitation with pose similarity evaluation,” Signal Processing, Vol.108, pp. 136-146, 2015.
  17. [17] T. Obo, C. K. Loo, and N. Kubota, “Robot posture generation based on genetic algorithm for imitation,” 2015 IEEE Congress on Evolutionary Computation (CEC), pp. 552-557, May 2015.
  18. [18] A. R. Taheri, M. Alemi, A. Meghdari, H. R. Pouretemad, and S. L. Holderread, “Clinical application of humanoid robots in playing imitation games for autistic children in Iran,” Procedia-Social Behavioral Sciences, Vol.176, pp. 898-906, 2015.
  19. [19] T. Okamoto, T. Shiratori, M. Glisson, K. Yamane, S. Kudoh, and K. Ikeuchi, “Extraction of person-specific motion style based on a task model and imitation by humanoid robot,” 2014 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1347-1354, 2014.
  20. [20] K. Yamane, M. Revfi, and T. Asfour, “Synthesizing object receiving motions of humanoid robots with human motion database,” 2013 IEEE Int. Conf. on Robotics and Automation, pp. 1629-1636, 2013.
  21. [21] S. I. Nakaoka, A. Nakazawa, F. Kanehiro, K. Kaneko, M. Morisawa, H. Hirukawa, and K. Ikeuchi, “Learning from observation paradigm: Leg task models for enabling a biped humanoid robot to imitate human dances,” The Int. J. of Robotics Research, Vol.26, No.8, pp. 829-844, 2007.
  22. [22] W. Johal, “Robots Interacting with Style,” Proc. of the 10th Annual ACM/IEEE Int. Conf. on Human-Robot Interaction Extended Abstracts, pp. 191-192, March 2015.
  23. [23] M. S. Erden, “Emotional postures for the humanoid-robot nao,” Int. J. of Social Robotics, Vol.5, No.4, pp. 441-456, 2013.
  24. [24] W. F. Hsieh, E. Sato-Shimokawara, and T. Yamaguchi, “Enhancing the familiarity for humanoid robot pepper by adopting customizable motion,” 43rd Annual Conf. of the IEEE Industrial Electronics Society (IECON 2017), pp. 8497-8502, 2017.
  25. [25] Y. Li, W. F. Hsieh, A. Matsufuji, E. Sato-Shimokawara, and T. Yamaguchi, “Be Certain or Uncertain for the Erroneous Situation in Human-Robot Interaction: A Dialogue Experiment Focused on the Verbal Factors,” The Joint Int. Conf. of ISCIIA2018 and ITCA2018, 4M1-2-3, 2018.
  26. [26] S. Ivaldi, S. Lefort, J. Peters, M. Chetouani, J. Provasi, and E. Zibetti, “Towards engagement models that consider individual factors in HRI: On the relation of extroversion and negative attitude towards robots to gaze and speech during a human–robot assembly task,” Int. J. of Social Robotics, Vol.9, No.1, pp. 63-86, 2017.
  27. [27] S. M. Anzalone, Y. Yoshikawa, H. Ishiguro, E. Menegatti, E. Pagello, and R. Sorbello, “Towards partners profiling in human robot interaction contexts,” Proc. of Int. Conf. on Simulation, Modeling, and Programming for Autonomous Robots, pp. 4-15, 2012.
  28. [28] P. Chevalier, J. C. Martin, B. Isableu, and A. Tapus, “Impact of personality on the recognition of emotion expressed via human, virtual, and robotic embodiments,” 2015 24th IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN), pp. 229-234, 2015.
  29. [29] A. Thomaz, G. Hoffman, and M. Cakmak, “Computational human-robot interaction,” Foundations and Trends® in Robotics, Vol.4, Nos.2-3, pp. 105-223, 2016.
  30. [30] B. Lepri, R. Subramanian, K. Kalimeri, J. Staiano, F. Pianesi, and N. Sebe, “Employing social gaze and speaking activity for automatic determination of the Extraversion trait,” Proc. of Int. Conf. on Multimodal Interfaces & the Workshop on Machine Learning for Multimodal Interaction, Article No.7, 2010.
  31. [31] O. Celiktutan and H. Gunes, “Computational analysis of human-robot interactions through first-person vision: Personality and interaction experience,” 2015 24th IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN), pp. 815-820, 2015.
  32. [32] C. J. A. M. Willemse, A. Toet, and J. B. F. van Erp, “Affective and behavioral responses to robot-initiated social touch: toward understanding the opportunities and limitations of physical contact in human-robot interaction,” Frontiers in ICT, Vol.4, No.12, doi: 10.3389/fict.2017.00012, 2017.
  33. [33] L. Desideri, C. Ottaviani, M. Malavasi, R. di Marzio, and P. Bonifacci, “Emotional processes in human-robot interaction during brief cognitive testing,” Computers in Human Behavior, Vol.90, pp. 331-342, 2019.
  34. [34] M. Sugaya, Y. Nishida, R. Yoshida, and Y. Takahashi, “An Experiment of Human Feeling for Hospitality Robot Measured with Biological Information,” 2018 IEEE 42nd Annual Computer Software and Applications Conf. (COMPSAC), Vol.2, pp. 611-615, 2018.
  35. [35] C. E. Osgood, W. H. May, and M. S. Miron, “Cross-cultural universals of affective meaning (Vol.1),” University of Illinois Press, 1975.
  36. [36] T. Tanaka, H. Hashimoto, and S. Yokota, “Analysis of Impression in Exercise while Watching Avatar Movement,” Int. J. on Advances in Intelligent Systems, Vol.9, Nos.1-2, pp. 213-222, June 2016.
  37. [37] M. J. Greenacre, “Theory and applications of correspondence analysis,” Academic Press, 1984.
  38. [38] C. Bartneck, D. Kulić, E. Croft, and S. Zoghbi, “Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots,” Int. J. of Social Robotics, Vol.1, No.1, pp. 71-81, 2009.
  39. [39] S. Akselrod, D. Gordon, F. A. Ubel, D. C. Shannon, A. C. Berger, and R. J. Cohen, “Power spectrum analysis of heart rate fluctuation: a quantitative probe of beat-to-beat cardiovascular control,” Science, Vol.213, No.4504, pp. 220-222, 1981.
  40. [40] P. Grossman, “Respiratory and cardiac rhythms as windows to central and autonomic biobehavioral regulation: selection of window frames, keeping the panes clean and viewing the neural topography,” Biological Psychology, Vol.34, Nos.2-3, pp. 131-161, 1992.
  41. [41] B. Pomeranz, R. J. Macaulay, M. A. Caudill, I. Kutz, D. Adam, D. Gordon, K. M. Kilborn, A. C. Barger, D. C. Shannon, R. J. Cohen et al., “Assessment of autonomic function in humans by heart rate spectral analysis,” Am. J. Physiol., Vol.248, No.1, pp. H151-H153, 1985.
  42. [42] H. Murakami and H. Ohira, “Influence of attention manipulation on emotion and autonomic responses,” Perceptual and Motor Skills, Vol.105, No.1, pp. 299-308, 2007.
  43. [43] A. J. Camm, M. Malik, J. T. Bigger, G. Breithardt, S. Cerutti, R. J. Cohen, and F. Lombardi, “Heart rate variability: standards of measurement, physiological interpretation and clinical use,” Circulation, Vol.93, No.5, pp. 1043-1065, 1996.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 05, 2024