JRM Vol.32 No.1 pp. 86-96
doi: 10.20965/jrm.2020.p0086


Effect of Robot’s Play-Biting in Non-Verbal Communication

Kayako Nakagawa*,**, Reo Matsumura*,**, and Masahiro Shiomi**

*Karakuri Products Inc.
2-1-1 Nihonbashi-hongokucho, Chuo-ku, Tokyo 103-0021, Japan

**Advanced Telecommunications Research Institute International (ATR)
2-2-2 Hikaridai, Keihanna Science City, Kyoto 619-0288, Japan

August 20, 2019
November 29, 2019
February 20, 2020
human-robot interaction, play-biting, touch, therapy, stress

This paper focuses on “play-biting” as a touch communication method used by robots. We investigated an appropriate play-biting behavior and its effect on interaction. The touching action has positive effects in human-robot interactions. However, as biting is a defenseless act, it may cause a negative effect as well. Therefore, we first examine biting manner and the appearance of the robot using a virtual play-biting system in Experiment 1. Next, based on the result of experiment, the play-biting system is implemented in a stuffed animal robot. We verified the impressions created by the robot and its effect on mitigating stress in Experiment 2. Consequently, the play-biting communication gave positive and lively impression, and effect of reducing the physiological index of stress, in comparison to only touching the robot.

Play-biting communication with the robot

Play-biting communication with the robot

Cite this article as:
K. Nakagawa, R. Matsumura, and M. Shiomi, “Effect of Robot’s Play-Biting in Non-Verbal Communication,” J. Robot. Mechatron., Vol.32 No.1, pp. 86-96, 2020.
Data files:
  1. [1] J. D. Fisher, M. Rytting, and R. Heslin, “Hands Touching Hands: Affective and Evaluative Effects of an Interpersonal Touch,” Sociometry, Vol.39, pp. 416-421, 1976.
  2. [2] N. Gueguen, C. Jacob, and G. Boulbry, “The effect of touch on compliance with a restaurant’s employee suggestion,” Int. J. of Hospitality Management, Vol.26, No.4, pp. 1019-1023, 2007.
  3. [3] S. J. Whitcher and J. Fisher, “Multidimensional reaction to therapeutic touch in a hospital setting,” J. of Personality and Social Psychology, Vol.37, No.1, pp. 87-96, 1979.
  4. [4] K. Nakagawa, M. Shiomi, K. Shinozawa, R. Matsumura, H. Ishiguro, and N. Hagita, “Effect of Robot’s active touch on people’s motivation,” Proc. of the 6th ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI 2011), pp. 465-472, 2011.
  5. [5] M. Shiomi, K. Nakagawa, K. Shinozawa, R. Matsumura, H. Ishiguro, and N. Hagita, “Does A Robot’s Touch Encourage Human Effort?,” Int. J. of Social Robotics, Vol.9, No.1, pp. 5-15, 2017.
  6. [6] M. Shiomi and N. Hagita, “Audio-Visual Stimuli Change not Only Robot’s Hug Impressions but Also Its Stress-Buffering Effects,” Int. J. of Social Robotics, February 8, 2019.
  7. [7] T. Shibata and J. F. Coughlin, “Trends of Robot Therapy with Neurological Therapeutic Seal Robot, PARO,” J. Robot. Mechatron., Vol.26, No.4, pp. 418-425, 2014.
  8. [8] S. Hauser, S. McIntyre, A. Israr, H. Olausson, and G. Gerling, “Uncovering Human-to-Human Physical Interactions that Underlie Emotional and Affective Touch Communication,” Proc. of IEEE World Haptics Conf., pp. 407-412, 2019.
  9. [9] T. Kasuga and M. Hashimoto, “Human-Robot Handshaking using Neural Oscillators,” Proc. of the 2005 IEEE Int. Conf. on Robotics and Automation, pp. 3802-3807, 2005.
  10. [10] Z. Wang, E. Giannopoulos, M. Slater, and A. Peer, “Handshake: Realistic Human-Robot Interaction in Haptic Enhanced Virtual Reality,” Presence, Vol.20, No.4, pp. 371-392, 2011.
  11. [11] M. Jindai, T. Watanabe, S. Shibata, and T. Yamamoto, “Development of a Handshake Robot System Based on a Handshake Approaching Motion Model,” J. Robot. Mechatron., Vol.20, No.4, pp. 650-659, 2008.
  12. [12] J. N. Bailenson, N. Yee, S. Brave, D. Merget, and D. Koslow, “Virtual Interpersonal Touch: Expressing and Recognizing Emotions Through Haptic Devices,” Human-Computer Interaction, Vol.22, No.3, pp. 325-353, 2007.
  13. [13] Y. Yamashita, H. Ishihara, T. Ikeda, and M. Asada, “Investigation of Causal Relationship Between Touch Sensations of Robots and Personality Impressions by Path Analysis,” Int. J. of Social Robotics, Vol.11, No.1, pp. 141-150, 2019.
  14. [14] H. Fukuda, M. Shiomi, K. Nakagawa, and K. Ueda, “‘Midas touch’ in human-robot interaction: Evidence from event-related potentials during the ultimatum game,” Proc. of the 7th Annual ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI’12), pp. 131-132, 2012.
  15. [15] M. Shiomi, A. Nakata, M. Kanbara, and N. Hagita, “A hug from a robot encourages prosocial behavior,” 2017 26th IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN), pp. 418-423, 2017.
  16. [16] H. Sumioka, A. Nakae, R. Kanai, and H. Ishiguro, “Huggable communication medium decreases cortisol levels,” Scientific Reports, Vol.3, 3034, 2013.
  17. [17] H. Cramer, A. K. Amin, V. Evers, and N. Kemper, “Touched by robots : Effects of physical contact and robot proactiveness,” Workshop on the Reign of Catz and Dogz in CHI’09, 2009.
  18. [18] C. J. A. M. Willemse, A. Toet, and J. B. F. van Erp, “Affective and Behavioral Responses to Robot-Initiated Social Touch: Toward Understanding the Opportunities and Limitations of Physical Contact in Human-Robot Interaction,” Frontiers in ICT, Vol.4, 12, 2017.
  19. [19] M. Kanoh, “Babyloid,” J. Robot. Mechatron., Vol.26, No.4, pp. 513-514, 2014.
  20. [20] R. Hayashi and S. Kato, “Psychological effects of physical embodiment in artificial pet therapy,” Artificial Life and Robotics, Vol.22, pp. 58-63, 2016.
  21. [21] R. Hayashi and S. Kato, “Importance of Soft Tactility on Robot-Assisted Therapy,” J. of Japan Society of Kansei Engineering, Vol.18, No.1. pp. 23-29, 2019.
  22. [22] W. A. Bainbridge, J. Hart, E. S. Kim, and B. Scassellati, “The effect of presence on human-robot interaction,” The 17th IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN 2008), pp. 701-706, 2008.
  23. [23] K. Shinozawa, F. Naya, J. Yamato, and K. Kogure, “Differences in effect of robot and screen agent recommendations on human decision-making,” Int. J. of Human-Computer Studies, Vol.62, No.2, pp. 267-279, 2005.
  24. [24] A. Powers, S. Kiesler, S. Fussell, S. Fussell, and C. Torrey, “Comparing a Computer Agent with a Humanoid Robot,” Proc. of the ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI ’07), pp. 145-152, 2007.
  25. [25] L. Hoffmann and N. C. KräMer, “Investigating the Effects of Physical and Virtual Embodiment in Task-oriented and Conversational Contexts,” Int. J. Hum.-Comput. Stud., Vol.71, Nos.7-8, pp. 763-774, 2013.
  26. [26] M. B. Mathur and D. B. Reichling, “Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley,” Cognition, Vol.146, pp. 22-32, 2016.
  27. [27] C. Bartneck, D. Kulić, E. Croft, and S. Zoghbi, “Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots,” Int. J. of Social Robotics, Vol.1, No.1, pp. 71-81, 2009.
  28. [28] K. Tokuda, “The Validity of Temporary Mood Scale,” Ritsumeikan J. of Human Sciences, No.22, pp. 1-6, 2011.
  29. [29] M. Malik, J. T. Bigger, A. J. Camm, R. E. Kleiger, A. Malliani, A. J. Moss, and P. J. Schwartz, “Heart rate variability: Standards of measurement, physiological interpretation, and clinical use,” European Heart J., Vol.17, No.3, pp. 354-381, 1996.
  30. [30] X. Zheng, M. Shiomi, T. Minato, and H. Ishiguro, “What kinds of robot’s touch will match expressed emotions?,” IEEE Robotics and Automation Letters, Vol.5, No.1, pp. 127-134, 2020.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jun. 19, 2024