JRM Vol.32 No.1 pp. 51-58
doi: 10.20965/jrm.2020.p0051


How Can Robots Make People Feel Intimacy Through Touch?

Xiqian Zheng*,**, Masahiro Shiomi*, Takashi Minato*, and Hiroshi Ishiguro*,**

*Advanced Telecommunications Research Institute International (ATR)
2-2-2 Hikaridai Seika-cho, Sorakugun, Kyoto 619-0288, Japan

**Graduate School of Engineering Science, Osaka University
1-3 Machikaneyamacho, Toyonaka, Osaka 560-0043, Japan

July 19, 2019
November 13, 2019
February 20, 2020
human-robot interaction, haptic interaction, perceived intimacy
How Can Robots Make People Feel Intimacy Through Touch?

Android ERICA touching a participant

This study investigates the effects of the touch characteristics that change the intimacy perceived by humans in human-robot touch interaction with an android robot having a human-like feminine appearance. Past studies on human-robot touch interaction focused on understanding which types of human touches are used to express emotions to robots. However, they less focused on how a robot’s touch characteristics can affect humans’ perceived intimacy. In this study, first, we concentrated on two types of touch characteristics (type and place) and their effects on the perceived intimacy of a commonly used emotion in human-robot interaction, namely happiness. The results showed that the touch types are useful for changing the perceived intimacy, although the touched place did not exhibit any significant effects. Based on the results of our first experiment, we investigated the effects of different touch characteristics (length and part). We concluded that the touch part is useful to change the perceived intimacy, although the touch length did not exhibit any significant effects. Finally, the results suggested that a pat (type) by the fingers (part) is a better combination to express intimacy with our robot.

Cite this article as:
X. Zheng, M. Shiomi, T. Minato, and H. Ishiguro, “How Can Robots Make People Feel Intimacy Through Touch?,” J. Robot. Mechatron., Vol.32, No.1, pp. 51-58, 2020.
Data files:
  1. [1] J. D. Fisher, M. Rytting, and R. Heslin, “Hands Touching Hands: Affective and Evaluative Effects of an Interpersonal Touch,” Sociometry, Vol.39, No.4, pp. 416-421, 1976.
  2. [2] J. F. Deethardt and D. G. Hines, “Tactile communication and personality differences,” J. of Nonverbal Behavior, Vol.8, No.2, pp. 143-156, 1983.
  3. [3] F. N. Willis Jr. and R. A. Dodds, “Age, relationship, and touch initiation,” The J. of Social Psychology, Vol.138, No.1, pp. 115-123, 1998.
  4. [4] J. W. Lee and L. K. Guerrero, “Types of touch in cross-sex relationships between coworkers: Perceptions of relational and emotional messages, inappropriateness, and sexual harassment,” J. of Applied Communication Research, Vol.29, No.3, pp. 197-220, 2001.
  5. [5] T. Field, “Touch for socioemotional and physical well-being: A review,” Developmental Review, Vol.30, No.4, pp. 367-383, 2010.
  6. [6] M. J. Hertenstein, R. Holmes, M. McCullough, and D. Keltner, “The communication of emotion via touch,” Emotion, Vol.9, No.4, p. 566, 2009.
  7. [7] A. Cekaite and M. K. Holm, “The Comforting Touch: Tactile Intimacy and Talk in Managing Children’s Distress,” Research on Language and Social Interaction, Vol.50, No.2, pp. 109-127, 2017.
  8. [8] C. O’lynn and L. Krautscheid, “‘How should I touch you?’: a qualitative study of attitudes on intimate touch in nursing care,” AJN The AmERICAn J. of Nursing, Vol.111, No.3, pp. 24-31, 2011.
  9. [9] B. K. Jakubiak and B. C. Feeney, “Interpersonal touch as a resource to facilitate positive personal and relational outcomes during stress discussions,” J. of Social and Personal Relationships, Vol.36, No.9, pp. 2918-2936, 2019.
  10. [10] J. B. Van Erp and A. Toet, “How to touch humans: Guidelines for social agents and robots that can touch,” 2013 Humaine Association Conf. on Affective Computing and Intelligent Interaction (ACII), pp. 780-785, 2013.
  11. [11] J. B. F. van Erp and A. Toet, “Social Touch in Human-Computer Interaction,” Frontiers in Digital Humanities, Vol.2, No.2, p. 2, 2015.
  12. [12] G. Huisman, “Social Touch Technology: A Survey of Haptic Technology for Social Touch,” IEEE Trans. on Haptics, Vol.10, No.3, pp. 391-408, 2017.
  13. [13] M. Shiomi, K. Shatani, T. Minato, and H. Ishiguro, “How should a Robot React before People’s Touch?: Modeling a Pre-Touch Reaction Distance for a Robot’s Face,” IEEE Robotics and Automation Letters, Vol.3, No.4, pp. 3773-3780, 2018.
  14. [14] A. E. Block and K. J. Kuchenbecker, “Softness, Warmth, and Responsiveness Improve Robot Hugs,” Int. J. of Social Robotics, Vol.11, No.1, pp. 49-64, 2019.
  15. [15] R. Yu et al., “Use of a Therapeutic, Socially Assistive Pet Robot (PARO) in Improving Mood and Stimulating Social Interaction and Communication for People With Dementia: Study Protocol for a Randomized Controlled Trial,” JMIR Research Protocols, Vol.4, No.2, e45, 2015.
  16. [16] H. Sumioka, A. Nakae, R. Kanai, and H. Ishiguro, “Huggable communication medium decreases cortisol levels,” Scientific Reports, Vol.3, p. 3034, 2013.
  17. [17] M. Shiomi and N. Hagita, “Audio-Visual Stimuli Change not Only Robot’s Hug Impressions but Also Its Stress-Buffering Effects,” Int. J. of Social Robotics, pp. 1-8, February 8, 2019.
  18. [18] M. Shiomi, A. Nakata, M. Kanbara, and N. Hagita, “A Robot that Encourages Self-disclosure by Hug,” Proc. of Int. Conf. on Social Robotics (ICSR 2017), pp. 324-333, 2017.
  19. [19] P. Baxter, E. Ashurst, R. Read, J. Kennedy, and T. Belpaeme, “Robot education peers in a situated primary school study: Personalisation promotes child learning,” PLoS ONE, Vol.12, No.5, e0178126, 2017.
  20. [20] I. Leite et al., “Narratives with robots: The impact of interaction context and individual Differences on story recall and emotional Understanding,” Frontiers in Robotics and AI, Vol.4, p. 29, 2017.
  21. [21] M. Shiomi, T. Kanda, I. Howley, K. Hayashi, and N. Hagita, “Can a Social Robot Stimulate Science Curiosity in Classrooms?,” Int. J. of Social Robotics, Vol.7, No.5, pp. 641-652, 2015.
  22. [22] H. Felzmann, T. Beyan, M. Ryan, and O. Beyan, “Implementing an ethical approach to big data analytics in assistive robotics for elderly with dementia,” SIGCAS Comput. Soc., Vol.45, No.3, pp. 280-286, 2016.
  23. [23] M. Shiomi, T. Iio, K. Kamei, C. Sharma, and N. Hagita, “Effectiveness of Social Behaviors for Autonomous Wheelchair Robot to Support Elderly People in Japan,” PLoS ONE, Vol.10, No.5, e0128031, 2015.
  24. [24] J. Li, W. Ju, and B. Reeves, “Touching a Mechanical Body: Tactile Contact with Intimate Parts of a Humanoid Robot is Physiologically Arousing,” J. of Human-Robot Interaction, Vol.6, No.3, pp. 118-130, 2017.
  25. [25] Y. Yamashita, H. Ishihara, T. Ikeda, and M. Asada, “Appearance of a Robot Influences Causal Relationship between Touch Sensation and the Personality Impression,” Proc. of the 5th Int. Conf. on Human Agent Interaction, pp. 457-461, 2017.
  26. [26] S. Yohanan and K. E. MacLean, “The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature,” Int. J. of Social Robotics, Vol.4, No.2, pp. 163-180, 2012.
  27. [27] P. Ekman, “Facial expression and emotion,” J. of AmERICAn psychologist, Vol.48, No.4, p. 384, 1993.
  28. [28] M. J. Hertenstein, D. Keltner, B. App, B. A. Bulleit, and A. R. Jaskolka, “Touch communicates distinct emotions,” Emotion, Vol.6, No.3, p. 528, 2006.
  29. [29] R. Andreasson, B. Alenljung, E. Billing, and R. Lowe, “Affective touch in human-robot interaction: conveying emotion to the NAO robot,” Int. J. of Social Robotics, Vol.10, No.4, pp. 473-491, 2018.
  30. [30] D. F. Glas, T. Minato, C. T. Ishi, T. Kawahara, and H. Ishiguro, “ERICA: The ERATO intelligent conversational android,” 2016 25th IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN), pp. 22-29, 2016.
  31. [31] D. Roberson, L. Damjanovic, and M. Kikutani, “Show and tell: The role of language in categorizing facial expression of emotion,” Emotion Review, Vol.2, No.3, pp. 255-260, 2010.
  32. [32] T. W. Bickmore, R. Fernando, L. Ring, and D. Schulman, “Empathic touch by relational agents,” IEEE Trans. on Affective Computing, Vol.1, No.1, pp. 60-71, 2010.
  33. [33] A. Gallace and C. Spence, “The science of interpersonal touch: an overview,” Neuroscience & Biobehavioral Reviews, Vol.34, No.2, pp. 246-259, 2010.
  34. [34] H. IJzerman, M. Gallucci, W. T. Pouw, S. C. Weißgerber, N. J. Van Doesum, and K. D. Williams, “Cold-blooded loneliness: Social exclusion leads to lower skin temperatures,” Acta Psychologica, Vol.140, No.3, pp. 283-288, 2012.
  35. [35] J. T. Suvilehto et al., “Topography of social touching depends on emotional bonds between humans,” Proc. of the National Academy of Sciences, Vol.112, No.45, pp. 13811-13816, 2015.
  36. [36] S. Safdar et al., “Variations of emotional display rules within and across cultures: A comparison between Canada, USA, and Japan,” Canadian J. of Behavioural Science, Vol.41, No.1, pp. 1-10, 2009.
  37. [37] X. Zheng, M. Shiomi, T. Minato, and H. Ishiguro, “What Kinds of Robot’s Touch Will Match Expressed Emotions?,” IEEE Robotics and Automation Letters, Vol.5, Issue 1, pp. 127-134, 2020.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Aug. 09, 2020