JRM Vol.25 No.3 pp. 449-457
doi: 10.20965/jrm.2013.p0449


Applicability of Equilibrium Theory of Intimacy to Non-Verbal Interaction with Robots: Multi-Channel Approach Using Duration of Gazing and Distance Between a Human Subject and Robot

Hiroko Kamide*, Koji Kawabe**, Satoshi Shigemi**,
and Tatsuo Arai*

*Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka 560-8531, Japan

**Honda R&D Co., Ltd., 8-1 Honcho, Wako, Saitama 351-0188, Japan

December 27, 2012
February 18, 2013
June 20, 2013
humanoid, non-verbal communication, multi-channel approach, duration of eye contact (gazing), interpersonal distance (distance between a human and a robot)
This study discusses the applicability of the equilibrium theory of intimacy to non-verbal interaction between a human and a robot through how long they maintain eye contact and through a multi-channel approach to distance. According to the equilibrium theory of intimacy, in interpersonal communication, multiple non-verbal channels are simultaneously adjusted to maintain equilibrium of the intimacy level. This study includes a self-introduction situation by a robot in which the duration of eye contact and the distance between them are manipulated and how long the participant looks at the robot is measured. As a result, as the equilibrium theory of intimacy predicts, how long the robot is looked at increases when the distance between them is great, while the duration is short when the distance is short. Discussions here include problems with and prospects for applying the equilibrium theory of intimacy to interactions with a robot.
Cite this article as:
H. Kamide, K. Kawabe, S. Shigemi, and T. Arai, “Applicability of Equilibrium Theory of Intimacy to Non-Verbal Interaction with Robots: Multi-Channel Approach Using Duration of Gazing and Distance Between a Human Subject and Robot,” J. Robot. Mechatron., Vol.25 No.3, pp. 449-457, 2013.
Data files:
  1. [1] K. Wada and T. Shibata, “Living with Seal Robots? Its Sociopsychological and Physiological Influences on the Elderly at a Care House,” IEEE Trans. on Robotics, Vol.23, No.5, pp. 972-980, 2007.
  2. [2] A. Tewari, J. Peabody, R. Sarle, G. Balakrishnan, A. K. Hemal, A. Shrivastaba et al., “Technique of a da Vinci robotassisted anatomic radical prostatectomy,” Urology, Vol.60, No.4, pp. 569-572, 2002.
  3. [3] T. Kamegawa, T. Yamasaki, H. Igarashi, and F. Matsuno, “Development of the snake-like rescue robot KOHGA,” Proc. of IEEE Int. Conf. Robot, pp. 5081-5086, 2004.
  4. [4] M. Arai, Y. Tanaka, S. Hirose, H. Kuwahara, and S. Tsukui, “Development of ‘Souryu-IV’ and ‘Souryu-V’: Serially connected crawler vehicles for in-rubble searching operations,” J. of Field Robotics, Special Issue on Search and Rescue Robots, Vol.25, No.1-2, pp. 3165, 2008.
  5. [5] M. Fujita, “On activating human communications with pet-type robot AIBO,” Proc. of IEEE, Vol.92, No.11, pp. 1804-1813, 2004.
  6. [6] K. Suzuki, R. Hikiji, and S. Hashimoto, “Development of an Autonomous Humanoid Robot, iSHA, for Harmonized Human-Machine Environment,” J. of Robotics and Mechatronics, Vol.14, No.5 pp. 497-505, 2002.
  7. [7] K. Hoshino and I. Kawabuchi, “Pinching at Fingertips for Humanoid Robot Hand,” J. of Robotics and Mechatronics, Vol.17, No.6 pp. 655-663, 2005.
  8. [8] Y. Takahashi and M. Kohda, “Simple Humanoid Biped Robot with PIC Microcomputer for University Education,” J. of Robotics and Mechatronics, Vol.17, No.2, pp. 226-231, 2005.
  9. [9] N. Endo, S. Momoki, M. Zecca, M. Saito, Y. Mizoguchi, K. Itoh, and A. Takanishi, “Development of Whole-Body Emotion Expression Humanoid Robot,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 2140-2145, 2008.
  10. [10] C. F. DiSalvo, F. Gemperle, J. Forlizzi, and S. Kiesler, “All robots are not created equal: The design and perception of humanoid robot heads,” Proc. of the DIS Conf., pp. 321-326, 2002.
  11. [11] B. Mutlu, T. Kanda, J. Forlizzi, J. Hodgins, and H. Ishiguro, “Conversational Gaze Mechanisms for Humanlike Robots,” ACM Trans. on Interactive Intelligent Systems, Vol.1, No.2, pp. 1-33, 2012.
  12. [12] C. L. Breazeal, “Emotion and sociable humanoid robots,” Int. J. of Human-Computer Studies, Vol.59, No.1-2, pp. 119-155, 2003.
  13. [13] C. Becker-Asano, T. Kanda, C. Ishi, and H. Ishiguro, “Studying laughter in combination with two humanoid robots,” AI & SOCIETY, Vol.26, p.291-300, 2011.
  14. [14] Y. Muto, S. Takasugi, T. Yamamoto, and Y. Miyake, “Timing Control of Utterance and Gesture in Interaction between Human and Humanoid robot,” Proc. of the IEEE Int. Symposium on Robot and Human Interactive Communication, pp. 1022-1028, 2009.
  15. [15] C. Liu, C. T. Ishi, H. Ishiguro, and N. Hagita, “Generation of Nodding, Head Tilting and Eye Gazing for Human-Robot Dialogue Interaction,” Proc. of ACM/IEEE Int. Conf. on Human-Robot Interaction, pp. 285-292, 2012.
  16. [16] T. Tojo, Y. Matsusaka, T. Ishii, and T. Kobayashi, “A conversational robot utilizing facial and body expressions,” Proc. of IEEE Int. Conf. on System, Man and Cybernetics, pp. 858-863, 2000.
  17. [17] Y. Ujiie, K. Inoue, T. Takubo, and T. Arai, “Evaluation of human sense of security for coexisting robots using virtual reality,” Proc. of Int. Symposium on Systems and Human Science, 2006.
  18. [18] R. Gockley, J. Forlizzi, and R. Simmons, “Natural person-following behavior for social robots,” Proc. of ACM/IEEE Int. Conf. on Human-Robot Interaction, pp. 17-24, 2007.
  19. [19] R. Gockley and M. Matari’c, “Encouraging physical therapy compliance with a hands-off mobile robot,” Proc. of ACM/IEEE Int. Conf. on Human-Robot Interaction, pp. 150-155, 2006.
  20. [20] F. Zanlungo, T. Ikeda, and T. Kanda, “Social force model with explicit collision prediction,” Europhysics Letters, Vol.93, 68005, 2011.
  21. [21] M. Shiomi, F. Zanlungo, K. Hayashi, and T. Kanda, “A Framework with a Pedestrian Simulator for Deploying Robots into a Real Environment,” Simulation, modeling, and programming for autonomous robots, Simulation, Modeling, and Programming for Autonomous Robots, Vol.7628, pp. 185-196, 2012.
  22. [22] K. Hayashi, M. Shiomi, T. Kanda, and N. Hagita, “Friendly Patrolling: A Model of Natural Encounters,” Robotics: Science and Systems, pp. 121-128, 2011.
  23. [23] M. Wiener and A. Mehrabian, “Language within language: Immediacy, a channel in verbal communication,” New York: Appleton-Century-Crofts, 1968.
  24. [24] M. O’Sullivan, J. Hartley, D. Saunders, and J. Fiske, “Key concepts in communication,” New York: Methuen, 1983.
  25. [25] M. Argyle and J. Dean, “Eye contact, distance, and affiliation,” Sociometry, Vol.28, pp. 289-304, 1965.
  26. [26] J. N. Bailenson, J. Blascovich, A. C. Beall, and J. M. Loomis, “Equilibrium revisited: Mutual gaze and personal space in virtual environments,” PRESENCE: Teleoperators and Virtual Environments, Vol.10, pp. 583-598, 2001.
  27. [27] S. Shigemi, Y. Kawaguchi, T. Yoshiike, K. Kawabe, and N. Ogawa, “Development of New ASIMO,” Honda Technical Review, Vol.18, No.1, pp. 38-44, 2006.
  28. [28] F. J. Bernieri, J. S. Gillis, J. M. Davis, and J. G. Grahe, “Dyad rapport and accuracy of its judgment across situations: A lens model analysis,” J. of Personality and Social Psychology, Vol.71, pp. 110-129, 1996.
  29. [29] L. M. Coutts and M. Ledden, “Nonverbal compensatory reactions to changes in interpersonal proximity,” J. of Social Psychology, Vol.102, No.2, pp. 283-290, 1977.
  30. [30] H. Kamide, K. Kawabe, S. Shigemi, and T. Arai, “Development of a Psychological Scale for General Impressions of Humanoid,” Advanced Robotics, Vol.27, No.1, 2013.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024