single-jc.php

JACIII Vol.17 No.3 pp. 433-442
doi: 10.20965/jaciii.2013.p0433
(2013)

Paper:

Psychological Effects of a Synchronously Reliant Agent on Human Beings

Felix Jimenez*, Teruaki Ando*, Masayoshi Kanoh*,
and Tsuyoshi Nakamura**

*Chukyo University, 101-2 Yagoto Honmachi, Showa-ku, Nagoya 466-8666, Japan

**Nagoya Institute of Technology, Gokiso-cho, Showa-ku, Nagoya, Aichi 466-8555, Japan

Received:
October 11, 2012
Accepted:
March 21, 2013
Published:
May 20, 2013
Keywords:
urge system, self-sufficiency, Kansei agents, human-agent interaction
Abstract

The ability of human symbiosis robots to communicate is indispensable for their coexistence with humans, so studies on the interaction between humans and robots are important. In this paper, we propose amodel robot self-sufficiency system that empathizes with human emotions, a model in which we apply the urge system to an autonomous system of emotions. We carry out simulation experiments on this model and verify the psychological interaction between the software robot and its users.

Cite this article as:
Felix Jimenez, Teruaki Ando, Masayoshi Kanoh, and
and Tsuyoshi Nakamura, “Psychological Effects of a Synchronously Reliant Agent on Human Beings,” J. Adv. Comput. Intell. Intell. Inform., Vol.17, No.3, pp. 433-442, 2013.
Data files:
References
  1. [1] P. Marti, M. Bacigalupo, L. Giusti, C. Mennecozzi, and T. Shibata, “Socially Assistive Robotics in the Treatment of Behavioural and Psychological Symptoms of Dementia,” IEEE/RAS-EMBS Int. Conf. on Biomedical Robotics and Biomechatronics 2006, pp. 483-488, 2006.
  2. [2] K. Wada and T. Shibata, “Robot Therapy in a Care House – its Sociopsychological and Physiological Effects on the Residents,” IEEE Int. Conf. on Robotics and Automation 2006, pp. 3966-3971, 2006.
  3. [3] C. Breazeal and B. Scassellati, “A Context-dependent Attention System for A Social Robot,” Int. Joint Conf. on Artificial Intelligence, pp. 1146-1151, 1999.
  4. [4] C. Breazeal, “Emotion and Sociable Humanoid Robots,” Int. J. of Human Computer Interaction, Vol.59, pp. 119-155, 2003.
  5. [5] F. Hara, “Artificial Emotion: for Developing Heart-to-hart Communication Between Robot and Human,” J. of the Japan Society of Mechanical Engineers, Vol.95, No.883, pp. 508-512, 1992 (in Japanese).
  6. [6] H. Kobayashi, F. Hara, G. Uchida, and M. Ohno, “Study on Face Robot for Active Human Interface – Mechanisms of Face Robot and Facial Expressions of 6 Basic Emotions –,” J. of the Robotics Society of Japan, Vol.12, No.1, pp. 155-163, 1994.
  7. [7] M. Toda, “Emotion and Urges,” the Urge Theory of Emotion and Cognition, Chapter 1, SCCS Technical Report, 1993.
  8. [8] M. Toda, “Basic Structure of the Urge Operations,” the Urge Theory of Emotion and Cognition, Chapter 2, SCCS Technical Report, 1994.
  9. [9] T. Ando and M. Kanoh, “Psychological Effects of a Self-sufficiency Model Based on Urge System,” J. of Advanced Computational Intelligence and Intelligent Informatics, Vol.14, No.7, pp. 877-884, 2010.
  10. [10] T. Ando and M. Kanoh, “A Self-sufficiency Model Using Urge System,” IEEE World Congress on Computational Intelligence, pp. 429-434, 2010.
  11. [11] J. A. Russell, “A Circumplex Model of Affect,” J. of Personality and Social Psychology, pp. 1161-1178, 1980.
  12. [12] R. Pfeifer and C. Scheier, “Understanding Intelligence,” MIT Press, 1999.
  13. [13] M. Toda, “Man, Robot, and Society,” Models and Speculations, Martinus Nijhoff Publishing, 1982.
  14. [14] P. Ekman, “Unmasking the Face,” Prentice-Hall, 1975.
  15. [15] T. Iihoshi, T. Ando, M. Kanoh, and T. Nakamura, “Action Learning for Self-sufficiency Model Based on Urge System,” Fuzzy System Symposium, pp. 240-245, 2010 (in Japanese).
  16. [16] E. J. Gibson and R. D. Walk, “The “Visual cliff”,” Scientific American, Vol.202, pp. 67-71, 1960.
  17. [17] R. Taki, Y. Maeda, and Y. Takahashi, “Personal Preference Analysis for Emotional Behavior Response of Autonomous Robot in Interactive Emotion Communication,” J. of Advanced Computational Intelligence and Intelligent Informatics, Vol.14, No.7, pp. 852-859, 2010.
  18. [18] Y. Maeda and R. Taki, “Interactive Emotion Communication between Human and Robot,” Int. J. of Innovative Computing, Information and Control, Vol.7, No.5(B), pp. 2961-2971, 2011.
  19. [19] Y. Matsui, M. Kanoh, S. Kato, T. Nakamura, and H. Itoh, “A Model for Generating Facial Expressions using Virtual Emotion based on Simple Recurrent Network,” J. of Advanced Computational Intelligence and Intelligent Informatics, Vol.14, No.5, pp. 453-463, 2010.
  20. [20] M. Kanoh, T. Nakamura, S. Kato, and H. Itoh, “Affective Facial Expressions Using Auto-associative Neural Network in Kansei Robot “Ifbot”,” Y. Dai, B. Chakraborty and M. Shi (Eds.), Kansei Engineering and Soft Computing: Theory and Practice, pp. 215-236, IGI Global, 2010.
  21. [21] M. Kanoh, S. Iwata, S. Kato, and H. Itoh, “Emotive Facial Expressions of Sensitivity Communication Robot “Ifbot”,” Kansei Engineering Int., Vol.5, No.3, pp. 35-42, 2005.
  22. [22] C. V. O.Witvliet and S. R. Vrana, “Psychophysiological Responses as Indices of Affective Dimensions,” Psychophysiology, Vol.32, pp. 436-443, 1995.
  23. [23] M. Kanoh and T. Shimizu, “Developing A Robot Babyloid That Cannot Do Anything,” J. of the Robotics Society of Japan, Vol.29, No.3, pp. 76-83, 2011 (in Japanese).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Sep. 21, 2021