JRM Vol.25 No.6 pp. 1060-1069
doi: 10.20965/jrm.2013.p1060


Implementation Approach of Affective Interaction for Caregiver Support Robot

Yutaka Miyaji* and Ken Tomiyama**

*Aoyama Gakuin University, 5-10-1 Fuchinobe, Chuou-ku, Sagamihara-shi, Kanagawa 252-5258, Japan

**Chiba Institute of Technology, 2-17-1 Tsudanuma, Narashino-shi, Chiba 275-0016, Japan

May 24, 2013
November 3, 2013
December 20, 2013
affective robotics, virtual kansei, caregiver support robot

This paper describes a series of our studies for developing functions for robots to better interact with humans, especially in the welfare field. The caregiver support robot is proposed to help caregivers in the welfare field and functions related to realizing affective behavior were studied. We believe such robotmust understand human emotion state, have own virtual emotion state and be able to express emotion in order to behave affectively. The Virtual Kansei (VK) was proposed to answer this set of requirements and various elements of VK were developed. The VK consists of three parts; the Kansei detector, the Kansei generator and the Kansei expressive regulator. The Kansei detector detects human partner’s emotion state using facial images, voice sounds and body movements. The Kansei generator generates human-like virtual emotion for robots. We devised a mimicking approach in developing the generator where emotion distances are defined and are used in learning and evaluating the generator. The Kansei expressive regulator makes the robot behave emotionally in executing everyday tasks. It modulates the basic robot motion according to the generated virtual emotion. This paper focuses on the concept and the relationship of these elements.

Cite this article as:
Yutaka Miyaji and Ken Tomiyama, “Implementation Approach of Affective Interaction for Caregiver Support Robot,” J. Robot. Mechatron., Vol.25, No.6, pp. 1060-1069, 2013.
Data files:
  1. [1] K. Tomiyama and Y. Miyaji, “Towards Realization of Care Worker Support Robot,” In S. Ohara and I. Kaminaga (Eds.), Current Status of Welfare in Japan, Chapter 13, pp. 301-329, Ibunsya, July 2001 (in Japanese). ISBN: 4-7531-0217-3
  2. [2] M. Fujita. “Digital creatures for future entertainment robotics,” Proc. of the IEEE Int. Conf. on Robotics and Automation, Vol.1, pp. 801-806, 2000.
  3. [3] K. Wada, T. Shibata, T. Saito, and K. Tanie, “Effects of robot assisted activity to elderly people who stay at a health service facility for the aged,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems 2003 (IROS 2003), Vol.3, pp. 2847-2852, 2003.
  4. [4] Y. Miyaji and K. Tomiyama, “Construction of Virtual KANSEI by Petri-net with GA and Method of Constructing Personality,” Proc. ROMAN2003, 12th IEEE Workshop Robot and Human Interactive Communication, p. 6B4 (CD-ROM), November 2003.
  5. [5] Y. Miyaji and K. Tomiyama, “Virtual KANSEI for Robots in Welfare,” IEEE/ICME Int. Conf. on Complex Medical Engineering 2007 (CME 2007), pp. 1323-1326, May 2007.
  6. [6] S. Sugano, H. Morita, and K. Tomiyama, “Study on Kawaii-ness in Motion – Classifying Kawaii Motion using Roomba –,” Int. Conf. on Applied Human Factors and Ergonomics 2012, 2012 (in Japanese).
  7. [7] Y. Miyaji and K. Tomiyama, “Emotion Detecting Method based on Various Attributes of Human Voice,” Korean J. of the Science of Emotion and Sensibility, Vol.81, No.1, pp. 1-7, March 2005.
  8. [8] Y. Katsuno, Y. Miyaji, and K. Tomiyama, “Head Posture Invariant Detection of Facial Expression,” 2005 JSME Annual Conf. on Robotics and Mechatronics (ROBOMEC 05), p. 2A1N028 (CDROM), 2005 (in Japanese).
  9. [9] R. Ueno and K. Tomiyama, “Emotion Detection Based on Fusion of Data From Facial Image and Voice Sound – Comfort/Discomfort State Detection using SVM –,” The 12th Annual Conf. of Japan Society of Kansei Engineering Tokyo, September, 2010.
  10. [10] C. Breazeal, “Regulation and entrainment for human-robot interaction,” Int. J. of Experimental Robotics, Vol.21, No.10-11, pp. 883-902, 2003.
  11. [11] H. Miwa, T. Okuchi, K. Itoh, H. Takanobu, and A. Takanishi, “A new mental model for humanoid robots for human friendly communication – introduction of learning system, mood vector and second order equations of emotion –,” J. of the American Voice I/O Society, Vol.3, pp. 3588-3593, 2003.
  12. [12] J. Kogami, K. Tomiyama, and Y. Miyaji, “Kansei Generator using HMM for Virtual Kansei in Caretaker Support Robot,” Kansei Engineering Int. (J. of Japan Socety Kansei Engineering), Vol.8, No.1, pp. 83-90, January 2009.
  13. [13] P. Ekman, “Emotions revealed: Recognizing faces and feelings to improve communication and emotional life,” Henry Holt and Company, 2007.
  14. [14] M. Zenkyoh and K. Tomiyama, “Surprise Generator for Virtual KANSEI Based on Human Surprise Characteristics,” Proc. of HCI2011, 2011.
  15. [15] H. Schlosberg, “Three dimensions of emotion,” Psychological Review, Vol.61, pp. 81-88, 1954.
  16. [16] N. Inoue, Y.Miyaji, and K. Tomiyama, “KANSEI Generator of Virtual Kansei Reflecting Environment Information,” Proc. of Spring Conf. of JSKE 2007, p. B21, 2007 (in Japanese).
  17. [17] T. Yanadori, N. Kiuchi, H. Takeuchi, Y. Miyaji, and K. Tomiyama, “Development of a Test Bed Care-Worker Support Robot,” 2004 JSME Annual Conf. on Robotics and Mechatronics (ROBOMEC’04), p. 1A1-H-63 (CD-ROM), June 2004 (in Japanese).
  18. [18] T. Naito, Y.Miyaji, and K. Tomiyama, “Action Generation for Care-Worker Support Robot Based on Virtual KANSEI,” Proc. of The 6th Annual Conf. of JSKE 2004, p. 266, September 2004 (in Japanese).
  19. [19] J. Kogami, Y. Miyaji, and K. Tomiyama, “Construction and Evaluation of a Virtual KANSEI System for Robots”, Trans. of Japan Society of Kansei Engineering, Vol.9, No.4, pp. 601-609, September 2009 (in Japanese).
  20. [20] Y. Shinohara and N. Ohtsu, “Facial Expression Recognition Using Fisher Weight Maps,” J. of the institute of Electronics, Information and Communication Engineer, Vol.103, No.737, pp. 79-84, (in Japanese).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Feb. 25, 2021