Mapping Facial Expression to Internal States Based on Intuitive Parenting
Ayako Watanabe*, Masaki Ogino**, and Minoru Asada**,***
*Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871, Japan
**Asada Synergistic Intelligence Project, ERATO, Japan Science and Technology Agency, FRC1, Graduate School of Engineering, Osaka University
***Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871, Japan
Sympathy is a key issue in interaction and communication between robots and their users. In developmental psychology, intuitive parenting is considered the maternal scaffolding upon which children develop sympathy when caregivers mimic or exaggerate the child’s emotional facial expressions . We model human intuitive parenting using a robot that associates a caregiver’s mimicked or exaggerated facial expressions with the robot’s internal state to learn a sympathetic response. The internal state space and facial expressions are defined using psychological studies and change dynamically in response to external stimuli. After learning, the robot responds to the caregiver’s internal state by observing human facial expressions. The robot then expresses its own internal state facially if synchronization evokes a response to the caregiver’s internal state.
-  G. Gergely and J. S. Watson, “Early socio-emotional development: Contingency perception and the social-biofeedback model,” in P. Rochat (Ed.), “Early Social Cognition: Understanding Others in the First Months of Life,” Mahwah, NJ: Lawrence Erlbaum, pp. 101-136, 1999.
-  A. Mehrabian, “Implicit communication of emotions and attitudes,” Wadsworth, 1981.
-  M. H. Johnson and J. Morton, “Biology and cognitive development,” Blackwell, 1991.
-  D. Rosenstein and H. Oster, “Differential facial responses to four basic tastes in newborns,” Blackwell Publishing, Vol.59, pp. 1555-1568, Dec., 1988.
-  D. Matsui, T. Minato, K. F. MacDorman, and H. Ishiguro, “Generating natural motion in an android by mapping human motion,” Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1089-1096, 2005.
-  T. Hashimoto, M. Sennda, and H. Kobayashi, “Realization of realistic and rich facial expressions by face robot,” 2004 1st IEEE Technical Exhibition Based Conference on Robotics and Automation, pp. 37-38, Nov., 2004.
-  C. Breazeal, D. Buchsbaum, J. Gray, D. Gatenby, and B. Blumberg, “Learning from and about others: Towards using imitation to bootstrap the social understanding of others by robots,” Artificial Life, Vol.11, pp. 31-62, 2005.
-  T. Kobayashi, Y. Ogawa, K. Kato, and K. Yamamoto, “Learning system of human facial expression for a family robot,” in Proceeding of the Sixth International Conference on Automatic Face and Gesture Recognition, pp. 481-486, 2004.
-  H. Papousek and M. Papousek, “Intuitive parenting: a dialectic counterpart to the infant’s precocity in integrative capacities,” Handbook of Infant Development, pp. 669-720, 1987.
-  P. Rochat, “The infant’s world,” Artificial Life, 2001.
-  M. Asada, K. F. MacDorman, H. Ishiguro, and Y. Kuniyoshi, “Cognitive developmental robotics as a new paradigm for the design of humanoid robots,” in Proceeding of the 1st IEEE/RSJ International Conference on Humanoid Robots, CDROM, 2000.
-  J. A. Russell, “A circumplex model of affect,” Journal of Personality and Social Psychology, Vol.39, pp. 1161-1178, 1980.
-  H. Yamada, “Visual information for categorizing facial expression of emotion,” Japan Psychology Review, Vol.35, pp. 172-181, 1993.
-  K. Lorenz, “Studies in animal and human behavior,” London: Methuen, 1970-1971.
-  T. Kohonen, “Self-organizing maps,” Springer-Verlag, Berlin Heidelberg, 1995.
-  T. Singer, B. Seymour, J. O’Doherty, H. Kaube, R. J. Dolan, and C. D. Frith, “Empathy for pain involves the affective but not sensory components of pain,” Science, Vol.303, No.20, pp. 1157-1162, Feb., 2004.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 International License.
Copyright© 2007 by Fuji Technology Press Ltd. and Japan Society of Mechanical Engineers. All right reserved.