Learning of Joint Attention from Detecting Causality Based on Transfer Entropy
Hidenobu Sumioka*, **, Yuichiro Yoshikawa*, and Minoru Asada*, **
*Asada Synergistic Intelligence Project, ERATO, JST
**Graduate School of Eng., Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871, Japan
Joint attention, i.e., the behavior of looking at the same object that another person is looking at, plays an important role in human and human-robot communication. Previous synthetic studies focusing on modeling the early developmental process of joint attention have proposed learning methods without explicit instructions for joint attention. In these studies, the causal structure between a perception variable (a caregiver’s face direction or an individual object) and an action variable (gaze shift to a caregiver’s face or to an object location) was given in advance to learn joint attention. However, such a structure is expected to be found by the robot through interaction experiences. In this paper, we investigates how transfer entropy, an information theory measure, is used to quantify the causality inherent in face-to-face interaction. In computer simulations of human-robot interaction, we examine which pair of perceptions and actions is selected as the causal pair and show that the selected pairs can be used for learning a sensorimotor map for joint attention.
-  F. Kaplan and V. V. Hafner, “The Challenges of Joint Attention,” In Proc. of the 4th Int. Workshop on Epigenetic Robotics, Genoa, Italy, pp. 67-74, 2004.
-  B. Scassellati, “Theory of mind for a humanoid robot,” Autonomous Robots, Vol.12, No.1, pp. 13-24, Jan., 2002.
-  M. Imai, T. Ono, and H. Ishiguro, “Physical Relation and Expression: Joint Attention for Human-Robot Interaction,” In Proc. of the 10th IEEE Int. Workshop on Robot and Human Communication, 2001.
-  Y. Nagai, K. Hosoda, A. Morita, and M. Asada, “A constructive model for the development of joint attention,” Connection Science, Vol.15, No.4, pp. 211-229, Dec., 2003.
-  J. Triesch, C. Teuscher, G. O. Deák, and E. Carlson, “Gaze following: why (not) learn it?,” Developmental Science, Vol.9, No.2, pp. 125-157, Mar., 2003.
-  M. Asada, F. K. MacDorman, H. Ishiguro, and Y. Kuniyoshi, “Cognitive Developmental Robotics As a New Paradigm for the Design of Humanoid Robots,” Robotics and Autonomous Systems, Vol.37, pp. 185-193, 2001.
-  M. Lungarella, G. Metta, R. Pfeifer, and G. Sandini, “Developmental robotics: a survey,” Connection Science, Vol.15, No.4, pp. 151-190, 2003.
-  T. Schreiber, “Measuring information transfer,” Physical Review Letters, Vol.85, No.2, pp. 461-464, 2000.
-  M. Lungarella, K. Ishiguro, Y. Kuniyoshi, and N. Otsu, “Methods for quantifying the causal structure of bivariate time series,” Int. J. of Bifurcation and Chaos, Vol.17, No.3, pp. 903-921, 2007.
-  O. Sporns, J. Karnowski, and M. Lungarella, “Mapping Causal Relations in Sensorimotor Networks,” Proc. of the 5th Int. Workshop on Epigenetic Robotics, 2006.
-  F. Sai and I. W. R. Bushnell, “The perception of faces in different poses by 1-month-olds,” British journal of developmental psychology, Vol.6, pp. 35-41, 1988.
-  J. G. Bremner, “Infancy: 2nd Edition,” Oxford, Blackwell, 1994.
-  T. Striano, A. Henning, and D. Stahl, “Sensitivity to social contingencies between 1 and 3 months of age,” Developmental Science, Vol.8, No.6, pp. 509-518, 2005.
-  J. Nadel, I. Carchon, C. Kervella, D. Marcelli, and D. Réserbat-Plantey, “Expectancies for social Contingency in 2-month-olds,” Developmental Science, Vol.2, No.2, pp. 164-173, 1999.
-  P. Rochat, “The infant and Others,” The infant’s world, pp. 127-166, Harvard University Press, 2001.
-  V. Corkum and C. Moore, “Development of Joint Visual Attention in Infants,” C. Moore and P. J. Dunham (Eds.), Joint attention: It’s origins and role in development, Lawrence Erlbaum Associates, pp. 61-84, 1995.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.
Copyright© 2008 by Fuji Technology Press Ltd. and Japan Society of Mechanical Engineers. All right reserved.