single-jc.php

JACIII Vol.11 No.10 pp. 1274-1280
doi: 10.20965/jaciii.2007.p1274
(2007)

Paper:

Joint Attention Between a Human Being and a Partner Robot Based on Computational Intelligence

Naoyuki Kubota*,**, Toshiyuki Shimizu*, and Minoru Abe***

*Dept. of System Design, Tokyo Metropolitan University, 6-6 Asahigaoka, Hino, Tokyo 191-0065, Japan

**SORST, Japan Science and Technology Agency (JST)

***EQUOS RESEARCH Co., Ltd. EV Laboratory

Received:
December 14, 2005
Accepted:
April 16, 2006
Published:
December 20, 2007
Keywords:
intelligent robot, visual perception, human-friendly communication, genetic algorithms, neural network
Abstract

Discussing joint attention between a human being and a partner robot to ensure human-friendly communication, we extract the direction of human attention based on facial direction in images, and propose control of the partner robot based on the direction of the extracted human face. We propose extraction for the direction in which a hand points, and discuss experimental results for partner robots using our proposal.

Cite this article as:
Naoyuki Kubota, Toshiyuki Shimizu, and Minoru Abe, “Joint Attention Between a Human Being and a Partner Robot Based on Computational Intelligence,” J. Adv. Comput. Intell. Intell. Inform., Vol.11, No.10, pp. 1274-1280, 2007.
Data files:
References
  1. [1] R. A. Brooks, “Cambrian Intelligence,” The MIT Press, 1999.
  2. [2] R. Pfeifer and C. Scheier, “Understanding Intelligence,” The MIT Press, 1999.
  3. [3] B. Scassellati, “Investigating Models of Social Development using a Humanoid Robot,” Robots and Biology: Developing Connections, 1998.
  4. [4] G. Syswerda, “A Study of Reproduction in Generational and Steady-State Genetic Algorithms,” Foundations of Genetic Algorithms, San Mateo: Morgan Kaufmann Publishers, Inc., 1991.
  5. [5] J.-S. R. Jang, C.-T. Sun, and E. Mizutani, “Neuro-Fuzzy and Soft Computing,” Prentice-Hall, Inc., 1997.
  6. [6] W. Maass and C. M. Bishop, “Pulsed Neural Networks,” The MIT Press, 1999.
  7. [7] T. Kohonen, “Self-Organization and Associative Memory,” Springer, 1984.
  8. [8] T. Hastie, R. Tibshirani, and J. Friedman, “The Elements of Statistical Learning,” Springer-Verlag, 2001.
  9. [9] D. Sperber and D. Wilson, “Relevance - Communication and Cognition,” Blackwell Publishing Ltd., 1995.
  10. [10] K. Morikawa, S. Agarwal, C. Elkan, and G. Cottrell, “A Taxonomy of Computational and Social Learning,” Proc. of Workshop on Developmental Embodied Cognition, 2001.
  11. [11] G. Rizzolatti, and M. A. Arbib, “Language within our grasp,” Trends in Neuroscience, Vol.21, pp. 188-194, 1998.
  12. [12] T. Fukuda and N. Kubota, “An Intelligent Robotic System Based on A Fuzzy Approach,” Proc. of IEEE, Vol.87, No.9, pp. 1448-1470, 1999.
  13. [13] N. Kubota, D. Hisajima, F. Kojima, and T. Fukuda, “Fuzzy and Neural Computing for Communication of A Partner Robot,” Journal of Multiple-Valued Logic and Soft-Computing, Vol.9, No.2 pp. 221-239, 2003.
  14. [14] N. Kubota and K. Tomoda, “Behavior Coordination of A Partner Robot based on Imitation,” Proc. of 2nd Int. Conf. on Autonomous Robots and Agents, pp. 164-169, 2004.
  15. [15] N. Kubota and K. Nishida, “Fuzzy Computing for Communication of A Partner Robot Based on Imitation,” Proc. (CD-ROM) of 2005 IEEE Int. Conf. on Robotics and Automation, pp. 4391-4396, 2005.
  16. [16] M. Merleau-Ponty, “Phenomenology of Perception,” Trans. Colin Smith. New York: Humanities Press, 1962.
  17. [17] B. Robins, P. Dickerson, P. Stribling, and K. Dautenhahn, “Robot-Mediated Joint Attention in Children with Autism,” Interaction Studies Vol.5, No.2, pp. 161-198, 2004.
  18. [18] H. H. Bulthoff, S. W. Lee, T. A. Poggio, and C. Wallraven, “Biologically Motivated Computer Vision,” Springer-Verlag, 2002.
  19. [19] N. Kubota and K. Nishida, “Development of Internal Models for Communication of A Partner Robot Based on Computational Intelligence,” Proc. of 6th Int. Symposium on Advanced Intelligent Systems, pp. 577-582, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Mar. 05, 2021