single-jc.php

JACIII Vol.14 No.7 pp. 770-775
doi: 10.20965/jaciii.2010.p0770
(2010)

Paper:

An Integrated Perceptual System of Different Perceptual Elements for an Intelligent Robot

Hiroyuki Masuta and Naoyuki Kubota

Department of System Design, Tokyo Metropolitan University, 6-6 Asahigaoka, Hino-shi, Tokyo 191-0065, Japan

Received:
April 1, 2010
Accepted:
July 28, 2010
Published:
November 20, 2010
Keywords:
robot arm, 3D-range camera, service robot, spiking-neural network, human visual perception
Abstract
This paper discusses an integrated perceptual system for intelligent robots. Robots should be able to perceive environments flexibly enough to realize intelligent behavior. We focus on a perceptual system based on the perceiving-acting cycle discussed in ecological psychology. The perceptual system we propose consists of a retinal model and a spiking-neural network realizing the perceiving-acting cycle concept. We apply our proposal to a robot arm with a threedimensional (3D)-range camera. We verified the feasibility of the perceptual system using a single input such as depth or luminance information. Our proposal integrates different perceptual elements for improving the accuracy of perception. Experimental results showed that our proposal perceives the targeted dish accurately by integrating different perceptual elements using the 3D-range camera.
Cite this article as:
H. Masuta and N. Kubota, “An Integrated Perceptual System of Different Perceptual Elements for an Intelligent Robot,” J. Adv. Comput. Intell. Intell. Inform., Vol.14 No.7, pp. 770-775, 2010.
Data files:
References
  1. [1] N. Mitsunaga, Z. Miyashita, K. Shinozawa, T. Miyashita, H. Ishiguro, and N. Hagita, “What makes people accept a robot in a social environment,” Int. Conf. on Intelligent Robots and Systems, pp. 3336-3343, 2008.
  2. [2] J. J. Gibson, “The ecological approach to visual perception,” Hillsdale, NJ: Lawrence Erlbaum Associates, 1979.
  3. [3] D. N. Lee, “Guiding movement by coupling taus,” Ecological psychology, Vol.10, No.3-4, pp. 221-250, 1998.
  4. [4] H. Masuta and N. Kubota, “The Intelligent Control based on Perceiving-Acting Cycle by using 3D-range camera,” 2009 IEEE Int. Conf. on Systems, Man, and Cybernetics, 2009.
  5. [5] E. Sato, T. Yamaguchi, and F. Harashima, “Natural Interface Using Pointing Behavior for Human-Robot Gestural Interaction,” IEEE trans. on Industrial Electronics, Vol.54, No.2, pp. 1105-1112, 2007.
  6. [6] N. Y. Chong, H. Hongu, K. Ohba, S. Hirai, and K. Tanie, “Knowledge Distributed Robot Control Framework,” Proc. Int. Conf. on Control, Automation, and Systems, pp. 22-25, 2003.
  7. [7] J. G. Garcia, J. G. Ortega, A. S. Garcia, and S. S. Martinez, “Robotic Software Architecture for Multisensor Fusion System,” Industrial Electronics, IEEE Trans. on, Vol.56, No.3, pp. 766-777, 2009.
  8. [8] T. Oggier, M. Lehmann, R. Kaufmannn, M. Schweizer, M. Richter, P. Metzler, G. Lang, F. Lustenberger, and N. Blanc, “An allsolid-state optical range camera for 3D-real-time imaging with sub-centimeter depth-resolution (SwissRanger),” Proc. of SPIE, Vol.5249, pp. 634-545, 2003.
  9. [9] C. A. Curcio, K. R. Sloan, and R. E. Kalina, “Human photoreceptor topography,” J. of Comparative Neurology, Vol.292, pp. 497-523, 1960.
  10. [10] W. Maass and C. M. Bishop, “Pulsed Neural Networks,” The MIT Press, 1999.
  11. [11] H. Masuta and N. Kubota, “Perception in A Partner Robot Based on Human Visual System,” Int. J. of factory automation, robotics and soft computing, Issue 2, pp. 48-55, 2008.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024