single-rb.php

JRM Vol.21 No.1 pp. 121-127
doi: 10.20965/jrm.2009.p0121
(2009)

Paper:

Eccentricity Compensator for Wide-Angle Fovea Vision Sensor

Sota Shimizu* and Joel W. Burdick**

*California Institute of Technology, Division of Biology, MC 139-74, Pasadena, CA 91125, USAWaseda University, Advanced Research Institute for Science and Engineering, 17 Kikui-cho, Shinjuku-ku, Tokyo 162-0044, Japan

**California Institute of Technology, Department of Bioengineering

Received:
October 6, 2008
Accepted:
October 6, 2008
Published:
February 20, 2009
Keywords:
fovea sensor, space-variant image, wavelet transform, bio-mimetics, image processing
Abstract
This paper aims at the acquisition of a robust feature for rotation, scale, and translation-invariant image matching of space-variant images from a fovea sensor. The proposed model eccentricity compensator corrects deformation in log-polar images when the fovea sensor is not centered on the target image, that is, when eccentricity exists. An image simulator in a discrete space implements this model through its geometrical formulation. This paper also proposes Unreliable Feature Omission (UFO) using the Discrete Wavelet Transform. UFO reduces local high frequency noise appearing in the space-variant image when the eccentricity changes. It discards coefficients when they are regarded as unreliable, based on digitized errors in the input image from the fovea sensor. The first simulation estimates the compensator by comparing it with other polar images. This result shows the compensator performs well, and its root mean square error (RMSE) changes only by up to 2.54% on the condition that the eccentricity is within 34.08°. The second simulation shows UFO performs well for the log-polar image remapped by the eccentricity compensator when white Gaussian noise (WGN) is added. The result from the Daubechies (7, 9) biorthogonal wavelet shows UFO reduces the RMSE by up to 0.40 %, even if the WGN is not added, when the eccentricity is within 34.08°.

This paper is the full translation from the transactions of JSME Vol.73, No.733.
Cite this article as:
S. Shimizu and J. Burdick, “Eccentricity Compensator for Wide-Angle Fovea Vision Sensor,” J. Robot. Mechatron., Vol.21 No.1, pp. 121-127, 2009.
Data files:
References
  1. [1] E. L. Schwartz, “Spatial mapping in the primate sensory projection: Analytic structure and relevance to perception,” Biological Cybernetics, Vol.29, pp. 181-194, 1977.
  2. [2] G. Sandini and V. Tagliasco, “An anthropomorphic retina-like structure for scene analysis,” Computer Graphics and Image Processing, 14, pp. 365-372, 1980.
  3. [3] F. Berton, G. Sandini, and G. Metta, “Anthropomorphic Visual Sensors, Encyclopedia of Sensors,” Edited by C.A. Grimes, E.C.Dickey and M.V.Pishko, Vol.10, pp. 1-16, 2005.
  4. [4] J. Van der Spiegel, G. Kreider, C. Claeys, I. Debusschere, G. Sandini, P. Dario, and et al., “A foveated retina-like sensor using CCD technology,” Analog VLSI Implementations of Neural Networks. Kluwer, C. Mead and M. Ismail, Boston, 1989.
  5. [5] R. Wodnicki, G. W. Roberts, and M. D. Levine, “A foveated image sensor in standard CMOS technology,” In Custom Integrated Circuits Conference, Santa Clara, California, 1995.
  6. [6] F. Panerai, C. Capurro, and G. Sandini, “Space Variant Vision for an Active Camera Mount,” Proc. of SPIE, Vol.2488, pp. 284-296, Visual Information Processing IV, 1995.
  7. [7] A. Bernardino, J. Santos-Victor, and G. Sandini, “Foveated active tracking with redundant 2D motion parameters,” Robotics and Autonomous System 39, pp. 205-221, 2002.
  8. [8] S. Shimizu, et.al., “Vision Sensor with Wide Angle and High Distortion lens,” Video proceedings of IEEE Int. Conf. on Robotics and Automation, Visual Sensing 3, 1995.
  9. [9] Y. Kuniyoshi, N. Kita, K. Sugimoto, , et.al., “A Foveated Wide Angle Lens for Active Vision,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 2982-2988, 1995.
  10. [10] S. Shimizu, H. Jiang, and J. W. Burdick, “Image Extraction by Wide Angle Foveated Lens for Overt-Attention,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 3437-3442, 2006.
  11. [11] S. Shimizu, “Multi-Functional Application of Wide Angle Foveated Vision Sensor in Mobile Robot Navigation,” Journal of Robotics and Mechatronics, Vol.14, No.4, pp. 382-389, 2002.
  12. [12] M. Bolduc and M. D. Levine, “A Review of Biologically-Motivated Space-Variant Data Reduction Models for Robotic Vision,” Computer Vision and Image Understanding, Vol.69, No.2, pp. 170-184, 1998.
  13. [13] S. W. Wilson, “On the retino-cortical mapping, International Journal on Man-Machine Studies,” Vol.18, pp. 361-389, 1983.
  14. [14] I. Daubechies, “Ten Lecture on Wavelets,” SIAM Philadelphia, 1992.
  15. [15] D. Casasent and D. Psaltis, “Position, Rotation and Scale-invariant Optical Correlation,” Applied Optics, Vol.15, pp. 1795-1799, 1976.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 18, 2024