single-rb.php

JRM Vol.24 No.4 pp. 677-685
doi: 10.20965/jrm.2012.p0677
(2012)

Paper:

3D Measurement Using a Fish-Eye Camera Based on EPI Analysis

Kenji Terabayashi*, Toru Morita**, Hiroya Okamoto***,
and Kazunori Umeda***

*Department of Mechanical Engineering, Faculty of Engineering, Shizuoka University, 3-5-1 Johoku, Hamamatsu, Shizuoka 432-8561, Japan

**Sony Corporation, 2-15-3 Konan, Minato-ku, Tokyo 108-6201, Japan

***Department of Precision Mechanics, Faculty of Science and Engineering, Chuo University, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan

Received:
October 14, 2011
Accepted:
February 16, 2012
Published:
August 20, 2012
Keywords:
fish-eye lens camera, three-dimensional (3D) measurement, Epipolar-Plane Image (EPI)
Abstract
In car driving support systems and mobile robots, it is important to understand three-dimensional environment widely at once. In this paper, we use a fish-eye camera as a sensor to measure three-dimensional (3D) environments. This camera can take a wide-range and distortional image and can be easily mounted on cars. We propose a method for reconstructing 3D environment using fish-eye images based on Epipolar-Plane Image (EPI) analysis. This method enables easy and stable matching of feature points. The effectiveness of the proposed method is verified by experiments.
Cite this article as:
K. Terabayashi, T. Morita, H. Okamoto, and K. Umeda, “3D Measurement Using a Fish-Eye Camera Based on EPI Analysis,” J. Robot. Mechatron., Vol.24 No.4, pp. 677-685, 2012.
Data files:
References
  1. [1] R. Bunschoten and B. Krose, “Robust scene reconstruction from an omnidirectional vision system,” IEEE Trans. on Robotics and Automation, Vol.19, No.2, pp. 351-357, 2003.
  2. [2] R. Kawanishi, A. Yamashita, and T. Kaneko, “Three-Dimensional Environment Model Construction from an Omnidirectional Image Sequence,” J. of Robotics and Mechatronics, Vol.21, No.5, pp. 574-582, 2009.
  3. [3] R. Matsuhisa, S. Ono, H. Kawasaki, A. Banno, and K. Ikeuchi, “Structure from Motion for omnidirectional images using efficient factorization method based on virtual camera rotation,” Proc. of Int. Workshop Computer Vision and Its Application to Image Media Processing, pp. 1-7, 2009.
  4. [4] L. Shigang, “Monitoring around a vehicle by a spherical image sensor,” IEEE Trans. on Intelligent Transportation Systems, Vol.7, No.4, pp. 541-550, 2006.
  5. [5] R. Okutsu, K. Terabayashi, Y. Aragaki, N. Shimomura, and K. Umeda, “Generation of Overhead View Images by Estimating Intrinsic and Extrinsic Camera Parameters of Multiple Fish-Eye Cameras,” Proc. of IAPR Conf. on Machine Vision Applications (MVA), pp. 447-450, 2009.
  6. [6] Y. Chen, Y. Tu, C. Chiu, and Y. Chen, “An Embedded System for Vehicle Surrounding Monitoring,” Proc. of 2nd Int. Conf. on Power Electronics and Intelligent Transportation System (PEITS), Vol.2, pp. 92-95, 2009.
  7. [7] S. Abraham and W. Forstner, “Fish-eye-stereo calibration and epipolar rectification,” ISPRS J. of Photogrammetry and Remote Sensing, Vol.59, pp. 278-288, 2005.
  8. [8] T. Nishimoto and J. Yamaguchi, “Three dimensional measurement using fisheye stereo vision,” Proc. of SICE Annual Conf. 2007, pp. 2008-2012, 2007.
  9. [9] L. Shigang, “Binocular Spherical Stereo,” IEEE Trans. on Intelligent Transportation Systems, Vol.9, pp. 589-600, 2008.
  10. [10] P. J. Herrera, G. Pajares, M. Guijarro, J. J. Ruz, and J. M. Cruz, “A Stereovision Matching Strategy for Images Captured with Fish-Eye Lenses in Forest Environments,” Sensors, Vol.11, pp. 1756-1783, 2011.
  11. [11] K. Terabayashi, H. Mitsumoto, T. Morita, Y. Aragaki, N. Shimomura, and K. Umeda, “Measurement of Three Dimensional Environment with a Fish-eye Camera Based on Structure From Motion – Error Analysis,” J. of Robotics and Mechatronics, Vol.21, No.6, pp. 680-688, 2009.
  12. [12] D. G. Lowe, “Object Recognition from Local Scale Invariant Features,” Proc. of IEEE Int. Conf. on Computer Vision (ICCV), pp. 1150-1157, 1999.
  13. [13] R. C. Bolles, H. H. Baker, and D. H. Marimont, “Epipolar-Plane Image Analysis: An Approach to Determining Structure from Motion,” Int. J. of Computer Vision, Vol.1, pp. 7-55, 1987.
  14. [14] H. H. Baker and R. C. Bolles, “Generalizing epipolar-plane image analysis on the spatiotemporal surface,” Int. J. of Computer Vision, Vol.3, No.1, pp. 33-49, 1989.
  15. [15] H. Kawasaki, K. Ikeuchi, and M. Sakauchi, “EPI analysis of omnicamera image,” Proc. of 15th IAPR Int. Conf. on Pattern Recognition (ICPR), Vol.1, pp. 379-383, 2000.
  16. [16] H. Kawasaki, K. Ikeuchi, and M. Sakauchi, “Spatio-Temporal Analysis of Omni Image,” Proc. of IEEE Computer Society Conf. on Computer Vision and Pattern Recognition (CVPR), Vol.2, pp. 577-584, 2000.
  17. [17] J. Canny, “A Computational Approach to Edge Detection,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.8, No.6, pp. 679-698, 1986.
  18. [18]
    Supporting Online Materials:
  19. [19] [a] Around view monitor. Nissan Motor [Online].
    Available: http://www.nissan-global.com/EN/TECHNOLOGY/OVERVIEW/avm.html

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024