JRM Vol.17 No.2 pp. 110-115
doi: 10.20965/jrm.2005.p0110


Development of a 3D Vision Range Sensor Using Equiphase Light Section Method

Masaaki Kumagai

Tohoku Gakuin University, 1-13-1 Chuo, Tagajo, Miyagi 985-8537, Japan

October 18, 2004
January 6, 2005
April 20, 2005
3D range sensor, vision, light section method, phase shift method, stereovision

Many autonomous and industrial robots require three-dimensional (3D) environmental sensors. Stereovision or laser range finders are often used to detect distance to obstacles or targets. We propose 3D measurement system using active stereovision based on light section and phase shift. A special light source consisting of a rotating cylinder and a lamp projects planes of lights whose intensities vary sinusoidally over time. Each plane phase differs from those of other planes, so we distinguish which plane sectioned the target from the phase. Images from the video camera are processed into phases of each pixel that indicate distance to measured targets. This needs only simple calculation for phase detection and simultaneously operates multiple systems. The basic principle and expansion for practical use are proposed and experimental results are detailed using a trial, verifying the proposal’s effectiveness.

Cite this article as:
Masaaki Kumagai, “Development of a 3D Vision Range Sensor Using Equiphase Light Section Method,” J. Robot. Mechatron., Vol.17, No.2, pp. 110-115, 2005.
Data files:
  1. [1] M. Kumagai, and T. Emura, “Slope-Walk of a Human Type Biped Robot –Estimation of inclination using stereo vision and its application to slope-walk–,” Journal of the Society of Instrument and Control Engineers, Vol.37, No.11, pp. 1040-1047, 2001 (in Japanese).
  2. [2] H. Kohno, T. Emura, and M. Kumagai, “Vision-Based Dynamic Walk of Biped Robot on Uneven Ground,” Proceedings of JSME ROBOMEC ’02, 1A1-H04, 2002 (in Japanese).
  3. [3] A. Koseki, T. Emura, and M. Kumagai, “Dynamic Walk of Quardruped Robot on Uneven Ground by Using Stereo Vision,” Proceedings of JSME ROBOMEC ’02, 2P1-E10, 2002 (in Japanese).
  4. [4] T. Emura, M. Kumagai, and L. Wang, “A Next-Generation Intelligent Car for Safe Drive,” Journal of Robotics and Mechatronics, Vol.12, No.5, pp. 545-551, 2000.
  5. [5] H. G. Maas, “Robust Automatic Surface Reconstruction with Structured Light,” International Archives of Photogrammetry and Remote Sensing, Vol.29, part B5, pp. 709-713, 1992.
  6. [6] P. Albrecht, and B. Michaelis, “Improvement of the Spatial Resolution of an Optical 3-D Measurement Procedure,” IEEE Trans, on Instrumentation and Measurement, Vol.47, No.1, pp. 158-162, Feb. 1998.
  7. [7] Y. Oike, H. Shintaku, M. Ikeda, and K. Asada, “A High-Resolution and Real-Time 3-D Imaging System Based on Light-Section Method,” Journal of The Institute of Image Information and Television Engineers, Vol.57, No.9, pp. 1149-1151, 2003 (in Japanese).
  8. [8] V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Applied Optics, Vol.23, No.18, pp. 3105-3108, Sep. 1984.
  9. [9] R. Mitaka, and C. Hamada, “High-Speed High-Accuracy Three-Dimensional Measuring System Using Phase-Shift Method,” Matsushita Electric Works Technical Report, No.78, pp. 10-15, Aug. 2002 (in Japanese, available on WWW).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Mar. 05, 2021