single-rb.php

JRM Vol.28 No.4 pp. 523-532
doi: 10.20965/jrm.2016.p0523
(2016)

Paper:

Structured Light Field Generated by Two Projectors for High-Speed Three Dimensional Measurement

Akihiro Obara*, Xu Yang*, and Hiromasa Oku**

*School of Science and Technology, Gunma University
1-5-1 Tenjin-cho, Kiryu, Gunma 376-8515, Japan

**Graduate School of Science and Technology, Gunma University
1-5-1 Tenjin-cho, Kiryu, Gunma 376-8515, Japan

Received:
January 20, 2016
Accepted:
April 21, 2016
Published:
August 20, 2016
Keywords:
high-speed three dimensional measurement, structured light filed, depth map
Abstract
Triangulation is commonly used to restore 3D scenes, but its frame of less than 30 fps due to time-consuming stereo-matching is an obstacle for applications requiring that results be fed back in real time. The structured light field (SLF) our group proposed previously reduced the amount of calculation in 3D restoration, realizing high-speed measurement. Specifically, the SLF estimates depth information by projecting information on distance directly to a target. The SLF synthesized as reported, however, presents difficulty in extracting image features for depth estimation. In this paper, we propose synthesizing the SLF using two projectors with a certain layout. Our proposed SLF’s basic properties are based on an optical model. We evaluated the SLF’s performance using a prototype we developed and applied to the high-speed depth estimation of a target moving randomly at a speed of 1000 Hz. We demonstrate the target’s high-speed tracking based on high-speed depth information feedback.
Concept of SLF generated by two projectors

Concept of SLF generated by two projectors

Cite this article as:
A. Obara, X. Yang, and H. Oku, “Structured Light Field Generated by Two Projectors for High-Speed Three Dimensional Measurement,” J. Robot. Mechatron., Vol.28 No.4, pp. 523-532, 2016.
Data files:
References
  1. [1] K. Konolige, “Projected Texture Stereo,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 148-155, 2010.
  2. [2] H. Kawasaki, R. Furukawa, R. Sagawa, and Y. Yagi, “Dynamic scene shape re-construction using a single structured light pattern,” IEEE Conf. on Computer Vision and Pattern Recognition, pp. 1-8, 2008.
  3. [3] M. Tateishi?CH. Ishiyama, and K. Umeda, “Construction of a Very Compact Range Image Sensor Using a Multi-Slit Laser Projector,” Trans. of the JSME Ser. C, Vol.74, No.739, pp. 499-505, 2008 (in Japanese).
  4. [4] Y. Watanabe, T. Komuro, and M. Ishikawa, “955-fps Real-time Shape Measurement of Moving/Deforming Object using High-speed Vision for Numerous-point Analysis,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 3192-3197, 2007.
  5. [5] J. Takei, S. Kagami, and K. Hishimoto, “3,000-fps 3-D Shape Measurement Using a High-Speed Camera-Projector System,” Proc. 2007 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 3211-3216, 2007.
  6. [6] Y. Liu, H. Gao, Q. Gu, T. Aoyama, T. Takaki, and I. Ishii, “High-Frame-Rate Structured Light 3-D Vision for Fast Moving Object,” J. of Robotics and Mechatronics, Vol.26, No.3, pp. 311-320, 2014.
  7. [7] J. Chen, Q. Gu, T. Aoyama, T. Takaki, and I. Ishii, “Blink-Spot Projection Method for Fast Three-Dimensional Shape Measurement,” J. of Robotics and Mechatronics, Vol.27, No.4, pp. 430-443, 2015.
  8. [8] K. Suzuki and I. Kumazawa, “High-speed 3D measurement method by single camera using trichromatic illumination (translated from the original title),” The 14th Meeting on Image Recognition and Understanding (MIRU2011), pp. 1429-1436, 2011 (in Japanese).
  9. [9] H. Kawasaki, Y. Horita, H. Morinaga, Y. Matugano, S. Ono, M. Kimura, and Y. Takane, “Structured light with coded aperture for wide range 3D measurement,” IEEE Conf. on Image Processing (ICIP), pp. 2777-2780, 2012.
  10. [10] H. Masuyama, H. Kawasaki, and R. Furukawa, “Depth from Projector’s Defocus Based on Multiple Focus Pattern Projection,” IPSJ Trans. on Computer Vision and Applications, Vol.6, pp. 88-92, 2014.
  11. [11] T. Matsumoto, H. Oku, and M. Ishikawa, “High-Speed Real-Time Depth Estimation by Projecting Structured Light Field,” J. of the Robotics Society of Japan, Vol.34, No.1, pp. 48-55, 2016.
  12. [12] Y. Watanabe, H. Oku, and M. Ishikawa, “Architectures and applications of high-speed vision,” Optical Review, Vol.21, Issue 6, pp. 875-882, 2014.
  13. [13] Joseph W. Goodman, “Introduction To Fourier Optics,” Roberts & Co., 2004.
  14. [14] K. Okumura, K. Yokoyama, H. Oku, and M. Ishikawa, “1 ms Auto Pan-Tilt – video shooting technology for objects in motion based on Saccade Mirror with background subtraction,” Advanced Robotics, Vol.29, No.2, pp. 201-200, 2011.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024