single-rb.php

JRM Vol.34 No.5 pp. 985-996
doi: 10.20965/jrm.2022.p0985
(2022)

Paper:

Angle of View Switching Method at High-Speed Using Motion Blur Compensation for Infrastructure Inspection

Yuriko Ezaki*, Yushi Moko*, Tomohiko Hayakawa*, and Masatoshi Ishikawa*,**

*Information Technology Center, The University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

**Tokyo University of Science
1-3 Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan

Received:
March 11, 2022
Accepted:
August 10, 2022
Published:
October 20, 2022
Keywords:
high-speed and high-resolution imaging, galvanometer mirror, infrastructure inspection, angle of view, motion blur
Abstract

Efficient imaging is achieved under conditions of high relative velocity between the camera and the subject by using the following imaging system; two galvanometer mirrors are placed vertically in front of the camera, one for motion blur compensation and the other for switching the angle of view. The proposed system can overcome the shortcomings of conventional imaging systems with motion blur compensation, such as a small angle of view, and efficiently acquire high-resolution images. If the angle is changed for each capture while the mirrors are stationary for the exposure time, the natural frequency of the mirrors produces noise, leading to a poor resolution. However, this issue can be managed by generating and using an input that does not contain a natural frequency component. A target moving in one dimension can be captured and it is confirmed that the angle of view was extended from the obtained image. It is expected that the camera will be used for inspections under conditions where the relative speed between the camera and target is high, such as in highway tunnel inspections.

Extended angle of view images

Extended angle of view images

Cite this article as:
Y. Ezaki, Y. Moko, T. Hayakawa, and M. Ishikawa, “Angle of View Switching Method at High-Speed Using Motion Blur Compensation for Infrastructure Inspection,” J. Robot. Mechatron., Vol.34 No.5, pp. 985-996, 2022.
Data files:
References
  1. [1] T. Suda, A. Tabata, J. Kawakami, and T. Suzuki, “Development of an impact sound diagnosis system for tunnel concrete lining,” Tunnelling and Underground Space Technology, Vol.19, pp. 328-329, 2004.
  2. [2] Z. Aliansyah, K. Shimasaki, M. Jiang, T. Takaki, I. Ishii, H. Yang, C. Umemoto, and H. Matsuda, “A Tandem Marker-Based Motion Capture Method for Dynamic Small Displacement Distribution Analysis,” J. Robot. Mechatron., Vol.31, No.5, pp. 671-685, 2019.
  3. [3] R. Haraguchi, K. Osuka, S. Makita, and S. Tadokoro, “The development of the mobile inspection robot for rescue activity, MOIRA2,” Proc. of 2005 12th Int. Conf. on Advanced Robotics (ICAR’05), pp. 498-505, 2005.
  4. [4] M. Ikura, L. Miyashita, and M. Ishikawa, “Stabilization System for UAV Landing on Rough Ground by Adaptive 3D Sensing and High-Speed Landing Gear Adjustment,” J. Robot. Mechatron., Vol.33, No.1, pp. 108-118, 2021.
  5. [5] M. Morita, H. Kinjo, S. Sato, T. Sulyyon, and T. Anezaki, “Autonomous flight drone for infrastructure (transmission line) inspection (3),” 2017 Int. Conf. on Intelligent Informatics and Biomedical Sciences (ICIIBMS), pp. 198-201, 2017.
  6. [6] H. Huang, Y. Sun, Y. Xue, and F. Wang, “Inspection Equipment Study for Subway Tunnel Defects by Grey-Scale Image Processing,” Adv. Eng. Inform., Vol.32, No.C, pp. 188-201, 2017.
  7. [7] H. Mashimo and T. Ishimura, “State of the art and future prospect of maintenance and operation of road tunnel,” Proc. of the 23rd ISARC 2006, Tokyo, Japan, pp. 299-302, 2006.
  8. [8] P. J. Hardin and R. R. Jensen, “Small-Scale Unmanned Aerial Vehicles in Environmental Remote Sensing: Challenges and Opportunities,” GIScience & Remote Sensing, Vol.48, No.1, pp. 99-111, 2011.
  9. [9] C. Kerl, J. Sturm, and D. Cremers, “Dense visual SLAM for RGB-D cameras,” 2013 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2100-2106, 2013.
  10. [10] T. Yang, P. Li, H. Zhang, J. Li, and Z. Li, “Monocular Vision SLAM-Based UAV Autonomous Landing in Emergencies and Unknown Environments,” Electronics, Vol.7, No.5, 2018.
  11. [11] T. Hayakawa, Y. Moko, K. Morishita, and M. Ishikawa, “Pixel-wise deblurring imaging system based on active vision for structural health monitoring at a speed of 100 km/h,” A. Verikas, P. Radeva, D. Nikolaev, and J. Zhou (Eds.), Tenth Int. Conf. on Machine Vision (ICMV 2017), Vol.10696, pp. 548-554, Int. Society for Optics and Photonics (SPIE), 2018.
  12. [12] S. Adwan, I. Alsaleh, and R. Majed, “A new approach for image stitching technique using Dynamic Time Warping (DTW) algorithm towards scoliosis X-ray diagnosis,” Measurement, Vol.84, pp. 32-46, 2016.
  13. [13] Y. Ezaki, Y. Moko, H. Ikeda, T. Hayakawa, and M. Ishikawa, “Extension of the Capture Range Under High-Speed Motion Using Galvanometer Mirror,” 2020 IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics (AIM), pp. 1854-1859, 2020.
  14. [14] C. Dong, C. C. Loy, K. He, and X. Tang, “Learning a Deep Convolutional Network for Image Super-Resolution,” European Conf. on Computer Vision (ECCV 2014), Vol.8692, pp. 184-199, 2014.
  15. [15] Y. Gu, T. Yendo, T. M. Panahpour, T. Fujii, and M. Tanimoto, “Traffic sign detection in dual-focal active camera system,” 2011 IEEE Intelligent Vehicles Symposium (IV), pp. 1054-1059, 2011.
  16. [16] D. Matsuka and M. Mimura, “Surveillance System for Multiple Moving Objects,” IEEJ Trans. on Industry Applications, Vol.139, No.11, pp. 908-915, 2019.
  17. [17] S. Hu, K. Shimasaki, M. Jiang, T. Takaki, and I. Ishii, “A Dual-Camera-Based Ultrafast Tracking System for Simultaneous Multi-target Zooming,” 2019 IEEE Int. Conf. on Robotics and Biomimetics (ROBIO), pp. 521-526, 2019.
  18. [18] Y. Seki, H. Fujimoto, and T. Arimatsu, “Basic examination of simultaneous optimization in mechanism and control for galvanometer scanner system,” 2011 IEEE-Japan, Technical Committee on Industrial Instrumentation and Control, pp. 17-21, 2011.
  19. [19] T. Sueishi, H. Oku, and M. Ishikawa, “Robust high-speed tracking against illumination changes for dynamic projection mapping,” 2015 IEEE Virtual Reality (VR), pp. 97-104, 2015.
  20. [20] L. Miyashita, Y. Watanabe, and M. Ishikawa, “MIDAS Projection: Markerless and Modelless Dynamic Projection Mapping for Material Representation,” ACM Trans. Graph., Vol.37, No.6, 2018.
  21. [21] K. Fukamizu, L. Miyashita, and M. Ishikawa, “ElaMorph Projection: Deformation of 3D Shape by Dynamic Projection Mapping,” 2020 IEEE Int. Symposium on Mixed and Augmented Reality (ISMAR), pp. 164-173, 2020.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 06, 2024