single-au.php

IJAT Vol.19 No.3 pp. 178-191
doi: 10.20965/ijat.2025.p0178
(2025)

Research Paper:

Change Detection in Image Pairs for Plant Inspection Using Mobile Robot

Susumu Shimizu*1 ORCID Icon, Takuya Igaue*1,† ORCID Icon, Jun Younes Louhi Kasahara*1 ORCID Icon, Naoya Yamato*2, Seiji Kasahara*2, Hiroyuki Ito*2, Taizo Daito*2, Sunao Tamura*2, Akinobu Sasamura*2, Toshiya Kato*2, Fumihiko Nonaka*2, Shinji Kanda*3 ORCID Icon, Keiji Nagatani*1 ORCID Icon, Hajime Asama*1 ORCID Icon, Qi An*4 ORCID Icon, and Atsushi Yamashita*4 ORCID Icon

*1Department of Precision Engineering, Graduate School of Engineering, The University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

Corresponding author

*2Engineering & Capital Planning Department, ENEOS Corporation
Tokyo, Japan

*3Research into Artifacts, Center for Engineering, The University of Tokyo
Tokyo, Japan

*4Department of Human and Engineered Environmental Studies, Graduate School of Frontier Sciences, The University of Tokyo
Kashiwa, Japan

Received:
August 20, 2024
Accepted:
November 29, 2024
Published:
May 5, 2025
Keywords:
robot system, image change detection, autonomous plant inspection
Abstract

In this study, we propose a system to detect changes in three-dimensional (3D) space for autonomous plant visual inspection by a mobile robot. The videos captured by a mobile robot during past inspections are compared with the videos obtained during the current inspection using both pose information and the acquired images. To ensure robustness against changes in shooting conditions, change detection is executed employing deep learning techniques. Subsequently, the detected information is projected onto a 3D space to localize the changes. To verify the effectiveness of the proposed method, experiments were conducted both in a real plant environment and a simulated indoor plant environment. The results of the outdoor experiments showed that the proposed system achieved image pair determination, change detection, and integration into a 3D space. The results of the indoor experiments and evaluations confirmed that the proposed method for image pair determination was suitable based on considerations of detection accuracy and computation time.

Cite this article as:
S. Shimizu, T. Igaue, J. Kasahara, N. Yamato, S. Kasahara, H. Ito, T. Daito, S. Tamura, A. Sasamura, T. Kato, F. Nonaka, S. Kanda, K. Nagatani, H. Asama, Q. An, and A. Yamashita, “Change Detection in Image Pairs for Plant Inspection Using Mobile Robot,” Int. J. Automation Technol., Vol.19 No.3, pp. 178-191, 2025.
Data files:
References
  1. [1] S. Naz, M. F. Iqbal, I. Mahmood, and M. Allam, “Marine oil spill detection using synthetic aperture radar over Indian ocean,” Marine Pollution Bulletin, Vol.162, Article No.111921, 2021. https://doi.org/10.1016/j.marpolbul.2020.111921
  2. [2] Z. Tang, Z. Wang, Y. Lu, and P. Sun, “Cause analysis and preventive measures of pipeline corrosion and leakage accident in alkylation unit,” Engineering Failure Analysis, Vol.128, Article No.105623, 2021. https://doi.org/10.1016/j.engfailanal.2021.105623
  3. [3] M. A. Adegboye, W. Fung, and A. Karnik, “Recent advances in pipeline monitoring and oil leakage detection technologies: Principles and approaches,” Sensors, Vol.19, No.11, Article No.2548, 2019. https://doi.org/10.3390/s19112548
  4. [4] J. Hilario, C. Penaloza, D. Hernandez-Carmona, J. Balbuena, D. Quiroz, J. Ramirez, and F. Cuellar, “Development of a mobile robot for inspection of analog gauges in industrial plants using computer vision,” Proc. of the 2019 IEEE Int. Conf. on Cybernetics and Intelligent Systems (CIS) and IEEE Conf. on Robotics, Automation and Mechatronics (RAM), pp. 209-214, 2019. https://doi.org/10.1109/CIS-RAM47153.2019.9095800
  5. [5] M. Fahimipirehgalin, E. Trunzer, M. Odenweller, and B. Vogel-Heuser, “Automatic visual leakage detection and localization from pipelines in chemical process plants using machine vision techniques,” Engineering, Vol.7, No.6, pp. 758-776, 2021. https://doi.org/10.1016/j.eng.2020.08.026
  6. [6] S. Kim, J. Park, and J. W. Park, “A leak detection and 3D source localization method on a plant piping system by using multiple cameras,” Nuclear Engineering and Technology, Vol.51, No.1, pp. 155-162, 2019. https://doi.org/10.1016/j.net.2018.09.012
  7. [7] K. Shukutani, K. Onishi, N. Onishi, H. Okazaki, H. Kojima, and S. Kobori, “Development of explosion-proof autonomous plant operation robot for petrochemical plants,” Mitsubishi Heavy Industries Technical Review, Vol.55, No.4, 2018.
  8. [8] D. Zaman, M. K. Tiwari, A. K. Gupta, and D. Sen, “A review of leakage detection strategies for pressurised pipeline in steady-state,” Engineering Failure Analysis, Vol.109, Article No.104264, 2020. https://doi.org/10.1016/j.engfailanal.2019.104264
  9. [9] G.-P. Kousiopoulos, D. Kampelopoulos, N. Karagiorgos, G.-N. Papastavrou, V. Konstantakos, and S. Nikolaidis, “Acoustic leak localization method for pipelines in high-noise environment using time-frequency signal segmentation,” IEEE Trans. on Instrumentation and Measurement, Vol.71, 2022. https://doi.org/10.1109/TIM.2022.3150864
  10. [10] S. Salimpour, J. P. Queralta, and T. Westerlund, “Self-calibrating anomaly and change detection for autonomous inspection robots,” Proc. of the 2022 Sixth IEEE Int. Conf. on Robotic Computing (IRC), pp. 207-214, 2022. https://doi.org/10.1109/IRC55401.2022.00042
  11. [11] J. Zhao, M. Gong, J. Liu, and L. Jiao, “Deep learning to classify difference image for image change detection,” Proc. of the 2014 Int. Joint Conf. on Neural Networks (IJCNN), pp. 411-417, 2014. https://doi.org/10.1109/IJCNN.2014.6889510
  12. [12] Z. Wang, J. Peng, W. Song, X. Gao, Y. Zhang, X. Zhang, L. Xiao, and L. Ma, “A convolutional neural network-based classification and decision-making model for visible defect identification of high-speed train images,” J. of Sensors, Vol.2021, Article No.5554920, 2021. https://doi.org/10.1155/2021/5554920
  13. [13] L. Attard, C. J. Debono, G. Valentino, and M. D. Castro, “Vision-based change detection for inspection of tunnel liners,” Automation in Construction, Vol.91, pp. 142-154, 2018. https://doi.org/10.1016/j.autcon.2018.03.020
  14. [14] Z. Wang, Y. Zhang, L. Luo, and N. Wang, “AnoDFDNet: A deep feature difference network for anomaly detection,” J. of Sensors, Vol.2022, Article No.3538541, 2022. https://doi.org/10.1155/2022/3538541
  15. [15] P. F. Alcantarilla, J. Nuevo, and A. Bartoli, “Fast explicit diffusion for accelerated features in nonlinear scale spaces,” Proc. of the British Machine Vision Conf. (BMVC), 2013. http://doi.org/10.5244/C.27.13
  16. [16] S. A. K. Tareen and Z. Saleem, “A comparative analysis of SIFT, SURF, KAZE, AKAZE, ORB, and BRISK,” Proc. of the 2018 Int. Conf. on Computing, Mathematics and Engineering Technologies (iCoMET), 2018. https://doi.org/10.1109/ICOMET.2018.8346440
  17. [17] A. Singh, “Review article digital change detection techniques using remotely-sensed data,” Int. J. of Remote Sensing, Vol.10, No.6, pp. 989-1003, 1989. https://doi.org/10.1080/01431168908903939
  18. [18] R. J. Radke, S. Andra, O. Al-Kofahi, and B. Roysam, “Image change detection algorithms: a systematic survey,” IEEE Trans. on Image Processing, Vol.14, No.3, pp. 294-307, 2005. https://doi.org/10.1109/TIP.2004.838698
  19. [19] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.22, No.11, pp. 1330-1334, 2000. https://doi.org/10.1109/34.888718
  20. [20] S. Rusinkiewicz and M. Levoy, “Efficient variants of the ICP algorithm,” Proc. of the third Int. Conf. on 3-D digital imaging and modeling, pp. 145-152, 2001. https://doi.org/10.1109/IM.2001.924423
  21. [21] D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv:1412.6980, 2014. https://doi.org/10.48550/arXiv.1412.6980

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on May. 08, 2025