single-au.php

IJAT Vol.19 No.4 pp. 566-574
doi: 10.20965/ijat.2025.p0566
(2025)

Research Paper:

Robot Localization by Data Integration of Multiple Thermal Cameras in Low-Light Environment

Masaki Chino*,**,† ORCID Icon, Junwoon Lee** ORCID Icon, Qi An** ORCID Icon, and Atsushi Yamashita** ORCID Icon

*Hazama Ando Corporation
515-1 Karima, Tsukuba, Ibaraki 305-0822, Japan

**The University of Tokyo
Chiba, Japan

Corresponding author

Received:
January 31, 2025
Accepted:
April 17, 2025
Published:
July 5, 2025
Keywords:
thermal camera, visual odometry, sensor fusion, GNSS
Abstract

A method is proposed for interpolating pose information by integrating data from multiple thermal cameras when a global navigation satellite system temporarily experiences a decrease in accuracy. When temperature information obtained from thermal cameras is visualized, a two-stage temperature range restriction is applied to focus only on areas with temperature variations, making conversion into clearer images possible. To compensate for the narrow field of view of thermal cameras, multiple thermal cameras are oriented in different directions. Pose estimation is performed with each camera, and the estimation results of one camera are interpolated using those of other cameras based on reliability derived from predicted values of the camera pose. Experimental results obtained in a low-light nighttime environment demonstrate that the proposed method achieves higher pose estimation accuracy than other state-of-the-art methods.

Cite this article as:
M. Chino, J. Lee, Q. An, and A. Yamashita, “Robot Localization by Data Integration of Multiple Thermal Cameras in Low-Light Environment,” Int. J. Automation Technol., Vol.19 No.4, pp. 566-574, 2025.
Data files:
References
  1. [1] S. Miura, I. Kuronuma, and K. Hamamoto, “Next generation construction production system: On automated construction machinery,” Proc. of the 7th Civil Engineering Conf. in the Asian Region, 2016.
  2. [2] J. Nubert, E. Walther, S. Khattak, and M. Hutter, “Learning-based localizability estimation for robust LiDAR localization,” Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 17-24, 2022. https://doi.org/10.1109/IROS47612.2022.9982257
  3. [3] M. Chino and A. Yamashita, “Location measurement system for automated operation of construction machinery using visual SLAM,” Proc. of the 10th Int. Conf. on Construction Engineering and Project Management, pp. 1011-1018, 2024.
  4. [4] D. Kong, Y. Zhang, and W. Dai, “Direct near-infrared-depth visual SLAM with active lighting,” IEEE Robotics and Automation Letters, Vol.6, No.4, pp. 7057-7064, 2021. https://doi.org/10.1109/LRA.2021.3096741
  5. [5] S.-H. Jung, J. Eledath, S. Johansson, and V. Mathevon, “Egomotion estimation in monocular infra-red image sequence for night vision applications,” Proc. of IEEE Workshop on Applications of Computer Vision, 2007. https://doi.org/10.1109/WACV.2007.20
  6. [6] S. Vidas and S. Sridharan, “Hand-held monocular SLAM in thermal-infrared,” Proc. of 12th Int. Conf. on Control, Automation, Robotics and Vision, pp. 859-864, 2012. https://doi.org/10.1109/ICARCV.2012.6485270
  7. [7] S. Yun, M. Jung, J. Kim, S. Jung, Y. Cho, M.-H. Jeon, G. Kim, and A. Kim, “STheReO: Stereo thermal dataset for research in odometry and mapping,” Proc. of IEEE/RSJ Int. Conf. of Intelligent Robots and Systems, pp. 3857-3864, 2022. httsp://doi.org/10.1109/IROS47612.2022.9981857
  8. [8] J. Lee, T. Ando, M. Shinozaki, T. Kitajima, Q. An, and A. Yamashita, “Self-TIO: Thermal-inertial odometry via self-supervised 16-bit feature extractor and tracker,” IEEE Robotics and Automation Letters, Vol.10, No.2, pp. 1003-1010, 2025. https://doi.org/10.1109/LRA.2024.3518303
  9. [9] K. Dziarki, A. Hulewicz, and Z. Krawiecki, “Selection of the size of field of view in thermal imaging observations of small areas,” Proc. of ITM Web of Conf., Vol.28, Article No.01040, 2019. https://doi.org/10.1051/itmconf/20192801040
  10. [10] Z. Zhang, H. Rebecq, C. Forster, and D. Scaramuzza, “Benefit of large field-of-view cameras for visual odometry,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 801-808, 2016. https://doi.org/10.1109/ICRA.2016.7487210
  11. [11] J. Xu, R. Li, L. Zhao, W. Yu, Z. Liu, B. Zhang, and Y. Li, “CamMap: Extrinsic calibration of non-overlapping cameras based on SLAM map alignment,” IEEE Robotics and Automation Letters, Vol.7, No.4, pp. 11879-11885, 2022. https://doi.org/10.1109/LRA.2022.3207793
  12. [12] J. Engel, V. Koltun, and D. Cremers, “Direct sparse odometry,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.40, No.3, pp. 611-625, 2018. https://doi.org/10.1109/TPAMI.2017.2658577
  13. [13] R. E. Kalman, “A new approach to linear filtering and prediction problems,” J. of Basic Engineering, Vol.82, pp. 35-45, 1960. https://doi.org/10.1115/1.3662552
  14. [14] J. Lee, R. Komatsu, M. Shinozaki, T. Kitajima, H. Asama, Q. An, and A. Yamashita, “Switch-SLAM: Switching-based LiDAR-inertial-visual SLAM for degenerate environments,” IEEE Robotics and Automation Letters, Vol.9, No.8, pp. 7270-7277, 2024. https://doi.org/10.1109/LRA.2024.3421792
  15. [15] S. Umeyama, “Least-squares estimation of transformation parameters between two point patterns,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.13, No.4, pp. 376-380, 1991. https://doi.org/10.1109/34.88573
  16. [16] J. A. Shaw, P. W. Nugent, N. J. Pust, B. Thurairajah, and K. Mizutani, “Radiometric cloud imaging with an uncooled microbolometer thermal infrared camera,” Optics Express, Vol.13, No.15, pp. 5807-5817, 2005. https://doi.org/10.1364/OPEX.13.005807
  17. [17] P. Saponaro, S. Sorensen, S. Rhein, and C. Kambhamettu, “Improving calibration of thermal stereo cameras using heated calibration board,” Proc. of IEEE Int. Conf. on Image Processing, pp. 4718-4722, 2015. https://doi.org/10.1109/ICIP.2015.7351702
  18. [18] P. Geneva, K. Eckenhoff, W. Lee, Y. Yang, and G. Huang, “OpenVINS: A research platform for visual-inertial estimation,” Proc. of the 2020 IEEE Int. Conf. on Robotics and Automation, pp. 4666-4672, 2020. https://doi.org/10.1109/ICRA40945.2020.9196524
  19. [19] C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM3: An accurate open-source library for visual, visual-inertial, and multimap SLAM,” IEEE Trans. on Robotics, Vol.37, No.6, pp. 1874-1890, 2021. https://doi.org/10.1109/TRO.2021.3075644

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 04, 2025