single-rb.php

JRM Vol.32 No.6 pp. 1233-1243
doi: 10.20965/jrm.2020.p1233
(2020)

Paper:

Arbitrary Viewpoint Visualization for Teleoperated Hydraulic Excavators

Tatsuki Nagano*, Ryosuke Yajima*, Shunsuke Hamasaki*, Keiji Nagatani*, Alessandro Moro*, Hiroyuki Okamoto**, Genki Yamauchi***, Takeshi Hashimoto***, Atsushi Yamashita*, and Hajime Asama*

*The University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

**RITECS Inc.
3-5-11 Shibasaki-cho, Tachikawa-shi, Tokyo 190-0023, Japan

***Public Works Research Institute
1-6 Minamihara, Tsukuba-shi, Ibaraki 300-2621, Japan

Received:
May 6, 2020
Accepted:
October 5, 2020
Published:
December 20, 2020
Keywords:
arbitrary viewpoint image, visualization, fish-eye camera, hydraulic excavator, teleoperation
Abstract
Arbitrary Viewpoint Visualization for Teleoperated Hydraulic Excavators

Arbitrary viewpoint image generated by the proposed system

In this paper, we propose a visualization system for the teleoperation of excavation works using a hydraulic excavator. An arbitrary viewpoint visualization system is a visualization system that enables teleoperators to observe the environment around a machine by combining multiple camera images. However, when applied to machines with arms (such as hydraulic excavators), a part of the field of view is shielded by the image of the excavator’s arm; hence, an occlusion occurs behind the arm. Furthermore, it is difficult for teleoperators to understand the three-dimensional (3D) condition of the excavating point because the current system approximates the surrounding environment with a predetermined shape. To solve these problems, we propose two methods: (1) a method to reduce the occluded region and expand the field of view, and (2) a method to measure and integrate the 3D information of the excavating point to the image. In addition, we conduct experiments using a real hydraulic excavator, and we demonstrate that an image with sufficient accuracy can be presented in real-time.

Cite this article as:
T. Nagano, R. Yajima, S. Hamasaki, K. Nagatani, A. Moro, H. Okamoto, G. Yamauchi, T. Hashimoto, A. Yamashita, and H. Asama, “Arbitrary Viewpoint Visualization for Teleoperated Hydraulic Excavators,” J. Robot. Mechatron., Vol.32, No.6, pp. 1233-1243, 2020.
Data files:
References
  1. [1] F. Matsuno and S. Tadokoro, “Rescue Robots and Systems in Japan,” Proc. of the 2004 IEEE Int. Conf. on Robotics and Biomimetics, pp. 12-20, 2004.
  2. [2] S. Kawatsuma, M. Fukushima, and T. Okada, “Emergency Response by Robotsto Fukushima-Daiichi Accident: Summary and Lessons Learned,” Industrial Robot: An Int. J., Vol.39, No.5, pp. 428-435, 2012.
  3. [3] M. Moteki, K. Fujino, T. Ohtsuki, and T. Hashimoto, “Research on Visual Point of Operator in Remote Control of Construction Machinery,” Proc. of the 28th Int. Symp. on Automation and Robotics in Construction, pp. 532-537, 2010.
  4. [4] S. Iwataki, H. Fujii, A. Moro, A. Yamashita, H. Asama, and H. Yoshinada, “Visualization of the Surrounding Environment and Operational Part in a 3DCG Model for the Teleoperation of Construction Machines,” 2015 IEEE/SICE Int. Symp. on System Integration, pp. 81-87, 2015.
  5. [5] W. Sun, S. Iwataki, R. Komatsu, H. Fujii, A. Yamashita, and H. Asama, “Simultaneous Tele-visualization of Construction Machine and Environment Using Body Mounted Cameras,” Proc. of the 2016 IEEE Int. Conf. on Robotics and Biomimetics, pp. 382-387, 2016.
  6. [6] M. Fuchida, S. Chikushi, A. Moro, A. Yamashita, and H. Asama, “Arbitrary Viewpoint Visualization for Teleoperation of Disaster Response Robots,” J. of Advanced Simulation in Science and Engineering, Vol.6, No.1, pp. 249-259, 2019.
  7. [7] D. Scaramuzza, A. Martinelli, and R. Siegwart, “A Flexible Technique for Accurate Omnidirectional Camera Calibration and Structure from Motion,” Proc. of the 2006 IEEE Int. Conf. on Computer Vision Systems, p. 45, 2006.
  8. [8] D. Scaramuzza, A. Martinelli, and R. Siegwart, “A Toolbox for Easily Calibrating Omnidirectional Cameras,” Proc. of the 2006 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 5695-5701, 2006.
  9. [9] R. Komatsu, H. Fujii, Y. Tamura, A. Yamashita, and H. Asama, “Free Viewpoint Image Generation System Using Fisheye Cameras and a Laser Rangefinder for Indoor Robot Teleoperation,” ROBOMECH J., Vol.7, No.1, pp. 1-10, 2020.
  10. [10] F. Cosco, C. Garre, F. Bruno, M. Muzzupappa, and M. A. Otaduy, “Augmented Touch Without Visual Obtrusion,” Proc. of the IEEE Int. Symp. on Mixed and Augmented Reality 2009, pp. 99-102, 2009.
  11. [11] T. Sato, H. Fujii, A. Moro, K. Sugimoto, A. Nozue, Y. Mimura, K. Onata, A. Yamashita, and H. Asama, “Development of Bird’s-Eye View System in Unmanned Construction,” Trans. of the Japan Society of Mechanical Engineers, Vol.81, Issue 823, pp. 14-31, 2015 (in Japanese).
  12. [12] T. Sato, A. Moro, A. Sugahara, T. Tasaki, A. Yamashita, and H. Asama, “Spatio-Temporal Bird’s-Eye View Images Using Multiple Fish-eye Cameras,” Proc. of the 2013 IEEE/SICE Int. Symp. on System Integration, pp. 753-758, 2013.
  13. [13] S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas, and M. J. Marín-Jiménez, “Automatic Generation and Detection of Highly Reliable Fiducial Markers Under Occlusion,” Pattern Recognition, Vol.47, No.6, pp. 2280-2292, 2014.
  14. [14] S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas, and R. Medina-Carnice, “Generation of Fiducial Marker Dictionaries Using Mixed Integer Linear Programming,” Pattern Recognition, Vol.51, pp. 481-491, 2016.
  15. [15] F. Dornaika and R. Horaud, “Simultaneous Robot-world and Hand-Eye Calibration,” IEEE Trans. on Robotics and Automation, Vol.14, No.4, pp. 617-622, 1998.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Nov. 24, 2022