single-rb.php

JRM Vol.30 No.1 pp. 106-116
doi: 10.20965/jrm.2018.p0106
(2018)

Paper:

Sensor Data Fusion of a Redundant Dual-Platform Robot for Elevation Mapping

Avi Turgeman*, Shraga Shoval**, and Amir Degani*,***

*Technion Autonomous Systems Program (TASP), Technion
Haifa 32000, Israel

**Department of Industrial Engineering and Management, Ariel University
Ariel 40700, Israel

***Faculty of Civil and Environmental Engineering, Technion
Haifa 32000, Israel

Received:
June 25, 2017
Accepted:
November 14, 2017
Published:
February 20, 2018
Keywords:
signal estimation, data fusion, robotic sensing, robot control, robotic terrain mapping
Abstract

This paper presents a novel methodology for localization and terrain mapping along a defined course such as narrow tunnels and pipes, using a redundant unmanned ground vehicle kinematic design. The vehicle is designed to work in unknown environments without the use of external sensors. The design consists of two platforms, connected by a passive, semi-rigid three-bar mechanism. Each platform includes separate sets of local sensors and a controller. In addition, a central controller logs the data and synchronizes the platforms’ motion. According to the dynamic patterns of the redundant information, a fusion algorithm, based on a centralized Kalman filter, receives data from the different sets of inputs (mapping techniques), and produces an elevation map along the traversed route in the x-z sagittal plane. The method is tested in various scenarios using simulated and real-world setups. The experimental results show high degree of accuracy on different terrains. The proposed system is suitable for mapping terrains in confined spaces such as underground tunnels and wrecks where standard mapping devices such as GPS, laser scanners and cameras are not applicable.

Redundant dual-platform prototype

Redundant dual-platform prototype

Cite this article as:
A. Turgeman, S. Shoval, and A. Degani, “Sensor Data Fusion of a Redundant Dual-Platform Robot for Elevation Mapping,” J. Robot. Mechatron., Vol.30 No.1, pp. 106-116, 2018.
Data files:
References
  1. [1] T. Suzuki, Y. Amano, T. Hashizume, and S. Suzuki, “3D Terrain Reconstruction by Small Unmanned Aerial Vehicle Using SIFT-Based Monocular SLAM,” J. Robot. Mechatron., Vol.23, No.2. pp. 292-301, 2011.
  2. [2] K. Yoshida and H. Hamano, “Motion Dynamics of a Rover with Slip-Based Traction model,” Int. Conf. on Robotics and Automation, pp. 3155-3160, 2002.
  3. [3] Z. Shiller and W. Serate, “Trajectory Planning of Tracked Vehicles,” ASME J. of Dynamic Systems, Measurement and Control, Vol.117, Np.4, pp. 619-624, 1995.
  4. [4] L. Ojeda and J. Borenstein, “Methods for the Reduction of Odometry Errors in Over-Constrained Mobile Robots,” Proc. of the UGV Technology Conf. at the SPIE AeroSense Symposium, Orlando, FL, April 21-25, 2003.
  5. [5] H. D. Whyte and T. Bailey, “Simultaneous Localization and Mapping (SLAM): Part I The Essential Algorithms,” 2006.
  6. [6] T. Bailey and H. D. Whyte, and T. Bailey, “Simultaneous Localization and Mapping (SLAM): Part II State of the Art,” 2006.
  7. [7] R. Smith, M. Self, and P. Cheeseman, “Estimating Uncertain Spatial Relationships in Robotics,” Autonomous Robot Vehicles, Springer-Verlag, pp. 167-193, 1990.
  8. [8] J. J. Leonard and H. D. Whyte, “Simultaneous Map Building and Localization for an Autonomous Mobile Robot,” Proc. IEEE Int. Workshop on Intelligent Robots and Systems (IROS), pp. 1442-1447, Osaka, Japan, 1991.
  9. [9] L. Doitsidis, A. Renzaglia, S. Weiss, E. Kosmatopoulos, D. Scaramuzza, and R. Siegwart, “3-D Surveillance Coverage Using Maps Extracted by a Monocular SLAM Algorithm,” Intelligent Robots and System (IROS), IEEE/RSJ Int. Conf. San Francisco, 2011.
  10. [10] J. J. Leonard and R. J. Rikoski, “Incorporation of Delayed Decision Making into Stochastic Mapping,” Int. Symposium on Experimental Robotics, 2000.
  11. [11] J. J. Leonard, R. J. Rikoski, P.M. Newman, and M. C. Bosse, “Mapping Partially Observable Features from Multiple Uncertain Vantage Points,” Int. J. of Robotics Research, Vol.21, Issue 10-11, pp. 943-975, 2002.
  12. [12] M. Deans and M. Hebert, “Experimental Comparison of Techniques for Localization and Mapping Using A Bearing-Only Sensor,” Int. Symposium on Experimental Robotics, 2000.
  13. [13] T. Bailey, “Constrained Initialisation for Bearing-Only SLAM,” IEEE Int. Conf. on Robotics and Automation, Taipei, Taiwan, 2003.
  14. [14] D. F. Wolf, G. S. Sukhatme, D. Fox, and W. Burgard, “Autonomous Terrain Mapping and Classification Using Hidden Markov Models,” Int. Conf. on Robotics and Automation, Barcelona, Spain, 2005.
  15. [15] F. Hashikawa and K. Morioka, “Convenient Position Estimation of Distributed Sensors in Intelligent Spaces Using SLAM for Mobile Robots,” J. Robot. Mechatron., Vol.27, No.2, pp. 191-199, 2015.
  16. [16] A. Prusak, O. Melnychuk, and H. Roth, “Pose Estimation and Map Building with a PMD-Camera for Robot Navigation,” “3-D PoseMap” project (KO-2044/3-1 and RO-2384/1-1), 2007.
  17. [17] K. Konolige, M. Agrawal, R. C. Bolles, C. Cowan, M. Fischler, and B. Gerkey, “Outdoor Mapping and Navigation Using Stereo Vision,” Proc. of Int. Symp. on Experimental Robotics (ISER), Rio de Janeiro, Brazil, 2006.
  18. [18] A. A. Souza and L. M. G. Gonçalves, “3D Robotic Mapping with Probabilistic Occupancy Grids,” Int. J. of Engineering Sciences & Emerging Technologies, 2012.
  19. [19] R. Manduchi, A. Castano, A. Talukder, and L. Matthies, “Obstacle Detection and Terrain Classification for Autonomous Off-Road Navigation,” Autonomous Robots 18, pp. 81-102, Springer Science and Business Media, 2005.
  20. [20] A. L. Rankin, A. Huertas, and L. H. Matthies, “Stereo Vision Based Terrain Mapping for Off-Road Autonomous Navigation,” Unmanned Systems Technology XI, Proc. of the SPIE, Vol.7332, 2009.
  21. [21] J. S. Gutmann, E. Eade, P. Fong, and M. E. Munich, “Vector Field SLAM – Localization by Learning the Spatial Variation of Continuous Signals,” IEEE Trans. on Robotics, Vol.28, Issue 3, pp. 650-667, 2012.
  22. [22] H. N. Do, M. Jadaliha, M. Temel, and J. Choi, “Fully Bayesian Field Slam Using Gaussian Markov Random Fields,” Asian J. of Control, Vol.18, No.5, pp. 1-14, 2016.
  23. [23] S. Vasudevan, F. Ramos, E. Nettleton, and H. D. Whyte, “Gaussian Process Modeling of Large Scale Terrain,” J. of Field Robotics, Vol.26, Issue 10, pp. 812-840, 2009.
  24. [24] R. Tanabe, Y. Sasaki, and H. Takemura, “Probabilistic 3D Sound Source Mapping System Based on Monte Carlo Localization Using Microphone Array and LIDAR,” J. Robot. Mechatron., Vol.29, No.1, pp. 94-104, 2017.
  25. [25] A. Sujiwo, T. Ando, E. Takeuchi, Y. Ninomiya, and M. Edhario, “Monocular Vision-Based Localization Using ORB-SLAM with LIDAR-Aided Mapping in Real-World Robot Challenge,” J. Robot. Mechatron., Vol.28, No.4, pp. 479-490, 2016.
  26. [26] S. Shoval and A. Shapiro, “Dual-Tracked Mobile Robot for Motion in Challenging Terrains,” J. of Field Robotics, JWUS107s7A/ROB-10-0128.R3 July 24, 2011.
  27. [27] J. Borenstein, “Control and Kinematic Design for Multi-Degree-of-Freedom Mobile Robots with Compliant Linkage,” IEEE Trans. on Robotics and Automation, Vol.11, No.1, pp. 21-35, 1995.
  28. [28] T. Digaňa, “Kalman Filtering in Multi-Sensor Fusion,” Master’s Thesis for the degree of M.Sc., Department of Automation and Systems Technology, Helsinki University of Technology, 2004.
  29. [29] T. G. Lee, “Centralized Kalman Filter with Adaptive Measurement Fusion: its Application to a GPS/SDINS Integration System with an Additional Sensor International,” J. of Control, Automation, and Systems, Vol.1, No.4, 2003.
  30. [30] G. Welch and G. Bishop, “An Introduction to the Kalman Filter,” Department of Computer Science, University of North Carolina, 2006.
  31. [31] P. G. Jayasekara, G. Ishigami, and T. Kubota, “Particle Filter based 3-D Position Tracking for Terrain Rovers using Laser Point Clouds,” IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), September 14-18, Chicago, IL, USA, 2014.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024