single-rb.php

JRM Vol.29 No.5 pp. 887-894
doi: 10.20965/jrm.2017.p0887
(2017)

Paper:

Development and Performance Evaluation of Planar Travel Distance Sensors for Mobile Robots in Sandy Terrain

Arata Yanagisawa and Genya Ishigami

Keio University
3-14-1 Hiyoshi, Kohoku, Yokohama, Kanagawa 223-8522, Japan

Received:
March 20, 2017
Accepted:
July 26, 2017
Published:
October 20, 2017
Keywords:
localization, mobile robot, optical flow sensor, sandy terrain
Abstract

A planar travel distance sensor (two-dimensional sensor) was developed for a mobile robot in sandy terrain. The sensor system uses an optical flow device integrated into a small module with a simple configuration. The system achieves a high sampling rate on the order of milliseconds as well as precise measurement on a sub-millimeter order. Its performance was evaluated experimentally for measurement accuracy and repeatability, velocity response, robustness at varied heights with respect to terrain, and terrain surface characteristics. The experimental results confirm that the two-dimensional sensor system is accurate, having an error of distance traveled of less than a few percent, and that it possesses a wide dynamic range for the robot’s traveling velocity. This paper also discusses the applicability of the two-dimensional sensor for practical scenarios on the basis of the experimental findings.

Planar travel distance sensors for mobile robots

Planar travel distance sensors for mobile robots

Cite this article as:
A. Yanagisawa and G. Ishigami, “Development and Performance Evaluation of Planar Travel Distance Sensors for Mobile Robots in Sandy Terrain,” J. Robot. Mechatron., Vol.29 No.5, pp. 887-894, 2017.
Data files:
References
  1. [1] M. Yokozuka and O. Mastumoto, “Accurate Localization for Making Maps to Mobile Robots Using Odometry and GPS Without Scan-Matching,” J. Robotics and Mechatronics, Vol.27, No.4, pp. 410-418, 2015.
  2. [2] J. Eguchi and K. Ozaki, “Development of Method Using a Combination of DGPS and Scan Matching for the Making of Occupancy Grid Maps for Localization,” J. Robotics and Mechatronics, Vol.25, No.3, pp. 506-514, 2013.
  3. [3] T. Suzuki, M. Kitamura, and Y. Amano, “Autonomous Navigation of a Mobile Robot Based on GNSS/DR Integration in Outdoor Environments,” J. Robotics and Mechatronics, Vol.26, No.2, pp. 214-224, 2014.
  4. [4] T. M. Howard, A. Morfopoulos, J. Morrison, Y. Kuwata, C. Villalpando, L. Matthies, and M. McHenry, “Enabling Continuous Planetary Rover Navigation through FPGA Stereo and Visual Odometry,” Proc. IEEE Aerospace Conf., pp. 1-9, 2012.
  5. [5] D. Nister, O. Naroditsky, and J. Bergen, “Visual Odometry for Ground Vehicle Applications,” J. Field Robotics, Vol.23, pp. 3-20, 2006.
  6. [6] A. Sujiwo et al., “Monocular Vision-Based Localization Using ORB-SLAM with LIDER-Aided Mapping in Real-World Robot Challenge,” J. Robotics and Mechatronics, Vol.28, No.4, pp. 479-490, 2016.
  7. [7] S. Maeyama, N. Ishikawa, and S. Yuta, “Rule based filtering and fusion of odometry and gyroscope for a fail safe dead reckoning system of a mobile robot,” Proc. IEEE Int. Multisensor Fusion and Integration for Intelligent Systems, pp. 541-548, 1996.
  8. [8] K. Nagatani, D. Endo, and K. Yoshida, “Improvement of the Odometry Accuracy of a Crawler Vehicle with Consideration of Slippage,” Proc. IEEE Int. Conf. Robotics and Automation, pp. 2752-2757, 2007.
  9. [9] B. Barshan and H. F. Durrant-Whyte, “Inertial Navigation System for Mobile Robots,” IEEE Trans. on Robotics and Automation, Vol.11, No.3, pp. 328-342, 1995.
  10. [10] L. Ojeda and J. Borenstein, “Improved Position Estimation for Mobile Robots on Rough Terrain Using Attitude Information,” Technical Report, Department of Mechanical Engineering, The University of Michigan, 2001.
  11. [11] M. Maimone, Y. Cheng, and L. Matthies, “Two Years of Visual Odometry on the Mars Exploration Rovers,” J. Field Robotics, Vol.24, pp. 169-186, 2007.
  12. [12] K. S. Ali et al., “Attitude and Position on the Mars Exploration Rovers,” Proc. Int. Conf. Systems, Man and Cybernetics, pp. 20-27, 2005.
  13. [13] S. Lee and J.-B. Song, “Robust Mobile Robot Localization using Optical Flow Sensor and Encoder,” Proc. IEEE Int. Conf. Robotics and Automation, pp. 1039-1044, 2004.
  14. [14] D. Sekimori and F. Miyazaki, “Self-Localization for Indoor Mobile Robots Based on Optical Mouse Sensor Values and Simple Global Camera Information,” Proc. IEEE Int. Conf. Robotics and Biomimetics, pp. 605-610, 2005.
  15. [15] N. Tunwattana et al., “Investigations into the effects of illumination and acceleration on optical mouse sensor as contact-free 2D measurement devices,” Sensor and Actuators, Vol.149, pp. 87-92, 2008.
  16. [16] W. Xin and K. Shida, “Optical Mouse Sensor for Detecting Height Variation and Translation of a Surface,” Proc. IEEE Int. Conf. Industrial Technology, pp. 1-6, pp. 205-208, 2004,
  17. [17] R. Ross and S. Wang, “Toward Refocused Optical Mouse Sensors for Outdoor Optical Flow Odometry,” IEEE Sensors J., Vol 12, pp. 1925-1932, 2012.
  18. [18] N. Isaku, K. Watanabe, K. Nagatani, and K. Yoshida, “Noncontact position estimation device with optical sensor and laser sources for mobile robots traversing slippery terrains,” Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, pp. 3422-3427, 2010.
  19. [19] H. Dahmen, M. Alain, and H. A. Mallot, “Insect-inspired odometry by optic flow recorded with optical mouse chips,” Flying Insects and Robots, pp. 115-126, 2009.
  20. [20] F. Expert, S. Viollet, and F. Ruffier, “A mouse sensor and a 2-pixel motion sensor exposed to continuous illuminance changes,” Sensors, pp. 974-977, 2011.
  21. [21] D. Honegger, L. Meier, P. Tanskanen, and M. Pollefeys, “An Open Source and Open Hardware Embedded Metric Optical Flow CMOS Camera for Indoor and Applications,” Proc. Int. Conf. Robotics and Automation, pp. 1736-1741, 2013.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 06, 2024