single-rb.php

JRM Vol.19 No.1 pp. 34-41
doi: 10.20965/jrm.2007.p0034
(2007)

Paper:

Mobile Robot with Floor Tracking Device for Localization and Control

Isaku Nagai and Yutaka Tanaka

Division of Industrial Innovation Sciences, Graduate School of Natural Science and Technology, Okayama University, 3-1-1 Tsushima-Naka, Okayama-shi, Okayama 700-8530, Japan

Received:
October 31, 2005
Accepted:
September 12, 2006
Published:
February 20, 2007
Keywords:
mobile robot, localization, visual tracking, floor image, tracked vehicle
Abstract
We developed a visual device that tracks floor images and calculates the movement of a camera on a mobile robot. The mobile robot has caterpillar-tread wheels and uses our visual tracking device for localization. The robot is localized and controlled in real time based on the information on the estimated position and direction using FPGA, SRAM, and a small CPU board. Location and direction error over a closed path is eliminated by searching for an original floor image memorized initially at the point from which the robot started the run. Experimental results demonstrate the advantages of the proposal using the visual tracking device localizing a mobile robot with wheel slippage and under changing light conditions. We also show that the robot runs along a closed path repeatedly without a straying from the track by using the original image to correct accumulated error.
Cite this article as:
I. Nagai and Y. Tanaka, “Mobile Robot with Floor Tracking Device for Localization and Control,” J. Robot. Mechatron., Vol.19 No.1, pp. 34-41, 2007.
Data files:
References
  1. [1] G. Antonelli, S. Chiaverini, and G. Fusco, “An odometry calibration method for mobile robots based on the least-squares technique,” Proc. 2003 American Control Conference, pp. 3429-3434, 2003.
  2. [2] A. T. Le, D. C. Rye, and H. F. Durrant-Whyte, “Estimation of tracksoil interactions for autonomous tracked vehicles,” Proc. IEEE Int. Conf. on Robotics and Automation, pp. 1388-1393, 1997.
  3. [3] A. Kelly, “Mobile Robot Localization from Large Scale Appearance Mosaics,” The International Journal of Robotics Research, Vol.19, No.11, pp. 1104-1125, 2000.
  4. [4] N. Gracias, S. van der Zwaan, A. Bernardino, and J. Santos-Victor, “Mosaic Based Navigation for Autonomous Underwater Vehicles,” IEEE Journal of Oceanic Engineering, Vol.28, No.4, pp. 609-624, 2003.
  5. [5] I. Nagai and Y. Tanaka, “Tracking of Floor Image Based on Correlation Processing,” Proc. of the 4th IFAC Symposium on Intelligent Autonomous Vehicles, pp. 151-156, 2001.
  6. [6] I. Nagai and Y. Tanaka, “Visual Tracking Device Measuring 2-D movement and Rotation of Floor Image,” Transactions of the Japan Society of Mechanical Engineers Series C, Vol.70, No.692, pp. 1037-1044, 2004 (in Japanese).
  7. [7] C. Harris and M. Stephens, “A combined corner and edge detector,” Proc. 4th Alvey Vision Conf., pp. 147-151, 1988.
  8. [8] H. P. Moravec, “Towards Automatic Visual Obstacle Avoidance,” Proc. of the 5th Int. Conf. on Artificial Intelligence, p. 584, 1977.
  9. [9] A. Mallet, S. Lacroix, and L. Gallo, “Location Estimation in Outdoor Environments using Pixel Tracking and Stereovision,” Int. Conf. on Robotics and Automation, pp. 3519-3524, 2000.
  10. [10] I.-K. Jung and S. Lacroix, “A Robust Interest Points Matching Algorithm,” Int. Conf. on Computer Vision, Vol.2, pp. 538-543, 2001.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024