single-rb.php

JRM Vol.22 No.6 pp. 708-717
doi: 10.20965/jrm.2010.p0708
(2010)

Paper:

Road-Crossing Landmarks Detection by Outdoor Mobile Robots

Aneesh Chand and Shin‘ichi Yuta

Graduate School of Systems and Information Engineering, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8573, Japan

Received:
March 7, 2010
Accepted:
June 13, 2010
Published:
December 20, 2010
Keywords:
outdoor robot navigation
Abstract

Notably, a salient shortfall of most outdoor mobile robots is their lack of ability to autonomously cross roads while traveling along pedestrian sidewalks in an urban outdoor environment. If it has the ability to intuitively cross a road, the robot could then travel longer distances and more complex routes than originally possible. To this effect, the authors have been developing technologies that attempt to endow such a road-crossing function to outdoor mobile robots. In this paper, a system for road-crossing landmarks detection and localization for outdoor mobile robots is presented. We show how a robot equipped with a single monocular camera and laser range finder sensor can accurately detect, identify and localize roadcrossing landmarks such as pedestrian push button boxes, zebra crossings and pedestrian lights that the robots needs to be aware of and use in order to autonomously cross roads. In addition, experimental results and future plans are discussed.

Cite this article as:
Aneesh Chand and Shin‘ichi Yuta, “Road-Crossing Landmarks Detection by Outdoor Mobile Robots,” J. Robot. Mechatron., Vol.22, No.6, pp. 708-717, 2010.
Data files:
References
  1. [1] Q. Muhlbauer, S. Sosnowski, T. Xu, T. Zhang, K. Kuhnlenz, and M. Buzz, “Navigation through Urban Environments by Visual Perception and Interaction,” Proc. 2009 IEEE Int. Conf. on Robotics and Automation, pp. 3558-3564, 2009.
  2. [2] G. Lidoris, F. Rohrmuller, D. Wollherr, and M. Buss, “The Autonomous City Explorer Project -Mobile robot navigation in highly populated urban environments,” Proc. 2009 IEEE Int. Conf. on Robotics and Automation, pp. 1416-1422, 2009.
  3. [3] Y. Morales, E. Takeuchi, A. Carballo, W. Tokunaga, H. Kuniyoshi, A. Aburadani, A. Hirosawa, Y. Nagasaka, Y. Suzuki, and T. Tsubouchi, “1km Autonomous Robot Navigation on Outdoor Pedestrian Paths – Running the Tsukuba Challenge 2007,” Proc. 2008 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 219-225, 2008.
  4. [4] S. Yuta, M. Mizukawa, and H. Hashimoto, “Tsukuba Challenge: The Purpose and Results,” Special Issue on Tsukuba Challenge, The J. of the Society of Instrument and Control Engineers, Vol.49, pp. 572-578, 2010. (in Japanese)
  5. [5] Y. Morales, A. Carballo, E. Takeuchi, A. Aburadani, and T. Tsubouchi, “Autonomous robot navigation in outdoor cluttered pedestrian walkways,” J. of Field Robotics, Vol.26, Issue 8, pp. 609-635, 2008.
  6. [6] H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “SURF: Speeded up Robust Features,” Computer Vision and Image Understanding (CVIU), Vol.110, No.3, pp. 346-359, 2008.
  7. [7] P. Viola and M. J. Jones, “Rapid Object Detection Using a Boosted Cascade of Simple Features,” IEEE Conf. Computer Vision and Pattern Recognition (CVPR), 2001.
  8. [8] M. S. Uddin and T. Shioyama, “Detection of Pedestrian Crossing using Bipolarity and Projective Invariant,” 2005 IAPR Conf. on Machine Vision Applications, pp. 132-135, 2005.
  9. [9] S. Stephen, “Zebra-crossing detection for the partially sighted,” Proc. IEEE Computer Society Conf. Computer Vision and Pattern Recognition, Vol.2, pp. 211-217, 2000.
  10. [10] A. R. Smith, “Color gamut transform pairs,” Computer Graphics, Vol.12, No.3, pp. 12-19, 1978.
  11. [11] P. J. Burt, T. H. Hong, and A. Rosenfeld, “A Segmentation and Estimation of Image Region Properties through Cooperative Hierarchical Computation,” IEEE Trans. On System, Man and Cybernetics, Vol.SMC-11, No.12, 1981.
  12. [12] “Intel Corporation,” Open Source Computer Vision Library, Reference Manual, Beta 2 version, pp. 10-29, 2001.
  13. [13] A. Kosir and J. F. Tasic, “Pyramid segmentation parameters estimation based on image total variation,” In Proc. of IEEE Conf. EUROCON, 2003.
  14. [14] S. Yuta, S. Suzuki, and S. Iida, “Implementation of a Small Size Experimental Self-Contained Autonomous Robot – Sensors, Vehicle Control and Description of Sensor Based Behavior,” Lecture Notes in Control and Information Sciences, Vol.190, The 2nd, Int. Sym. on Experimental Robotics II, pp. 344-358, 1991.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Sep. 14, 2021