single-rb.php

JRM Vol.27 No.4 pp. 356-364
doi: 10.20965/jrm.2015.p0356
(2015)

Paper:

Development of an Autonomous Mobile Robot with Self-Localization and Searching Target in a Real Environment

Masatoshi Nomatsu, Youhei Suganuma, Yosuke Yui, and Yutaka Uchimura

Shibaura Institute of Technology
3-7-5 Toyosu, Koutou-ku, Tokyo 135-8548, Japan

Received:
January 20, 2015
Accepted:
April 16, 2015
Published:
August 20, 2015
Keywords:
autonomous mobile robot, self localization, 3D laser range finder, map matching method, point cloud processing
Abstract

Developed autonomous mobile robot
In describing real-world self-localization and target-search methods, this paper discusses a mobile robot developed to verify a method proposed in Tsukuba Challenge 2014. The Tsukaba Challenge course includes promenades and parks containing ordinary pedestrians and bicyclists that require the robot to move toward a goal while avoiding the moving objects around it. Common self-localization methods often include 2D laser range finders (LRFs), but such LRFs do not always capture enough data for localization if, for example, the scanned plane has few landmarks. To solve this problem, we used a three-dimensional (3D) LRF for self-localization. The 3D LRF captures more data than the 2D type, resulting in more robust localization. Robots that provide practical services in real life must, among other functions, recognize a target and serve it autonomously. To enable robots to do so, this paper describes a method for searching for a target by using a cluster point cloud from the 3D LRF together with image processing of colored images captured by cameras. In Tsukuba Challenge 2014, the robot we developed providing the proposed methods completed the course and found the targets, verifying the effectiveness of our proposals.
Cite this article as:
M. Nomatsu, Y. Suganuma, Y. Yui, and Y. Uchimura, “Development of an Autonomous Mobile Robot with Self-Localization and Searching Target in a Real Environment,” J. Robot. Mechatron., Vol.27 No.4, pp. 356-364, 2015.
Data files:
References
  1. [1] N. Akai, K. Inoue, and K. Ozaki, “Autonomous Navigation Based on Magnetic and Geometric Landmarks on Environmental Structure in Real World,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 158-165, 2014.
  2. [2] J. Eguchi and K. Ozaki, “Development of the Autonomous Mobile Robot for Target-Searching in Urban Areas in the Tsukuba Challenge 2013,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 166-176, 2014.
  3. [3] K. Okawa, “Three Tiered Self-Localization of Two Position Estimation Using Three Dimensional Environment Map and Gyro-Odometry,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 196-203, 2014.
  4. [4] S. Muramatsu, T. Tomizawa, S. Kudoh, and T. Suehiro, “Development of Intelligent Mobile Cart in a Crowded Environment – Robust Localization Technique with Unknown Objects –,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 204-213, 2014.
  5. [5] M. Yokozuka and O. Matsumoto, “A Reasonable Path Planning via Path Energy Minimization,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 236-244, 2014.
  6. [6] M. Saito et al., “Pre-Driving Needless System for Autonomous Mobile Robots Navigation in Real World Robot Challenge 2013,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 185-195, 2014.
  7. [7] T. Shioya, K. Kogure, and N. Ohta, “Minimal Autonomous Mover – MG-11 for Tsukuba Challenge –,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 225-235, 2014.
  8. [8] A. Ohya and S. Yuta, “Optical Range Sensor for Mobile Robot’s Environment Recognition and Its Application,” Conf. on System Integration (SICE2004), pp. 1067-1070, 2004 (in Japanese).
  9. [9] T. Suzuki et al., “Autonomous Navigation of a Mobile Robot Based on GNSS/DR Integration in Outdoor Environments,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 214-224, 2014.
  10. [10] K. Akira et al., “Development of Total Recording System for GPS: as an evaluation tool of the positional presumption accuracy degradation by multipath propagation,” Technical report of the Institute of Electronics, Information and Communication Engineers, ITS, Vol.109, No.58, pp. 1-6, 2009 (in Japanese).
  11. [11] J. Kikuchi, H. Date, S. Okawa, Y. Takita, and K. Kobayashi, “Detection of a sitting human using 3D LIDAR,” Annual Conf. of the Robotics Society of Japan 2013, RSJ2013AC1E3-03, 2013 (in Japanese).
  12. [12] K. Yamauchi et al., “Person Detection Method Based on Color Layout in Real World Robot Challenge 2013,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 151-157, 2014.
  13. [13] K. Konolige and K. Chou, “Markov localization using correlation,” Int. Joint Conf. on Artificial Intelligence (IJCAI ’99), pp. 1154-1159, 1999.
  14. [14] E. B. Olson, “Real-Time Correlative Scan Matching,” IEEE Int. Conf. on Robotics and Automation (ICRA ’09), pp. 4387-4393, 2009.
  15. [15] K. Komiya, S. Miyashita, Y. Maruoka, and Y. Uchimura, “Control of Autonomous Mobile Robot Using Map Matching with Optimized Search Range,” Electrical Engineering in Japan, Vol.190, Issue 4, pp. 66-75, 2015.
  16. [16] K. Higuchi, “Particle filter,” J. of the Institute of Electronics, Information and Communication Engineers, Vol.88, No.12, pp. 989-994, 2005 (in Japanese).
  17. [17] G. Kitagawa, “Monte Carlo filter and smoothing,” Proc. of the institute of statistical mathematics, Vol.44, No.1, pp. 31-48, 1996 (in Japanese).
  18. [18] S. Thrun, W. Burgard, and D. Fox, “Probabilistic Robotics,” MIT Press, 2005.
  19. [19] T. Shikina et al., “Approach for Localization and Motion Planning in Tsukuba Challenge 2011,” Conf. on System Integration (SICE SI2011), pp. 1750-1753, 2011 (in Japanese).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024