single-rb.php

JRM Vol.28 No.4 pp. 432-440
doi: 10.20965/jrm.2016.p0432
(2016)

Paper:

Development of Autonomous Robot with Simple Navigation System for Tsukuba Challenge 2015

Yuta Kanuki and Naoya Ohta

Gunma University
1-5-1 Tenjin-cho, Kiryu-shi, Gunma 376-8515, Japan

Received:
January 20, 2016
Accepted:
April 23, 2016
Published:
August 20, 2016
Keywords:
minimal autonomous mover, 2D environment map, pyramid map matching, color image processing
Abstract

Development of Autonomous Robot with Simple Navigation System for Tsukuba Challenge 2015

MercuryMega (SICKLaser-Model)

This paper introduces the robot that was developed for Tsukuba Challenge 2015. One of the team’s design goals was to implement a simple sensor configuration and control algorithm. For example, the robot’s laser range finders (LRF) were of fixed type, and localization was accomplished using only a single LRF and no other sensor. Environment map matching was achieved using an algorithm that processed pyramid images using binary image processing to reduce computational cost. Shape information from the LRF and color information from a camera were combined to detect a signboard placed near a person. During an actual test in rainy weather, the robot ran the entire course and, detected two, out of a possible four, number of target persons.

Cite this article as:
Y. Kanuki and N. Ohta, “Development of Autonomous Robot with Simple Navigation System for Tsukuba Challenge 2015,” J. Robot. Mechatron., Vol.28, No.4, pp. 432-440, 2016.
Data files:
References
  1. [1] K. Nagatani, A. Kushleyev, and D. D. Lee, “Sensor Information Processing in Robot Competitions and Real World Robotic Challenges,” Advanced Robotics, Vol.26, No.14, pp. 1539-1554, 2012.
  2. [2] T. Shioya, K. Kogure, and N. Ohta, “Minimal Autonomous Mover – MG-11 for Tsukuba Challenge –,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 225-235, 2014.
  3. [3] S. Ohkawa, Y. Takita, H. Date, and K. Kobayashi, “Development of Autonomous Mobile Robot Using Articulated Steering Vehicle and Lateral Guiding Method,” J. of Robotics and Mechatronics, Vol.27, No.4, pp. 337-345, 2015.
  4. [4] S. A. Rahok, H. Oneda, A. Tanaka, and K. Ozaki, “A Robust Navigation Method for Mobile Robots in Real-World Environments,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 177-184, 2014.
  5. [5] N. Akai, K. Inoue, and K. Ozaki, “Autonomous Navigation Based on Magnetic and Geometric Landmarks on Environmental Structure in Real World,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 158-165, 2014.
  6. [6] N. Akai, K. Yamaguchi, K. Inoue, Y. Kakigi, Y. Abe, and K. Ozaki, “Development of Mobile Robot “SARA” that Completed Mission in Real World Robot Challenge 2014,” J. of Robotics and Mechatronics, Vol.27, No.4, pp. 327-336, 2015.
  7. [7] J. Eguchi and K. Ozaki, “Development of Autonomous Mobile Robot Based on Accurate Map in the Tsukuba Challenge 2014,” J. of Robotics and Mechatronics, Vol.27, No.4, pp. 346-355, 2015.
  8. [8] M. Yokozuka and O. Matsumoto, “Accurate Localization for Making Maps to Mobile Robots Using Odometry and GPS Without Scan-Matching,” J. of Robotics and Mechatronics, Vol.27, No.4, pp. 410-418, 2015.
  9. [9] K. Okawa, “Three Tiered Self-Localization of Two Position Estimation Using Three Dimensional Environment Map and Gyro-Odometry,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 196-203, 2014.
  10. [10] S. Muramatsu, T. Tomizawa, S. Kudoh, and T. Suehiro, “Development of Intelligent Mobile Cart in a Crowded Environment – Robust Localization Technique with Unknown Objects –,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 204-213, 2014.
  11. [11] Y. Fujino, K. Kiuchi, S. Shimizu, T. Yokota, and Y. Kuroda, “Integrated Autonomous Navigation System and Automatic Large Scale Three Dimensional Map Construction,” J. of Robotics and Mechatronics, Vol.27, No.4, pp. 401-409, 2015.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Nov. 16, 2018