JRM Vol.17 No.3 pp. 262-268
doi: 10.20965/jrm.2005.p0262


Navigation Using Local Landmarks in a Corridor Environment

Kazumi Oikawa*, Hidenori Takauji**, Takanori Emaru***,
Shigenori Okubo*, and Takeshi Tsuchiya****

*Yamagata University, 4-3-16 Jonan, Yonezawa, Yamagata 992-8510, Japan

**Hokkaido University, Kita 14, Nishi 9, Kita-ku, Sapporo, Hokkaido 060-0814, Japan

***University of Electro-Communications, 1-5-1 Chofugaoka, Chofu, Tokyo 182-8585, Japan

****Hokkaido Institute of Technology, 7-15-4-1 Maeda, Teine-ku, Sapporo, Hokkaido 006-8585, Japan

October 18, 2004
April 11, 2005
June 20, 2005
navigation, local landmark, topological map, autonomous mobile robot, behavior-based robot

We propose a robot navigation using local landmarks and topological maps to make robots easy to use. To popularize the use of robots in the home and welfare institutions, robots must be easy for anyone to use. Conventional robots are generally expensive to implement due to extensive preparation such as providing maps with Cartesian coordinates and setting precise environmental landmarks called “setup costs [1].” For robots to navigate precisely, they must measure locations precisely, which may require sensors with high precision and sophisticated computing, e.g., to compensate for accumulated error caused by wheel slippage. Approaches without position coordinate make difficult, but we solve the problem using a user-friendly robot with inaccurate sensors and unsophisticated computing – “human” approach.

Cite this article as:
Kazumi Oikawa, Hidenori Takauji, Takanori Emaru,
Shigenori Okubo, and Takeshi Tsuchiya, “Navigation Using Local Landmarks in a Corridor Environment,” J. Robot. Mechatron., Vol.17, No.3, pp. 262-268, 2005.
Data files:
  1. [1] J. Ota, and T. Arai, “Mono-Function Modular Robot Systems,” J. of the Robotics Society of Japan, Vol.21, No.8, pp. 877-881, 2003 (in Japanese).
  2. [2] A. O. S. Maeyama, and S. Yuta, “Outdoor Navigation of a Mobile Robot Using Natural Landmarks,” Proc. of Intelligent Autonomous Systems (IAS-5), pp. 164-171, 1998.
  3. [3] T. Yoshida, A. Ohya, and S. Yuta, “Autonomous Mobile Robot Navigation Using Braille Blocks in Outdoor Environment,” J. of the Robotics Society of Japan, Vol.22, No.4, pp. 469-477, 2004 (in Japanese).
  4. [4] M. Tomono, and S. Yuta, “Indoor Navigation based on an Inaccurate Map using Object Recognition,” J. of the Robotics Society of Japan, Vol.22, No.1, pp. 83-92, 2004 (in Japanese).
  5. [5] B. J. Kuipers, and Y.-T. Byun, “A Robust, Qualitative Approach to a Spatial Learning Mobile Robot,” SPIE Sensor Fusion: Spatial Reasoning and Scene Interpretation, Vol.1003, pp. 366-375, 1988.
  6. [6] M. J. Mataric, “Integration of Representation Into Goal-Driven Behavior-Based Robots,” IEEE Trans. on Robotics and Automation, Vol.8, No.3, pp. 304-312, 1992.
  7. [7] R. A. Brooks, “A Robust Layered Control System For A Mobile Robot,” IEEE J. of Robotics and Automation, Vol.RA-2, No.1, pp. 14-23, 1986.
  8. [8] U. Nehmzow, “Mobile Robots: a practical introduction – 2nd ed.,” Springer-Verlag, 2003.
  9. [9] J. L. Jones, and A. M. Flynn, “Mobile Robots: Inspiration to Implementation,” A K Peters, Ltd., 1993.
  10. [10] K. Oikawa, and T. Tsuchiya, “Navigation using Artificial Landmark without using Coordinate System,” Proc. of the 21st Conference of the Robotics Society of Japan, 3J11, 2003 (in Japanese).
  11. [11] K. Oikawa, and T. Tsuchiya, “Navigation for a Behavior-Based Autonomous Mobile Robot,” J. of Robotics and Mechatronics, Vol.10, No.5, pp. 407-412, 1998.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Feb. 25, 2021