single-au.php

IJAT Vol.15 No.2 pp. 182-190
doi: 10.20965/ijat.2021.p0182
(2021)

Paper:

Localization System for Indoor Mobile Robot Using Large Square-Shaped Reflective Marker

Hiroaki Seki*,†, Ken Kawai**, and Masatoshi Hikizu***

*Faculty of Frontier Engineering, Kanazawa University
Kakuma-machi, Kanazawa, Ishikawa 920-1192, Japan

Corresponding author

**Graduate School of Natural Science and Technology, Kanazawa University, Kanazawa, Japan

***Faculty of Production Systems Engineering and Sciences, Komatsu University, Komatsu, Japan

Received:
August 18, 2020
Accepted:
January 26, 2021
Published:
March 5, 2021
Keywords:
localization, mobile robot, retro-reflective marker, infrared LED, Hough transform
Abstract

A localization system using reflective markers and a fisheye camera with blinking infrared lights is useful and safe for mobile robot navigation in an environment with coexisting humans and robots; however, it has the problems of low robustness and a small measurable range for marker detection. A large, square-shaped reflective marker, with solid and dotted edges, is proposed for more reliable localization of indoor mobile robots. It can be easily detected using Hough transform and is robust for occlusion. The coordinates of the four corners of the square-shaped marker determine the robot’s localization. Infrared lighting with a new LED arrangement is designed for a wide measurable range via brightness simulation, including the effect of observation and reflection angles. A prototype system was developed, enabling the 2D position and orientation to be detected with an accuracy of 60 mm and 3, respectively, within a 4 m2 area.

Cite this article as:
Hiroaki Seki, Ken Kawai, and Masatoshi Hikizu, “Localization System for Indoor Mobile Robot Using Large Square-Shaped Reflective Marker,” Int. J. Automation Technol., Vol.15, No.2, pp. 182-190, 2021.
Data files:
References
  1. [1] J. Tan, F. Duan, R. Kato, and T. Arai, “Man-Machine Interface for Human-Robot Collaborative Cellular Manufacturing System,” Int. J. Automation Technol., Vol.3, No.6, pp. 760-767, 2009.
  2. [2] L. B. Marinho et al., “Novel mobile robot localization approach based on topological maps using classification with reject option in omnidirectional images,” Expert Systems with Applications, Vol.72, No.15, pp. 1-17, 2017.
  3. [3] M. Nomatsu, Y. Suganuma, Y. Yui, and Y. Uchimura, “Development of an Autonomous Mobile Robot with Self-Localization and Searching Target in a Real Environment,” J. Robot. Mechatron., Vol.27, No.4, pp. 356-364, 2015.
  4. [4] K. Takahashi et al., “Performance Evaluation of Robot Localization Using 2D and 3D Point Clouds,” J. Robot. Mechatron., Vol.29, No.5, pp. 928-934, 2017.
  5. [5] V. Balaska, L. Bampis, M. Boudourides, and A. Gasteratos, “Unsupervised semantic clustering, and localization for mobile robotics tasks,” Robotics and Autonomous Systems, Vol.131, 103567, 2020.
  6. [6] K. Yoshida, “A Study on the On-Ground Access System Using Marker-Based Visual Imformation,” J. Robot. Mechatron., Vol.15, No.1, pp. 84-95, 2003.
  7. [7] H. Nakanishi and H. Hashimoto, “AR-Marker/IMU Hybrid Navigation System for Tether-Powered UAV,” J. Robot. Mechatron., Vol.30, No.1, pp. 76-85, 2018.
  8. [8] W. T. Huang, C. L. Tsai, and H. Y. Lin, “Mobile Robot Localization Using Ceiling Landmarks and Images Captured form an RGB-D camera,” Proc. of the 2012 IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics, pp. 855-860, 2012.
  9. [9] D. Nemec et al., “Precise localization of the mobile wheeled robot using sensor fusion of odometry, visual artificial landmarks, and inertial sensors,” Robotics and Autonomous Systems, Vol.112, pp. 168-177, 2019.
  10. [10] Z. Huang et al., “Accurate 3-D Position and Orientation Method for Indoor Mobile Robot Navigation Based on Photoelectric Scanning,” IEEE Trans. on Instrumentation and Measurement, Vol.64, No.9, pp. 2518-2529, 2015.
  11. [11] T. Tsuruta, K. Miura, and M. Miyaguchi, “Mobile robot for marking free access floors at construction sites,” Automation in Construction, Vol.107, 102912, 2019.
  12. [12] H. Seki et al., “Position System for Indoor Mobile Robot Using Active Ultrasonic Beacons,” Proc. of the 3rd IFAC Symp. on Intelligent Autonomous Vehicles, pp. 681-686, 1998.
  13. [13] K. Tabata, T. Iwai, S. Kudomi, Y. Endo, and Y. Nishida, “Precision Improvement of Position Measurement Using Two Ultrasonic Land Markers,” J. Robot. Mechatron., Vol.26, No.2, pp. 245-252, 2014.
  14. [14] K. Watanabe, Y. Yamada, and I. Nagai, “The Development of a 3D Position Measurement System for Indoor Aerial Robots,” Proc. of the 14th Int. Conf. on Control, Automation and Systems, pp. 1185-1190, 2014.
  15. [15] J. Wang and Y. Takahashi, “Indoor mobile robot self-localization based on low-cost light system with novel emitter arrangement,” Robomech J., Vol.5, 17, doi: 10.1186/s40648-018-0114-x, 2018.
  16. [16] S. Jia, J. Sheng, D. Chugo, and K. Takase, “Human Recognition Using RFID Technology and Stereo Vision,” J. Robot. Mechatron., Vol.21, No.1, pp. 28-35, 2009.
  17. [17] A. Ismail et al., “A Novel Automated Construction Method of Signal Fingerprint Database for Mobile Robot Wireless Positioning System,” Int. J. Automation Technol., Vol.11, No.3, pp. 459-471, 2017.
  18. [18] H. Seki et al., “Positioning System for Indoor Mobile Robots Using Reflective Marks and Fisheye Camera with Infrared LEDs,” Proc. of the 5th Franco-Japanese and 3rd European-Asian Congress on Mechatronics, pp. 142-147, 2001.
  19. [19] F. Zhou et al., “A high precision visual localization sensor and its working methodology for an indoor mobile robot,” Frontiers of Information Technology and Electronic Engineering, Vol.17, pp. 365-374, 2016.
  20. [20] Y. Nakazato, M. Kanbara, and N. Yokoya, “A Localization System Using Invisible Retro-reflective Markers,” Proc. of the IAPR Conf. on Machine Vision Applications, pp. 140-143, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on May. 04, 2021