single-jc.php

JACIII Vol.15 No.9 pp. 1269-1276
doi: 10.20965/jaciii.2011.p1269
(2011)

Paper:

Mutual Localization of Multiple Sensor Node Robots

Keitaro Naruse, Shigekazu Fukui, and Jie Luo

School of Computer Science and Engineering, University of Aizu, Tsuruga, Ikki-machi, Aizu-Wakamatsu, Fukushima 965-0006, Japan

Received:
May 18, 2011
Accepted:
June 22, 2011
Published:
November 20, 2011
Keywords:
swarm robotics, sensor network, localization, Kalman filter
Abstract
The objective of this paper is to develop a localization systemof cooperativemultiple mobile robots, in which each robot is assumed to observe a set of known landmarks and equipped with an omnidirectional camera. In this paper, it is assumed that a robot can detect other robots by using the omnidirectional camera, share its estimated position with others, and utilize shared positions for its localization. In other words, each robot can be viewed as an additional mobile landmark to a set of stationary landmarks. A foremost concern is how well this system performs localization under a limited amount of information. This paper presents an investigation of self localization error of each robot in a group using Extended Kalman Filter to solve the localization problem with the insufficient landmarks and inaccurate position information.
Cite this article as:
K. Naruse, S. Fukui, and J. Luo, “Mutual Localization of Multiple Sensor Node Robots,” J. Adv. Comput. Intell. Intell. Inform., Vol.15 No.9, pp. 1269-1276, 2011.
Data files:
References
  1. [1] S. Thrun, W. Burgard, and D. Fox, “Probabilistic Robotics,” pp. 189-215, 2006.
  2. [2] G. Welch and G. Bishop, “An Introduction to the Kalman Filter,” University of North Carolina at Chapel Hill, 2006.
  3. [3] L. E. Parker, “Current state of the art in distributed autonomous mobile robotics,” Proc. of the 5th Int. Symposium on Distributed Autonomous Robotic Systems (DARS 2000), Knoxville, TN, Oct. 4-6, pp. 3-12, 2000.
  4. [4] C. Ferrari, E. Pagello, J. Ota, and T. Arai, “Multirobot motion coordination in space and time,” Robotic Autonomous Systems, Vol.25, No.3/4, pp. 219-229, 1998.
  5. [5] D. Fox, W. Burgard, H. Kruppa, and S. Thrun, “Collaborative Multi-Robot Localization,” Proc. of German Conference on Artificial Intelligence (KI), Germany, 1999.
  6. [6] K. Yamazawa, Y. Yagi, and M. Yachida, “Omnidirectional Imaging with Hyperboloidal Projection,” Proc. of 1993 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1029-1034, 1993.
  7. [7] K. Aizawa, K. Sakaue, and Y. Suenaga (Eds.), “Image processing technologies: algorithms, sensors, and applications,” p. 125, 2004.
  8. [8] K. Yamazawa, Y. Yagi, and M. Yachida, “Visual navigation with Omnidirectional Image Sensor HyperOmni Vision,” The Trans. of the Institute of Electronics, Information and Communication Engineers (IEICE), Vol.J79-D-II, No.5, pp. 698-707, 1996.
  9. [9] H. Koyasu, J. Miura, and Y. Shirai, “RecognizingMoving Obstacles for Robot Navigation using Real-time Omnidirectional Stereo Vision,” J. of Robotics and Mechatronics, Vol.14, pp. 147-156, 2002.
  10. [10] M. Doi and S. Hashimoto, “Environmental recognition in a mobile robot using omni-directional image sensing and stereophonic distance measurement,” 2001 IEEE-RAS Int. Conf. on Humanoid Robots, 2001.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024