JRM Vol.17 No.2 pp. 218-225
doi: 10.20965/jrm.2005.p0218


Autonomous Mobile Surveillance Based on RTK-GPS in Urban Canyons

Jun-ichi Meguro*, Rui Hirokawa**, Jun-ichi Takiguchi**,
and Takumi Hashizume*

*Advanced Research Institute for Science and Engineering, Waseda University, 17 Kikui-cho, Shinjyuku-ku, Tokyo 162-0044, Japan

**Mitsubishi Electric Corp. Kamakura Works, 325 Kamimachiya, Kamakura, Kanagawa 247-0065, Japan

October 24, 2004
January 6, 2005
April 20, 2005
moving robot, sensor, network based RTK-GPS/INS, positioning, augmentation services
This paper describes an autonomous mobile surveillance used in plants in high-rise buildings. This consists of a wireless LAN, a base station, and an autonomous vehicle. The vehicle uses GPS/INS navigation using network-based Real-Time Kinematic GPS (RTK-GPS) with Positioning Augmentation Services (PASTM, Mitsubishi Electric Corporation 2003), an Area Laser Radar (ALR), a slave camera, and an Omni-Directional Vision (ODV) sensor for surveillance and reconnaissance. The vehicle switches control modes – normal, road tracking, and crossing recognition – based on vehicle navigation error. A field test shows that the vehicle tracks a planned straight paths within 0.10m accuracy and planned curved paths within 0.25m even without RTK fixed solutions. Field experiments and analysis prove that the proposed navigation provides sufficient navigation and guidance accuracy under poor satellite geometry and visibility, and that the panorama image database with absolute positioning is useful for surveillance.
Cite this article as:
J. Meguro, R. Hirokawa, J. Takiguchi, and T. Hashizume, “Autonomous Mobile Surveillance Based on RTK-GPS in Urban Canyons,” J. Robot. Mechatron., Vol.17 No.2, pp. 218-225, 2005.
Data files:
  1. [1] S. Sukkarieh et al., “A High Integrity IMU/GPS Navigation Loop for Autonomous Land Vehicle Application,” IEEE Trans. on Robotics and Automation, Vol.15, No.3, June 1999.
  2. [2] B. H. Kaygisiz, “GPS/INS Enhancement Using Neural Networks for Autonomous Ground Vehicle Applications,” IEEE International Conf. on Robots and systems, pp. 3763-3768, 2003.
  3. [3] K. Ohno et al., “Outdoor Navigation of a Mobile Robot between Buildings based on DGPS and Odometry Data Fusion,” IEEE International Conf. on Robotics and Automation, Vol.2, pp. 1978-1984, 2003.
  4. [4] A. Takeya et al., “Omnidirectional Vision System Using Two Mirrors,” Novel Optical Systems Design and Optimization SPIE, Vol.3430, pp. 50-60, 1998.
  5. [5] M. Saito et al., “Network-based RTK-GPS for Nation-Wide High Accuracy Positioning and Navigation in Japan,” International Symposium International Space University, 2003.
  6. [6] H. Asama et al., “A Method of Structuring Free Space And An Efficient Algorithm of Path Planning for Autonomous Mobile Robot,” Annual Conference of the Robotics Society of Japan, pp. 881-882, 1990 (in Japanese).
  7. [7] M. W. M. G. Dissanayake, P. Newman, S. Clark, H. F. Durrant-Whyte, and M. Csorba, “A Solution to the Simultaneous Localization and Map Building (SLAM) Problem,” IEEE Transactions on Robotics and Automation, Vol.17, Issue 3, pp. 229-241, 2001.
  8. [8] J. J. Leonard, and H. F. Durrant-Whyte, “Simultaneous Map Building and Localization for an Autonomous Mobile Robot,” Proc. of Int. Conf. on Intelligent Robots and Systems, pp. 1442-1446, 1991.
  9. [9] T. Hashizume et al., “A Study of Autonomous Mobile System in Outdoor Environment, Part-5 Development of a Self-positioning System with an Omnidirectional Vision System,” IEEE Int. Conf. Robotics and Automations, 2001.
  10. [10] R. Hirokawa et al., “Autonomous Vehicle Navigation with Carrier Phase DGPS and Laser-Scanner Augmentation,” ION GNSS, 2004.
  11. [11] A. Elfes et al., “Sonor-Based Real-World Mapping and Navigation,” J. of Robotics and Automation, pp. 249-265, 1987.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on May. 10, 2024