Special Issue on Mobile Robot
Department of Robotics, Mechanical Engineering Laboratory, 1-2, Namiki, Tsukuba, Ibaraki 305-8564, Japan
Published:February 20, 1999
Mobility, or locomotion, is as important a function for robots as manipulation. A robot can enlarge its work space by locomotion. It can also recognize its environment well with its sensors by moving around and by observing its surroundings from various directions. Much researches has been done on mobile robots and the research appears to be mature. Research activity on robot mobility is still very active; for example, 22% of the sessions at ICRA'98 - the International Conference on Robotics and Automation - and 24% of the sessions at IROS'98 - the International Conference on Intelligent Robots and Systems - dealt with issues directly related to mobile robots. One of the main reasons may be that intelligent mobile robots are thought to be the closest position to autonomous robot applications. This special issue focuses on a variety of mobile robot research from mobile mechanisms, localization, and navigation to remote control through networks. The first paper, entitled "Control of an Omnidirectional Vehicle with Multiple Modular Steerable Drive Wheels," by M. Hashimoto et al., deals with locomotion mechanisms. They propose an omnidirectional mobile mechanism consisting of modular steerable drive wheels. The omnidirectional function of mobile mechanisms will be an important part of the human-friendly robot in the near future to realize flexible movements in indoor environments. The next three papers focus on audiovisual sensing to localize and navigate a robot. The second paper, entitled "High-Speed Measurement of Normal Wall Direction by Ultrasonic Sensor," by A. Ohya et al., proposes a method to measure the normal direction of walls by ultrasonic array sensor. The third paper, entitled "Self-Position Detection System Using a Visual-Sensor for Mobile Robots," is written by T. Tanaka et al. In their method, the position of the robot is decided by measuring marks such as name plates and fire alarm lamps by visual sensor. In the fourth paper, entitled "Development of Ultra-Wide-Angle Laser Range Sensor and Navigation of a Mobile Robot in a Corridor Environment," written by Y Ando et al., a very wide view-angle sensor is realized using 5 laser fan beam projectors and 3 CCD cameras. The next three papers discussing navigation problems. The fifth paper, entitled "Autonomous Navigation of an Intelligent Vehicle Using 1-Dimensional Optical Flow," by M. Yamada and K. Nakazawa, discusses navigation based on visual feedback. In this work, navigation is realized by general and qualitative knowledge of the environment. The sixth paper, entitled "Development of Sensor-Based Navigation for Mobile Robots Using Target Direction Sensor," by M. Yamamoto et al., proposes a new sensor-based navigation algorithm in an unknown obstacle environment. The seventh paper, entitled "Navigation Based on Vision and DGPS Information for Mobile Robots," S. Kotani et al., describes a navigation system for an autonomous mobile robot in an outdoor environment. The unique point of their paper is the utilization of landmarks and a differential global positioning system to determine robot position and orientation. The last paper deals with the relationship between the mobile robot and computer networks. The paper, entitled "Direct Mobile Robot Teleoperation via Internet," by K. Kawabata et al., proposes direct teleoperation of a mobile robot via the Internet. Such network-based robotics will be an important field in robotics application. We sincerely thank all of the contributors to this special issue for their cooperation from the planning stage to the review process. Many thanks also go to the reviewers for their excellent work. We will be most happy if this issue aids readers in understanding recent trends in mobile robot research and furthers interest in this research field.
Cite this article as:K. Komoriya, “Special Issue on Mobile Robot,” J. Robot. Mechatron., Vol.11 No.1, p. 1, 1999.Data files: