single-rb.php

JRM Vol.17 No.2 pp. 208-217
doi: 10.20965/jrm.2005.p0208
(2005)

Paper:

Development of Optical Communication Marks for Mobile Robots to Recognize Their Environment and to Handle Objects

Keiji Nagatani*, Hiroyasu Sato**, Hidenori Tasaka***,
Akio Gofuku****, and Yutaka Tanaka*

*The Graduate School of Natural Science and Technology, Okayama University, 3-1-1 Tsushima Naka, Okayama 700-8530, Japan

**Denso Corp.

***Tasaka Ironworks

****Department of Systems Engineering, Faculty of Engineering, Okayama University

Received:
October 30, 2004
Accepted:
January 6, 2005
Published:
April 20, 2005
Keywords:
Marks system, mobile manipulator, object grasping, localization, optical communication
Abstract
In an environment in which robots and human beings coexist, it is difficult for a mobile manipulator to grasp an object autonomously. Generally, the most difficult aspect of realizing such action is object recognition. Such an environment includes many of irregular shapes and sizes (because of unknown distance from sensors to each object) and blocking of vision sensors. Object’s abstraction and recognition thus remains far from practical use of current sensing technology. To solve the recognition problem, we developed marks recognized through optical communication rather than attempting to improve sensing technology. Marks using light-emitting diodes (LEDs) have the advantages of (1) being visible as far as they remain in sight, (2) having the properties of the target correctly using digital signals, and (3) localizing the mark position using stereovision. Once the marks are attached to several movable objects, a robot can find a location of a target object easily by communicating with the mark attached to the target object. Also, if the marks are located at known positions such as walls, a mobile robot can localize itself by detecting such marks. In this research, we developed an Intelligent Mark System (IMS) between a robot and marks which uses visible and infrared LEDs for communication. Using this system, we performed (1) object grasping tasks (the target is a small can with IMS) using a mobile manipulator, (2) localizing a mobile robot using IMS on several walls and (3) recognizing large objects (desks). In this paper, we explain an overview of communication between marks and robots, and we discuss task performance results for an autonomous mobile manipulator using IMS in a real environment. Also, we report the feasibility and limitations of our proposal.
Cite this article as:
K. Nagatani, H. Sato, H. Tasaka, A. Gofuku, and Y. Tanaka, “Development of Optical Communication Marks for Mobile Robots to Recognize Their Environment and to Handle Objects,” J. Robot. Mechatron., Vol.17 No.2, pp. 208-217, 2005.
Data files:
References
  1. [1] T. Sogo, H. Ishiguro, and T. Ishida, “Mobile robot navigation by distributed vision agent,” in PRIMA, pp. 96-110, 1999.
  2. [2] J. H. Lee, N. Ando, and H. Hashimoto, “Mobile robot architecture in intelligent space,” JSME Journal of Robotics and Mechatronics, Vol.11, No.2, pp. 165-170, 1999.
  3. [3] H. Noguchi, T. Mori, and T. Sato, “Network middleware for utilization of sensors in room,” in Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1832-1838, 2003.
  4. [4] D. Kurabayashi, K. Konishi, and H. Asama, “Distributed guidance knowledge management by intelligent data carriers,” International Journal of Robotics and Automation, Vol.16, No.4, pp. 207-216, 2001.
  5. [5] J. Ota, M. Yamamoto, K. Ikeda, Y. Aiyama, and T. Arai, “Environmental support method for mobile robots using visual marks with memory storage,” in Proc. of IEEE International Conference on Robotics and Automation, pp. 2976-2981, 1999.
  6. [6] N. Y. Chong, and K. Tanie, “Dependable manipulation based on RFID,” in Proc. International Symposium on Robotics and Automation, pp. 330-335, 2002.
  7. [7] Y. Nishida, H. Aizawa, T. Hori, N. H. Hoffman, T. Kanade, and M. Kakikura, “3d ultrasonic tagging system for observing human activity,” in Proceedings of IEEE International Conference on Intelligent Robots and Systems, pp. 785-791, 2003.
  8. [8] R. Y. Tsai, “A versatile camera calibration technique for highaccuracy 3d machine vision metrology using off-the-shelf tv cameras and lenses,” IEEE Journal of Robotics and Automation, Vol.3, No.4, pp. 323-344, 1987.
  9. [9] Y. Yagi, “Omnidirectional sensing and its applications,” IEICE Trans. on Information and Systems, Vol.E82-D, No.3, pp. 568-579, 1999.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024