single-jc.php

JACIII Vol.16 No.6 pp. 704-712
doi: 10.20965/jaciii.2012.p0704
(2012)

Paper:

Proposal of Method “Motion Space” to Express Movement of Robot

Kentarou Kurashige*, Naoki Kitayama*, and Masafumi Kiyohashi**

*Muroran Institute of Technology, 27-1 Mizumoto-cho, Muroran city, Hokkaido 050-8585, Japan

**Sun Information & Service, Co, Ltd., 3-15-9 Hongo, Bunkyo-ku, Tokyo 113-0033, Japan

Received:
February 19, 2012
Accepted:
June 20, 2012
Published:
September 20, 2012
Keywords:
motion space, knowledge of movement, method to express movement of robot, teaching playback
Abstract
In recent years, the use of robots has been spreading to various fields. Further, requirements for the use of robots are increasing. A method is therefore necessary for allowing persons who are not experts in using robots to actually operate a robot. We intend to develop a method for robot operation whereby a user need not have technical knowledge. In this paper, we focus on methods whereby a user of a robot assigns movement to the robot and the robot reproduces movement. One of the most widely used techniques that is used nowadays involves teaching playback. Teaching playback is a method in which a teacher moves a robot using controllers and lets the robot record movement and then play it back. Robots functioning via teaching playback cannot however adapt to a changing environment. The environment in which human beings live generally change. Teaching playback is therefore not usable in variable environments. Methods for generating movement robustly in environments have been studied. Designing the movement of a robot by using these methods cannot be done, however without understanding complicated formulas. Only movement designers having technical knowledge can use these techniques. We propose new knowledge of movement to solve the problems present in these methods. Knowledge of movement is information involving the generation of movement in a robot. In conventional methods, knowledge ofmovement was a complicated formula. By using our method, a robot incorporates knowledge based on information obtained by moving a robot just like in the teaching playback method. We expect that by using our methods a user can move a robot in the desired manner.
Cite this article as:
K. Kurashige, N. Kitayama, and M. Kiyohashi, “Proposal of Method “Motion Space” to Express Movement of Robot,” J. Adv. Comput. Intell. Intell. Inform., Vol.16 No.6, pp. 704-712, 2012.
Data files:
References
  1. [1] H. Asada and Y. Asari, “The Direct teaching of tool manipulation skills via the impedance identification of human motion,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 1269-1274, 1988.
  2. [2] A. Clark and R. Grush, “Towards a Cognitive Robotics,” Adaptive Behavior, January 1999, Vol.7, No.1, pp. 5-16, 1999.
  3. [3] R. A. Brooks, “A robust layered control system for a mobile robot,” IEEE J. of Robotics and Automation, Vol.RA-2, No.1, pp. 14-23, 1986.
  4. [4] S. Kotosaka and S. Schaal, “Synchronized robot motion by neural oscillators,” J. of the Robotics Society of Japan, Vol.19, No.5, pp. 580-583, 2001 (in Japanese).
  5. [5] T. Kasuga and M. Hashimoto, “Human-Robot Handshaking using Neural Oscillators,” Proc. of the 2005 IEEE Int. Conf. on Robotics and Automation, pp. 3802-3807, April 18-22, 2005.
  6. [6] M. Okada, K. Tatani, and Y. Nakamura, “Polynomial Design of the Nonlinear Dynamics for the Brain-Like Information Processing of Whole Body Motion,” Proc. of the 2002 IEEE Int. Conf. on Robotics and Automation, Vol.2, pp. 1410-1415, 2002.
  7. [7] M. Okada and M. Watanabe, “Controller Decomposition and Combination Design of Body / Motion Elements based on Orbit Attractor,” Proc. of the 2009 IEEE Int. Conf. on Robotics and Automation, pp. 1364-1369, May 12-17, 2009.
  8. [8] Y. Sakurai, N. Honda, and J. Nishino, “Acquisition of Knowledge for Gymnastic Bar Action by Active Learning Method,” J. of Advanced Computational Intelligence and Intelligent Informatics, Vol.7, No.1, pp. 10-18, 2003.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Oct. 01, 2024