JRM Vol.29 No.5 pp. 919-927
doi: 10.20965/jrm.2017.p0919


Adaptive Learning of Hand Movement in Human Demonstration for Robot Action

Ngoc Hung Pham and Takashi Yoshimi

Graduate School of Engineering and Science, Shibaura Institute of Technology
3-7-5 Toyosu, Koto, Tokyo 135-8548, Japan

February 21, 2017
June 14, 2017
October 20, 2017
learning from demonstration, hand movements, dynamic movement primitives, robot actions
Adaptive Learning of Hand Movement in Human Demonstration for Robot Action

Robot arm LWA3 performs 'pick up a cup'

This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.

  1. [1] N. H. Pham and T. Yoshimi, “Extracting actions from instruction manual and testing their execution in a robotic simulation,” ASEAN Engineering J., Vol.6 No.2, June 2016.
  2. [2] A. Billard and D. Grollman, “Robot learning by demonstration,” Scholarpedia, Vol.8, No.12, p. 3824, 2013. doi: 10.4249/scholarpedia.3824
  3. [3] B. D. Argall, S. Chernova, M. Veloso, and B. Browning, “A survey of robot learning from demonstration,” Robotics and autonomous systems, Vol.57, No.5, pp. 469-483, 2009.
  4. [4] A. Ijspeert, J. Nakanishi, and S. Schaal, “Movement Imitation with nonlinear dynamical systems in humanoid robots,” Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), 2002.
  5. [5] P. Pastor, H. Hoffmann, T. Asfour, and S. Schaal, “Learning and generalization of motor skills by learning from demonstration,” IEEE Int. Conf. Robotics and Automation, 2009.
  6. [6] R. Mao, Y. Yang, C. Fermuller, Y. Aloimonos, and J. S. Baras, “Learning hand movements from markerless demonstrations for humanoid tasks,” IEEE-RAS Int. Conf. on Humanoid Robots, 2014.
  7. [7] M. Prada and A. Remazeilles, “Dynamic movement primitives for human robot interaction,” IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2012.
  8. [8] B. Nemec and A. Ude, “Action sequencing using dynamic movement primitives,” Robotica, Vol.30, pp. 837-846, 2012.
  9. [9] S. Calinon, Z. Li, and T. Alizadeh, “Statistical dynamical systems for skills acquisition in humanoids,” 12th IEEE-RAS Int. Conf. on Humanoid Robots, 2012.
  10. [10] D. Forte, A. Gams, J. Morimoto, and A. Ude, “On-line motion synthesis and adaptation using a trajectory database,” Robotics and Autonomous Systems, 2012.
  11. [11] T. Asfour, F. Gyarfas, P. Azad, and R. Dillmann, “Imitation Learning of Dual-Arm Manipulation Tasks in Humanoid Robots,” IEEE-RAS Int. Conf. on Humanoid Robots (Humanoids), pp. 40-47, 2006.
  12. [12] S. Calinon, F. D’ hanlluin, E. L. Sauser, D. G. Caldwell, and A. Billard, “Learning and reproduction of gestures by imitation: an approach based on Hidden Markov Model and Gaussian mixture regression,” IEEE Robotics and Automation Magazine, Vol.7, No.2, pp. 44-45, 2010.
  13. [13] M. Yasser and T. Nishida, “Robust learning from demonstrations using multidimensional SAX,” 14th Int. Conf. on Control, Automation and Systems (ICCAS), 2014.
  14. [14] A. Skoglund, B. Iliev, B. Kadmiry, and R. Palm, “Programming by Demonstration of Pick-and-Place Tasks for Industrial Manipulators using Task Primitives,” 2007 Int. Symposium on Computational Intelligence in Robotics and Automation, pp. 368-373, Jacksonville, FI, 2007.
  15. [15] I. Oikonomidis, N. Kyriazis, and A. Argyros, “Efficient model-based 3D tracking of hand articulations using Kinect,” British Machine Vision Conf., 2011.
  16. [16] A. Ijspeert, J. Nakanishi, and S. Schaal, “Learning Attractor Landscapes for Learning Motor Primitives,” Advances in Neural Information Processing Systems 15 (NIPS), 2003.
  17. [17] S. Schaal, J. Peters, J. Nakanishi, and A. J. Ijspeert, “Control, Planning, Learning, and Imitation with Dynamic Movement Primitives,” IEEE Int. Conf. on Intelligent Robots and Systems (IROS), 2003.
  18. [18] S. Schaal, A. J. Ijspeert, and A. Billard, “Computational Approaches to Motor Learning by Imitation,” Philosophical Trans. of the Royal Society of London: Series B, Biological Sciences, 2003.
  19. [19] A. Ijspeert, J. Nakanishi, P. Pastor, H. Hoffmann, and S. Schaal, “Dynamical movement primitives: Learning attractor models for motor behaviors,” Neural Computation, Vol.25, pp. 328-373, 2013.
  20. [20] H. Hoffmann, P. Pastor, D.-H. Park, and S. Schaal, “Biologically-inspired dynamical systems for movement generation: Automatic real-time goal adaptation and obstacle avoidance,” 2009 IEEE Int. Conf. on Robotics and Automation (ICRA 2009), pp. 2587-2592, Kobe, Japan, May 12-17, 2009.
  21. [21] P. Pastor, H. Hoffmann, and S. Schaal, “Movement generation by learning from demonstration and generalization to new targets,” Adaptive Motion of Animals and Machines (AMAM), 2008.
  22. [22] S. Schaal, J. Peters, J. Nakanishi, and A. J. Ijspeert, “Learning movement primitives,” The Eleventh Int. Symposium on Robotics Research (ISRR), pp. 561-572, Siena, Italy, October 19-22, 2003.
  23. [23] M. Hersch, F. Guenter, S. Calinon, and A. Billard, “Dynamical system modulation for robot learning via kinesthetic demonstrations,” IEEE Trans. Robotics, Vol.24, No.6, pp. 1463-1467, 2008.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Dec. 12, 2017