single-au.php

IJAT Vol.15 No.2 pp. 197-205
doi: 10.20965/ijat.2021.p0197
(2021)

Technical Paper:

Offline Direct Teaching for a Robotic Manipulator in the Computational Space

Satoshi Makita*,†, Takuya Sasaki**, and Tatsuhiro Urakawa**

*Fukuoka Institute of Technology
3-30-1 Wajirohigashi, Higashi-ku, Fukuoka 811-0295, Japan

Corresponding author

**National Institute of Technology, Sasebo College, Sasebo, Japan

Received:
August 21, 2020
Accepted:
January 18, 2021
Published:
March 5, 2021
Keywords:
robot teaching, direct teaching, augmented reality, virtual reality
Abstract

This paper proposes a robot teaching method using augmented and virtual reality technologies. Robot teaching is essential for robots to accomplish several tasks in industrial production. Although there are various approaches to perform motion planning for robot manipulation, robot teaching is still required for precision and reliability. Online teaching, in which a physical robot moves in the real space to obtain the desired motion, is widely performed because of its ease and reliability. However, actual robot movements are required. In contrast, offline teaching can be accomplished entirely in the computational space, and it requires constructing the robot’s surroundings as computer graphic models. Additionally, planar displays do not provide sufficient information on 3D scenes. Our proposed method can be employed as offline teaching, but the operator can manipulate the robot intuitively using a head-mounted device and the specified controllers in the virtual 3D space. We demonstrate two approaches for robot teaching with augmented and virtual reality technologies and show some experimental results.

Cite this article as:
S. Makita, T. Sasaki, and T. Urakawa, “Offline Direct Teaching for a Robotic Manipulator in the Computational Space,” Int. J. Automation Technol., Vol.15 No.2, pp. 197-205, 2021.
Data files:
References
  1. [1] Z. Pan, J. Polden, N. Larkin, S. V. Duin, and J. Norrish, “Recent progress on programming methods for industrial robots,” Robotics and Computer-Integrated Manufacturing, Vol.28, No.2, pp. 87-94, 2012.
  2. [2] R. Dillmann, “Teaching and learning of robot tasks via observation of human performance,” Robotics and Autonomous System, Vol.47, pp. 109-116, 2004.
  3. [3] S. Calinon and A. Billard, “Active Teaching in Robot Programming by Demonstration,” Proc. of the IEEE Int. Symp. on Robot and Human Interactive Communication, pp. 702-707, 2007.
  4. [4] Y. Maeda and T. Nakamura, “View-based teaching/playback for robotic manipulation,” Robomech J., Vol.2, 2, 2015.
  5. [5] K. Hoshino, N. Igo, M. Tomida, and H. Kotani, “Teleoperating System for Manipulating a Moon Exploring Robot on the Earth,” Int. J. Automation Technol., Vol.11, No.3, pp. 433-441, 2017.
  6. [6] J.-C. Latombe, “Motion Planning: A Journey of Robots, Molecules, Digital Actors, and Other Artifacts,” Int. J. of Robotics Research, Vol.18, No.11, pp. 1119-1128, 1999.
  7. [7] S. Y. Gadre, “Teaching Robots Using Mixed Reality,” Master’s thesis, Brown University, 2018.
  8. [8] E. Rosen, D. Whitney, E. Phillips, G. Chien, J. Tompkin, G. Konidaris, and S. Tellex, “Communicating and controlling robot arm motion intent through mixed-reality head-mounted displays,” Int. J. of Robotics Research, Vol.38, Nos.12-13, pp. 1513-1526, 2019.
  9. [9] H. Kawasaki, T. Mouri, and S. Ueki, “Virtual Robot Teaching for Humanoid Both-Hands Robots Using Multi-Fingered Haptic Interface,” C. S. Lanyi (Ed.), “The Thousand Faces of Virtual Reality,” pp. 107-128, doi: 10.5772/59189, 2014.
  10. [10] K. Kato, N. Sato, and Y. Morita, “Development of direct operation system for mobile robot by using 3D CG diorama,” Proc. of the Int. Conf. on Control, Automation and Systems, pp. 1486-1490, 2012.
  11. [11] K. Hoshino, M. Kitani, R. Asami, N. Sato, Y. Morita, T. Fujiwara, T. Endo, and F. Matsuno, “Improvement of operability of tele-operation system for legged rescue robot,” Proc. of the IEEE Int. Conf. on Intelligence and Safety for Robotics, pp. 134-139, 2018.
  12. [12] Microsoft, “Mixed Reality Toolkit.” https://microsoft.github.io/MixedRealityToolkit-Unity/README.html [Accessed August 8, 2020]
  13. [13] H. Durrant-Whyte and T. Bailey, “Simultaneous localization and mapping: part I,” IEEE Robotics and Automation Magazine, Vol.13, No.2, pp. 99-110, 2006.
  14. [14] PTC, “Vuforia Engine.” https://developer.vuforia.com/ [Accessed August 8, 2020]
  15. [15] A. Hornung, K. M. Wurm, M. Bennewitz, C. Stachniss, and W. Burgard, “OctoMap: An Efficient Probabilistic 3D Mapping Framework Based on Octrees,” Autonomous Robots, Vol.34, pp. 189-206, 2013.
  16. [16] “Octomap.” https://octomap.github.io/ [Accessed August 8, 2020]
  17. [17] M. Quigley, K. Conley, B. P. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, and A. Y. Ng, “ROS: an open-source Robot Operating System,” ICRA Workshop on Open Source Software, 2009.
  18. [18] “ROS: The Robot Operating System.” https://www.ros.org/ [Accessed August 8, 2020]
  19. [19] Unity Technologies, “Unity.” https://unity.com/ [Accessed August 8, 2020]

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024