single-rb.php

JRM Vol.23 No.4 pp. 484-493
doi: 10.20965/jrm.2011.p0484
(2011)

Paper:

Visual Marker System for Autonomous Object Handling by Assistive Robotic Arm

Hideyuki Tanaka, Tetsuo Tomizawa, Yasushi Sumi,
Jae Hoon Lee, Hyun Min Do, Bong Keun Kim,
Tamio Tanikawa, Hiromu Onda, and Kohtaro Ohba

National Institute of Advanced Industrial Science and Technology (AIST), AIST Tsukuba Central 2, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568, Japan

Received:
January 22, 2011
Accepted:
April 27, 2011
Published:
August 20, 2011
Keywords:
visual marker, environment structuring, assistive robotic arm
Abstract
The environment structuring concept has been used to design and develop a visual marker system for automating a robotic arm that assists those with upperlimb problems to lead independent lives. Confirmation experiments demonstrate the feasibility of our system in meeting different practical performance requirements. Combining semiautonomous control, simple visual feedback, and our system enabled the assistive robotic arm to handle objects autonomously – something that had not been previously conventionally possible.
Cite this article as:
H. Tanaka, T. Tomizawa, Y. Sumi, J. Lee, H. Do, B. Kim, T. Tanikawa, H. Onda, and K. Ohba, “Visual Marker System for Autonomous Object Handling by Assistive Robotic Arm,” J. Robot. Mechatron., Vol.23 No.4, pp. 484-493, 2011.
Data files:
References
  1. [1] K. Ohba, “Universal-design for Environment and Manipulation Framework,” J. of Robotics Society of Japan, Vol.26, No.6, pp. 431-435, 2008.
  2. [2] T. Tomizawa et al., “Common Interface Design for Robot and Human – Universal Handle for Object Manipulation –,” Proc. SI2008, pp. 465-466, 2008.
  3. [3] K. Ohara et al., “Visual Mark for Robot Manipulation and Its RT-Middleware Component,” Advanced Robotics, Vol.22, No.6, pp. 633-655, 2008.
  4. [4] N. Yamanobe, Y. Wakita, K. Nagata, M. Clerc, T. Kinose, and E. Ono, “Robotic arm operation training system for persons with disabled upper limbs,” Proc. of The 19th IEEE Int. Symposium on Robot and Human Interaction Communciation (RO-MAN 2010), pp. 749-754, 2010.
  5. [5] A. Versluis, B. Driessen, J. Woerden, and B. Kröse, “Enhancing the usability of the MANUS manipulator by using visual servoing,” Proc. Int. Conf. on Rehabilitation Robotics, pp. 43-46, 2003.
  6. [6] B. Driessen, F. Liefhebber, T. Kate, and K. V.Woerden, “Collaborative control of the manus manipulator,” Proc. 2005 IEEE 9th Conf. on Rehabilitation Robotics, pp. 247-251, 2005.
  7. [7] N. P. Papanikolopoulos and P. K. Khosla, “Shared and traded telerobotic visual control,” Proc. IEEE Int. Conf. on Robotics and Automation, Vol.1, pp. 878-885, 1992.
  8. [8] C. Dune, C. Leroux, and E. Marchand, “Intuitive human interactive with an arm robot for severely handicapped people – a one click approach,” IEEE Int. Conf. on Rehabilitation Robotics, pp. 582-589, 2007.
  9. [9] K. Tsui, H. Yanco, D. Kontak, and L. Beliveau, “Development and evaluation of a flexible interface for a wheelchair mounted robotic arm,” Proc. the Third Annual Conf. on Human-Robot Interaction, 2008.
  10. [10] J. Ota, M. Yamamoto, K. Ikeda, Y. Aiyama, and T. Arai, “Environmental Support Method for Mobile Manipulators Using Visual Marks with Memory Storage,” J. of Robotics Society of Japan, Vol.17, No.5, pp. 670-676, 1999.
  11. [11] J. Ota, R. Katsuki, and T. Arai, “Object Manipulation of a Robot by Utilizing QR Code,” J. of Robotics Society of Japan, Vol.23, No.6, pp. 678-682, 2005.
  12. [12] N. Y. Chong, H. Hongu, K. Ohba, S. Hirai, and K. Tanie, “Knowledge Distributed Robot Control Framework,” Proc. Int. Conf. on Control, Automation, and Systems, Gyeongju, Korea, 2003.
  13. [13] D. Wagner and D. Schmalstieg, “ARToolKitPlus for Pose Tracking on Mobile Devices,” Proc. of 12th Computer Vision Winter Workshop (CVWW’07), pp. 139-146, 2007.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024